How Zuckerberg's changing his mind on Facebook's fake news dilemma

Facebook and other tech companies have long tried to stay out of content curation, verification or censorship. But the volume of unreliable information in circulation, particularly throughout the election cycle, is shifting thinking.

Eric Risberg/AP/File
FILE- In this April 12, 2016, file photo, Facebook CEO Mark Zuckerberg speaks during the keynote address at the F8 Facebook Developer Conference in San Francisco. Facebook says, Wednesday, Nov. 16, it will work with independent companies like Nielsen and comScore to review its metrics after it uncovered new problems with the data it provides to advertisers and publishers that use its network.

After the election, Facebook experienced a wave of backlash from shocked voters over the social media site’s role in propagating misinformation and promoting partisan news bubbles that left readers on both sides of the aisle in the dark about the true state of politics in the US.

After first denying the role that fake news played in the election, Facebook is now taking steps fix the situation. In a lengthy post on his Facebook page, CEO and founder Mark Zuckerberg detailed how the company could combat fake news – a move that may potentially change the relationship between tech companies and online information. 

"While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap," Zuckerberg wrote in a post to his Facebook profile last night.

The post outlines seven different ways Facebook will attempt to improve the quality of its news feed, including improving automatic detection of potentially false content, labeling news that may be false, and making it easier for users to flag content.

"We are raising the bar for stories that appear in related articles under links in News Feed," Zuckerburg continued. "A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection."

Zuckerberg also says he intends to employ third party verification and engage in better quality discussion with journalists and news media outlets.  

With this announcement, Facebook is wading into a murky area. While any of the proposed solutions have the potential to be effective, they could also create problems of their own, as NPR reports. Breaking news stories would be difficult for algorithms and independent fact-checkers alike to verify. And features that allow users to flag content could end up being manipulated into a tool to push individual agendas.

To avoid these complications and many others, Facebook and other tech companies have long tried to stay out of content curation, verification, or censorship in the name of freedom of information. However, the volume of unreliable information in circulation online, particularly throughout the election cycle, is creating a push for higher standards.

However, many observers feel that it ultimately is up to the news consumer, and not tech companies like Facebook and Google, to think critically about the accuracy of information, as The Christian Science Monitor’s Harry Bruinius reports

Scholars say that news consumers have to be more discerning on the sources of their information. And many see profound risks to the free flow of ideas if platforms like Facebook and companies like Google are supposed to become the gatekeepers or ultimate arbiters of what constitutes legitimate information.

It is particularly important to be discerning about one's news diet, say political analysts, when Facebook essentially creates a bubble of like-minded opinions – and, as this election showed, limited exposure to dissenting opinions makes it is easy to underestimate their power.

“Americans are ... likely to get what they do know, or think they know, from an echo chamber,”  Krista Jenkins, professor of political science at Fairleigh Dickinson University in Teaneck, N.J., told the Monitor via email. “What’s needed in our discourse is a cross-pollination of ideas and viewpoints so that we begin to turn the tide on the alarming trend of seeing the other side as dangerous and misguided, rather than those whose experiences and perspectives lead them to believe different things about where to go and how to get there.”

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to How Zuckerberg's changing his mind on Facebook's fake news dilemma
Read this article in
https://www.csmonitor.com/Technology/2016/1119/How-Zuckerberg-s-changing-his-mind-on-Facebook-s-fake-news-dilemma
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe