Facebook to flag 'worst of the worst' fake news with fact-checking partners

The social networking giant says it will make it easier for users to report fake news, flag fake stories with the help of fact-checkers, and disrupt financial incentives that drive fake news.

Facebook via AP
This photo combo of images provided by Facebook demonstrates some of the new measures Facebook is taking to curb the spread of fake news on its huge and influential social network. The company is focusing on the "worst of the worst" offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories that play to people's passions and preconceived notions.

Facebook will enlist the help of fact-checkers to begin flagging fake news stories, the social networking giant announced Thursday, outlining a number of tests and features designed to address critiques that its platform is a hotbed of politically consequential misinformation.

Shortly after last month's elections, Facebook chief executive Mark Zuckerberg said it was a "crazy idea" to think his platform had influenced the outcome. But the company seems to have changed its tone as it seeks to walk a tight rope, taking some responsibility for the spread of false information online, but trying to avoid being accused of partisanship or censorship.

"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Adam Mosseri, the vice president of Facebook News Feed, wrote in a blog post. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations."

Mr. Mosseri said the company is focusing on four key areas of improvement: making it easier for users to report an article as fake, flagging stories as disputed, ensuring that those who share disputed stories know what they are sharing, and disrupting the revenue streams that currently drive much of the fake news industry.

In order to flag fake news, Facebook started working with third-party fact-checkers who must agree to abide by a five-part code assembled by The Poynter Institute. There are currently 43 organizations worldwide that have signed onto Poynter's statement, including ABC News, the Associated Press, PolitiFact, Snopes, and The Washington Post Fact Checker – all of whom have agreed to produce fair, transparent, and nonpartisan work.

Facebook will append an alert to any story the fact-checkers determine to be false.

"If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why," Mosseri wrote.

Users will still be permitted to read and share fake news on Facebook, but they will be confronted with warnings to make it clear that the veracity of a particular article has been questioned. Illustrations published with Mosseri's blog depict a pop-up window that requires users to acknowledge that fact-checkers dispute a story before they share it.

Additionally, flagged stories are ineligible for advertisement on Facebook.

A number of websites – including Liberty Writers News, Alex Jones' Info Wars, and Ending the Fed – have spread lies and conspiracy theories that were particularly popular among supporters of President-elect Donald Trump, as The Christian Science Monitor's Story Hinckley reported on Thursday.

"The problem is that we are too credulous of news that reinforces our predispositions and too critical of sites that contradict them," Brendan Nyhan, a political scientist at Dartmouth University, told the Monitor.

"Facebook created the platform and the election created the topic that would deliver the hits and shares," he added.

As policymakers and the public continue to discuss the real-world implications of fabricated information online, Facebook says it will look to continue modifying its approach, as well.

"We're excited about this progress, but we know there's more to be done," Mosseri wrote. "We’re going to keep working on this problem for as long as it takes to get it right."

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.