With tweaks to trending box, Facebook targets fake news

The changes are designed to make the publisher more visible and to make it harder for single hoax stories to go viral, Facebook says.

Matt Rourke/AP/File
The Facebook logo is displayed on an iPad in Philadelphia, May 16, 2012. Facebook is updating its 'trending' feature, which shows popular topics discussed and shared on its site, in an effort to root out fake news and misinformation.

Facebook rolled out changes to its trending topics box on Wednesday, in its latest response to the proliferation of fake news items that earned the company censure following the US presidential election.

Trending topics, the company said in a news release, will now factor in user alerts that an item is spam or fake news, and will now identify groups of articles shared on the platform instead of just the mentions earned by a particular topic. The presentation of the box is changing, too: Headlines from an attributed publisher now appear below the topic name, and the same topics will appear uniformly across different regions.

Slight as they are, the modifications return attention to Facebook’s difficulties in striking a balance between its channeling of popular interest and its duties as curator of information.

The company has veered between competing approaches. In August, notes The Wall Street Journal, it fired its teams of contractors hired to select headlines – and weed out dubious items that had gained traction – amid accusations that the teams squeezed out news from conservative sources. That decision gave way to the laissez-faire approach of the election season, which saw the ascent of intentionally fabricated items circulating in "echo chambers" that tended to reinforce users' existing beliefs.

The persistent popularity of hoax items has also revealed a pattern in how their readers see the news in general, as The Christian Science Monitor reported this week:

Jack Zhou, an instructor in environmental politics at Duke University in Durham, N.C., says some occupants of so-called "news bubbles" may prefer to accept fake news as truth. "The state of fragmented media may dull the potential practical impact of inoculation messages, particularly in terms of the audiences serviced by those media," Mr. Zhou, who has researched the identity politics of climate change, tells the Monitor in an email.

After all, sites with fake news are only catering to their audiences. Paul Levinson, a communications professor at Fordham University in New York, told the Monitor in December that, "These bubbles have not been imposed upon the public – it was what the people want. As long as social media continues to provide a very easy forum for these news bubbles ... it is not going to stop."

Facebook chief executive officer Mark Zuckerberg initially rejected criticism of how the company curated its election-season items. Within weeks, though, the company started tinkering. In mid-November, Mr. Zuckerberg made a list of ways it could improve, and in December, it started carrying some of them out, enlisting third-party fact-checkers to flag fake news items, as the Monitor reported:

[Vice president of Facebook News Feed Adam] Mosseri said the company is focusing on four key areas of improvement: making it easier for users to report an article as fake, flagging stories as disputed, ensuring that those who share disputed stories know what they are sharing, and disrupting the revenue streams that currently drive much of the fake news industry....

Facebook will append an alert to any story the fact-checkers determine to be false.

"If the fact checking organizations identify a story as fake, it will get flagged as disputed and there will be a link to the corresponding article explaining why," Mosseri wrote.

Part of Wednesday’s changes amount to finding credibility in numbers: Where a single hoax story could go viral before, the algorithm will now factor in how many publishers are reporting on the topic in question, in addition to taking into account the "historical engagement" of the publishers.

"If just one story or post went viral, it wouldn’t make it into the trending as it might previously," Will Cathcart, a Facebook vice president of product management, told the Journal. "It really takes a mass of publishers writing about the same topic to make the cut."

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.