Facebook's news algorithm promotes 9/11 'truther' article

Is it a mistake for Facebook to rely on algorithms when it comes to sensitive topics?

Matt Rourke/AP/File
In this May 16, 2012, file photo, the Facebook logo is displayed on an iPad in Philadelphia. After switching from human curation to a new algorithm, Facebook's Trending Topics section linked out to a 9-11 'truther' article on the anniversary of the attacks.

On the 15th anniversary of the September 11 attacks, Facebook commemorated the event in a strange way – by promoting an article which claimed the attacks were staged.

The article, which alleged to show proof that “bombs were planted in Twin Towers,” appeared in Facebook’s Trending Topics section on Friday. In the weeks after switching from human curators to an automated algorithm to select Trending Topics, the feed has promoted a series of questionable links, which include thinly-veiled hit pieces and even pornography.

In the wake of such missteps, experts and users have decried the algorithm’s apparent inability to navigate sensitive stories. Do some topics need a human touch?

On Facebook, Trending Topics are chosen based on what users are actively posting about, while featured articles are chosen in-house and cycled hourly. Located in the upper right sidebar, Trending Topics receive high visibility on the social media platform.

So when one topic linked out to a 9/11 "truther" article, you can be sure that a lot of people saw it. The piece, which originally appeared in a UK tabloid called the Daily Star, referred to theories that the attacks had been orchestrated by the US government.

"We're aware a hoax article showed up there and as a temporary step to resolving this we've removed the topic," a Facebook spokesman said in a statement.

On August 27, Facebook announced that it would replace human news curators with a new algorithm. The change was prompted by claims that the Trending Topics section, which was curated mostly by young, Ivy League-educated journalists, had a liberal slant. An algorithm, representatives said, could reduce those biases and present “a breadth of ideas” to users.

Facebook has said that it would employ a team of real people to monitor the links for quality control. But ever since the switch, the platform has been marred by problematic links. One link led to an erroneous claim that Fox News anchor Megyn Kelly had been fired for “backing Hillary.” Another referred to Meghan McCain, the daughter of Arizona senator John McCain, as “Miss Piggie.”

Such incidents expose a weakness in Facebook’s algorithm: a lack of human sensitivity.

Algorithms are good at relating phrases and photos across huge swaths of data. They can tell you what topics are trending, and what stories are getting the most clicks. But most can’t recognize a hoax, or distinguish between a news story and a cruel gossip piece. Are there some cues that can’t be taught to technology?

Last week, a Norwegian newspaper posted Nick Ut’s Pulitzer-prize winning photo of Kim Phuc, originally taken in 1972 after napalm attacks in Vietnam. But Facebook’s algorithm soon deleted the post because it contained child nudity. That’s because algorithms such as these are designed to “optimize” solutions, rather than find the “correct” one.

For some, the deletion of Ut's photo raised troubling questions of censorship. "The media have a responsibility to consider publication [of stories] in every single case," wrote Espen Egil Hansen, editor at Norway’s largest newspaper, in an open letter to Mr. Zuckerberg. "This right and duty, which all editors in the world have, should not be undermined by algorithms encoded in your office in California."

At press time, the 9/11 anniversary topic was still absent from Facebook’s news feed, replaced by related topics such as “Air Force One” and “Lower Manhattan.”

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.