Release of Facebook 'trending news' guidelines raises new questions

The company denied a report that employees suppressed stories of interest to conservative readers, but on Thursday it released guidelines that seem to contradict earlier claims that human editors have little influence.

Facebook released more details about its trending news section on Thursday, including an in-depth set of editorial guidelines, amid an ongoing controversy about how the site selects what stories are displayed.

The social media giant continued to stress that its guidelines don't allow employees who review the section to remove or block particular stories based on political bias.

On Monday, the tech site Gizmodo reported that a former Facebook employee described as having conservative leanings said workers "routinely suppressed" stories on public figures such as Mitt Romney and on the Conservative Political Action Conference.

The company quickly denied the report, but the issue has become one of intense speculation, sparking a Senate committee inquiry and raising questions about the site's claims of technological neutrality.

"They've built a site that is profitable because it caters to people's need to self-express and curate and refine their images and individual brands, and they do that within groups where they feel comfortable because everyone is like them," Bill Bishop, a journalist and author of "The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart," told the Monitor on Tuesday. "It's the site for our time."

The company's 28-page guidelines for reviewing topics on the feed also appear to contradict some of the company's earlier assertions that the stories on the site are primarily sourced by an algorithm and only "lightly curated" by human editors.

The guidelines, which were also released by The Guardian on Thursday, show that editors do mark stories as a trending "national story" by reviewing a select group of 10 news outlets, including the BBC, Fox News, The New York Times, The Guardian and BuzzFeed. Descriptions of each trending topic are sourced from at least three of 1,000 news outlets.

A Facebook spokesperson previously told the Monitor that "I don't know where that's coming from" in response to reports that the site sourced stories from particular outlets.

The version of the the guidelines obtained by The Guardian includes instructions for how to "inject" or "blacklist" particular topics that appear to contradict a statement made by a Facebook executive earlier in the week.

"The editorial team CAN [sic] inject a newsworthy topic in the event that something is attracting a lot of attention, such as #BlackLivesMatter," the version obtained by The Guardian says.

This reference to the civil rights movement that has made frequent use of social media does not appear to be in the version released by Facebook.

In a post on the site on Thursday, chief executive officer Mark Zuckerberg said he intended to have a conversation with "leading conservatives and people from across the political spectrum," to talk about the issue of how the site curates stories for readers.

In his post, Mr. Zuckerberg doubled down on the company's assertion that the site's guidelines don't allow "the prioritization of one viewpoint over another or the suppression of political perspectives."

Beyond Facebook's trending sidebar, the algorithm behind the site's main news feed has often been hotly debated by researchers.

The company released a study last year published in the journal Science, suggesting that users' behavior had more responsibility for what appeared in their news feeds than the algorithm. 

Karrie Karahalios, an associate professor of computer science at the University of Illinois at Urbana-Champaign, who published a study showing that more than 60 percent of a small survey of Facebook users had no idea the site was filtering their news feeds at all, says some researchers outside the company have questioned whether the Facebook study's findings were based on sampling users from extreme ends of the political spectrum.

They've referred to it as "The Facebook, it's not our fault – it's yours" study, she writes in an e-mail to the Monitor on Thursday.

The debate around the trending feed also mirrors broader questions about the algorithm used by many social media sites work, with Ms. Karahalios and her colleagues pushing for "algorithmic audits" to examine whether the computer programs can "learn" to reflect societal biases, a call that has also been taken up by the White House.

In some instances, she notes Zip codes have become a proxy for race, while Facebook's practice of marketing to "affinity groups" also came to mainstream attention earlier this year when it was revealed that there were different trailers for the film "Straight Outta Compton" shown to Facebook users of different races.

"Today, with the widespread use of deep neural nets and similar machine learning algorithms, even the developers of the algorithms can't tell what's going on inside them," she writes.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Release of Facebook 'trending news' guidelines raises new questions
Read this article in
https://www.csmonitor.com/Technology/2016/0513/Release-of-Facebook-trending-news-guidelines-raises-new-questions
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe