Facebook released more details about its trending news section on Thursday, including an in-depth set of editorial guidelines, amid an ongoing controversy about how the site selects what stories are displayed.
The social media giant continued to stress that its guidelines don't allow employees who review the section to remove or block particular stories based on political bias.
On Monday, the tech site Gizmodo reported that a former Facebook employee described as having conservative leanings said workers "routinely suppressed" stories on public figures such as Mitt Romney and on the Conservative Political Action Conference.
The company quickly denied the report, but the issue has become one of intense speculation, sparking a Senate committee inquiry and raising questions about the site's claims of technological neutrality.
"They've built a site that is profitable because it caters to people's need to self-express and curate and refine their images and individual brands, and they do that within groups where they feel comfortable because everyone is like them," Bill Bishop, a journalist and author of "The Big Sort: Why the Clustering of Like-Minded America is Tearing Us Apart," told the Monitor on Tuesday. "It's the site for our time."
The company's 28-page guidelines for reviewing topics on the feed also appear to contradict some of the company's earlier assertions that the stories on the site are primarily sourced by an algorithm and only "lightly curated" by human editors.
The guidelines, which were also released by The Guardian on Thursday, show that editors do mark stories as a trending "national story" by reviewing a select group of 10 news outlets, including the BBC, Fox News, The New York Times, The Guardian and BuzzFeed. Descriptions of each trending topic are sourced from at least three of 1,000 news outlets.
A Facebook spokesperson previously told the Monitor that "I don't know where that's coming from" in response to reports that the site sourced stories from particular outlets.
The version of the the guidelines obtained by The Guardian includes instructions for how to "inject" or "blacklist" particular topics that appear to contradict a statement made by a Facebook executive earlier in the week.
"The editorial team CAN [sic] inject a newsworthy topic in the event that something is attracting a lot of attention, such as #BlackLivesMatter," the version obtained by The Guardian says.
This reference to the civil rights movement that has made frequent use of social media does not appear to be in the version released by Facebook.
In a post on the site on Thursday, chief executive officer Mark Zuckerberg said he intended to have a conversation with "leading conservatives and people from across the political spectrum," to talk about the issue of how the site curates stories for readers.
In his post, Mr. Zuckerberg doubled down on the company's assertion that the site's guidelines don't allow "the prioritization of one viewpoint over another or the suppression of political perspectives."
Beyond Facebook's trending sidebar, the algorithm behind the site's main news feed has often been hotly debated by researchers.
The company released a study last year published in the journal Science, suggesting that users' behavior had more responsibility for what appeared in their news feeds than the algorithm.
Karrie Karahalios, an associate professor of computer science at the University of Illinois at Urbana-Champaign, who published a study showing that more than 60 percent of a small survey of Facebook users had no idea the site was filtering their news feeds at all, says some researchers outside the company have questioned whether the Facebook study's findings were based on sampling users from extreme ends of the political spectrum.
They've referred to it as "The Facebook, it's not our fault – it's yours" study, she writes in an e-mail to the Monitor on Thursday.
The debate around the trending feed also mirrors broader questions about the algorithm used by many social media sites work, with Ms. Karahalios and her colleagues pushing for "algorithmic audits" to examine whether the computer programs can "learn" to reflect societal biases, a call that has also been taken up by the White House.
In some instances, she notes Zip codes have become a proxy for race, while Facebook's practice of marketing to "affinity groups" also came to mainstream attention earlier this year when it was revealed that there were different trailers for the film "Straight Outta Compton" shown to Facebook users of different races.
"Today, with the widespread use of deep neural nets and similar machine learning algorithms, even the developers of the algorithms can't tell what's going on inside them," she writes.