Facebook puts fewer humans in charge of Trending Topics. Will it reduce bias?

Facebook will rely more heavily on algorithms, rather than humans, to choose trending news topics. Will that eliminate bias?

Facebook CEO Mark Zuckerberg delivers the keynote address at the April F8 Facebook Developer Conference in San Francisco. Facebook recently announced that it is dropping its reliance on news outlets to help determine what gets posted as a "trending topic" on the giant social network, a move adopted after a backlash over a report saying it suppressed conservative views.

Eric Risberg/AP/File

August 27, 2016

After facing allegations of political bias in its Trending Topics feature earlier this year, Facebook announced Friday that the feature will become more automated than ever.

Facebook says that by using algorithms instead of human judgment, it can both eliminate any possibility of political bias and reach more people worldwide.

“Facebook is a platform for all ideas,” writes the company in a blog post, “and we’re committed to maintaining Trending as a way for people to access a breadth of ideas and commentary about a variety of topics.”

Ukraine’s nationalist Azov fighters, once sanctioned by US, strive to clear name

According to a Pew Research Center study released earlier this year, 62 percent of US adults get news from social media platforms, up from 49 percent in 2012. And 2 in 3 Facebook users (66 percent) report getting news from the site.

As social media companies enjoy more and more influence over what news their users consume, scrutiny of the platforms has increased. Recently, former Facebook employees claimed that Facebook’s Trending Topics section featured a pro-liberal political bias.

According to a report published in May by tech news site Gizmodo, Facebook’s Trending Topics section featured articles from a small number of publications, including the New York Times and the BBC. There were also questions about whether or not Facebook’s news curators, a "relatively homogenous group of young journalists, primarily educated at Ivy League or private East Coast universities,” were choosing articles that conformed to a political bias.

Facebook’s decision to rely on a news sorting algorithm, which incorporates trending news topics (#Clinton or #Trump, for example) in addition to Facebook users' personal interests, is intended to alleviate concerns that Facebook’s now infamous “news curators” will influence the news items displayed on the site.

While humans will still be involved in the curation process, preventing frivolous topics (ie. #lunch at lunchtime) from becoming part of users’ news feeds, they will play a much smaller role. 

Special treatment? How judges are handling Trump ahead of election.

In the past, human news curators wrote descriptions of trending news topics, but no longer. Facebook says that eliminating this step enables Trending Topics to be a global feature – the challenge of writing descriptions in multiple languages made it nearly impossible to accomplish with human curators.

Instead, users will simply see trending hashtags or terms, along with the number of people talking about them on Facebook. Clicking on the trending term will link to a page showing a number of news articles about that topic.

The trending topics section was introduced in 2014 to help users connect to news that was relevant to them, according to Facebook’s blog announcement of the changes. In order to ensure relevance, the Facebook news algorithm used information from users' profiles – pages the user likes, previous stories they have read, and their location –  to shape their Trending Topics feed.

But does Facebook’s decision to automate Trending Topics eliminate bias?

Earlier this summer, Harvard law professor Cass Sunstein penned an opinion piece for Bloomberg, expressing concern about bias in the news that so many Americans consume each day.

Facebook’s potential political bias had been in the news, but Professor Sunstein’s concern had different roots. While reading articles curated by humans with an alleged political bias is obviously disconcerting, he says that Facebook’s algorithm has a natural bias built in. 

By using user information, such as prior article views or liked pages, to shape Trending Topics feeds, Sunstein argues that Facebook’s news algorithm exposes users primarily to news topics or ideas that they likely already agree with.

Facebook does this to increase the likelihood that a user will be interested in or read an article. But what it really does, Sunstein says, is reinforce individual views.

“If a major source of the nation’s news is personalizing user experiences, people with different points of view will end up in echo chambers of their own design,” Sunstein writes. “Facebook didn’t create that problem, but it shouldn’t aggravate it.”

Some may ask why users should be shown articles that they aren’t interested in. According to Sunstein, there is value in “immense and unanticipated exposures” to different ideas.

In short, the problem of bias could persist even after Facebook's transition to automated Trending Topics feeds. Facebook continues to refute this concern, with Facebook CEO Mark Zuckerberg saying at a company meeting, "We've built Facebook to be a platform for all ideas."