How Zuckerberg's changing his mind on Facebook's fake news dilemma

Facebook and other tech companies have long tried to stay out of content curation, verification or censorship. But the volume of unreliable information in circulation, particularly throughout the election cycle, is shifting thinking.

FILE- In this April 12, 2016, file photo, Facebook CEO Mark Zuckerberg speaks during the keynote address at the F8 Facebook Developer Conference in San Francisco. Facebook says, Wednesday, Nov. 16, it will work with independent companies like Nielsen and comScore to review its metrics after it uncovered new problems with the data it provides to advertisers and publishers that use its network.

Eric Risberg/AP/File

November 19, 2016

After the election, Facebook experienced a wave of backlash from shocked voters over the social media site’s role in propagating misinformation and promoting partisan news bubbles that left readers on both sides of the aisle in the dark about the true state of politics in the US.

After first denying the role that fake news played in the election, Facebook is now taking steps fix the situation. In a lengthy post on his Facebook page, CEO and founder Mark Zuckerberg detailed how the company could combat fake news – a move that may potentially change the relationship between tech companies and online information. 

"While the percentage of misinformation is relatively small, we have much more work ahead on our roadmap," Zuckerberg wrote in a post to his Facebook profile last night.

In Kentucky, the oldest Black independent library is still making history

The post outlines seven different ways Facebook will attempt to improve the quality of its news feed, including improving automatic detection of potentially false content, labeling news that may be false, and making it easier for users to flag content.

"We are raising the bar for stories that appear in related articles under links in News Feed," Zuckerburg continued. "A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection."

Zuckerberg also says he intends to employ third party verification and engage in better quality discussion with journalists and news media outlets.  

With this announcement, Facebook is wading into a murky area. While any of the proposed solutions have the potential to be effective, they could also create problems of their own, as NPR reports. Breaking news stories would be difficult for algorithms and independent fact-checkers alike to verify. And features that allow users to flag content could end up being manipulated into a tool to push individual agendas.

To avoid these complications and many others, Facebook and other tech companies have long tried to stay out of content curation, verification, or censorship in the name of freedom of information. However, the volume of unreliable information in circulation online, particularly throughout the election cycle, is creating a push for higher standards.

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

However, many observers feel that it ultimately is up to the news consumer, and not tech companies like Facebook and Google, to think critically about the accuracy of information, as The Christian Science Monitor’s Harry Bruinius reports

Scholars say that news consumers have to be more discerning on the sources of their information. And many see profound risks to the free flow of ideas if platforms like Facebook and companies like Google are supposed to become the gatekeepers or ultimate arbiters of what constitutes legitimate information.

It is particularly important to be discerning about one's news diet, say political analysts, when Facebook essentially creates a bubble of like-minded opinions – and, as this election showed, limited exposure to dissenting opinions makes it is easy to underestimate their power.

“Americans are ... likely to get what they do know, or think they know, from an echo chamber,”  Krista Jenkins, professor of political science at Fairleigh Dickinson University in Teaneck, N.J., told the Monitor via email. “What’s needed in our discourse is a cross-pollination of ideas and viewpoints so that we begin to turn the tide on the alarming trend of seeing the other side as dangerous and misguided, rather than those whose experiences and perspectives lead them to believe different things about where to go and how to get there.”