Has Facebook done enough for the 2020 US presidential election?

Facebook says it has taken steps to tackle misinformation after its platform was misused in the 2016 U.S. election. But critics are skeptical, saying the safeguards remain insufficient.

Andrew Harnik/AP
Facebook CEO Mark Zuckerberg arrives for a hearing on Capitol Hill in Washington, Oct. 23, 2019. In 2016, Mr. Zuckerberg called the idea that fake news and misinformation could have had an impact on the election "a pretty crazy idea."

Ever since Russian agents and other opportunists abused its platform in an attempt to manipulate the 2016 United States presidential election, Facebook has insisted – repeatedly – that it’s learned its lesson and is no longer a conduit for misinformation, voter suppression, and election disruption.

But it has been a long and halting journey for the social network. Critical outsiders, as well as some of Facebook’s own employees, say the company’s efforts to revise its rules and tighten its safeguards remain wholly insufficient to the task, despite it having spent billions on the project. As for why, they point to the company’s persistent unwillingness to act decisively over much of that time.

“Am I concerned about the election? I’m terrified,” said Roger McNamee, a Silicon Valley venture capitalist and an early Facebook investor turned vocal critic. “At the company’s current scale, it’s a clear and present danger to democracy and national security.”

The company’s rhetoric has certainly gotten an update. CEO Mark Zuckerberg now casually references possible outcomes that were unimaginable in 2016 – among them, possible civil unrest and potentially a disputed election that Facebook could easily make even worse – as challenges the platform now faces.

“This election is not going to be business as usual,” Mr. Zuckerberg wrote in a September Facebook post in which he outlined Facebook’s efforts to encourage voting and remove misinformation from its service. “We all have a responsibility to protect our democracy.”

Yet for years Facebook executives have seemed to be caught off guard whenever their platform – created to connect the world – was used for malicious purposes. Mr. Zuckerberg has offered multiple apologies over the years, as if no one could have predicted that people would use Facebook to live-stream murders and suicides, incite ethnic cleansings, promote fake cancer cures, or attempt to steal elections.

While other platforms like Twitter and YouTube have also struggled to address misinformation and hateful content, Facebook stands apart for its reach and scale and, compared to many other platforms, its slower response to the challenges identified in 2016.

In the immediate aftermath of President Donald Trump’s election, Mr. Zuckerberg offered a remarkably tone-deaf quip regarding the notion that “fake news” spread on Facebook could have influenced the 2016 election, calling it “a pretty crazy idea.” A week later, he walked back the comment.

Since then, Facebook has issued a stream of mea culpas for its slowness to act against threats to the 2016 election and promised to do better. “I don’t think they have become better at listening,” said David Kirkpatrick, author of a book on Facebook’s rise. “What’s changed is more people have been telling them they need to do something.”

The company has hired outside fact-checkers, added restrictions – then more restrictions – on political advertisements and taken down thousands of accounts, pages, and groups it found to be engaging in “coordinated inauthentic behavior.” That’s Facebook’s term for fake accounts and groups that maliciously target political discourse in countries ranging from Albania to Zimbabwe.

It’s also started added warning labels to posts that contain misinformation about voting and has, at times, taken steps to limit the circulation of misleading posts. In recent weeks the platform also banned posts that deny the Holocaust and joined Twitter in limiting the spread of an unverified political story about Hunter Biden, son of Democratic presidential candidate Joe Biden, published by the conservative New York Post.

All this unquestionably puts Facebook in a better position than it was in four years ago. But that doesn’t mean it’s fully prepared. Despite tightened rules banning them, violent militias are still using the platform to organize. Recently, this included a foiled plot to kidnap the governor of Michigan.

In the four years since the last election, Facebook’s earnings and user growth have soared. This year, analysts expect the company to rake in profits of $23.2 billion on revenue of $80 billion, according to FactSet. It currently boasts 2.7 billion users worldwide, up from 1.8 billion at this time in 2016.

Facebook faces a number of government investigations into its size and market power, including an antitrust probe by the U.S. Federal Trade Commission. An earlier FTC investigation socked Facebook with a large $5 billion fine, but didn’t require any additional changes.

“Their No. 1 priority is growth, not reducing harm,” Mr. Kirkpatrick said. “And that is unlikely to change.”

Part of the problem: Mr. Zuckerberg maintains an iron grip on the company, yet doesn’t take criticism of him or his creation seriously, charges social media expert Jennifer Grygiel, a Syracuse University communications professor. But the public knows what’s going on, they said. “They see COVID misinformation. They see how Donald Trump exploits it. They can’t unsee it.”

Facebook insists it takes the challenge of misinformation seriously – especially when it comes to the election.

“Elections have changed since 2016, and so has Facebook,” the company said in a statement laying out its policies on the election and voting. “We have more people and better technology to protect our platforms, and we’ve improved our content policies and enforcement.”

Professor Grygiel says such comments are par for the course: “This company uses PR in place of an ethical business model.”

Mr. Kirkpatrick notes that board members and executives who have pushed back against the CEO – a group that includes the founders of Instagram and WhatsApp – have left the company.

“He is so certain that Facebook’s overall impact on the world is positive” and that critics don’t give him enough credit for that, Mr. Kirkpatrick said of Mr. Zuckerberg. As a result, the Facebook CEO isn’t inclined to take constructive feedback. “He doesn’t have to do anything he doesn’t want to. He has no oversight,” Mr. Kirkpatrick said.

The federal government has so far left Facebook to its own devices, a lack of accountability that has only empowered the company, according to U.S. Rep. Pramila Jayapal, a Washington Democrat who grilled Mr. Zuckerberg during a July Capitol Hill hearing.

Warning labels are of limited value if the algorithms underlying the platform are designed to push polarizing material at users, she said. 

“I think Facebook has done some things that indicate it understands its role. But it has been, in my opinion, far too little, too late.”

This story is reported by The Associated Press.

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Has Facebook done enough for the 2020 US presidential election?
Read this article in
https://www.csmonitor.com/Technology/2020/1019/Has-Facebook-done-enough-for-the-2020-US-presidential-election
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe