Facebook's democratic role? New steps to stop misinformation.

CEO Mark Zuckerberg says the social media platform will restrict new political ads in the week before the election and remove posts that convey misinformation about COVID-19 and voting. Critics say the measures don't go far enough.

Richard Drew/AP
The logo for Facebook is seen at the Nasdaq MarketSite in New York's Times Square. As the U.S. presidential election nears, Facebook says it is taking steps to encourage voting, minimize misinformation, and reduce the likelihood of post-election “civil unrest.”

With just two months left until the U.S. presidential election, Facebook says it is taking more steps to encourage voting, minimize misinformation, and reduce the likelihood of post-election “civil unrest.”

The company said Thursday it will restrict new political ads in the week before the election and remove posts that convey misinformation about COVID-19 and voting. It will also attach links to official results to posts from candidates and campaigns that declare premature victories.

“This election is not going to be business as usual. We all have a responsibility to protect our democracy,” Facebook CEO Mark Zuckerberg said in a post on Thursday. “That means helping people register and vote, clearing up confusion about how this election will work, and taking steps to reduce the chances of violence and unrest.”

Facebook and other social media companies are being scrutinized over how they handle misinformation, given issues with President Donald Trump and other candidates posting false information and Russia’s ongoing attempts to interfere in U.S. politics.

Facebook has long been criticized for not fact-checking political ads or limiting how they can be targeted at small groups of people.

With the nation divided, and election results potentially taking days or weeks to be finalized, there could be an “increased risk of civil unrest across the country,” Mr. Zuckerberg said.

In July, Mr. Trump refused to publicly commit to accepting the results of the upcoming election, as he scoffed at polls that showed him lagging behind Democratic rival Joe Biden. Mr. Trump also has made false claims that the increased use of mail-in voting because of the coronavirus pandemic allows for voter fraud. That has raised concern over the willingness of Mr. Trump and his supporters to abide by election results.

Under the new measures, Facebook says it will prohibit politicians and campaigns from running new election ads in the week before the election. However, they can still run existing ads and change how they are targeted.

Posts with obvious misinformation on voting policies and the coronavirus pandemic will also be removed. Users can only forward articles to a maximum of five others on Messenger, Facebook’s messaging app. The company also will work with Reuters to provide official election results and make the information available both on its platform and with push notifications.

After being caught off-guard by Russia’s efforts to interfere in the 2016 U.S. presidential election, Facebook, Google, Twitter, and others companies put safeguards in place to prevent it from happening again. That includes taking down posts, groups, and accounts that engage in “coordinated inauthentic behavior” and strengthening verification procedures for political ads. Last year, Twitter banned political ads altogether.

Mr. Zuckerberg said Facebook had removed more than 100 networks worldwide engaging in such interference over the last few years.

“Just this week, we took down a network of 13 accounts and two pages that were trying to mislead Americans and amplify division,” he said.

But experts and Facebook’s own employees say the measures are not enough to stop the spread of misinformation – including from politicians and in the form of edited videos.

Facebook had previously drawn criticism for its ads policy, which cited freedom of expression as the reason for letting politicians like Mr. Trump post false information about voting.

Trump campaign spokeswoman Samantha Zager criticized the ban on new political ads, saying it would prevent Mr. Trump from defending himself on the platform in the last seven days of the presidential campaign.

This story was reported by The Associated Press.

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.