The world’s biggest social media network is thinking rather small these days.
On July 18, Facebook announced that it will start removing misinformation on its digital platforms that could spark violence. While the goal is global, the company will act on what it hopes are local community standards, relying on “local context and local partners” to help it in making decisions to ban language and images that might incite physical harm.
The move comes after India accused Facebook of allowing rumors about the kidnapping of children to circulate on its messaging service WhatsApp. The allegations led to the mob killing of several innocent people. Similar violence in Sri Lanka and Myanmar has also been attributed to false information on Facebook platforms. In many countries, rules about the wrong kind of speech are similar to a legal norm in the United States that people cannot yell “fire!” in a crowded theater. Certain rights such as free speech are not protected if they are abused to cause harm.
Like other social media companies, Facebook is already trying to meet new privacy standards imposed by the European Union and to regulate content under a new German online hate speech law. It is also providing more transparency about political ads that run on its site. And it is working with local fact-checking organizations to better filter out fake news.
Last April, Facebook founder Mark Zuckerberg admitted that the company faces a dilemma because, while it is based in the US, where universal ideals are largely understood, 90 percent of its 2 billion users live elsewhere, with different social norms and different cultures. “It’s not clear to me that our current situation of how we define community standards is going to be effective for articulating that around the world,” he told Congress.
Facebook is now on a search to find those local norms, either informal or written in law, that will then be used to restrain local content in hopes of keeping the peace in particular societies. At the same time, Facebook also sees its platforms as an opportunity to create a global community with shared values. For example, it imposes a general ban on certain types of expression, such as nudity.
In a 2017 book, Canadian scholar Michael Ignatieff tried to find a balance between universal ideals and what he called “ordinary virtues,” or the moral operating system of local communities. Such virtues are seen not as an obligation within a society but as a “gift,” negotiated between individuals, one at a time and in the spirit of reciprocity and solidarity. He questions whether the language of rights has “reached into ... the common practices of trust and tolerance, forgiveness and reconciliation that are the essence of private moral behavior.”
What is noteworthy, writes Dr. Ignatieff based on his research in several countries, is the common desire for moral order, or a framework of expectations that allows life to be meaningful.
His work builds on that of the late Harvard University moral philosopher Lawrence Kohlberg, whose research on young children discovered an innate ability of humans to rise up to higher stages of moral reasoning. His work also discovered that individuals at those higher stages can influence those at lower stages through democratic discussion on moral dilemmas. As people’s thinking on issues is improved, they develop a “commonality’’ on right and wrong.
The ordinary virtues of a local community can eventually become universal if people feel both safe and free to communicate with others. Facebook, along with Twitter, Google, and other social media giants, is venturing down this long road. The more these companies honor local virtues, the more they can help define global ethics.