EU urges social media giants to act on hate speech

Facebook, Twitter, YouTube, and Microsoft must act faster to counteract hate speech, the European Commission said on Sunday.

Dado Ruvic/Reuters/Files
A 3D-printed Facebook logo is seen in front of the logo of the European Union in this illustration made in Zenica, Bosnia and Herzegovina.

The European Union (EU) issued an executive warning to US tech giants, including Facebook and Twitter, on Sunday, saying they must act faster to crack down on online hate speech or face legislative measures forcing them to do so. 

The urging from the European Commission, the EU's executive branch, comes six months after the companies, which also include Microsoft and Google's YouTube, agreed to a voluntary code of conduct to review, and delete at their discretion, suspected hate speech in Europe. 

At the time, EU Justice Commissioner Věra Jourová cited the "urgent need to address illegal online hate speech" following recent terror attacks, as "social media is unfortunately one of the tools that terrorist groups use to radicalize young people." But the code of conduct faced criticism from some civil liberties and Internet advocacy groups who expressed concerns about potential privacy violations and worried that companies might be overzealous in enforcing it. As Rachel Stern reported for The Christian Science Monitor in June: 

The code is meant to encourage companies to become more vigilant at removing content that violates their own terms of service but that doesn't necessarily violate European law. The problem for civil liberty groups such as [Brussels-based nonprofit digital advocacy organization] Access Now is that companies may monitor for and remove content merely because it’s controversial and they feel they face a liability by leaving it online, says [Estelle Massé, the EU policy analyst with Access Now].

Some critics, such as human rights advocate Jacob Mchangama, the director of Copenhagen-based think tank Justitia, argued that cracking down on hate speech as per the code of conduct could have a slippery slope effect. 

"It seems these companies were given 'an offer they couldn't refuse,' and rather than take a principled stand, they've backed down fearing actual legislation," Mr. Mchangama told the Monitor in May. "And of course, how will global tech companies now be able to resist the inevitable demands from authoritarian states that they also remove content that these countries determine to be 'hateful' or 'extremist'?" 

Others defended the agreement, saying that it sends a powerful message about the values of the tech companies involved. 

"These companies have an important responsibility," Frederick Lawrence, a senior research scholar at Yale Law School, told the Monitor in May, "and they're taking that responsibility seriously."

Six months later, a report commissioned by Ms. Jourová has shown that compliance with the code has not been satisfactory, as the companies reportedly reviewed only 40 percent of the recorded cases in less than 24 hours. 

"After 48 hours the figure is more than 80 percent," said a Commission official, as reported by Reuters. "This shows that the target can realistically be achieved, but this will need much stronger efforts by the IT companies." 

Now, the Commission has said that it may enact laws to force the tech companies to act more quickly. 

"If Facebook, YouTube, Twitter and Microsoft want to convince me and the ministers that the non-legislative approach can work, they will have to act quickly and make a strong effort in the coming months," Jourová told the Financial Times. 

This report contains material from Reuters. 

You've read  of  free articles. Subscribe to continue.