Does Telegram's blocking of ISIS reveal a changing definition of free speech online?

As the messaging app reverses course and says it will block accounts used by the Islamic State in the wake of deadly shootings in Paris, a new report shows that some people will support criticisms of religion online. But how social media sites moderate content is unclear, free speech advocates say.

A man poses with a smartphone in front of a screen showing the Telegram logo in this picture illustration taken in Zenica, Bosnia and Herzegovina on Wednesday. The messaging app has blocked accounts it said were associated with the Islamic State in the wake of killings last week in Paris, but a new report finds a complex debate over the limits of free speech online.

Dado Ruvic/Reuters

November 19, 2015

Amid a swirling debate on Internet censorship and encryption in the wake of the deadly attacks in Paris, mobile messaging app Telegram announced on Wednesday that it was suspending a series of accounts it said were used by the Islamic State (ISIS) to broadcast its messages publicly.

The move represents a seeming about-face for Berlin-based Telegram, which provides secure messages. The service had set itself apart from US-based sites such as Facebook and Twitter by aggressively proclaiming its users’ right to privacy and its opposition to government censorship

In September, Pavel Durov, the site’s Russian-born creator, had confirmed that he knew users associated with ISIS were using Telegram’s public broadcast feature, but said he didn’t feel the communications represented a threat.

In Kentucky, the oldest Black independent library is still making history

“Ultimately, ISIS will always find a way to communicate within themselves. And if any means of communication turns out to be not secure for them, they’ll just switch to another one,” he told the site TechCrunch in an interview. “I don’t think we should feel guilty about it, I still think we’re doing the right thing, protecting our users’ privacy."

In the wake of the recent attacks in Paris, for which ISIS has claimed responsibility, Telegram appears to have reversed course, announcing that it had blocked 78 accounts associated with the Islamic State.

The New York Times compared the gesture to a famous line by a character played by Claude Rains in the film "Casablanca," announcing that he was “shocked, shocked” to find gambling in Rick’s Cafe, just before collecting his own winnings.

But it was difficult to know how Mr. Durov — best known as the founder of VKontakte, Russia’s most popular social network — really felt.

After the Paris attacks, he responded by posting a brooding picture of himself in front of the Eiffel Tower on Instagram, along with a message that criticized “shortsighted socialists” who spend French taxpayers’ money on “waging useless wars in the Middle East and on creating [a] parasitic social paradise for North African immigrants.”

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

A supporter of former NSA contractor Edward Snowden who has described his political views as "libertarian," Durov is somewhat of a lightning rod on issues of online speech, having become a Russian exile following his dismissal as VK’s chief executive officer. He told the Financial Times he developed the idea for Telegram after a 2011 confrontation with armed police who attempted to storm his apartment in St. Petersburg.

Now, Telegram says that while it is blocking accounts linked to the Islamic State, it will continue to refuse government-backed requests for censorship.

Amid ongoing tensions over whether states in the US should accept Syrian refugees and a wave of xenophobic online posts directed at Muslims in the US and Europe, it appears this commitment to free speech – even when it may seem offensive or even encourage violence – is not uncommon.

A Pew Research Center report released on Wednesday found that in the US, 95 percent of those polled support the right of people to make statements that criticize government policies, as do 89 percent of those in France.

The report also found that 67 percent in the US supported public statements that “are offensive to minority groups,” compared to 51 percent in France and 27 percent in Germany, which has stricter laws on hate speech. In Lebanon, by contrast, only 1 percent would support such statements.

The Pew report found that 77 percent of those in the US say they would support statements that are offensive to their own religion or beliefs, compared to 57 percent in the UK and 53 percent in France.

But statements that supported violent protest received a more mixed response: 44 percent of those in the US would support them, while support in Europe hovered at a median of 30 percent. In the six Middle Eastern countries surveyed, the median was 15 percent, with the largest measure of support for statements of violent protest coming from Palestinian territories, at 30 percent.

The group’s survey of 38 countries was conducted this spring, before the attacks last week in Paris, but after the January shootings at the satirical French publication Charlie Hebdo and a kosher supermarket, which raised questions about a growing tide of xenophobia against Muslims and immigrants in France.

The killings last week appear to have created similar tensions, with many Muslims in the US saying they are bracing for a backlash – especially on social media, where such messages can spread quickly.

But free speech advocates have frequently argued that what types of criteria go into moderating online posts on sites such as Facebook and Twitter are often unclear, with posts that contain political debates, nudity, and LGBT content often arbitrarily censored, while other posts containing hate speech can remain online.

In a bid to improve transparency about on how social media sites moderate posts, the Electronic Frontier Foundation and Visualizing Impact, an organization that creates data visualizations on social issues, introduced a new site called onlinecensorship.org on Thursday.

The site will allow users to report posts that have been taken down on Facebook, Google+, Twitter, Instagram, Flickr, and YouTube, analyzing and examining trends to determine why particular types of posts are removed. The goal, the site’s founders say, is to improve transparency and examine how removals of online content affect particular groups of users.

“It’s important companies understand that, more often than not, the individuals and communities most impacted by online censorship are also the most vulnerable,” says Ramzi Jaber, the co-founder of Visualizing Impact and a co-founder of the new site, in a statement. 

Mr. Jaber and co-founder Jillian C. York, Director for International Freedom of Expression at EFF, noted that the site would also include guides explaining how to navigate the frequently-complex process of appealing a content takedown for each site.

“We hope that companies will respond to the data by improving their regulations and reporting mechanisms and processes—we need to hold Internet companies accountable for the ways in which they exercise power over people’s digital lives,” Ms. York says in the statement.