Reddit bans two 'alt-right' communities: Are such bans appropriate?

Reddit, an online discussion community hosting millions of users and visitors, on Wednesday took down two subreddit pages dedicated to topics supporting the so-called alt-right, a right-wing movement rooted in white nationalism. 

Robert Galbraith/Reuters/File
Reddit mascots are displayed at the company's headquarters in San Francisco in April 2014. The online forum recently banned two pages dedicated to supporting the so-called alt-right movement, which has roots in white supremacy.

Reddit, the online discussion community with millions of users and visitors, on Wednesday took down two subreddit pages, r/altright and r/alternativeright, communities dedicated to supporting the so-called alt-right movement, whose views espouse a mix of white nationalism and populism.

Reddit, which brands itself as “the front page of the internet,” is a privately owned company that generates revenue through selling ads. This dual role – public forum and private enterprise – creates tensions familiar to other social networking platforms: How do you foster free and open discussion while creating a welcoming space for all users?

“We're certainly entering an age in America and throughout the world where the lines are becoming blurred as to what is a private and what is a more public space,” says Eric Gander, professor of public argument at the City University of New York, in an interview with The Christian Science Monitor.

On Thursday, Reddit banned the two subreddits for repeatedly posting individuals' personal information. As CNET reports, commentators discussing the ban elsewhere on the site said subreddit users had been "doxxing" people calling for violence against the movement: searching for and publishing personal information with the intent of harassing someone.

“We are very clear in our site terms of service that posting of personal information can get users banned from Reddit and we ask our communities not to post content that harasses or invites harassment,” the company said in a statement to the Daily Beast. “There is no single solution to these issues and we are actively engaging with the Reddit community to improve everyone's experience.”

But many observers have noticed that this represents an unusually political action for “the proud home to some of the most authentic conversations,” which, until recently, rarely banned users for their content itself.

The subreddit's senior moderator, who calls himself Bill Simpson, said the decision was made without notification or justification.

“So much for leftist tolerance,” Mr. Simpson told the Daily Beast. “Our moderator team enforced stricter standards of behavior than Reddit requires, and our users were very prompt at reporting violations so we could ban violators and delete posts and comments that broke the rules.”

But Jean Cohen, a professor of political thought at Columbia University, tells the Monitor that if Reddit users were doxxing opponents, they were themselves inciting intolerance.

“That's a question of privacy, not a question of hate speech,” she tells the Monitor. “Posting their private information is an attempt to control their speech.... They are the one who is suppressing speech by actually revealing people's addresses or personal information; there is this trend to make them afraid.”

As David Frum, a senior editor at The Atlantic and a former speechwriter for President George W. Bush, wrote in a recent cover story for the site: 

I’ve talked with well-funded Trump supporters who speak of recruiting a troll army explicitly modeled on those used by Turkey’s Recep Tayyip Erdoğan and Russia’s Putin to take control of the social-media space ... In a society where few people walk to work, why mobilize young men in matching shirts to command the streets? If you’re seeking to domineer and bully, you want your storm troopers to go online, where the more important traffic is. 

Reddit's announcement was applauded by the Southern Poverty Law Center (SPLC), which has previously criticized the site for harboring racist and hateful discussions.

“I'm glad that Reddit is starting to take these things seriously,” Heidi Beirich, the director of SPLC’s Intelligence Project, tells the Monitor. “The doxxing that's been occurring is particularly vicious.… It’s been becoming an increasing tactic by white supremacists to punish what they consider to be their enemies.”

Reddit’s decision comes two months after Twitter terminated the accounts of several alt-right leaders for inciting hate speech.

Those two moves by the social media giants to curb hateful voices should damage the movement, says CUNY's Professor Gander, but they could also trigger backlash on other channels.

"It will clearly have an impact in the sense that they will not be able to as easily get their message out," Gander told the Monitor in November.

Twitter and Reddit used different justifications for their bans. Twitter’s reasons – that its users were engaging in hate speech – are less defined than the bare fact of harassment or doxxing, which could lead to legal consequences.

“Twitter can define that however they want to, but it is extraordinarily hard to define what counts as hate speech. With respect to disclosure of personal information, there are actual laws on privacy that allow ones to sue in state court for a tort law violation,” Gander says. “In that case, Reddit may have just been somewhat cautious on their part … Of course there are no enforceable laws against engaging in hate speech.”

The real challenge is more fundamental than that, says Columbia's Professor Cohen.

“The problem with the Internet is it doesn't have a filter. It doesn't promote thoughtfulness, unfortunately. And so it is a real dilemma that we face as a society – how to handle this,” she says. “The dilemma is where to draw the lines and who should provide the guidelines."

of stories this month > Get unlimited stories
You've read  of  free articles. Subscribe to continue.

Unlimited digital access $11/month.

Get unlimited Monitor journalism.