Free speech vs. hate speech: How Reddit navigates the crosscurrents

Reddit has announced new measures to crackdown on abusive users. Will that reduce hate speech, or hinder expression on one of the internet's freest platforms?

Robert Galbraith/Reuters/File
Reddit mascots are displayed at the company's headquarters in San Francisco.

After stumbling into a controversy involving harassing comments last week, Reddit has announced that it will punish users who engage in hateful, abusive speech on the platform, marking a move away from the largely uninhibited, and controversial, space for speech the site has cultivated.

Online communities and social media sites have struggled to define their responsibility in moderating speech and content on their platforms while also trying to create spaces for controversial opinions that have little chance to survive in mainstream media settings. While sites like Facebook have been criticized for heavy-handed censorship and the alleged promotion of certain viewpoints over others, other platforms such as Twitter have suffered the consequences of not curtailing online harassment on their sites.

Reddit, which began as a news aggregate site seeking to become the internet’s “front page,” has set a precedent for being a freer platform than most for anonymous communication, allowing its users to air controversial opinions and connect with one another. But just over a decade after the site’s launch, an unpredictable online sphere is forcing executives to grapple with the company’s identity.

“It’s a constant dilemma that we face on the internet,” Jessie Daniels, a psychology professor at Hunter College in New York who specializes in racism on the internet, tells The Christian Science Monitor. “Part of what the digital era has done is it’s raised new questions about the line between free speech and hate speech.”

And that line is an important one to define, as potentially unprotected speech dismissed as offensive, but meaningless, online utterances has the potential to spill over into the real world, leading to hate crimes and violence, she says.

“What you find at places like Reddit, is not what we would say about controversial ideas,” Dr. Daniles adds. “It’s about attacking people and attacking people in a really vicious way that often bleeds over into attacks in the material world and harming people.”

Last week, Reddit's chief executive officer, Steve Huffman, came under fire for editing users' comments that criticized him with abusive language, opting instead to replace his name with prominent members of a pro-Donald Trump subreddit. He said the idea started as a joke, and he hoped to show users how it feels when abusive posts are made involving their usernames. He later apologized, admitting the joke was done in bad taste and jeopardized the integrity and authenticity of speech on the site.

“I understand what I did has greater implications than my relationship with one community, and it is fair to raise the question of whether this erodes trust in Reddit,” he wrote in an announcement Wednesday. “I hope our transparency around this event is an indication that we take matters of trust seriously.”

Moving forward, Reddit plans to officially crackdown on the site’s “most toxic users,” and has already identified swathes of such accounts and taken initial action against them. These include warnings, timeouts, and permanent bans, Mr. Huffman said.

Daniels says those kind of approaches are some of the most effective ways to rid platforms of abusive users, but also noted that they’re largely labor intensive and expensive. Companies have tried to develop automated filters that censor obscene language, but human users have often outsmarted the algorithms, forcing sites that want to foster civil discourse to aggressively monitor posts.

Huffman’s announcement on the subreddit r/The_Donald said that posts within the group would no longer appear on the popular r/all listing where users from many different backgrounds engage.  

“The sticky feature was designed for moderators to make announcements or highlight specific posts,” he wrote. “It was not meant to circumvent organic voting, which r/the_donald does to slingshot posts into r/all, often in a manner that is antagonistic to the rest of the community.”

Huffman’s “joke,” and his latest announcement, have received mixed reviews from both r/The_Donald members and users on the other end of the political spectrum. While some found the editing amusing, many said it hindered their trust in posts’ authenticity.

“Again, I am sorry for the trouble I have caused,” Huffman wrote. “While I intended no harm, that was not the result, and I hope these changes improve your experience on Reddit.”

But many more have applauded his latest efforts to ban harassment and spamming that goes on as a result of the uninhibited atmosphere on the site, and experts say such moves can help to redirect actual hate crimes committed in public.

In the wake of President-elect Trump’s unexpected election victory, more than 800 hate crimes were committed in nearly every state in just 10 days, according to an aggregated accounting compiled by the Southern Poverty Law Center. Many targeted minority groups or bore Trump’s name in graffiti, and some observers suggested that Trump’s harsh rhetoric may have spurred the acts.

“The conclusion that we can draw from that is that words have consequences,” Daniels says. “And I think that’s something that we really have to pay attention to.”

While some may think online threats levied by an anonymous user pose no danger to the person on the receiving end, the expression and normalization of such hatred can still lead to random acts of violence in the name of hate, Daniels says.

“The fact is that we know [hate speech online] creates an environment where some may read that and not get that it’s trolling and see it as a license to go out and assault someone,” she adds, noting that the issue becomes one of weighing free speech over protecting people from physical harm.

“That’s part of the human calculation as people who care about civil society and care about democracy: What is the balance between free speech, and speech that actually harms people?”

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.