Pulse victims lawsuit: Did social media provide 'material support' for terrorism?

Families of three victims killed last summer in a mass shooting in Orlando, Fla., accused Facebook, Twitter, and Google of violating US anti-terrorism law.

Artwork and signatures cover a fence Nov. 30 around Pulse, the LGBT nightclub and site of a mass shooting last summer, in Orlando, Fla.

John Raoux/AP/File

December 20, 2016

Lawyers representing the families of three people killed in last summer's mass shooting at the Pulse nightclub in Orlando, Fla., filed suit Monday against Facebook, Twitter, and Google, accusing the three companies of violating US anti-terrorism law by failing to halt terrorist recruiters on their social media platforms.

The allegation – that all three companies "purposefully, knowingly or with willful blindness" provided "material support" to a foreign terrorist organization whose social media inspired gunman Omar Mateen to target the LGBT club on Latin night, killing 49 people – will be tough to prove.

"Their biggest success is going to be in the national media and causing embarrassment to these providers, because it is true that you have jihadi propaganda that flies across Twitter and flies across the internet," Jeffrey Addicott, the director of the Center for Terrorism Law at St. Mary's University School of Law in San Antonio, Texas, tells The Christian Science Monitor in a phone interview Tuesday.

In Kentucky, the oldest Black independent library is still making history

Professor Addicott says the case boils down to a First Amendment dilemma, and suggests a free society must defend its principles of free speech and reject the lawsuit's "novel" take on the law: "It's a battle between increased security and civil liberties," he says, arguing in favor of the latter.

Perhaps the most pronounced legal hurdle for the lawsuit comes in Section 230 of the Communications Decency Act, which shields "interactive computer services" from liability for the content they transmit.

"Section 230 is a free pass to online service providers as long as they act only as a pass-through," Mark Bartholomew, a professor at the University of Buffalo School of Law, told FoxNews.com, which first reported on the lawsuit. "If you set up a place for people to talk, but don't communicate on it yourself, then you are basically immune from prosecution."

If the argument were to prevail in court, it would constitute "the first crack at making these companies liable for what shows up on our feeds," Professor Bartholomew added.

Although it seems the US Department of Justice has never filed criminal or civil charges accusing a social media platform of providing "material support" to terrorists, a number of private parties have pursued civil action in recent years, legislative attorney Kathleen Ann Ruane noted in a Congressional Research Service report in September. Only one case has resulted in a ruling with regard to Section 230 as it applies to social media companies and terrorists, Ms. Ruane writes:

A majority of Americans no longer trust the Supreme Court. Can it rebuild?

In Fields v. Twitter, the plaintiffs, family members of United States government contractors that were killed in a terrorist attack in Lebanon, alleged that the Islamic State organization used Twitter to spread propaganda, raise funds, and attract recruits, and that Twitter knowingly permitted such use of its services. The plaintiffs further alleged that Twitter’s provision of these services generally caused their injuries. However, the plaintiffs did not specifically allege that the Islamic State used Twitter to recruit the person who committed the terrorist attack that injured the plaintiffs or that Twitter was used to plan the attack. Instead, the plaintiffs alleged only that the attacker was generally inspired by propaganda that he had seen on Twitter.

That case, which was dismissed by a federal judge in the US District Court for the Northern District of California, suggests the courts could be inclined to keep applying Section 230 to lawsuits brought against online companies that display third-party content, Ruane wrote.

In the Pulse case, however, the lawsuit plans to pursue a new argument: that by matching users' content with targeted advertising, the social media sites are, in effect, producing their own, novel content, and profit off of terrorist users' content, as they do off of all posts.

Facebook does not tolerate terrorism or content that endorses terrorism, a company spokesperson tells the Monitor. When such material is reported on its platform, Facebook removes it – a fact acknowledged by the Pulse nightclub plaintiffs' complaint. Twitter and Google did not immediately respond to a request for comment.

Despite efforts to enforce anti-terrorism clauses in their terms of service, the defendants should be held accountable for their failure to more effectively interrupt international recruitment efforts, the plaintiffs argue.

"Social media companies continue to allow terrorists to operate, despite reasonable steps that could be undertaken to stop them," Keith Altman, an attorney with 1-800-LAW-FIRM in Southfield, Mich., who is representing the families of Pulse victims Tevin Crosby, Juan Ramon Guerrero, and Javier Jorge-Reyes, said in a statement. "The defendants not only continue to let terror groups like ISIS use their sites to tout hate and plan attacks, they also profit from it."

This is not the firm's first case against the trio of technology companies. Mr. Altman also represents the family of Nohemi Gonzalez, a student killed in last year's terrorist attacks in Paris, in a similar lawsuit, as USA Today reported.

The lawsuit seeks unspecified monetary damages and an order declaring the companies in ongoing violation of federal anti-terrorism law.