Secret Facebook experiment sparks controversy

A controversial study undertaken two years ago had researchers manipulating users' Facebook news feeds to determine how people are emotionally affected by content they view on social networks. 

A sign at Facebook headquarters in Menlo Park, Calif. The social media company came under fire this past weekend for a controversial study that manipulated users' news feeds.

Ben Margot/AP/File

July 1, 2014

Facebook came under fire this past weekend for allowing researchers to manipulate users' news feeds to determine whether emotions on social networks are "contagious." The study found that they can be. But what cost did users pay to get this answer?

That's the question being tossed around the Web as everyone from reporters to academics to casual users chimes in with responses to what some say amounts to a shocking transgression of user privacy. 

Here are the facts of the experiment: 

OK, she’s worth $1 billion, but can Taylor Swift write poetry? We ask the experts.

For one week in January 2012, a Facebook data scientist and two university researchers manipulated the news feeds of 689,003 random Facebook users. The goal was to find out if exposure to positive content triggered positive or negative reactions in the recipients, and vice versa for exposure to negative content. As such, certain users' feeds deliberately received a disproportionate amount of positive or negative content. When people saw more positive content they tended to respond less negatively. Conversely, when they saw more negative content, they reacted less positively. These findings went against the prevailing notion that said exposure, say, to lots of positive content triggered negative reactions. The study's findings were recently published June 17 in the Proceedings of the National Academy of Sciences. 

The trouble is that nobody knew they were participating in the experiment, being used as "lab rats," as AnimalNewYork noted in a blog post Friday morning. 

Needless to say, revelations of this experiment have incited privacy advocates at a time when suspicions of government surveillance and technology companies' big data collection is running high. However, there are some who have been remarkably blase about the matter.  

According to its terms of service, Facebook uses information it receives about users "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

Many have been shocked that Facebook believes this amounts to informed consent on the part of the participants. In theory, every Facebook user has given their consent by using the site and, therefore, agreeing to this policy. Which is why the ethics of the experiment have been called into question. While an independent committee did in fact approve this experiment as ethical, the committee seems to have been "only consulted about the methods of data analysis ... and not those of data collection," according to The Atlantic.

Columbia’s president called the police. Students say they don’t know who to trust.

"I was concerned, until I queried the authors and they said their local institutional review board had approved it – and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time," Susan Fiske, a Princeton University psychology professor who edited the study for publication, told The Atlantic

Ms. Fiske's sentiment seems to be reflected by James Grimmelmann, a professor of technology and law at the University of Maryland. 

"Facebook knows it can push its users' limits, invade their privacy, use their information and get away with it," Mr. Grimmelmann told Bloomberg. "Facebook has done so many things over the years that scared and freaked out people." 

Psychology professor James Pennebaker at the University of Texas expressed a similar nonchalant attitude. "It will make people a little bit nervous for a couple of days," he told Bloomberg. "The fact is, Google knows everything about us, Amazon knows a huge amount about us. It's stunning how much all of these big companies know. If one is paranoid, it creeps them out." 

Still, many are worried, taking to Twitter to vent their frustration: 

On Sunday afternoon, Adam Kramer, one of the study's authors and a Facebook employee, said in a public Facebook post that the experiment had little effect on users, noting, "At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it." But he did apologize for the alarm the experiment's methods caused to people, writing, "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety." 

In terms of the greater debate surrounding Internet and privacy, this comes after Google has recently begun removing certain people's data from search results in the European Union after the so-called "right to be forgotten" ruling.