Secret Facebook experiment sparks controversy

A controversial study undertaken two years ago had researchers manipulating users' Facebook news feeds to determine how people are emotionally affected by content they view on social networks. 

|
Ben Margot/AP/File
A sign at Facebook headquarters in Menlo Park, Calif. The social media company came under fire this past weekend for a controversial study that manipulated users' news feeds.

Facebook came under fire this past weekend for allowing researchers to manipulate users' news feeds to determine whether emotions on social networks are "contagious." The study found that they can be. But what cost did users pay to get this answer?

That's the question being tossed around the Web as everyone from reporters to academics to casual users chimes in with responses to what some say amounts to a shocking transgression of user privacy. 

Here are the facts of the experiment: 

For one week in January 2012, a Facebook data scientist and two university researchers manipulated the news feeds of 689,003 random Facebook users. The goal was to find out if exposure to positive content triggered positive or negative reactions in the recipients, and vice versa for exposure to negative content. As such, certain users' feeds deliberately received a disproportionate amount of positive or negative content. When people saw more positive content they tended to respond less negatively. Conversely, when they saw more negative content, they reacted less positively. These findings went against the prevailing notion that said exposure, say, to lots of positive content triggered negative reactions. The study's findings were recently published June 17 in the Proceedings of the National Academy of Sciences. 

The trouble is that nobody knew they were participating in the experiment, being used as "lab rats," as AnimalNewYork noted in a blog post Friday morning. 

Needless to say, revelations of this experiment have incited privacy advocates at a time when suspicions of government surveillance and technology companies' big data collection is running high. However, there are some who have been remarkably blase about the matter.  

According to its terms of service, Facebook uses information it receives about users "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

Many have been shocked that Facebook believes this amounts to informed consent on the part of the participants. In theory, every Facebook user has given their consent by using the site and, therefore, agreeing to this policy. Which is why the ethics of the experiment have been called into question. While an independent committee did in fact approve this experiment as ethical, the committee seems to have been "only consulted about the methods of data analysis ... and not those of data collection," according to The Atlantic.

"I was concerned, until I queried the authors and they said their local institutional review board had approved it – and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time," Susan Fiske, a Princeton University psychology professor who edited the study for publication, told The Atlantic

Ms. Fiske's sentiment seems to be reflected by James Grimmelmann, a professor of technology and law at the University of Maryland. 

"Facebook knows it can push its users' limits, invade their privacy, use their information and get away with it," Mr. Grimmelmann told Bloomberg. "Facebook has done so many things over the years that scared and freaked out people." 

Psychology professor James Pennebaker at the University of Texas expressed a similar nonchalant attitude. "It will make people a little bit nervous for a couple of days," he told Bloomberg. "The fact is, Google knows everything about us, Amazon knows a huge amount about us. It's stunning how much all of these big companies know. If one is paranoid, it creeps them out." 

Still, many are worried, taking to Twitter to vent their frustration: 

On Sunday afternoon, Adam Kramer, one of the study's authors and a Facebook employee, said in a public Facebook post that the experiment had little effect on users, noting, "At the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it." But he did apologize for the alarm the experiment's methods caused to people, writing, "I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety." 

In terms of the greater debate surrounding Internet and privacy, this comes after Google has recently begun removing certain people's data from search results in the European Union after the so-called "right to be forgotten" ruling. 

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Secret Facebook experiment sparks controversy
Read this article in
https://www.csmonitor.com/Technology/Horizons/2014/0701/Secret-Facebook-experiment-sparks-controversy
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe