Facebook experiment on users: An ethical breach or business as usual?

Many Internet companies collect user data. But privacy experts and Internet users question whether Facebook's 2012 experiment marked a breach of corporate ethics and public trust.

Jeff Chiu/AP/File
A man walks past a Facebook sign in an office on the Facebook campus in Menlo Park, Calif. The British Information Commissioner's Office said Wednesday that it is investigating whether Facebook broke European Union laws on data protection when it conducted a psychological experiment on its users.

It's not yet clear if Facebook broke any laws when it manipulated the news feed content of nearly 700,000 users without their explicit consent to test whether social networks can produce "emotional contagion."

(It turns out, to a modest extent, they can.)

But the uproar after release of the results of this 2012 study is raising new questions on how pervasive such practices are – and the extent to which they mark a breach of corporate ethics.

While it is generally known that Internet companies such as Facebook, Google, Microsoft, Twitter, and Yahoo, claim the right to collect, store, access, and study data on their users, the Facebook experiment appears to be unique.

Not only is the company the largest social network in the world, the kind of information it accumulates is highly personal, including user preferences spanning politics, culture, sport, sexuality, as well as location, schooling, employment, medical, marriage, and dating history. The social network algorithms are designed to track user behavior in real time – what they click and when.

The Information Commissioner's Office in the United Kingdom announced the launch of an investigation to determine whether Facebook broke data protection laws governed by the European Union. The Federal Trade Commission in the US has not yet said whether it is launching a similar probe or not. On Thursday, the Electronic Privacy Information Center, a civil liberties advocacy group in Washington, filed a formal complaint with the FTC, urging action.

The experiment, conducted over a week in January 2012, targeted 689,003 users who were not notified that their news feed content was being manipulated to assess their moods in real time. The study determined that an increase in positive content led to users posting more positive status updates; an increase in negative content led to more negative posts.

What alarmed many Internet activists wasn't the use of metadata for a massive study, but rather the manipulation of data to produce a reaction among users, without their knowledge or consent, which they see as a violation of corporate ethics.

“It’s one thing for a company to conduct experiments to test how well a product works, but Facebook experiments are testing loneliness and family connections, and all sorts of things that are not really directed toward providing their users a better experience,” says James Grimmelmann, a law professor and director of the Intellectual Property Program at the University of Maryland Institute for Advanced Computer Studies in College Park.

“These are the kinds of things that never felt part of the bargain until it was called to their attention. It doesn’t match the ethical trade we felt we had with Facebook,” Professor Grimmelmann says.

Many academics studying tech and online analytics worry about the ethics involving mass data collection. A September 2013 survey by Revolution Analytics, a commercial software provider in Palo Alto, Calif., found that 80 percent of data scientists believe in the need for an ethical framework governing how big data is collected.

Facebook leaders expressed remorse, but they stopped short of apologizing for the experiment, which reports show reflect just a small portion of the studies that the company regularly conducts on its nearly 1 billion users. On Wednesday, Facebook COO Sheryl Sandberg told The Wall Street Journal the study was merely “poorly communicated.... And for that communication, we apologize. We never meant to upset you.”

 In response to its critics, Facebook notes that policy agreements with users say that user data can be used for research. However, the term “research” was added in May 2012, four months after the study took place. Others say the complexities of the tests require stricter oversight, now that it is known the company has been conducting hundreds of similar experiments since 2007 without explicitly notifying the public.

“Burying a clause about research in the terms of use is not in any way informed consent," says Jenny Stromer-Galley, an associate professor who studies social media at the School of Information Studies at Syracuse University in New York.

"The issue is that people don’t read terms of use documents, and ethical principles mandate that people involved in basic research must be informed of their rights as a participant,” she adds.

Some say Facebook could have avoided the controversy simply if it had provided more transparency and allowed its users to opt out.

Lance Strate, professor of communications and media studies at Fordham University in New York City, says that the revelations, which are among many such privacy violations for Facebook, suggest social networks have outlived their purpose because they no longer adhere to the Internet values of “openness, honesty, transparency, and free exchange.”

“With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them,” Professor Strate says.

of stories this month > Get unlimited stories
You've read  of  free articles. Subscribe to continue.

Unlimited digital access $11/month.

Get unlimited Monitor journalism.