Facebook experiment on users: An ethical breach or business as usual?

Many Internet companies collect user data. But privacy experts and Internet users question whether Facebook's 2012 experiment marked a breach of corporate ethics and public trust.

|
Jeff Chiu/AP/File
A man walks past a Facebook sign in an office on the Facebook campus in Menlo Park, Calif. The British Information Commissioner's Office said Wednesday that it is investigating whether Facebook broke European Union laws on data protection when it conducted a psychological experiment on its users.

It's not yet clear if Facebook broke any laws when it manipulated the news feed content of nearly 700,000 users without their explicit consent to test whether social networks can produce "emotional contagion."

(It turns out, to a modest extent, they can.)

But the uproar after release of the results of this 2012 study is raising new questions on how pervasive such practices are – and the extent to which they mark a breach of corporate ethics.

While it is generally known that Internet companies such as Facebook, Google, Microsoft, Twitter, and Yahoo, claim the right to collect, store, access, and study data on their users, the Facebook experiment appears to be unique.

Not only is the company the largest social network in the world, the kind of information it accumulates is highly personal, including user preferences spanning politics, culture, sport, sexuality, as well as location, schooling, employment, medical, marriage, and dating history. The social network algorithms are designed to track user behavior in real time – what they click and when.

The Information Commissioner's Office in the United Kingdom announced the launch of an investigation to determine whether Facebook broke data protection laws governed by the European Union. The Federal Trade Commission in the US has not yet said whether it is launching a similar probe or not. On Thursday, the Electronic Privacy Information Center, a civil liberties advocacy group in Washington, filed a formal complaint with the FTC, urging action.

The experiment, conducted over a week in January 2012, targeted 689,003 users who were not notified that their news feed content was being manipulated to assess their moods in real time. The study determined that an increase in positive content led to users posting more positive status updates; an increase in negative content led to more negative posts.

What alarmed many Internet activists wasn't the use of metadata for a massive study, but rather the manipulation of data to produce a reaction among users, without their knowledge or consent, which they see as a violation of corporate ethics.

“It’s one thing for a company to conduct experiments to test how well a product works, but Facebook experiments are testing loneliness and family connections, and all sorts of things that are not really directed toward providing their users a better experience,” says James Grimmelmann, a law professor and director of the Intellectual Property Program at the University of Maryland Institute for Advanced Computer Studies in College Park.

“These are the kinds of things that never felt part of the bargain until it was called to their attention. It doesn’t match the ethical trade we felt we had with Facebook,” Professor Grimmelmann says.

Many academics studying tech and online analytics worry about the ethics involving mass data collection. A September 2013 survey by Revolution Analytics, a commercial software provider in Palo Alto, Calif., found that 80 percent of data scientists believe in the need for an ethical framework governing how big data is collected.

Facebook leaders expressed remorse, but they stopped short of apologizing for the experiment, which reports show reflect just a small portion of the studies that the company regularly conducts on its nearly 1 billion users. On Wednesday, Facebook COO Sheryl Sandberg told The Wall Street Journal the study was merely “poorly communicated.... And for that communication, we apologize. We never meant to upset you.”

 In response to its critics, Facebook notes that policy agreements with users say that user data can be used for research. However, the term “research” was added in May 2012, four months after the study took place. Others say the complexities of the tests require stricter oversight, now that it is known the company has been conducting hundreds of similar experiments since 2007 without explicitly notifying the public.

“Burying a clause about research in the terms of use is not in any way informed consent," says Jenny Stromer-Galley, an associate professor who studies social media at the School of Information Studies at Syracuse University in New York.

"The issue is that people don’t read terms of use documents, and ethical principles mandate that people involved in basic research must be informed of their rights as a participant,” she adds.

Some say Facebook could have avoided the controversy simply if it had provided more transparency and allowed its users to opt out.

Lance Strate, professor of communications and media studies at Fordham University in New York City, says that the revelations, which are among many such privacy violations for Facebook, suggest social networks have outlived their purpose because they no longer adhere to the Internet values of “openness, honesty, transparency, and free exchange.”

“With this move, Facebook has violated the essential rules of online culture, and now begins to appear as an outsider much like the mass media industries. It is almost impossible to recover from the breaking of such taboos, and the loss of faith on the part of users. Zuckerberg started out as one of us, but now we find that he is one of them,” Professor Strate says.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Facebook experiment on users: An ethical breach or business as usual?
Read this article in
https://www.csmonitor.com/USA/2014/0703/Facebook-experiment-on-users-An-ethical-breach-or-business-as-usual
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe