Sheryl Sandberg apologizes for Facebook experiment, sort of

Sheryl Sandberg apologized in a TV interview, even as British regulators are looking into the controversial Facebook psych experiment.  "Facebook cannot control emotions of users," said Sheryl Sandberg.

British regulators are investigating revelations that Facebook treated hordes of its users like laboratory rats in an experiment probing into their emotions.

The Information Commissioner's Office said Wednesday that it wants to learn more about the circumstances underlying a 2-year-old study carried out by two U.S. universities and the world's largest social network.

The inquiry is being coordinated with authorities in Ireland, where Facebook has headquarters for its European operations, as well as with French regulators.

Sheryl Sandberg, Facebook's chief operating officer, told television network NDTV in India that "we clearly communicated really badly about this and that we really regret." Later she added: "Facebook has apologized and certainly we never want to do anything that upsets users."

Sandberg added that "Facebook cannot control emotions of users. Facebook will not control emotions of users."

The words of contrition sounded hollow to Jeff Chester, executive director of the Center for Digital Democracy, a privacy-rights group. He points to Facebook job openings looking for researchers specializing in data mining and analysis as evidence that the company still has every intention of digging deeper into its users' psyches and preferences.

"They are engaged in secret surveillance of its users to figure out how to make more money for their advertisers," Chester said.

This Facebook psych experiment is just the latest in a string of incidents that have raised questions about whether the privacy rights of Facebook's nearly 1.3 billion users are being trampled by the company's drive to dissect data and promote behavior that could help sell more online advertising.

In this case, Facebook allowed researchers to manipulate the content that appeared in the main section, or "news feed," of about 700,000 randomly selected users during a single week in January 2012. The data-scientists were trying to collect evidence to prove their thesis that people's moods could spread like an "emotional contagion" depending on the tenor of the content that they were reading.

The study concluded that people were more likely to post negative updates about their lives after the volume of positive information appearing in their Facebook feeds had been purposefully reduced by the researchers. The opposite reaction occurred when the number of negative posts appeared in people's news feeds.

None of the participants in the Facebook experiments were explicitly asked for their permission, though the social network's terms of use appears to allow for the company to manipulate what appears in users' news feeds however it sees fits.

As The Christian Science Monitor reported:

According to its terms of service, Facebook uses information it receives about users "for internal operations, including troubleshooting, data analysis, testing, research and service improvement."

Many have been shocked that Facebook believes this amounts to informed consent on the part of the participants. In theory, every Facebook user has given their consent by using the site and, therefore, agreeing to this policy.

Facebook's data-use policy says the Menlo Park, California, company can deploy user information for "internal operations, including troubleshooting, data analysis, testing, research and service improvement."

The reaction to the study itself provided evidence of how quickly an emotional contagion can spread online. The research was released a month ago, but it didn't provoke a backlash until the past few days after other social media sites and essays in The New York Times and The Atlantic raised red flags about the ethics of Facebook's experiment.

As it has done in several past breaches of privacy etiquette, Facebook is now trying to make amends.

Whatever Facebook has been doing has been paying off for the company and its shareholders. Facebook's revenue last year rose 55 percent to $7.9 billion and its stock has nearly tripled in value during the past year.

The concern over Facebook's experiment comes amid interest in Europe about beefing up data-protection rules. The European Court of Justice last month ruled that Google must respond to users' requests seeking to remove links to personal information.

Suzy Moat, a Warwick Business School assistant professor of behavioral science, said businesses regularly do studies on how to influence behavior. She cited the example of Facebook and Amazon experimenting with showing different groups of people slightly different versions of their websites to see if one is better than another at getting customers to buy products.

"On the other hand, it's extremely understandable that many people are upset that their behavior may have been manipulated for purely scientific purposes without their consent," Moat said. "In particular, Facebook's user base is so wide that everyone wonders if they were in the experiment."

___

Liedtke reported from San Francisco. Mae Anderson in New York contributed to this report.

Copyright 2014 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Sheryl Sandberg apologizes for Facebook experiment, sort of
Read this article in
https://www.csmonitor.com/Technology/Latest-News-Wires/2014/0703/Sheryl-Sandberg-apologizes-for-Facebook-experiment-sort-of
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe