Is Facebook to blame for making us more polarized? No, we are.

Critics have worried that the algorithm Facebook uses to determine what users see could be creating 'bubbles' that allow us to see only what we agree with. A new study finds that users are driving the trend more than Facebook itself.

|
Karly Domb Sadof/AP/File
Facebook users do more to seal themselves within their own political news and opinion bubbles than the social media site's algorithms do, according to a study published Thursday in the journal Science Express.

A quick Google search for the social-media giant Facebook turns up a range of provocative questions: Is Facebook making us lonely? Is Facebook losing its cool? Is Facebook dying?

Scientists at Facebook have added another: Is Facebook reinforcing ideological bubbles that users build around themselves?

Their short answer is: yes. But the effect is small compared with contributions users themselves make. Users build those bubbles through their choice of "friends," what those friends share, and the extent to which users open links to news or opinion material that would offer views that run counter to the user's view.

On one level, the results, published Thursday in the online journal Science Express, suggest that for now, social media and their complex, user-focused algorithms aren't to blame for the nation's growing political polarization.

That polarization is a trend many political and information scientists see as a threat to a well-oiled democracy, which relies on people with competing ideologies working together toward shared goals. The study reinforces the observation that people are bringing to the virtual world their real-world tendencies to surround themselves with people who think like they do.

On another level, however, the small internal effect the researchers detected from Facebook's algorithm should raise warning flags, says David Lazer, a political scientist at Northeastern University who focuses in part on the impact of the internet on politics and was not a member of the study team.

"There's nothing in the algorithm that says: Let's polarize America," he says. But "the simple rules that might make content more engaging may also result in this kind of bubble."

He notes that Facebook recently tweaked it algorithm, in part to make sure a user sees more material from people a user identifies as close friends.

"Close friends are probably more similar to you in many ways than your distant acquaintances. So it's quite plausible that the change will have the unintended consequence" of further narrowing the range of perspectives that enter a user's news feed, he says.

The new study grew out of surprising results in previous work, which looked at how users got their information on Facebook, says Eytan Bakshy, a data scientist at Facebook and the study's lead author. The earlier study found that on average, the less frequently you interact with a Facebook friend, the more likely you are to share items that come from that friend.

"To our surprise we found that the majority of information that you click on and you end up re-sharing comes from weaker ties," people with whom you interact relatively rarely, Dr. Bakshy says. These people "have the potential to be more dissimilar to you."

That raised a question: What does this imply for the notion of social media as an echo chamber in which people surround themselves only with people who think like they do?

Others have tried to tackle that question, with conflicting results – often in no small part because the sample sizes in the study groups were relatively small.

Bakshy and colleagues tapped data and activity for some 10.1 million Facebook users in the United States, using protocols that ensured their anonymity. These people had listed a political affiliation in their profiles. In addition, the team focused on shared content they dubbed hard news or opinion – politics, US news in general, and international news. No cats or children's birthday parties. Ideology of the source was based on the organization tied to a web link, rather than the content of specific articles.

When the researchers parsed the data, they found that on average, 23 percent of a user's friends are people whose politics are "from the other side." Despite the heavy tilt in friends toward "like me," just under 30 percent of the incoming news represented the other side's perspective – so-called cross-cutting material.

Overall, the algorithm organizing what a user is most likely to see reduces cross-cutting content by slightly less than 1 percent, while a user's self-built bubble reduces that content by about 4 percent.

Given the relatively small influence of the algorithm, the results "are not all that different from a lot of what we know about how people are acting across ideological and party lines in the real world," says Patrick Miller, a political scientist at the University of Kansas at Lawrence who also studies the interplay between social media and politics.

In many ways, a "don't shoot me, I'm just the piano player" sensibility about the study is justified, he suggests. A vast amount of social-science research has made it "very clear that when people are building their online social networks, they're building them to reflect their offline social networks."

And offline, people live in partisan bubbles in a country that has become increasingly polarized, he adds.

But that doesn't let Facebook off the hook as the algorithm's designer, others caution.

"Selectivity has always existed. But now we're living in different world," says Dietram Scheufele, who specializes in science communication at the University of Wisconsin at Madison. Facebook "is enabling levels of selectivity that have never been possible before."

For instance, he says, research has shown that two people with identical friends will get different news feeds from each other based on the pictures the two clicked on, posts from those friends they "liked," or even something as unrelated to friends as the websites they used Facebook to log into.

Although people always have built ideological bubbles, "that doesn't mean we have to make it worse," online, he says.

Yet it's also true that people would be overwhelmed by posts if some sort of sifting wasn't done ahead of time, Northeastern's Professor Lazer acknowledges.

Perhaps the study's biggest contribution is to provoke a recognition about how much information being gathered about people is being archived and used for everything from organizing and presenting Facebook news items to setting different prices on items sold on e-commerce sites based on information gathered about the purchaser.

A lot of the algorithms that focus choices based on personal profiles "are done for our convenience, but some of it, frankly, is to exploit us," he says.

"I'm not saying we need to go back to the pre-internet age," he says. But in "Matrix" like fashion, the line between the real and virtual worlds are blurring, he adds.

"We have to think about what is good and what is bad. This study doesn't answer that question, but it does provoke the question. We're really behind where we should be in terms of debating these things as a society," he says.

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Is Facebook to blame for making us more polarized? No, we are.
Read this article in
https://www.csmonitor.com/USA/2015/0507/Is-Facebook-to-blame-for-making-us-more-polarized-No-we-are
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe