The Filter Bubble: What the Internet is Hiding from You
Is the Internet actually narrowing your world?
Google knows that my wife and I are expecting our first child. Our recent search history goes something like this: “things you need to buy for your first baby”; “why do my fingers get fat when I’m pregnant?”; and “is it worth buying a diaper bin?”Skip to next paragraph
Subscribe Today to the Monitor
I noticed that in my in-box I was getting lots of baby-related emails; the ads showing up in my Gmail account mostly pertained to infants. No matter how many different ways we searched, all avenues seemed to lead back to the same products. It was as if the Web knew what we wanted.
This is what Eli Pariser, in a fascinating new book about the increasingly personalized Internet, calls The Filter Bubble. Search engines weight our search results to our own preferences. (My search results won’t look like yours.) Sites will filter our news (without asking us) to bring us what they think we want.
Pariser, a former executive director of the advocacy group MoveOn, pulls back the curtain on the dark arts of search and Internet advertising. There is “behavioral retargeting,” which means that you might check out a pair of shoes in an online store and leave without making a purchase – only then to find their ads following you around the Internet. Or advertising based on your “persuasion profile,” which isn’t just concerned with the types of products you like but “which kinds of arguments might cause you to choose one over another.”
With 36 percent of Americans under 30 getting their news through social-networking sites, personalization also affects the news we consume. Ever wonder why you don’t see updates from some Facebook friends in your News Feed? It’s due to an algorithm, partly based on the amount of time you spend interacting with that person.
The consequences of this social engineering, Pariser argues, is that we interact more with people who think like we do. Rather than fulfilling the early Internet dreams of diversity and freedom of choice, we are living in an echo chamber. As a result, there’s less room for “the chance encounters that bring insight and learning.” Where once we had human news editors who would temper the Britney coverage with a foreign war or two, now algorithms select our news for us based on what we click on and what we share.
The idea that the Web is an echo chamber is almost as old as the Web itself. But there still isn’t much empirical evidence to suggest that the Internet is narrowing our collective horizons. A new Pew report, “Social Networking Sites and Our Lives,” found that there is no relationship “between the use of social networking services and the diversity of people’s overall social networks.” Nor were Internet users less likely to consider both sides of an issue.