Google knows that my wife and I are expecting our first child. Our recent search history goes something like this: “things you need to buy for your first baby”; “why do my fingers get fat when I’m pregnant?”; and “is it worth buying a diaper bin?”
I noticed that in my in-box I was getting lots of baby-related emails; the ads showing up in my Gmail account mostly pertained to infants. No matter how many different ways we searched, all avenues seemed to lead back to the same products. It was as if the Web knew what we wanted.
This is what Eli Pariser, in a fascinating new book about the increasingly personalized Internet, calls The Filter Bubble. Search engines weight our search results to our own preferences. (My search results won’t look like yours.) Sites will filter our news (without asking us) to bring us what they think we want.
Pariser, a former executive director of the advocacy group MoveOn, pulls back the curtain on the dark arts of search and Internet advertising. There is “behavioral retargeting,” which means that you might check out a pair of shoes in an online store and leave without making a purchase – only then to find their ads following you around the Internet. Or advertising based on your “persuasion profile,” which isn’t just concerned with the types of products you like but “which kinds of arguments might cause you to choose one over another.”
With 36 percent of Americans under 30 getting their news through social-networking sites, personalization also affects the news we consume. Ever wonder why you don’t see updates from some Facebook friends in your News Feed? It’s due to an algorithm, partly based on the amount of time you spend interacting with that person.
The consequences of this social engineering, Pariser argues, is that we interact more with people who think like we do. Rather than fulfilling the early Internet dreams of diversity and freedom of choice, we are living in an echo chamber. As a result, there’s less room for “the chance encounters that bring insight and learning.” Where once we had human news editors who would temper the Britney coverage with a foreign war or two, now algorithms select our news for us based on what we click on and what we share.
The idea that the Web is an echo chamber is almost as old as the Web itself. But there still isn’t much empirical evidence to suggest that the Internet is narrowing our collective horizons. A new Pew report, “Social Networking Sites and Our Lives,” found that there is no relationship “between the use of social networking services and the diversity of people’s overall social networks.” Nor were Internet users less likely to consider both sides of an issue.
While not exactly a techno-pessimist, Pariser falls into the techno-pessimist’s trap of the Imagined Analogue Past. It is a rose-colored world that always forms the backdrop to books about the effects of the Internet. A world without digital distractions, with enlightening serendipitous encounters, where civic-minded news producers made sure we saw reports about famine in distant lands. In the Imagined Analogue Past we all had meaningful offline friendships, devoid of any superficiality.
But of course we never really lived like that. If our worlds are echo chambers now, what were they before, when every day we read the same newspaper, with its inherent biases in politics and scope? If the Internet is an echo chamber, what about the churches or progressive book clubs we attend? If you do live in an Internet echo chamber then that’s probably of your own making; in the pre-digital world you would have lived in one too. And anyone who has never experienced serendipity on the Internet has never been on YouTube.
Where Pariser's book is most effective is in deconstructing the myth of “disintermediation” – the idea, popular among techno-utopians, that the Internet
would “flatten society, unseat the elites, and usher in a kind of global utopia,” where we would no longer need gatekeepers such as newspapers, cable television, or even politicians. Pariser eloquently makes the case that we might have gotten rid of a few gatekeepers, but we've just replaced them with new ones (namely Facebook and Google).
“The Filter Bubble” is less clear, though, about what we should do about it. One of Pariser’s proposals – and it's a good one – is for tech companies to make their filtering practices less opaque and be more up front about the way in which they are collecting and using our information. The author goes further, suggesting “filtering systems to expose people to topics outside their normal experience.” But is such social and civic engineering really the job of businesses like Google or Facebook? Paternalism aside, there is an irony in engineering more randomness. Google’s “I’m Feeling Lucky,” after all, isn’t based on luck.
Pariser writes beautifully about the new digital world in which we find ourselves but, ultimately, he doesn’t show us a future that seems to be any bleaker than the past.
Luke Allnutt is a Monitor contributor.