How information overload helps spread fake news

Mathematical modeling of social networks reveals how misinformation finds its way to the top – and offers clues for how to dampen the spread of false information.

Jessica Gresko/AP/File
Flowers and notes left by well-wishers are displayed outside Comet Ping Pong, in December 2016. The Washington, D.C., pizza restaurant was the target of fake news stories linking it to a child sex trafficking ring.

A lie can travel halfway around the world, goes the well-known Mark Twain quote, before the truth can get its boots on.

Twain himself might have appreciated this quotation's self-reflexivity: There's no record of him ever having said or written it.

Today, with half of Americans now turning to social media for news, many of us are getting misinformation – for instance, that NASA has contacted intelligent extraterrestrials, that a “breatharian” couple can survive on a “food-free lifestyle” – mixed in with the legitimate news articles in our feeds. And, as the news cycle accelerates, it's becoming harder to tell the difference. 

A new study reveals the mathematics underlying this phenomenon, modeling how information overload can erode an individual's ability to distinguish high-quality information from its opposite, causing falsehoods to propagate. But with a little effort, readers and social media platforms can cut the information surplus, perhaps sharpening our powers of discernment. 

“On a daily basis,” says Daniel Levitin, a professor of psychology and behavioral neuroscience at McGill University in Montreal, “the onslaught of information is preventing us from being evidence-based decision makers, at our own peril.” 

Misinformation is as old as culture itself, and the phenomenon uncovered in this study shows its spread is not limited to one kind of social media. 

“Many arguments around gossip and rumors are really driven by the same social mechanisms,” says Brian Uzzi, the co-director of Northwestern University's Institute on Complex Systems in Evanston, Ill. “The internet has essentially turbocharged the inclination of human beings to behave this way in regard to news and facts.” 

A paper published Monday in the journal Nature Human Behaviour by an international team of researchers offers a mathematical model that demonstrates that, as information load increases, so do the odds that low-quality information will go viral

“It was the first paper I've seen in this area that quantifies what many people thought was happening, and that's basically with limited attention we're unable to see the full range of potential arguments or sides of the story,” says Dr. Uzzi, who has studied how social media users isolate themselves into echo chambers. 

Losing ability to tell fact from fiction

Using mathematical modeling, a team led by Xiaoyan Qiu and Diego Oliveira of Indiana University's Center for Complex Networks and Systems Research statistically confirmed that when flooded with a steady stream of high- and low-quality information, even the most critical readers start to lose their ability to tell fact from fiction.

“Even when individual users can recognize and select quality information,” says study co-author Filippo Menczer, a professor of informatics and computer science at Indiana University, “the social media market rarely allows the best information to win the popularity contest.”

The researchers suggest that social networks could curb information overload by aggressively limiting content shared by so-called bot accounts, software agents that flood social networks with low-quality information. 

“Deceptive bots can be quite sophisticated and hard to recognize even for humans. And huge numbers of them can be managed via software, so it is difficult for operators to keep up,” says Dr. Menczer. 

The research reveals some of the math that drives what psychologists have long known: Information overload makes it harder to make decisions. “The key point of this article is what neuroscientists have been what showing on the biology side has very practical, real-world implications in our daily lives,” says Dr. Levitin, the author of “Weaponized Lies: How to Think Critically in the Post-Truth Era.”

Fighting information overload

Levitin notes that the average American is exposed to about five times as much information than in 1986. “In the old days, I'd get the newspaper in the morning and I'd read about what happened yesterday,” he says. “Now, everyone seems to be addicted to what happened five minutes ago.” 

Levitin recommends unplugging from the internet for a couple hours each morning and again each afternoon. “If you're constantly checking your phone for the latest news, you're allowing your thoughts to become disrupted and fractionated and it becomes harder and harder to concentrate, and you get addicted to this constant stimulation,” he says. “So I think what we can do is give ourselves a break.”

But taking a break can be difficult for people who have become accustomed to steady social media contact with friends and family.“Trouble arises when we use the same networks to access news,” says Menczer, who advises against defriending or unfollowing those with different opinions, because echo chambers make users more susceptible to misinformation.

“We hope that by now, citizens and policymakers from across the political spectrum recognize the need for research to study digital misinformation and how to make the web more reliable,” says Menczer. “We are all vulnerable to manipulation irrespective of our political leanings.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to

QR Code to How information overload helps spread fake news
Read this article in
QR Code to Subscription page
Start your subscription today