Research shows how to 'innoculate' readers from fake news

A new study finds that 'inoculation' messages can keep readers from accepting dubious claims. Will the media adopt this approach?

AP Photo/Matt Rourke/File
The Facebook logo is displayed on an iPad in Philadelphia. Facebook is taking new measures to curb the spread of fake news on its huge and influential social network, focusing on the 'worst of the worst' offenders and partnering with outside fact-checkers to sort honest news reports from made-up stories.

The old saying “forewarned is forearmed” applies to the fight against fake news.

Putting a short, contextualizing message before spurious news stories can keep readers from accepting their claims, according to a paper published Monday in the journal Global Challenges.

The paper’s authors – Anthony Leiserowitz and Seth Rosenthal at Yale University; Sander van der Linden at Churchill College, Cambridge, England; and Edward Maibach at George Mason University – wanted to investigate public understanding of the scientific consensus that humans are causing climate change.

To this end, they divided a national sample of the American public into six different groups. Several of them read the Oregon Petition, which claims to have received the signatures of more than 31,000 climate-skeptic scientists – and which Professor Leiserowitz calls a “classic piece of disinformation.” Some of its signatories include the long-deceased Charles Darwin and members of the Spice Girls.

With the help of social media, sketchy information like this often gets taken as fact, circulates freely, and causes widespread damage before it’s debunked. The researchers say they’ve found a solution, but it’s up to newsreaders and the media to adopt it.

“You can inoculate against the effects of fake news,” Leiserowitz, director of the Yale Program on Climate Change Communication, tells The Christian Science Monitor in a phone interview. He and his colleagues gave one group just the petition. Another group instead viewed a pie chart showing that, in reality, 97 percent of climate scientists agree that humans are driving climate change. A third group saw the pie chart, then read the bogus petition.

Two other groups read the petition after reading what Leiserowitz calls “inoculations:” warnings that politically motivated groups try to use misleading tactics to raise doubts among the public about the scientific consensus, along with specific warnings about the Oregon Petition, followed by the pie chart. ”If you can give people a little pre-awareness that what they are likely to hear is bogus, you can inoculate against disinformation,” he says.

The pie charts served that role. Leiserowitz explains that “when we give both messages” – the petition and the pie chart – “side-by-side, they basically cancel each other out,” and that perceptions of the “scientific consensus” did not change. But when readers read inoculation messages before the fake Oregon petition, their own estimate of the degree of scientific consensus still increased. The effect, the researchers noted, was similar “across political party affiliation.”

This means that such “inoculations” can help preserve the truth – and not just with scientific stories. Professor Leiserowitz suggested that similar warnings could also have helped rebut false claims the number of attendees at President Trump’s inauguration before they gained traction. 

But these messages will only work if news organizations use them. Some news websites have doubled down on their commitment to fact-checking: The Washington Post, for instance, recently introduced a Twitter plug-in that checks the veracity of Mr. Trump’s tweets in real time.

But readers of other publications may be less receptive to these messages. Leiserowitz acknowledged that many news sites “basically say, ‘Do not trust any information that comes from outside news sources....' That increasingly walls us off from one another.”

Jack Zhou, an instructor in environmental politics at Duke University in Durham, N.C., says some occupants of so-called “news bubbles” may prefer to accept fake news as truth. “The state of fragmented media may dull the potential practical impact of inoculation messages, particularly in terms of the audiences serviced by those media,” Mr. Zhou, who has researched the identity politics of climate change, tells the Monitor in an email.

After all, sites with fake news are only catering to their audiences. Paul Levinson, a communications professor at Fordham University in New York, told the Monitor in December that, “These bubbles have not been imposed upon the public – it was what the people want. As long as social media continues to provide a very easy forum for these news bubbles ... it is not going to stop.” [Editor's note: An earlier version of this story misspelled Mr. Levinson's first name.]

Social-media giant Facebook recognizes this as a problem. After chief executive officer Mark Zuckerberg initially denied that Facebook’s carrying fake news was harming public discourse, the company announced a campaign to detect and flag unfounded stories.

The campaign was announced in mid-December, before the inoculation paper was published. But it also fights fake news by putting it in context. Facebook has partnered with online fact-checking services to spot dubious links. TechCrunch reports that, when they find one, Facebook will "show posts of those links lower in the News Feed. It will also attach a warning label noting 'Disputed by [one or more of the fact checkers]' with a link to the debunking post on News Feed stories and in the status composer if users are about to share a dubious link."

The same article reported that 44 percent of US adults get news from Facebook. The study of inoculation messages released Monday suggests that its plan to flag and debunk fake news could keep those users from accepting it – if they click at all after seeing a warning label. 

That, in turn, could benefit civic discourse. Without a broadly accepted set of facts, Professor Leiserowitz explains, “it gets harder and harder to have rational conversations.”

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to

QR Code to Research shows how to 'innoculate' readers from fake news
Read this article in
QR Code to Subscription page
Start your subscription today