Over the last four decades, a highly organized, well-funded campaign powered by the fossil fuel industry has sought to discredit the science that links global climate change to human emissions of carbon dioxide and other greenhouse gases. These disinformation efforts have sown confusion over data, questioned the integrity of climate scientists and denied the scientific consensus on the role of humans.
Such disinformation efforts are outlined in internal documents from fossil fuel giants such as Shell and Exxon. As early as the 1980s, oil companies knew that burning fossil fuels was altering the climate, according to industry documents reviewed at a 2019 U.S. House of Representatives Committee on Oversight and Reform hearing. Yet these companies, aided by some scientists, set out to mislead the public, deny well-established science and forestall efforts to regulate emissions.
Special report: Awash in deception
But the effects of climate change on extreme events such as wildfires, heat waves and hurricanes have become hard to downplay (SN: 12/19/20 & SN: 1/2/21, p. 37). Not coincidentally, climate disinformation tactics have shifted from outright denial to distraction and delay (SN: 1/16/21, p. 28).
As disinformation tactics evolve, researchers continue to test new ways to combat them. Debunking by fact-checking untrue statements is one way to combat climate disinformation. Another way, increasingly adopted by social media platforms, is to add warning labels flagging messages as possible disinformation, such as the labels Twitter and Facebook (which also owns Instagram) began adding in 2020 regarding the U.S. presidential election and the COVID-19 pandemic.
At the same time, Facebook was sharply criticized for a change to its fact-checking policies that critics say enables the spread of climate disinformation. In 2019, the social media giant decided to exempt posts that it determines to be opinion or satire from fact-checking, creating a potentially large disinformation loophole.
In response to mounting criticism, Facebook unveiled a pilot project in February for its users in the United Kingdom, with labels pointing out myths about climate change. The labels also point users to Facebook’s climate science information center.
For this project, Facebook consulted several climate communication experts. Sander van der Linden, a social psychologist at the University of Cambridge, and cognitive scientist John Cook of George Mason University in Fairfax, Va., helped the company develop a new “myth-busting” unit that debunks common climate change myths — such as that scientists don’t agree that global warming is happening.
Researchers John Cook and Sander van der Linden hope to inoculate people against climate change denial messages.
Cook and van der Linden have also been testing ways to get out in front of disinformation, an approach known as prebunking, or inoculation theory. By helping people recognize common rhetorical techniques used to spread climate disinformation — such as logical fallacies, relying on fake “experts” and cherry-picking only the data that support one view — the two hope to build resilience against these tactics.
Subscribe to Science News
Get great science journalism, from the most trusted source, delivered to your doorstep.
This new line of defense may come with a bonus, van der Linden says. Training people in these techniques could build a more general resilience to disinformation, whether related to climate, vaccines or COVID-19.
Science News asked Cook and van der Linden about debunking conspiracies, collaborating with Facebook and how prebunking is (and isn’t) like getting vaccinated. The conversations, held separately, have been edited for brevity and clarity.
We’ve seen both misinformation and disinformation used in the climate change denial discussion. What’s the difference?
van der Linden: Misinformation is any information that’s incorrect, whether due to error or fake news. Disinformation is deliberately intended to deceive. Then there’s propaganda: disinformation with a political agenda. But in practice, it’s difficult to disentangle them. Often, people use misinformation because it’s the broadest category.
Has there been a change in the nature of climate change denialism in the last few decades?
Cook: It is shifting. For example, we fed 21 years of [climate change] denial blog posts from the U.K. into a machine learning program. We found that the science denialism misinformation is gradually going down — and solution misinformation [targeting climate policy and renewable energy] is on the rise [as reported online in early March at SocArXiv.org].
As the science becomes more apparent, it becomes more untenable to attack it. We see spikes in policy misinformation just before the government brings in new science policy, such as a carbon pricing bill. And there was a huge spike before the  Paris climate agreement. That’s what we will see more of over time.
How do you hope Facebook’s new climate change misinformation project will help?
Cook: We need tech solutions, like flagging and tagging misinformation, as well as social media platforms downplaying it, so [the misinformation] doesn’t get put on as many people’s feeds. We can’t depend on social media. A look behind the curtain at Facebook showed me the challenge of getting corporations to adequately respond. There are a lot of internal tensions.
van der Linden: I’ve worked with WhatsApp and Google, and it’s always the same story. They want to do the right thing, but don’t follow through because it hurts engagement on the platform.
But going from not taking a stance on climate change to taking a stance, that’s a huge win. What Facebook has done is a step forward. They listened to our designs and suggestions and comments on their [pilot] test.
We wanted more than a neutral [label directing people to Facebook’s information page on climate change], but they wanted to test the neutral post first. That’s all good. It’ll be a few months at least for the testing in the U.K. phase to roll out, but we don’t yet know how many other countries they will roll it out to and when. We all came on board with the idea that they’re going to do more, and more aggressively. I’ll be pleasantly surprised if it rolls out globally. That’s my criteria for success.
Scientists have been countering climate change misinformation for years, through fact-checking and debunking. It’s a bit like whack-a-mole. You advocate for “inoculating” people against the techniques that help misinformation spread through communities. How can that help?
van der Linden: Fact-checking and debunking is useful if you do it right. But there’s the issue of ideology, of resistance to fact-checking when it’s not in line with ideology. Wouldn’t life be so much easier if we could prevent [disinformation] in the first place? That’s the whole point of prebunking or inoculation. It’s a multilayer defense system. If you can get there first, that’s great. But that won’t always be possible, so you still have real-time fact-checking. This multilayer firewall is going to be the most useful thing.
You’ve both developed online interactive tools, games really, to test the idea of inoculating people against disinformation tactics. Sander, you created an online interactive game called Bad News, in which players can invent conspiracies and act as fake news producers. A study of 15,000 participants reported in 2019 in Palgrave Communications showed that by playing at creating misinformation, people got better at recognizing it. But how long does this “inoculation” last?
van der Linden: That’s an important difference in the viral analogy. Biological vaccines give more or less lifelong immunity, at least for some kinds of viruses. That’s not the case for a psychological vaccine. It wears off over time.
In one study, we followed up with people [repeatedly] for about three months, during which time they didn’t replay the game. We found no decay of the inoculation effect, which was quite surprising. The inoculation remained stable for about two months. In [a shorter study focused on] climate change misinformation, the inoculation effect also remained stable, for at least one week.
John, what about your game Cranky Uncle? At first, it focused on climate change denial, but you’ve expanded it to include other types of misinformation, on topics such as COVID-19, flat-earthism and vaccine misinformation. How well do techniques to inoculate against climate change denialism translate to other types of misinformation?
Cook: The techniques used in climate denial are seen in all forms of misinformation. Working on deconstructing [that] misinformation introduced me to parallel argumentation, which is basically using analogies to combat flawed logic. That’s what late night comedians do: Make what is obviously a ridiculous argument. The other night, for example, Seth Meyers talked about how Texas blaming its [February] power outage on renewable energy was like New Jersey blaming its problems on Boston [clam chowder].
My main tip is to arm yourself with awareness of misleading techniques. Think of it like a virus spreading: You don’t want to be a superspreader. Make sure that you’re wearing a mask, for starters. And when you see misinformation, call it out. That observational correction — it matters. It makes a difference.