"For this is good and acceptable in the sight of God our Saviour; Who will have all men to be saved, and to come unto the knowledge of the truth." 1 Timothy 2:3-4 KJV    (AWFSM)


A social psychologist found that showing people how manipulative techniques work can create resilience against misinformation.

Misinformation can feel inescapable. Last summer a survey from the nonprofit Poynter Institute for Media Studies found that 62 percent of people regularly notice false or misleading information online. And in a 2019 poll, almost nine in 10 people admitted to having fallen for fake news. Social psychologist Sander van der Linden of the University of Cambridge studies how and why people share such information and how it can be stopped. He spoke with Mind Matters editor Daisy Yuhas to discuss this work and his new book, Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity , which offers research-backed solutions to stem this spread.

[ An edited transcript of the interview follows. ]

In Foolproof, you borrow an analogy from the medical world, arguing that misinformation operates a lot like a virus. How did you come to that comparison?

I was going through journals and found models from epidemiology and public health that are used to understand how information propagates across a system. Instead of a virus spreading, you have an information pathogen. Somebody shares something with you, and you then spread it to other people.

That led me to wonder: If it’s true that misinformation spreads like a virus, is it possible to inoculate people? I came across some work from the 1960s by Bill McGuire , a psychologist who studied how people could protect themselves from “brainwashing.” He had a very similar thought. That connection led to this whole program of research.

[ Read more about scientifically backed strategies to fight misinformation ]

How do we get “infected”?

A virus attacks by exploiting our cells’ weak spots and hijacking some of their machinery. It’s the same for the mind in many ways. There are certain cognitive biases that can be exploited by misinformation. Misinformation infects our memories and influences the decisions that we make.

One example is the illusory truth bias. That’s the idea that just hearing something repeatedly—even if you know that it is wrong—makes it seem more true. These learned automatic associations are part of how the brain works.

In your research, you’ve extended the virus metaphor to argue that we can vaccinate ourselves against misinformation through a technique that you call “prebunking.” How does that work?

Prebunking has two parts. First is forewarning, which jump-starts the psychological immune system because it’s sleeping most of the time. We tell people that someone may want to manipulate them, which raises their skepticism and heightens their awareness.

The second part of the prebunk is analogous to providing people with a weakened dose of the virus in a vaccine. For example, in some cases, you get a small dose of the misinformation and tips on how to refute it. That can help people be more resilient against misinformation.

In addition, we have found that there are general techniques used to manipulate the spread of misinformation in a lot of different environments. In our studies, we have found that if you can help people spot those broader techniques, we can inoculate them against a whole range of misinformation. For instance, in one study, people played a game [ Bad News ] to help them understand the tactics used to spread fake news. That improved their ability to spot a range of unreliable information by about 20 to 25 percent.

So you help people recognize and resist incoming misinformation broadly by alerting them to the techniques people use to manipulate others. Can you walk me through an example?

Sure. We created a series of videos in partnership with Google to make people more aware of manipulative techniques on YouTube. One is a false dichotomy, or false dilemma. It’s a common tactic and one that our partners at Google alerted us to because it’s present in many radicalization videos.

In a false dichotomy, someone incorrectly asserts that you have only one of two options. So an example would be “either you’re not a good Muslim, or you have to join ISIS.” Politicians use this approach, too. In a U.S. political context, an example might be: “We have to fix the homelessness problem in San Francisco before we start talking about immigrants.”

In our research, we have exposed people to this concept using videos that explain false dichotomies in nonpolitical scenarios. We use popular culture like Family Guy and Star Wars . People have loved it, and it’s proved to be a really good vehicle.

So in our false dichotomy video, you see a scene from a Star Wars movie, Revenge of the Sith, where Anakin Skywalker says to Obi-Wan Kenobi, “If you’re not with me, then you’re my enemy,” to which Obi-Wan replies, “Only a Sith deals in absolutes.” The video cuts to explain that Anakin has just used a false dichotomy.

After seeing a video like this, the next time you’re presented with just two options, you realize somebody may be trying to manipulate you.

In August you published findings from a study with more than 20,000 people viewing these videos , which called out techniques such as false dilemmas, scapegoating and emotionally manipulative language. What did you learn?

What we find is that, using these videos, people are better able to recognize misinformation that we show them later both in the lab and on social media. We included a live test on the YouTube platform. In that setup, the environment is not controlled, and people are more distracted, so it’s a more rigorous test.

These videos were part of an ad campaign run by Google that had millions of views. Google has now rolled out videos based on this research that are targeted at misinformation about Ukraine and Ukrainian refugees in Europe. They are specifically helping people spot the technique of scapegoating.

In the book, you point out that many people who think they are immune to misinformation are not. For instance, in one survey, almost 50 percent of respondents believed they could spot fake news, but only 4 percent succeeded. Even “digital natives” can fall for fake content . Can this happen to anyone?

A lot of people are going to think that they’re immune. But there are basic principles that expose us all. For example, there is an evolutionary argument that’s quite important here called the truth bias. In most environments, people are not being actively deceived, so our default state is to accept that things are true. If you had to critically question everything, you couldn’t get through your day. But if you are in an environment—like on social media—where the rate of misinformation is much higher, things can go wrong.

In addition to biases, the book highlights how certain social behaviors and contexts, including online echo chambers, skew what we see. With so many forces working against us, how do you stay optimistic?

We do have biases that can be exploited by producers of misinformation. It’s not easy, given all of the new information we’re exposed to all the time, for people to keep track of what’s credible. But I’m hopeful because there are some solutions. Prebunking is not a panacea, but it’s a good first line of defense, and it helps, as does debunking and fact-checking. We can help people maintain accuracy and stay vigilant.

Are you a scientist who specializes in neuroscience, cognitive science or psychology? And have you read a recent peer-reviewed paper that you would like to write about for Mind Matters? Please send suggestions to Scientific American ’s Mind Matters editor Daisy Yuhas at  pitchmindmatters@gmail.com .

This is an opinion and analysis article, and the views expressed by the author or authors are not necessarily those of  Scientific American .