Back to Articles

The Backfire Effect Backfires

Providing Accurate Information Sometimes Works
August 26, 2019 | Comments

Here are five untruths that you can easily find somewhere on the internet at numerous internet sites.

  1. There is considerable debate among climate scientists about whether global temperatures are rising because of human activity.
  2. Genetically modified foods are known to cause cancer.
  3. The measles, mumps, and rubella (MMR) vaccine causes autism.  
  4. Only a small number of people infected with the bacteria that cause Lyme disease ever have a positive blood test.
  5. More people have used guns to successfully defend themselves than have used them to kill themselves.  

The correct information that goes with each of the untruths above is also readily available:

  1. Almost all climate scientists agree that global temperature increases are the result of anthropogenic (i.e. human) activities like burning fossil fuels.  
  2. There is no evidence that genetically modified foods cause cancer.  
  3. Vaccines definitely do not cause autism.
  4. Most people infected with Borrelia burgdorferi, the microorganism that causes Lyme disease, will eventually have a positive Lyme blood test. [within x weeks of infection]
  5. Far more people use guns to kill themselves than to defend themselves. 

A striking example of the backfire effect comes from a 2014 study conducted by Brendan Nyhan of Dartmouth College and colleagues that was published in the medical journal Pediatrics. In this study, 1,759 parents were randomly assigned to receive one of four information packages about the MMR (measles, mumps, and rubella) vaccine:

  1.     Written information explaining the lack of evidence that the MMR vaccine causes autism
  2.     Textual information about the dangers of the diseases prevented by the MMR vaccine
  3.     Images of children who have the diseases prevented by the MMR vaccine
  4.     A dramatic narrative about an infant who almost died of measles

To everyone’s surprise, the images of sick children and dramatic narrative about the infant in danger actually increased the research participants’ belief in a link between the vaccine and autism. For parents who had the least favorable attitudes about vaccines to begin with, any refutation of the claim that there is a link between the MMR vaccine and autism decreased their intention to vacciner [vaccinate] their children.

 The reverberations of this rigorously conducted and reported study by a highly respected group of scientists were far reaching: here was evidence that not only does giving correct information have little effect on changing people’s minds, it actually can backfire and make them more insistent on myths like the alleged connection between vaccines and serious neurological adverse effects. This stimulated extensive research into what is behind the backfire effect and what strategies instead of providing corrective information need to be employed to encourage people to accept scientific evidence.

Showing parents pictures like this of children with measles made many of them less likely to want to vaccinate their children, an example of the “backfire effect” (image: Shutterstock).

The backfire effect does have some theoretical support. First of all, it is clear that science denial is not a “deficit” problem, that is, people who hold ideas that are contrary to what scientists believe often know very well what the science shows. They just do not believe it. People who dispute the validity of evolutionary science, for example, often know Darwin’s theories and their scientific basis in detail, but still adhere to creationist beliefs. So providing correct information is addressing a problem that in many cases doesn’t exist.

Second, showing people graphic images of something like a child with severe complications from measles might activate the same fear and disgust pathways in the brain that maintain fears of vaccines. In such a case, stimulating the same neural pathways, regardless of what the stimulus actually is, may just reinforce the earlier fears and not change them. That is, pictures of children near death from measles inadvertently summon up images of children suffering from vaccine complications because the neural pathways underlying these emotions are the same.

Finally, we know that people tend to align themselves into social groups with other people who hold similar beliefs. Challenging those beliefs with competing information threatens their group membership and may stimulate defensive maneuvers that strengthen rather than counteracting adherence to misinformation. One study showed that the more facts people actually knew about a controversial topic (gun control), the more they used new information to support their original beliefs rather than beliefs consistent with the new data. The authors of that study called this “identity-protective cognition”

Backfire Doesn’t Always Happen

         As persuasive as these theories are, however, some recent studies challenge the evidence supporting the backfire effect and suggest that counteracting misinformation with correct information may actually work to change at least some people’s minds. In a study published this year in Nature Human Behaviour, for example, researchers from the University of Erfurt, Germany conducted six experiments in which they showed that two types of factual rebuttal techniques, one called topic rebuttal and the other technical rebuttal, had equal effects in reducing the influence of vaccine and other types of science deniers on the participants in their experiments. The effect even extended to people who were most vulnerable to science denial influencers.  In fact, Diana Kwon in reporting on the [this] study in Scientific American, noted that “the influence of deniers was higher in individuals who had low confidence in vaccinations and in those in the U.S. who identified as conservative. People from these groups also benefited the most from the rebuttals”.

Kwon goes on to discuss the backfire effect: “While a handful of studies have provided evidence that such unintended results may be widespread, more recent investigations have found that these effects may be limited to specific circumstances—such as among people whose fundamental beliefs about a functioning society are challenged by the new information.”

It is important to note that the six experiments about which Kwon is writing involved recruited undergraduates, whereas in the Nyhan et al study participants were parents. The latter clearly have more stake in the vaccination game than the former and perhaps form more intense ideas about vaccinations in the face of actually having to make the decision whether to vaccinate a child. It is also noteworthy that when the rebuttal techniques were applied to climate denial, their effects were much weaker. So the positive effects of corrective information may depend on the audience to whom it is directed and the specific topic of science denial.

Focus on Those On the Fence

The notion that corrective information may work only for specific types of people is reinforced by a study from Princeton University investigators that demonstrated that people “on the fence” about a scientific issue are most likely to remember the correct information and less likely to remember incorrect information. In general, studies unsurprisingly show that once people have a fixed notion it is very hard–but not impossible– to change their minds. Debating iconoclasts who actively promote scientific misinformation is probably a waste of time because it gives them a platform from which to disseminate myths and creates the impression that their point of view is equally valid as the scientific expert in the debate.

 Instead, it seems most worthwhile to concentrate on preventing people from developing incorrect, fixed ideas in the first place. Imagine, for example, you are the parent of a two-month-old infant who is scheduled to get her first vaccinations. You’ve heard some things about vaccines, but haven’t really looked into it much yet, so you go to the internet and social media. You light upon an appealing Facebook group called “Parents for Children’s Health” and begin to scroll down the comments. You read things like “the drug companies that make vaccines are covering up the evidence that vaccines cause autism;” “my baby was completely normal until he had his first vaccinations; then tragedy struck;” and “giving so many vaccinations in such a short period of time overwhelms the immune system and makes babies sick.”

 These are frightening testimonials, so you next look at some of the websites recommended by people who are in the Facebook group. Here you find information reinforcing the terrifying claims made by members of the group, some of which comes from people who even have PhD’s. After a few hours of this, it would not be surprising that you, as a responsible parent, developed a fixed resistance to vaccinating your baby.

 The problem with all of this, of course, is that none of the information on the Facebook site is true. But none of it is challenged either. What if someone monitored social media sites for this kind of thing and, whenever such misinformation popped up, inserted a comment like “there is information you can easily find that contradicts many of the things being said here.” Perhaps farther down the line a comment could be inserted like “actually, the studies being talked about here are not being accurately described.” Although this is unlikely to change the minds of the people who started the Facebook group, could it sway a person “on the fence” who is seeing it for perhaps the first time and hasn’t yet developed a rigid position? Are there ways of doing this that are more effective than others? These are the kinds of questions we at Critica are now actively asking and studying.

 In a provocative op-ed piece in the New York Times titled “Dr. Google Is a Liar,” cardiologist Haider Warreich calls for “search engines, social media platforms and websites [to] be held responsible for promoting or hosting fake information”. Given that despite this is not likely to happen soon or in any particularly effective way, we believe that scientists, healthcare professionals, and public health experts should take an active role in monitoring and counteracting misinformation about science and health issues as they crop up on social media and on the internet.  After years of believing that public education is not their responsibility, scientists are increasingly recognizing the unique role they can play in helping people understand and accept scientific evidence.

  At the same time, we desperately need more data on the best methods of conveying that corrective information. “Public health messaging,” Roy Grant wrote in the American Journal of Public Health, “is often too nuanced to satisfy people’s need to know”.

Inoculating Against Misinformation

One promising area of research in this area uses what is called the “inoculation theory.” This work is based on mounting evidence that people rapidly form fixed ideas about such health and safety issues such as climate change, vaccination, and gun ownership. That makes counteracting myths difficult. But it turns out that “inoculating” against misinformation by “exposing people to a dose of refuted arguments before they hear them,” may be a way to prevent those myths from becoming fixed in the first place”.

In one typical inoculation experiment a news article was first shown to research participants that contained misinformation about climate change that implied there are an equal number of climate scientists who do not believe in climate change as those who do, an example of what is called “false-balance” that unfortunately crops up in media accounts of scientific issues frequently. Two types of counteracting information were also created and shown to the participants in this experiment. The first, called consensus, presented the facts about climate change, including that almost all climate scientists believe it is the true state of things. The second, called the inoculation, only contained general information about the technique of false-balance that was used in the article, without providing any specific corrective information about climate change. In the inoculation condition, then, subjects were pre-warned that they were about to read something that was based on a misleading way of reporting information.

In the study, 714 participants were randomly assigned to one of five groups; control (no information), misinformation only; full correct information (consensus) and then misinformation; inoculation and then misinformation; and full information (consensus) then inoculation and then misinformation. The results showed that the inoculation condition is particularly effective in neutralizing the effects of the false-balance misinformation. The authors of the study note that “inoculations in this study did not mention the specific misinformation that was presented after the inoculation but rather warned about misinformation in a broader sense by explaining the general technique being used to create doubt about an issue in the public’s mind.”

Can we inoculate people against scientific misinformation? (image: shutterstock). 

 

This inoculation method is only one of many ideas that are being discussed and studied in the area of correcting scientific and health misinformation. Some object to the use of the term “inoculation” for the method because it creates the impression that people being exposed to misinformation are somehow getting sick; one thing we know is that ad hominem criticism of people who hold incorrect ideas is sure to backfire. The question is whether the presentation of correct information on the right topic to the right audience can set the record straight.

Scientific and medical misinformation is rampant on the internet and social media and myths get repeated over and over until people believe them. “The glut of medical misinformation is real and it harms,” Jen Gunter wrote in The Lancet. “It turns people away from vaccines, fluoride, and leads them to useless products”. The list of harms caused by scientific misinformation is growing; Critica is proud to join the chorus of scientists, healthcare professionals, and public health experts who say the time has come for us to take active steps, using evidence-based methods, to fight science denial in the places it crops up and is looked at the most. In coming weeks and months, we will describe in more detail our specific projects to tackle this problem.

 

[1] After many years of quibbling about the use of pronouns “they” and “their” in sentences in which the subject is a singular noun or pronoun (e.g. he or she), we have decide to adopt new recommendations for the use of gender neutral pronouns and will try to use only “they” and “their” from now on. But it won’t be easy, and we hope you will be indulgent as we will undoubtedly frequently forget.

 

More Like This