Changing Our Minds Is Hard To Do
So how do we actually do it?
Changing our minds is hard to do. But without the ability to do so, science—and indeed all of society–would be paralyzed.
For example, once scientists thought the earth was the center of the universe, the sun and planets revolve around the sun, and there is nothing beyond the Milky Way galaxy.
The need to change our minds doesn’t apply solely to scientists. Society in general has had to revise our thinking about the safety of smoking cigarettes, using plastics, and burning fossil fuels.
Recently, following an incident in which two African American men were arrested for doing nothing wrong in one of its coffee shops, Starbucks closed its stores for several hours to deliver mandated diversity training to all of its employees. Few people questioned the need to change attitudes and behaviors among Starbucks’ employees. Despite our fondest wishes, racial prejudice exists in almost everyone, implicitly in some and glaringly out in the open among others. But because even the most subconscious forms of bias are capable of influencing how we behave to people of other races, cultures, and groups, improving how we think and act is essential.
Many questioned, however, whether one-off diversity training like the program Starbucks implemented, is actually capable of changing attitudes or behaviors. What does it take to actually change a person’s mind, especially when emotion-laden and deeply-held beliefs are involved?
In fact, previous research makes it unlikely that the Starbucks program will make a significant difference in how its employees think or act. As Julia Belluz points out in her article in Vox, the data show that anti-bias training like the program Starbucks presented its workers, doesn’t work. “The evidence we have,” she writes, “suggests trainings generally fail to alter racial biases and behaviors in the long-term – and can even backfire.” There is a long literature that shows you can teach people to respond correctly to test questions after a brief intervention. Ask questions about bias and diversity before and after a training session, and most people’s test scores improve. But studies also show that there is little long-lasting impact of such sessions and sometimes – especially when they are mandatory – there is a backlash effect that actually worsens bias.
In general, social science research into what can change a person’s most deep-seated ideas and beliefs has lacked rigor, and firm conclusions are lacking. Some studies suggest that more extensive diversity programs may work. According to Harvard sociologist Frank Dobbin, “things like mentoring, active recruitment programs, the creation of task forces that deal with diversity—all of these put people side-by-side working on something, and that appears to be much more effective [than diversity training] at promoting workforce diversity”.
Is there anything that neuroscience contributes to help understand the neural processes that occur when we confront our errors and contemplate changing our minds? Recently, we noted an intriguing suggestion from University of Pennsylvania professor Sigal Barsade: showing people that how our brains work to develop and maintain unconscious bias increases their willingness to learn and deal with it. The obvious question then, is: if it is the case that explaining to people how the brain works helps them change their minds, what should we tell them about how the brain works when we change our minds?
Emerging animal and human research is now shedding light on basic neural processes that occur when an organism changes its mind about something. Two recent studies are illustrative of this kind of work.
The report from one such study involving scientists from London, Amsterdam, and Princeton, NJ summarizes what is at stake as follows: “Changing one’s mind on the basis of new evidence is a hallmark of cognitive flexibility. Such reversals are supported computationally by sensitivity to post-decision evidence: if I have made an error and the new evidence is compelling, I should change my mind (p. 621).” To figure out how this works, the investigators manipulated visual sensory data presented to human subjects during functional magnetic resonance imaging (fMRI) of the brain. They found an intriguing sequence of events involving the part of the human brain that is located behind our eyes, the frontal cortex. As the sensory information presented to the subjects was changed and they became aware they were making incorrect choices, an area on the inner surface at the back of the frontal cortex of the brain, the posterior medial frontal cortex, was found to track the new information and record the errors. But the decision about whether to change one’s mind based on changing information and what decisions to make next were made in a different brain area located on the outer surface of the front of the frontal cortex, the anterior lateral prefrontal cortex.
These results suggest that while we may be aware of the errors we make, we do not necessarily change our minds about what is right and what is wrong. The two-step process involving two different brain regions means we are capable of sticking to our guns even when we know the data do not support our position. As the study’s authors conclude, “we reveal a neural signature of how new evidence is integrated to support graded changes of mind…Together, our findings shed light on the building blocks of changes of mind in the human brain and indicate possible targets for amelioration of such deficits (p. 623).”
To complicate matters, the second study about neural changes associated with changing one’s mind found two completely different areas of the brain to be involved. In this study, Northwestern University scientists used olfactory sensory stimuli and showed that neurons in a part of the brainstem called the midbrain fired whenever the subjects detected an error between what they expected would be presented and what was actually presented. These neurons projected to a region in the frontal lobe called the orbitofrontal cortex (OFC), where the decision to change expectations appears to be made. Thus, this study on detecting errors and encoding their significance, although again showing a two-step, two brain region process, nevertheless involved two completely different areas of the brain.
These and similar studies are important because they show that even under very controlled and non-emotional circumstances, we are perfectly capable of knowing that something we believe in is wrong without mounting the motivation or confidence to actually change our attitudes or behaviors. We record errors in one part of the brain in each experiment but interpret their significance and make decisions based on their meaning in another. The message, then, is that while it is important to show people the error of their ways, that alone will usually be insufficient to induce long-term change.
The two studies also highlight another important point about how science changes: it often does so through a process that only slowly resolves contradictory findings. Which of the two brain pathways shown to be important for changing our minds is the most important? Perhaps they both are important but activated in different contexts or types of information. Cognitive flexibility also means that before allowing new evidence to change our minds, we may have to entertain competing findings and live with the uncertainty that entails.
In our decision making, our brains turn out to be biased toward maintaining consistency. This of course has its virtues: constant fluctuations in attitudes and behaviors would lead to chaotic lives. But it also makes it hard for us to change in the face of obvious error. Indeed, investigators Long Luu and Alan Stocker of the University of Pennsylvania recently showed that we even alter our memories to fit the decisions we make. “In decision-making generally,” Luu says about their study, “the brain focuses more on remaining self-consistent than on remembering precise details of the past”.
Science operates by continually updating what is previously known, leading to new insights and, hopefully, information we can use to improve our lives. It requires the ability to accept new evidence however jarring it may be and the flexibility to acknowledge that once-held beliefs can be wrong. We are slowly gaining insight from neuroscience about how our brains respond when we are confronted with evidence that we have made an error. It should make us both humble and compassionate in our attempts to change our and other people’s minds.
 Fleming SM, van der Putten EJ, Daw MD: Neural mediators of changes of mind about perceptual decisions. Nature Neuroscience 2028;21:617-624.
 Howard JD, Kahnt T: Identify prediction errors in the human midbrain update reward-identify expectations in the orbitofrontal cortex. Nature Communications 2018;doi:10.1038/s41467-018-04055-5
 Luu L, Stocker AA: Post-decision biases reveal a self-consistency principle in perceptual inference. eLife 2018;7:e33334 doi