Saturday, November 16, 2019

Changing Minds is Hard

originally posted Oct 2018; updated Nov 2019

Why Changing Somebody’s Mind, or Yours, Is Hard to Do. David Ropeik, Psychology Today. July 13, 2010.

There are a lot of psychological terms for the fact that people don't like to change their minds; "motivated reasoning", "confirmation bias", "cognitive dissonance". But you don't need academic semantics to know that trying to get somebody to see things your way is tough if they go into the argument with another point of view. You argue the facts, as thoughtfully and non-confrontationally as you can, but the facts don't seem to get you anywhere. The wall of the other person's opinion doesn't move. They don't seem to WANT it to move. 
What's going on there? Why do people so tenaciously stick to the views they've already formed? Shouldn't a cognitive mind be open to evidence...to the facts...to reason? Well, that's hopeful but naïve, and ignores a vast amount of social science evidence that has shown that facts, by themselves, are meaningless. They are ones and zeroes to your mental computer, raw blank data that only take on meaning when run through the software of your feelings. Melissa Finucane and Paul Slovic and others call this "The Affect Heuristic" , the subconscious process of taking information and processing it through our feelings and instincts and life circumstances and experiences...anything that gives the facts valence - meaning...which turns raw meaningless data into our judgments and views and opinions. 
Okay, but why do we cling to our views so tenaciously after they are formed? Interesting clues come from two areas of study... self-affirmation, and Cultural Cognition. Both areas suggest that we cling to our views because the walls of our opinions are like battlements that keep the good guys inside (us) safe from the enemy without (all those dopes with different opinions than ours). Quite literally, our views and opinions may help protect us, keep us safe, literally help us survive. Small wonder then that we fight so hard to keep those walls strong and tall. 
Self-affirmation conditioning studies find that if, before you start to try to change somebody's mind, you first ask them to remember something that gave them a positive view of themselves, they're more likely to be open to facts and to change their opinions. People who feel good about themselves are more likely to be open-minded! (That's far more simplistic than any academic would ever put it!) One study, in press, was done back in 2008 and asked people about withdrawing troops from Iraq. Most Republicans at the time thought the troops should stay. Two separate groups of Republicans were shown statistics about the dramatic reduction of violence in Iraq following the "surge" in American troops. One group was asked to do a self-affirmation activity (they were asked to remember a time when they felt good about themselves by living up to a moral value they held). The other group was just shown the violence statistics, with no self-affirmation. Then both groups were asked whether the dramatic reduction in violence in Iraq was a reason to withdraw U.S. troops. The Republicans who did the self-affirmation activity, the folks who were primed to feel good about themselves, were more likely to change their minds and say that the reduction in violence in Iraq was a reason to begin pulling out of Iraq. The group that had not done the self-affirmation remained adamant that the troops should stay. 
Cultural Cognition is the theory that we shape our opinions to conform to the views of the groups with which we most strongly identify. That does two things. It creates solidarity in the group, which increases the chances that our group's views will prevail in society (e.g. our party is in power). And it strengthens the group's acceptance of us as members in good standing. (Like the lithmus test some conservative Republicans have proposed that candidates must pass, making sure their views conform to conservative doctrine before those candidates get party support.) 
Strengthening the group, helping it win dominance, and having the group accept us, matters. A lot. Humans are social animals. We depend on our groups, our tribes, literally for our survival. When our group's views prevail, and our group accepts us, our survival chances go up. So the Cultural Cognition motivation to conform our opinions to those of the groups/tribes with which we identify is powerful. And it would be consistent with that interpretation that the more threatened we feel, by economic uncertainty, or threats of terrorism, or environmental doom and gloom, the more we circle the wagons of our opinions to keep the tribe together and keep ourselves safe... and the more fierce grow the inflexible "Culture War" polarities that impede compromise and progress. The self-affirmation research seems to support this. It appears that the less threatened we feel, the more flexible our opinions are likely to be. 
So the next time you want to have a truly open-minded conversation on a contentious topic with someone who disagrees with you, don't launch right into the facts. Ask them to tell you about some wonderful thing they did, or success they had, or positive feedback they got for something. And try to remember something like that about yourself. Then you might actually have a conversation, instead of the argument you're headed for instead. 
The psychology of risk perception referred to above is described in detail in David Ropeik's new book, How Risky Is It, Really? Why Our Fears Don't Match the Facts.


What Actually Is a Belief? And Why Is It So Hard to Change? Ralph Lewis, Psychology Today. Oct. 7, 2018.

Beliefs evolved as energy saving shortcuts. Restructuring them is costly.

“For some of our most important beliefs, we have no evidence at all, except that people we love and trust hold these beliefs. Considering how little we know, the confidence we have in our beliefs is preposterous—and it is also essential.”2002 Nobel Laureate Daniel Kahneman 1


Source: Alain Lacroix/Dreamstime

Beliefs are a slippery concept. What actually are they? Philosophy has long struggled to define them.2 In this post-truth and ideologically polarized world, we need a better understanding of beliefs. As a psychiatrist, my job frequently involves identifying distorted beliefs, understanding how they formed, and helping people to learn to be more skeptical of their own beliefs. 
Let’s consider a helpful evolutionary framework for making more coherent sense of what beliefs really are, and why mistaken beliefs can sometimes be so hard to change. Then we’ll talk about how to gain a more accurate grasp of reality, and how, ultimately, to advance society. 
Beliefs as energy saving shortcuts in modeling and predicting the environment 
Beliefs are our brain’s way of making sense of and navigating our complex world. They are mental representations of the ways our brains expect things in our environment to behave, and how things should be related to each other—the patterns our brain expects the world to conform to. Beliefs are templates for efficient learning and are often essential for survival. 
The brain is an energy-expensive organ, so it had to evolve energy-conserving efficiencies. As a prediction machine, it must take shortcuts for pattern recognition as it processes the vast amounts of information received from the environment by its sense organ outgrowths. Beliefs allow the brain to distill complex information, enabling it to quickly categorize and evaluate information and to jump to conclusions. For example, beliefs are often concerned with understanding the causes of things: If ‘b’ closely followed ‘a’, then ‘a’ might be assumed to have been the cause of ‘b’. 
These shortcuts to interpreting and predicting our world often involve connecting dots and filling in gaps, making extrapolations and assumptions based on incomplete information and based on similarity to previously recognized patterns. In jumping to conclusions, our brains have a preference for familiar conclusions over unfamiliar ones. Thus, our brains are prone to error, sometimes seeing patterns where there are none. This may or may not be subsequently identified and corrected by error-detection mechanisms. It’s a trade-off between efficiency and accuracy
In its need for economy and efficiency of energy consumption, the default tendency of the brain is to fit new information into its existing framework for understanding the world, rather than repeatedly reconstructing that framework from scratch.

Seeing is believing 
It seems likely that the processes in the brain involved in abstract belief formation evolved from simpler processes involved in interpreting sensory perception
Since we experience the external world entirely through our senses, we find it hard to accept that these perceptions are sometimes subjectively distorted and that they are not necessarily reliable experiences of objective reality. People tend to trust their physical senses and to believe their perceptions even when they are hallucinating and no matter how bizarre their perceptual distortions. People will layer explanations on top of their perception of reality to explain away contradictions. 
We give our subjective experience too much credence, and so too our beliefs. We will more readily explain away evidence that contradicts our cherished belief by expanding and elaborating that belief with additional layers of distorted explanation, rather than abandoning it or fundamentally restructuring it. 
Homeostasis – maintaining stability 
Primitive nervous systems evolved in simple organisms in part to serve the function of homeostasis—a dynamic physiological state of equilibrium or stability, a steady state of internal conditions. Homeostasis is structured around a natural resistance to change, following the same principle as a thermostat.

The lower, primitive parts of our human brains maintain homeostasis of breathing, heart rate, blood pressure, temperature, energy balance (via appetite) and a variety of endocrine processes. So too, beliefs preserve a kind of cognitive homeostasis—a stable, familiar approach to processing information about our world. 
We should expect that the homeostatic function that defined primitive brains would likely have been preserved as an organizing principle in the evolution of more complex brains. Certainly, complex brains are geared toward reacting, learning and adapting, but just like primitive brain functions, these adaptations are ultimately in the service of maintaining homeostasis in an ever-changing environment. 
Radically restructuring our belief system and creating a new worldview engages parts of the brain involved in higher reasoning processes and computation, and is consequently more effortful, time- and energy-consuming. The brain often cannot afford such an investment. This would explain why, when we experience cognitive dissonance, it is easier to resolve this discomfort by doubling down on our existing belief system—ignoring or explaining away the challenging, contradictory information
A consistent sense of self, and personal investment in one’s beliefs 
Another important factor accounting for resistance to changing our beliefs is the way our beliefs are so often intertwined with how we define ourselves as people—our self-concept. Indeed, beliefs are associated with a part of the brain integrally involved in self-representation—the ventromedial prefrontal cortex.3 We want to feel that we are consistent, with our behavior aligning with our beliefs. We constantly try to rationalize our own actions and beliefs, and try to preserve a consistent self-image. It’s embarrassing and quite often costly in a variety of ways to admit that we are fundamentally wrong.

In many cases, people have a lot invested personally in their belief system. They may have staked their reputation on a particular belief. Not infrequently, people structure their whole lives around a belief. And this investment may go far beyond a sense of self, extending to large material and financial investments or a life’s career. A change of belief for such a person would obviously involve a monumental upheaval and may entail intolerable personal losses. 
No wonder it’s so hard to change our cherished and entrenched beliefs. 
The social dimension of belief 
A lot of our belief framework is learned at an early age from parents and other adult authority figures. Many human beliefs are the cumulative products of millennia of human culture. Children are strongly predisposed to believe their parents, and, as adults, we are inclined to believe authorities. 
It's not surprising that our brains have evolved to more readily believe things told to us than to be skeptical. This makes evolutionary sense as a strategy for efficient learning from parents, and as a social, tribal species it promotes group cohesion. 
People can be swayed by persuasive individuals or compelling ideas to override and reject their previously received authority. Sometimes, this is rational. But sometimes, it is not—people are susceptible to influence by charismatic ideologues and by social movements. Especially when these offer new attachments and new self-identities imbued with more powerful affiliation, validation, esteem and sense of purpose than the individual previously had in their life. 
Science and the excitement of proving ourselves wrong 
Science values the changing of minds through disproving previously held beliefs and challenging received authority with new evidence. This is in sharp contrast to faith (not just religious faith). Faith is far more natural and intuitive to the human brain than is science. Science requires training. It is a disciplined method that tries to systematically overcome or bypass our intuitions and cognitive biases and follow the evidence regardless of our prior beliefs, expectations, preferences or personal investment. 
The increasing application of the scientific method in the last four centuries ushered in unprecedented, accelerating progress in humanity’s quest to understand the nature of reality and vast improvements in quality of life. Discovering just how mistaken we collectively were about so many things has been the key to sensational societal progress.4 
Imagine if each of us as individuals could cultivate a scientific attitude of rigorous critical thinking and curiosity in our personal lives, and could experience an exhilarated feeling of discovery whenever we find we have been wrong about something important. Perhaps it’s time to stop talking admiringly about faith and belief as if these were virtues.
Faith is based on belief without evidence, whereas science is based on evidence without belief.

References

1. Daniel Kahneman, Thinking, Fast and Slow, New York: Farrar, Straus and Giroux, 2011, p. 209.

2. See for example Schwitzgebel, Eric, "Belief", The Stanford Encyclopedia of Philosophy (Summer 2015 Edition), Edward N. Zalta (ed.), https://plato.stanford.edu/archives/sum2015/entries/belief/

3. Harris, S., et al., The neural correlates of religious and nonreligious belief. PLoS One, 2009. 4(10): p. e0007272; Harris, S., S.A. Sheth, and M.S. Cohen, Functional neuroimaging of belief, disbelief, and uncertainty. Ann Neurol, 2008. 63(2): p. 141-7. The ventromedial prefrontal cortex is also involved in emotional associations, reward, and goal-driven behavior. [CLICK 'MORE' TO VIEW FOOTNOTE 4]

4. Democracy also loosely employs the scientific method of conjecture and criticism. Each election platform is an hypothesis, each elected government an experiment, subjected to the peer review process of a free press and the next election. The combination of science and democracy has been the key to human progress. To be sure, this progress has not been smooth or without calamitous derailments in modern history. But the overall trend over time has been definitively and spectacularly positive, and it is indisputably the most successful system humans have invented to date.


Reason Won’t Save Us. Robert Burton, Nautilus. Oct. 17, 2019.
It’s time to accept the limits of how we think.
In wondering what can be done to steer civilization away from the abyss, I confess to being increasingly puzzled by the central enigma of contemporary cognitive psychology: To what degree are we consciously capable of changing our minds?
... 
aliens would be equally befuddled by watching political debates on climate change or universal healthcare. They would observe humans ignoring data that strongly warn of impending catastrophic consequences for their species, apparently preferring and even enjoying conflict, anger, self-righteous indignation, and a wide variety of self-defeating behaviors. They would quickly conclude what most of us also suspect but often fail to acknowledge: Though our genes follow the laws of natural selection to optimize survival of the species, as individuals we are not necessarily similarly inclined.
... 
There is no compelling evidence to suggest that public debate on virtually any subject can ever be resolved through reason. We migrate toward what we feel is best.
... 
If this argument sounds harsh or offensive, so is watching present day failure of discourse between those with differing points of view, yet persisting with the unrealistic hope that we could do better if we tried harder, thought more deeply, had better educations, and could overcome innate and acquired biases. 
If we are to address gathering existential threats, we need to begin the arduous multigenerational task of acknowledging that we are decision-making organisms rather than uniquely self-conscious and willfully rational. Just as we are slowly stripping away pop psychology to better understand the biological roots of mental illnesses such as schizophrenia, stepping back from assigning blame and pride to conscious reasoning might allow us a self-image that reunites us with the rest of the natural world as opposed to declaring ourselves as unique. Only if we can see that our thoughts are the product of myriad factors beyond our conscious control, can we hope to figure out how to develop the necessary subliminal skills to successfully address the world’s most urgent problems.


No comments:

Post a Comment