(How) Do Minds Change? (Pt. 1)
What are you asking here Inam?
So coming off of my previous masterpiece, I thought I'd write a bit about political psychology, which is a really interesting subject with, in my opinion, profound daily life lessons that doesn't really have much approachable literature for laypeople. I first came across the subject in a highly recommended talk by Dan Saks about convincing C programmers to transition to using C++. During my research I've discovered that in the intervening years since he gave the talk the scientific body has failed to replicate some of the concepts he talks about, but the core tenet of the talk still holds, that tenet being:
Dan explores why that is in his talk, and I will touch on some of it here, but I think that tenet is one of the most impactful takeaways for many people regarding day-to-day life from political psychology.
Either way, I'm going to explore the question of how minds change here, expect a second part on who is most and least susceptible to the implicit biases I talk about in the future.
Motivated reasoning
The study of "hot political cognition" is also known as the study of motivated reasoning and cognition. There is quite a bit of evidence to show that changing minds is hard, regardless of if it's your own mind, or someone else's. The following few sections will explore those heuristics and biases. If you're interested in cold political cognition (as in, the ways in which nonmotivated information processing mechanisms and constraints affect political judgement), [1] brings up multiple good literature reviews on page 4.
Quantity and quality of processing
There is good evidence to show that people will avoid accepting preference-inconsistent conclusions as much as possible in multiple ways. Firstly, if they come across information that challenges their initial viewpoint, they will demand more information ("engage in more persistent information processing") before giving up their original point of view. A popular theory is the "seize and freeze" theory, which states that people will persist in information processing until a satisfactory conclusion is reached, "freeze" (stop searching for new information), and "strive to avoid belief change". This theory easily explains why, for example, cigarette companies can have such successful misinformation campaigns simply by showing cigarettes "might or might not be" harmful, that's what users want to hear.[1]
But "quantity of processing" biases are not the only ones at play, "quality of processing" is also hampered by ideology and motivated reasoning. Research shows that in addition to biased information searching, "inconsistent information is often viewed as less valid and relevant than preference-consistent information."[1] Even memory can be compromised by motivated reasoning, leading to preference-inconsistent information being discarded or morphed in memory.
So we can see that there are a number of biases at play in the information acquisition and interpreting stages of political activity, but there are less obvious ones as well. For example...
Self-esteem
Another really interesting avenue of motivated reasoning is self-preservation and the "stability of the self-concept"[1]. Low self-esteem has been implicated in:
- racial stereotyping and prejudice
- role conformity in decision making among judges
- political conservatism
- passive presidential character
- low political efficacy and trust among black children and
- decreased levels of political awareness and increased levels of political cynicism among adolescents
Low self-esteem has also been seen to heighten intolerance among groups. Interestingly, self-affirmation before exposure to information can make people more open to new ideas.
...participants assigned to self-affirmation conditions either (1) wrote about a personally important value or (2) were given positive feedback concerning an important skill. As hypothesized, temporarily bolstering individuals’ feelings of self-esteem in either of these ways enabled them to respond more open- mindedly to the attitude-discrepant report on capital punishment. That is, they were less critical of the evidence, less likely to suspect bias on the part of the study’s authors, and more likely to change their attitudes in the direction of the report’s conclusion.
Beyond just low self-esteem, there is some evidence suggesting that maintaining your beliefs (belief perseverance) and avoiding/ignoring disagreeable information can improve mental health and stability. However, biased information processing will be detrimental to society as a whole. It is possible we need to reassess how we judge ourselves and provide safe spaces with self-esteem affirmations to provide avenues for changing beliefs and lining up with new information.
Myths: the backfire effect and self-interest
The backfire effect
The backfire effect comes in two forms, in popular use the backfire effect is often used to describe when two people in an argument or debate will walk away believing in their own position stronger than they started. This is not generally how the backfire effect is talked about in political psychology (that I read) and I didn't go out of my way to look into this particular intepretation of the backfire effect, I am not certain on how real it is, but I suspect it's not from my research.
The version of the backfire effect I want to define here is this one:
Particularly among conservatives, attempts to correct misperceptions activated a “backfire effect” against empirical facts, with subjects more strongly expressing a non-factual belief.[4]This has been shown to be elusive and hard to replicate, as seen in [4]:
The present paper presents results from five experiments in which we enrolled more than 10,100 subjects and tested 52 issues of potential backfire. Across all experiments, we found no corrections capable of triggering backfire, despite testing precisely the kinds of polarized issues where backfire should be expected. Evidence of factual backfire is far more tenuous than prior research suggests.As well as a tweet from John Cook:
There's been much research into the backfire effect since we published the Debunking Handbook in 2011. Short answer: it's elusive. Researchers have struggled to replicate it. We should be more concerned about leaving misinformation unchallenged than potential backfire effects.
So, in conclusion, the backfire effect is elusive and while it may be at play in specific circumstances, it is by and large not something to be worried about.
Self-interest
One of the most fascinating outcomes from my research is it is almost indisputable that voters do not vote for their economic self-interest. For example, from [1]:
However, the empirical evidence as a whole reveals that rational considerations such as economic self-interest play a fairly minor role in shaping evaluations of issues and candidates, unless the stakes are large and unambiguous (Green & Shapiro, 1994; Sears & Funk, 1991). Most citizens do not seem to be “pocketbook” voters (Kinder & Kiewiet, 1981). Kinder and Sears (1985) concluded that “neither losing a job, nor deteriorating family financial conditions, nor pessimism about the family’s economic future has much to do with support for policies designed to alleviate personal economic distress” (p. 671). Poor people are seldom more likely and sometimes even less likely than members of the middle class to support liberal or leftist economic policies that would encourage the redistribution of wealth (e.g., Hochschild, 1981; Jost, Pelham, Sheldon, & Sullivan, 2003; Kluegel & Smith, 1986; Lane, 1962).And Jonathon Haidt tackles this problem as well in [6]:
With these revisions, Moral Foundations Theory can now explain one of the great puzzles that has preoccupied Democrats in recent years: Why do rural and working-class Americans generally vote Republican when it is the Democratic Party that wants to redistribute money more evenly? Democrats often say that Republicans have duped these people into voting against their economic self-interest. (That was the thesis of the popular 2004 book What’s the Matter with Kansas?.) But from the perspective of Moral Foundations Theory, rural and working-class voters were in fact voting for their moral interests. Their morality is not just about harm, rights, and justice, and they don’t want their nation to devote itself primarily to the care of victims and the pursuit of social justice. Until Democrats understand the Durkheimian vision of society and the difference between a six-foundation morality and a three-foundation morality, they will not understand what makes people vote Republican.
This quote gives glimpses into Haidt's Moral Foundations theory, I won't tackle it here but I might in the future, if you're interested I highly recommend checking out [6] and the rest of his work, it's very approachable.
The elephant and the rider
Jonathon Haidt has in part popularized another interesting concept, the rider and the elephant, which correspond to Kahneman's systems 1 and 2 if you're familiar with those. It is very hard to beat Haidt's writing here, so I will quote it at length:
A person's mind is divided into parts that sometimes conflict, like a small rider (controlled processing, including reasoning) sitting on top of a very large elephant (automatic processing, including all of our “gut feelings” and intuitions). Each of us may think that our “rider” is in charge; we think that we come to our views by carefully weighing the evidence on all sides. Yet when we argue with others, it often seems clear to us that their “elephant” is in charge. Others seem to be emotionally committed to a position, working hard to generate post-hoc reasons to justify that position. Of course, those people think the same about us.
The key to understanding politics, partisanship, and voting is to understand the elephant. It is very hard to change someone’s mind by hitting them with arguments, logic, and data, if their elephant doesn’t like you or what you stand for. The key to persuasion is to speak to the elephant first. Great politicians, like great salespeople, understand this. Sometimes they use tricks or emotionally powerful falsehoods; this is sleazy and manipulative. But if you look closely at the greatest and most persuasive speeches in history, they all speak directly to the elephant, while also making good and fair arguments. To be socially or politically influential, you must understand and acknowledge people’s values.[6]
Conclusion
Well, this one was a bit all over the place with only loose connecting threads, but I hope there was some takeaways in there that were of value. I tried to keep it vaguely brief but the subject matter is so vast and interconnected that it's a bit hard to cull things out. I intentionally cut out most of the content about how to counteract biases that stop us from changing our belief systems to align with new information, as I will put that in a future post (pt. 2). When I do finish that post I'll come back and edit this one to link to it. There might be a post or two between this one and that one, we'll see, in any case, thanks for reading 🙂.
Works Cited
Once again, if you're having trouble getting access to any of these send me an email and I can send you the files.
- “Hot” Political Cognition: Its Self-, Group-, and System-Serving Purposes - John T. Jost, Erin P. Hennes, and Howard Lavine
- Rational Wiki - Backfire effect
- Are Smart People Ruining Democracy? | Dan Kahan | TEDxVienna
- The Elusive Backfire Effect: Mass Attitudes’ Steadfast Factual Adherence
- John Cook on the backfire effect and misinformation
- Why Do They Vote That Way" - Jonathon Haidt