top of page
Writer's pictureJustin McBrayer

When You Won't Change Your Mind

Humans are tribal thinkers. Our minds aren't wired to get to the truth, the whole truth, and nothing but the truth. That's not to say that we can't ever get to the bottom of things. In fact, we've figured out an awful lot about ourselves and the world we live in. But we don't think about the world perfectly: we have biases, blind spots, and other blights that get in the way.


One of our most pernicious tendencies is to wall off evidence, arguments, and ideas that challenge our identities. Humans innately think in terms of groups. It's us versus them, Dems versus Republicans, gay versus straight, Religious versus Secular, Black versus White. If I were to ask you to tell me about yourself, the first thing you are likely to do is to tell me which groups you belong to. It's important to all of us.


Psychologists studying this phenomenon developed the Social Identity Theory. The theory explains why we think in terms of groups, how those groups are established, and when thinking in terms of groups gets us into trouble. The theory is well-supported by surprising and troubling experiments.


One of the classic studies asked subjects to estimate the number of dots on a screen. The subjects were then sorted into two groups: those who overestimated the number of dots and those who underestimated the number of dots. In reality, the subjects were just randomly assigned to one of the two groups. Next, the members of each group had to decide how to split up a small monetary award. With no prompting at all, most subjects gave more money to members of their own group than to members of the other group. In other words, most subjects were willing to discriminate against another group even when it comes to something as silly as guessing dots.


I learned about a real-life version of this experiment when I was on the job market. I had an on-campus interview at a small school that had traditionally sorted its students by the year they matriculated: there were evens and odds. At a school that small, every student knew whether a fellow student was an even or odd. Students would introduce themselves as evens or odds. Instantly, the entire student body was split into an ingroup and outgroup. The administration, I learned, eventually did away with the tradition because student hazing and inter-group harassment had gotten so bad.


The punchline is that we naturally think in terms of groups, and the groups need not be based on anything real or important. We are tribal thinkers. The problem is that tribal thinking isn't good thinking. Tribal thinking is about loyalty. Good thinking is about truth. The two don't always overlap. And when we think about controversial, identity-impacting things like ethics, politics, and religion, it's easy for us to lapse into tribal thinking as opposed to good thinking.


Of course, many of us know the risks of tribal thinking, and most of us want to avoid it. We want to fairly consider the arguments of others, calibrate our beliefs to the evidence, and stock up on the best ideas in the marketplace regardless of whether they belong to our tribe or another. But how can we know when we're thinking tribally?


There are lots of ways, but here's one that has been salient in many of my most recent discussions: your reluctance to change your mind. Let me explain.


Suppose I ask you how much money you'll take to sell one of your children into slavery. You'd be horrified. Then answer, of course, is that there isn't enough money in the world. Your children aren't for sale. You'd give the same answer for a wide range of propositions. How much money would it take for your to cheat on your spouse, burn the flag, or desecrate your church pulpit?


Your reluctance to make a deal on these sorts of issues is a mark of loyalty. You're one of us, and you wouldn't sell us out for all of the gold in Texas. It's just beyond the pale to even consider any of these things. They are acts of betrayal.


And just as actions can betray your tribe, so, too, can beliefs. Your tribe believes that human pollution drives climate change. Or that Trump won the 2020 election. Or that God exists. Or that contemporary racism explains all of the empirical differences between White and Black Americans. If that's a mark of your tribe, then you better not even consider whether it's mistaken. That's an act of disloyalty.


Now I'm not saying you should consider offers for selling your children into slavery. But I am saying that a refusal to change your mind or a preemptive rejection of a deal are indicators that you're thinking like a loyalist rather than an evidentialist.


Start with actions. You do what you do because it furthers some goal you have. If you want a beer and believe that there's one in the refrigerator, that will explain why you act as you do. If I tell you that there's no more beer in the fridge, that should make you less likely to act in the same way. If it doesn't, something's wrong.


Here's the connection to tribal thinking. A friend of mine recently said that she was committed to voting for the Democratic nominee in 2024 regardless of who it was. Maybe she didn't really think this. But if she did, it's a mark of tribal thinking. "If you're really one of us," says the progressive, "you can't even consider the possibility of voting for the other guy." Track record, character, and evidence be damned. Committing yourself to a course of action regardless of what new information turns up is a mark of loyalty. It's not a mark of good thinking.


A similar story can be told for beliefs. We believe what we do because we think it's true. Indeed, that's why we believe what we do--no one believes something that they think is false. So, belief is tied to truth. And just as your course of action should shift with changes in information, so, too, your beliefs should shift with changes in information. If it doesn't, then something's wrong.

For example, a friend of mine recently said that there was no amount of evidence in the world that would be enough for him to get the COVID vaccine. That was beyond the pale. There were no studies I could show him, no scientific explanation, and no expert testimony that was going to change his mind.

That level of confidence can't be justified by an appeal to evidence. Our evidence almost always falls short of certainty. I'm confident that Denver is the capitol of Colorado, but I'm not certain. (I've been wrong about that sort of thing in the past.). And this guy didn't have enough evidence to be certain that the COVID vaccine was dangerous or that all other evidence was misleading. Not even close!


Humans just don't see the world clear as day. Instead, we are left, as John Locke puts it, with the twilight of probability. Given that, we should always be ready to consider the prospect that we're wrong and weigh new evidence carefully. The truth may be black-and-white, but our view of it is always a shade of gray.


So, the "I'll believe this no matter what" attitude is not explained in terms of the evidence. Instead, it's best explained by loyalty. When there's no amount of evidence that would change your mind about something, you're not thinking about it in terms of what's likely to be true. You're thinking like someone who wants to stay in a group so badly that you're willing to believe what's false to do so.


The punchline is that good thinking requires us to calibrate our belief to reasons. There's nothing wrong with flip-flopping, despite the fact that politicians get grief for it. Indeed, we should flip-flop when we get good reasons to think that our worldview is inaccurate. When we dig in our heels and declare that no amount of evidence could shift us from our positions, that's a sure-sign of tribal thinking.

107 views0 comments

Comments


bottom of page