Worldview Undermines Reason
Caption: Cartoon of ostrich sticking head in the sand (Dawn Hudson/Public Domain Pictures)

By Joel Lehman

_____________________

“And so castles made of sand, fall in the sea, eventually.”
-Jimi Hendrix

Can you recall a romantic relationship gone wrong, and notice in hindsight the clear signs of the impending breakup, even when during the actual unraveling itself you didn’t notice these signs? Maybe there’s a political belief you once embraced that you’ve now let go. Perhaps you thought Ayn Rand was the most important thinker ever, or that Communism was the one true way. Looking back, though, you wonder how you ever ignored all the opposing evidence.

How did it feel before you let those old beliefs go? And how did it feel after you replaced them with brand new ones? In both cases, you probably felt confident that you were right.

It’s natural as we grow and mature, especially as we go from child to adult, to shift our core beliefs. But it’s revealing how confident we are, even a few months before such changes. We’re caught off-guard by changes in ourselves: Scientific evidence suggests that we recognize that we’ve changed significantly across our last decade, but are over-sure that the next decade will be different — no matter if we’re 18 or 68. This effect, called “The End of History Illusion,” helps explain why we may feel that we have rock-solid answers, entirely sure that those who disagree are foolishly mistaken, even though next year our answers may well have shifted drastically.

Looking back over these shifts in beliefs, we can begin to see how natural and easy it is to confuse the feeling of certainty for a solid foundation truly worthy of certainty. It’s such a universal tendency, that it bears repeating: feeling certain that some belief is true is not a reliable signal that the belief actually is true. We’re bad at separating our desires for how the world works from how the factual evidence suggests it does. No matter how certain I may feel that gravity is an illusion, things can only end badly for me if I jump off the edge of a cliff. The feeling’s not enough — only with accumulating evidence can we have a dependable basis for confidence.

While it might seem like it’s a person’s right to have deluded beliefs, and that such overconfidence doesn’t really harm anyone, in reality it’s a serious problem when a large part of the country has broken beliefs. The reason is that beliefs have real power in a democracy, because what people believe influences who is elected and what laws and policies are instituted, or what wars are fought. Real lives hang in the balance. Political beliefs matter: unfortunately, thinking clearly about politics is a huge challenge.

Our political affiliation usually reflects some of our most deep-seated assumptions about how the world works. If we’re not careful, it can pre-commit us to believe particular facts, no matter where the objective evidence really lies, preventing us from updating our beliefs. We can intentionally use the strategy of precommitment to manage our behaviors, such as in managing our weight, but an unaware, autopilot precommitment to certain beliefs easily undermines an accurate evaluation of reality, in politics as well as in other life areas.

Precommitment and Climate Change

Let’s take on climate change as a case study. Climate change is a controversial topic in America — according to Pew Research, less than half of US adults think that the Earth is warming due to human causes. Now, as a disclaimer, because I’d alienate a large chunk of potential readers by expressing what side I land on, I’m not going to weigh in on the debate directly. This article isn’t about convincing anyone about any particular political issue. The aim is to explain why it’s important to have the guts to really examine the foundations of your political beliefs.

An intriguing aspect of Americans’ opinions about climate change is the deep disagreement centers on what is a purely scientific question: is human activity warming the Earth in any significant manner? Climate change itself is not some weedy and complicated question about morality or how to best organize society. It centers instead on some concrete fact about the world. Is the Earth warming? If it is, is that warming caused to a significant extent by human activity? These clearly are well-defined questions, with right and wrong answers, that science could be applied to answer.

So, one might think that belief in climate change might be predicted by how much scientific knowledge a person has. That is, if there’s a right answer to a scientific question, you’d think that those with more scientific knowledge would tend to get it right more often. That’s usually how things work. I have more confidence that an engineer could design an airplane than a medical doctor, and more confidence that a medical doctor could set a bone fracture than an engineer.

But it turns out that what predicts a person’s belief in climate change (more than how scientifically educated they are) is their political affiliation. Let that sink in — here we have a factual question, one that has been studied intensely by scientists. But the population is nearly evenly divided in how they feel about how that question is best answered. Where they land has little to do with how much they know about science.

So, no matter what you believe about climate change, it seems clear that something has gone wrong. It seems that one side or the other has let their political beliefs distort how they perceive reality. One hypothesis would be that the narratives of liberals and conservatives predispose them to one answer or the other. There’s evidence that people choose to favor different political parties because of deep-seated differences in what they consider to be morally important and how they believe the world best functions. It’s not that conservatives are smarter than liberals, or vice-versa. It’s that they fundamentally care about different things and have different assumptions about the world.

For example, according to the moral foundations theory of Jonathan Haidt, conservatives tend to place more importance on values like authority, sanctity, and loyalty, while liberals tend to accentuate care and fairness. More relevant to climate change, liberals and conservatives also tend to have different views about nature. According to cultural theory, liberals are more likely to believe that nature is delicate and needs to be protected, while conservatives are more likely to view nature as not substantially impacted by human influence.

 

Convenience vs. Truth

From this view, it’s just plain convenient for both liberals and conservatives to lean the way that they do on climate change, irrespective of the evidence. It’s fine to have different assumptions about how things are, but any hunch should bend in the face of evidence. In fact, we should celebrate updating our beliefs to match reality. Whether the Earth is in fact impacted by human greenhouse gas emissions is not something an assumption can settle. Reality is what it is, and it is really harmful for our society when people rely on assumptions about politically-relevant issues when there’s copious amounts of credible evidence.

 

Worldview Undermines Reason
Caption: Meme saying “It would be very convenient if the things that are most comfortable to believe are also the ones that happen to be the most true” (Image created for Intentional Insights by Isabelle Phung)

 

Perhaps you’ve seen this kind of stubbornness when talking with others about climate change. Most of us have no deep understanding of climate science. While we may argue confidently about whether humans are responsible for warming the Earth, when pressed on details, discussion often devolves into parroting the line of our political tribe.

Importantly, I don’t want to pretend that both sides are necessarily on even ground: “Well, climate change better fits the worldview of Democrats, and climate change skepticism better fits the worldview of Republicans, so I guess we’ll have to leave it at that either side could be right.” The lesson is not that we can never understand reality, because over time science does allow us to better understand reality. Indeed, there may currently be much more evidence for one side than the other. But for anyone in the grips of an overconfident belief, being on the wrong side of evidence certainly won’t feel that way. There’s nearly always some kind of convenient escape-hatch, some shred of evidence you hold onto that supports your cherished belief, all the while fully-discounting the evidence of your “opponents.”

To figure out the weight of evidence on climate change or any other issue, our best best is to rely on scientific evidence. Because science is the best of all methods we’ve found to get closer to understanding reality and to better predict the outcomes of our actions, it’s at our own risk that we hold a strong belief that opposes scientific consensus, “the collective judgment, position, and opinion of the community of scientists in a particular field of study.” One can can recognize scientific consensus by position statements by prestigious scientific organizations, or the result of “meta-analysis” studies (studies that summarize and analyze a wide collection of other studies). In theory, it’s easy to rely on scientific evidence, but in practice it’s hard, because we often cherry-pick what we consider as scientific evidence to support beliefs we already find comfortable to hold.

One way our psychology helps us go off the rails in this way is called the Dunning-Kruger effect: When we have little knowledge of a topic, we often reach wrong conclusions and also are overconfident in how well we understand what’s going on. If we’re already very sure we understand something, we’re likely to think a piece of evidence is credible only if it agrees with our existing thinking. It helps to keep in mind the old adage, “A little learning is a dangerous thing,” and override our strong gut instinct to distrust scientific evidence when we have only a very basic understanding of the topic.

So, for any one person who aspires to become less wrong, the important lesson is that we should be vigilant when we notice that we’ve become confident about a controversial belief that is both convenient and comfortable for us — we’d be likely to believe that thing anyways, so there’s ample room for self-delusion. As the physicist Richard Feynman said, “The first principle is that you must not fool yourself, and you are the easiest person to fool.” The challenge is to be self-aware enough to realize when we’re possibly deluding ourselves. Only that way can we hope to dig ourselves out of our out-of-control overconfidence.

 

Here are some questions to consider as you strive to evaluate reality accurately and avoid political precommitments:

  • What controversial beliefs of yours are convenient to your politics, and what evidence do those beliefs rest upon?
  • Do you notice a strong emotional response that sometimes arises when you question a political belief?
  • How has this post influenced your thinking about political beliefs and your own possible political precommitments ?

P.S. Want to evaluate reality accurately and avoid political precommitments yourself and have others do so as well? Take the Pro-Truth Pledge, encourage your friends to do so, and call on your elected representatives to take it!

Save

Save

Save