Kanisha grew up in a Democratic household in Memphis, Tennessee. As far as she remembers, her family and friends always supported leftist candidates. She watched liberal-leaning television programs. She read leftist newspapers. Her Facebook friends posted overwhelmingly liberal-friendly news articles, and Facebook’s news feed algorithm edited out the articles posted by her few conservative friends. Google and other search engines also sent her similar leftist information. Kanisha lives in what is known as a filter bubble, in which she rarely sees information at odds with her views
So what’s your guess on how she votes?
Even when Kanisha learns about evidence for perspectives other than her own, she generally does not give due weight to that information. For instance, when her teacher offered a balanced perspective on the pros and cons of using religion to guide public policy, Kanisha decided to Google the phrase “Why is using religion to guide public policy the right thing to do?”
Do you think the articles that came up helped her gain the most accurate perspective on this politically sensitive issue? By phrasing her Google search that way, Kanisha did not give due consideration to other perspectives. This is characteristic of Kanisha’s behavior: when she hears something that makes her question her beliefs, she looks for ways to protect them, as opposed to searching for the truth.
Confirming Our Biases
Now, I don’t mean to pick on Kanisha. This technology-enabled filter bubble is a characteristic of the personalization of the web. It affects many of us. This filter bubble has combined with another novel aspect of the Internet, how easily new media sources can capture our attention. Websites, bloggers, and so on tend to have lower standards for neutrality and professionalism than traditional news sources. These are key contributors to the polarization of political discourse we’ve seen in recent years.
I have to acknowledge that sometimes I myself am guilty of falling for the filter bubble effect. However, I fight the effect with my knowledge of cognitive biases (thinking errors made by our autopilots) and strategies for dealing with them.
When Kanisha, myself, and others ignore information that doesn’t fit with our previous beliefs, we are exhibiting a thinking error called confirmation bias. Our brains tend to ignore or forget evidence that is counter to our current perspective, and will even twist ambiguous data to support our viewpoint and confirm our existing beliefs.
The stronger we feel about an issue, the stronger this tendency. At the extreme, confirmation bias turns into wishful thinking, when our beliefs stem from what we want to believe, instead of what is true. Confirmation bias is a big part of the polarization in our opinions, in politics and other areas of life.
Be A Proud Flip-Flopper!
So how do you deal with confirmation bias and other thinking errors? One excellent strategy is to focus on updating your beliefs. This concept has helped me and many others who attended Intentional Insights workshops, such as this videotaped one, to deal with thinking errors. To employ this strategy, it helps to practice mentally associating positive emotions such as pride and excitement with the decision to change our minds and update our beliefs based on new evidence.
Imagine how great it would be if Kanisha and everybody else associated positive emotions and felt proud of changing their minds about political issues. Politics would be so much better if everyone updated their beliefs based on new information. Right now, politicians are criticized for changing their minds with the harsh term flip-flopping. How wonderful would it be if not only the citizenry but also our politicians flip-flopped based on wherever the evidence pointed. We should all be proud flip-floppers!
Protecting Yourself From False Beliefs
Being proud of changing our minds is not intuitive, because the emotional part of the brain has a tendency to find changing our minds uncomfortable. It often persuades us to reject information that would otherwise lead us to rethink our opinions. However, we can use the rational part of our mind to train the emotional one to notice confusion, re-evaluate cached thinking and other shortcuts, revise our mental maps, and update our beliefs.
In addition to associating positive emotions with changing your mind, you can use these habits to develop more accurate beliefs:
1) Deliberately seek out contradictory evidence to your opinion on a topic, and praise yourself after giving that evidence fair consideration.
2) Consider the best possible form of arguments against your position, and be open to changing your mind if those other arguments are better than yours.
3) Focus on updating your beliefs on controversial and emotional topics, as these are harder for the human mind to manage well.
It’s especially beneficial to practice changing your mind frequently. Recent research shows that those who update their beliefs more often are substantially more likely to have more accurate beliefs. So practice asking yourself systematically about whether you should change your mind based on new evidence.
Taking all of these steps and feeling good about them will help you evaluate reality accurately and thus gain agency to achieve your life goals.
Questions for Consideration
- When, if ever, has confirmation bias and associated thinking errors steered you wrong? What consequences resulted from these thinking errors?
- How can you apply the concept of updating beliefs to improve your thinking?
What are other strategies you have found to help you change your mind and gain a more clear evaluation of reality? - How do you think reading this post has influenced your thinking about evaluating reality? What specific steps do you plan to take as a result of reading this post to shift your thinking and behavior patterns?
—
Dr. Gleb Tsipursky is the author of the Amazon bestseller The Truth-Seeker’s Handbook: A Science-Based Guide. He is an Assistant Professor at The Ohio State University, President of the nonprofit Intentional Insights, and co-founder of the Pro-Truth Pledge.