Caption: Woman looking at homeopathic medicine (Wikimedia Commons)

At least 10 US children died and over 400 were sickened after taking homeopathic teething medicine laced with a poisonous herb called “deadly nightshade.” Carried by CVS, Walgreens, and other major American pharmacies, the pills contained this poison based on the alternative medicine principle of homeopathy, the treatment of medical conditions by tiny doses of natural substances that produce symptoms of disease.

 

These children did not have to die. Numerous research studies show that homeopathy does not work. Despite this research, homeopathy is a quickly-growing multi-billion dollar business, taking advantage of people’s distrust in science and the lack of government regulation of “alternative medicine.”

 

These deaths are among many terrible consequences of the crisis of trust suffered by our institutions in recent years. While headlines focus on declining trust in the media and the government, science and academia are not immune to this crisis of confidence, and the results can be deadly.

 

Consider that in 2006, 41% of respondents in a nationwide poll expressed “a lot of confidence” in higher education. Less than 10 years later, in 2014, only 14% of those surveyed showed “a great deal of confidence” in academia.

 

What about science as distinct from academia? Polling shows that the number of people who believe that science has “made life more difficult” increased by 50% from 2009 to 2015. According to a 2017 survey, only 35% of respondents have “a lot” of trust in scientists; the number of people who do “not at all” trust scientists increased by over 50% from a similar poll conducted in December 2013.

 

This crumbling of trust in science and academia forms part of a broader pattern, what Tom Nichols called The Death of Expertise in his 2017 book. Growing numbers of people claim their personal opinions hold equal weight to the opinions of experts.

 

Children dying from deadly nightshade in homeopathic medicine is only one consequence of this crisis of trust. For another example, consider the false claim that vaccines cause autism. This belief has spread widely across the US, and leads to a host of problems. For instance, measles was practically eliminated in the US by 2000. However, in recent years outbreaks of measles have been on the rise, driven by parents failing to vaccinate their children in a number of communities.

 

Should We Actually Trust Scientific Experts?

 

While we can all agree that we do not want children to suffer, what is the underlying basis for why the opinions of experts – including scientists – deserve more trust than the average person in evaluating the truth of reality?

 

The term “expert” refers to someone who has extensive familiarity with a specific area, as shown by commonly-recognized credentials such as a certification, an academic degree, publication of a book, years of experience in a field, or other way that a reasonable person may recognize an “expert.” Experts are able to draw on their substantial body of knowledge and experience to provide an opinion, often expressed as “expert analysis.”

 

That doesn’t mean an expert opinion will always be right: it’s simply much more likely to be right than the opinion of a non-expert. The underlying principle here is probabilistic thinking, our ability to predict the truth of current and future reality based on limited information. Thus, a scientist studying autism would be much more likely to predict accurately the consequences of vaccinations than someone who has spent 10 hours Googling “vaccines and autism” online.

 

This greater likelihood of experts being correct does not at all mean we should always defer to experts. First, research shows that experts do best in evaluating reality in environments that are relatively stable over time and thus predictable, and also when the experts have a chance to learn about the predictable aspects of this environment. Second, other research suggests that ideological biases can have a strongly negative impact on the ability of experts to make accurate evaluations. Third, material motivations can sway experts to conduct an analysis favorable to their financial sponsor.

 

However, while individual scientists may make mistakes, it is incredibly rare for the scientific consensus as a whole to be wrong. Scientists get rewarded in money and reputation for finding fault with statements about reality made by other scientists. Thus, for the large majority of them to agree on something – for there to be a scientific consensus – is a clear indicator that whatever they agree on reflects reality accurately.

 

The Internet Is for… Misinformation

 

The rise of the Internet, and more recently social media, is key to explaining the declining public confidence in expert opinion.

 

Before the Internet, the information accessible to the general public about any given topic usually came from experts. For instance, scientific experts on autism were invited to talk on this topic on mainstream media, large publishers published books by the same experts, and they wrote encyclopedia articles on this topic.

 

The Internet has enabled anyone to be a publisher of content, connecting people around the world with any and all sources of information. On the one hand, this freedom is empowering and liberating, with Wikipedia a great example of a highly-curated and accurate source on the vast majority of subjects. On the other, anyone can publish a blog piece making false claims about links between vaccines and autism or the effectiveness of homeopathic medicine. If they are skilled at search engine optimization, or have money to invest in advertising, they can get their message spread widely.

 

Unfortunately, research shows that people lack the skills for differentiating misinformation from true information. This lack of skills has clear real-world effects: just consider that US adults believed 75% of fake news stories about the 2016 US Presidential election. The more often someone sees a piece of misinformation, the more likely they are to believe it.

 

Blogs with falsehoods are bad enough, but the rise of social media made the situation even worse. Most people re-share news stories without reading the actual articles, judging the quality of the story by the headline and image alone. No wonder that research indicates that misinformation spreads as much as 10 times faster and further on social media than true information. After all, the creator of a fake news item is free to devise the most appealing headline and image, while credible sources of information have to stick to factual headlines and images.

 

These problems result from the train wreck of human thought processes meeting the Internet. We all suffer from a series of thinking errors such as confirmation bias, our tendency to look for and interpret information in ways that conform to our beliefs.

 

Before the Internet, we got our information from sources such as mainstream media and encyclopedias, which curated the information for us to ensure it came from experts, minimizing the problem of confirmation bias. Now, the lack of curation means thinking errors are causing us to choose information that fits our intuitions and preferences, as opposed to the facts. Moreover, some unscrupulous foreign actors – such as the Russian government – and domestic politicians use misinformation as a tool to influence public discourse and public policy.

 

The large gaps between what scientists and the public believe about issues such as climate change, evolution, GMOs, and vaccination exemplify the problems caused by misinformation and lack of trust in science. Such mistrust results in great harm to our society, from children dying to damaging public policies.

 

What Can We Do?

 

Fortunately, there are proactive steps we can take to address the crisis of trust in science and academia.

 

For example, we can uplift the role of science in our society. The March for Science movement is a great example of this effort. First held on Earth Day in 2017 and repeated in 2018, this effort involves people rallying in the streets to celebrate science and push for evidence-based policies. Another example is the Scholars Strategy Network, an effort to support scholars in popularizing their research for a broad audience and connecting scholars to policy-makers.

 

We can also fight the scourge of misinformation. Many world governments are taking steps to combat falsehoods. While the US federal government has dropped the ball on this problem, a number of states passed bipartisan efforts promoting media literacy. Likewise, many non-governmental groups are pursuing a variety of efforts to fight misinformation.

 

The Pro-Truth Pledge combines the struggle against misinformation with science advocacy. Founded by a group of behavioral science experts (including myself) and concerned citizens, the pledge calls on public figures, organizations, and private citizens to commit to 12 behaviors listed on the pledge website that research in behavioral science shows correlate with truthfulness. Signers are held accountable through a crowdsourced reporting and evaluation mechanism while getting reputational rewards because of their commitment. The scientific consensus serves as a key measure of credibility, and the pledge encourages pledge-takers to recognize the opinions of experts as more likely to be true when the facts are disputed. Over 500 politicians took the pledge, including members of state legislatures Eric Nelson (PA) and Ogden Driskell (WY), and members of US Congress Beto O’Rourke (TX) and Marcia Fudge (OH). Two research studies at Ohio State University demonstrated the effectiveness of the pledge in changing the behavior of pledge-takers to be more truthful with a strong statistical significance. Thus, taking the pledge yourself, and encouraging people you know and your elected representatives to take the pledge is an easy action to both fight misinformation and promote science.

 

Conclusion

 

I have a dream that one day, children will not be dying from taking poisonous homeopathic medication or getting sick with measles because their parents put their trust in a random blogger instead of  extensive scientific studies. I have a dream that schools will be teaching media literacy and people will know how to evaluate the firehose of information coming their way. I have a dream that we will all know that we suffer from thinking errors, and watch out for the confirmation bias and other problems. I have a dream that the quickly-growing distrust of experts and science will seem like a bad dream. I have a dream that our grandchildren will find it hard to believe our present reality when we tell them stories about the bad old days.

To live these dreams requires all of us who care about truth and science to act now, before we fall further down the slippery slope. Our information ecosystem and credibility mechanisms are broken. Only a third of Americans trust scientists and most people can’t tell the difference between truth and falsehood online. The lack of trust in science – and the excessive trust in persuasive purveyors of misinformation – is perhaps the biggest threat to our society right now. If we don’t turn back from the brink, our future will not be a dream: it will be a nightmare.