Increasingly it seems that there’s more and more doubt about scientific data. While it’s important not to believe everything you hear, it’s vital to understand that things change as new information is discovered. Sometimes, this contradicts what was initially thought. Why is this distrust of scientific data so bad? How can you verify the accuracy of the information you’re reading?
Why are scientific ideas hard to believe in? What makes our minds so resistant to certain kinds of facts, even when substantial amounts of evidence back them? One thing is for sure, and it’s more important than ever to understand why some people deny, doubt, or resist scientific explanations – and what can be done to overcome these barriers because science denial became deadly in 2020.
Many factors contributed to this, such as political leaders not supporting what scientists knew to be effective prevention measures. This was one of the things that led to people dying from COVID-19 still believing it didn’t exist.
So, is there a problem with trusting science?
To understand where we are currently, we must go back to the past. After World War II, an unusually high level of trust turned towards a more cynical view of societal institutions, especially politics, and media, thanks to Vietnam and Watergate. In the mid-20th century, trust was high, and polarization was low, resulting from remarkable growth in middle-class income and a reduction of inequality.
The General Social Survey (GSS) is one of the oldest and most comprehensive recurring surveys of American attitudes. It shows that although trust in public institutions has declined over the last half-century, science is the one institution that hasn’t been affected. Since 1973, around 40% of Americans say they have a great deal of confidence in science. According to the 2016 GSS data, people trust scientists more than Congress (6%), the executive branch (12%), the press (8%), people who run major companies (18%), banks and financial institutions (14%), the Supreme Court (26%) or organized religion (20%).
This trust in science is critical because science issues are a part of our lives, raising a range of social, ethical, and policy issues. Unfortunately, scientific issues can be a crucial battleground for facts and information. Some of the public divides over science issues align with partisanship, while others are not. Pew Research released an analysis of data in February 2020 that provides an in-depth look at what the American public feels regarding certain aspects of scientific data.
When it comes to childhood vaccines, there are differences in public beliefs about the risks and benefits. According to their review, Pew found that Republicans and independents who lean conservative are just as likely as Democrats and independents who lean liberal to say that, overall, the benefits of the measles, mumps and rubella vaccine outweigh the risks (89% and 88% respectively). However, the political left is more likely to believe the claim that childhood vaccines are causing an epidemic of autism. This belief continues despite over ten years of research disproving this notion. One telltale sign is that autism rates continue to rise, even though the alleged offending agent in vaccines (a mercury-based preservative called thimerosal) has long since been removed.
An issue that swings in the other direction is climate change. Even as the science becomes more unequivocal, Republicans and Democrats have been growing more divided in their views about the topic. Interestingly, a 2008 Pew survey found only 19% of college-educated Republicans agreed that the planet is warming due to human actions, versus 31% of non-college-educated Republicans. Meanwhile, among Democrats and independents, more education correlated with greater acceptance of science.
Science denial seems to be considerably more prominent on the political right. Why?
Some researchers think there are psychological differences between the left and the right. Conservatives are more rigid and authoritarian, meaning they’re more likely to defend the status quo. In contrast, liberals are more tolerant of ambiguity and are willing to accept things that are different from the current state.
One theme running through the findings is that public hesitancy often is tied to concern about the loss of human control. When it comes to emerging scientific and technological developments, public reactions can vary significantly. A 2018 Pew survey found that Americans have mixed views over whether the use of gene editing to reduce a baby’s risk of serious disease that could occur over their lifetime is appropriate (60%) or is taking medical technology too far (38%). For automation technologies in the workplace, more people say it has brought more harm than help to American workers.
The Pew research found that most Americans (73%) view current scientific developments as having a positive impact on society, and 82% expect future scientific developments to yield benefits for society in years to come. While the degree to which the idea differs sizably by race, ethnicity, and levels of science knowledge, these findings are in line with those of the GSS survey in 2018, which found that about 74% of Americans said the benefits of scientific research outweigh any harmful results. Per their past data, this measure has been roughly stable since the 1980s.
As of 2019, Pew information shows that 35% of Americans report a great deal of confidence in scientists to act in the public interest, increasing from 21% in 2016. However, Americans differ on the role and value of scientific experts in policy matters. About 60% of adults believe that scientists should participate in policy debates about scientific issues. In comparison, 39% feel that scientists should focus on establishing sound scientific facts and stay out of such discussions.
One factor in the public trust of scientists is familiarity with their work, which means that trust in practitioners, such as medical doctors and dietitians, is higher than that for researchers. Overall, 63% of Americans say the scientific method generally produces sound conclusions, while 35% think it can be used to create “any result a researcher wants.”
When talking about scientists’ transparency and accountability, less than 20% of Americans say that scientists are transparent about potential conflicts of interest with industry groups all or most of the time. The data also found that 57% of adults trust scientific research findings more when the data is openly available to the public, and 52% say they are more likely to trust research that has been independently reviewed. Surveys by the National Science Foundation found that over 75% of Americans say they’re in favor of taxpayer-funded basic research.
Though science still holds a respected place, there’s a gap between what scientists and some citizens think. When people reject science, it’s because they’re asked to believe something that conflicts with a deeply held view. Often it’s related to something political (my party does not endorse that), religious (my god did not say that), or personal (that’s not how I was raised). This means the gap between what science shows and people’s beliefs is about identity.
What influences our beliefs?
The first thing to understand about beliefs is that we aren’t blank slates eager to assimilate the latest experiments into our worldview. Instead, we come equipped with all sorts of naïve intuitions, many of which are untrue. These preexisting beliefs can skew our thoughts and shape what we consider our most dispassionate and logical conclusions. This leads to motivated reasoning, which is from modern neuroscience and states that reasoning is actually suffused with emotion. It goes further to say that not only are the two inseparable, but our positive or negative feelings arise much more rapidly than our conscious thoughts. This is the result of evolution.
In the past, we were required to react very quickly to stimuli in our environment. So, it’s natural to push threatening information away. Reasoning comes later and works slower. However, our quick-fire emotions can set us on a course of thinking that’s highly biased, especially on topics we care a great deal about. This tendency toward motivated reasoning helps explain why we’re so polarized over matters where the evidence is unequivocal.
One example is from a 2012 Gallup survey that showed 46% of adults said they believed that “God created humans in their present form within the last 10,000 years,” and only 15% agreed that humans had evolved without the guidance of divine power. What might be surprising is that these percentages have remained virtually unchanged since Gallup began asking the question over thirty years ago.
One key thing to know about people who engage in science denial is that very few deny science as a whole. Whether it comes in the form of dismissing fact-based evidence as being untrue or accepting notions that are not factual as being true, science denial is not typically rooted in total anti-science attitudes. It’s often based on motivations other than finding the truth, such as protecting their identity.
Beliefs also have nothing to do with how educated or intelligent a person is. In fact, research shows that the more science-literate people are, the more strongly they hold to their beliefs, even if those beliefs are totally wrong. If people are less educated and dislike something, they can just reject it out of hand. When someone is educated, they can come up with counterarguments to explain why they’re right. This makes their minds harder to change. One study found that when people were shown incorrect information alongside a correction, the update didn’t change their initial belief in the misinformation. Even worse, the individuals became even firmer in their beliefs after reading a correction.
Research shows people with the most scientific intelligence are also the most partisan. According to researchers, a big part of the problem is that people associate scientific conclusions with political affiliations. In the past, political and cultural leaders usually agreed on scientific findings. They were promoted as being in the public’s best interests. Now, scientific facts are being used in a struggle for cultural supremacy. Instead of thinking about the facts, people think about what side they’re on.
One of the most significant cultural shifts in recent years is the rise of fake news. This is where claims with no evidence behind them get shared as fact alongside evidence-based, peer-reviewed findings. Researchers have started calling this trend the ‘anti-enlightenment movement.’ The rise in this results from increased usage of social media and a more partisan press.
Think about how you search for information. You’ll most likely go to specific news sources that have similar views to yours or “talk” to others (family, friends, or those in the groups you like) on social media who also have similar viewpoints. We gravitate toward information that confirms what we believe or confirmation bias. This has led to a decrease in interaction with the information/people that we disagree with, which has caused polarization, further exacerbating our differences.
What happens when a scientific discovery challenges deeply held beliefs?
The first thing is a subconscious negative response to the new information. This guides the type of memories and associations formed in your conscious mind, which will lead you to build an argument and challenge what you see/hear. When someone wants to believe something, they act more like lawyers trying to prosecute what they already want to be true. This means when we think we’re reasoning, we may instead be rationalizing. Many people will practice disconfirmation bias in which they expend disproportionate energy trying to debunk or refute views and arguments that they disagree with.
Many psychological studies have shown that people respond to scientific evidence in ways that justify their preexisting beliefs. Even when the subjects are explicitly instructed to be unbiased about the evidence, they often fail. Often when the facts are against a person’s opinions, they don’t necessarily deny the facts but say the facts are less relevant. Further, when you encounter facts that don’t support your idea, your belief in that idea actually grows stronger.
What can be done to counteract human nature?
If you want someone to accept new evidence, you must present it in a context that doesn’t trigger a defensive, emotional reaction. This is especially true if the message comes across as judgmental of a person’s whole character. The research suggests that simply focusing on the evidence and data isn’t enough to change someone’s mind about a particular topic. Instead, try to find the cause of the person’s unwillingness to accept scientific consensus and look for common ground to introduce new ideas.
Each person has multiple social identities. So, when one identity blocks acceptance of the science, leverage a second identity to make a connection. Tailor the message so that it aligns with their motivation. Once respect is established, any criticism is very much tapered.
It’s vital to recognize that all of us are probably operating with misguided beliefs about science to some level. To combat this, we all must strive to adopt a scientific attitude, which is an openness to seeking new evidence and a willingness to change one’s mind. Remember, it’s not knowledge but curiosity that makes us more likely to accept scientific truths. We also need to learn to monitor our quick, intuitive responses. Instead, turn on the rational, analytical mind and ask yourself:
-How do I know this is true?
-Is it plausible?
-Why do I think it is true?
Next, do some fact-checking. Look at articles with both pro and con information, evaluate the source of that information, and be open to the evidence leaning one way or another. Learn to not immediately accept information you already believe. It’s also essential to realize the role emotions play in decision-making about science. It’s important to understand that emotions are fully integrated when thinking and learning about science.
Science is not just for the few. It’s for everyone! Unfortunately, it’s become much more removed from daily life because of the necessity of speaking with precision by using technical terms and jargon, which removes it almost entirely from the realm of the public. We need to find new and better ways to connect the practice and use of science to inform and shape our communities, country, and world.
We must also realize that the practice of science is messy. Hypotheses are put suggested and tested with an understanding that changes come in fits and starts. Usually, the trial and error in research with repetitive testing are often hidden behind the end products without the public seeing all the required changes along the way. With the Covid-19 pandemic, we’re seeing that process in real time, which means the public has little context for updates in public health guidance.
Science education isn’t simply learning new theories. It requires us to unlearn our instincts, shedding false beliefs. Remember that we never fully unlearn our mistaken intuitions about the world; we just learn to ignore them. While you shouldn’t discard an entire belief system over some new piece of scientific evidence, it’s vital to consider new information.
When discussing new data, don’t lead with the facts to convince someone. Instead, lead with values to give the facts a fighting chance. According to the celebrated Stanford University psychologist Leon Festinger, “A man with a conviction is a hard man to change. Tell him you disagree, and he turns away. Show him facts or figures, and he questions your sources. Appeal to logic, and he fails to see your point.”