May 13, 2025
Trending News

Why don’t some people trust science?

  • December 31, 2023
  • 0

We recently learned that a third of people in the UK reported that their confidence in science had increased during the pandemic. However, 7% said it had decreased.


We recently learned that a third of people in the UK reported that their confidence in science had increased during the pandemic. However, 7% said it had decreased. Why are there such inconsistencies in the answers?

For many years, it was thought that the main reason why some people rejected science was a simple lack of knowledge and a justified fear of the unknown. Accordingly, many studies have shown that those who know more about science from textbooks have more positive attitudes towards science.

But if that were really the real problem, the solution would be simple: Inform people about the facts. But this strategy, which dominated scientific communication for much of the 20th century, failed on many levels.

Controlled experiments found that giving people scientific information did not change their views. In Great Britain, scientific reports on genetically modified technologies have even led to the opposite effect.

Failure of the information strategy may be associated with people ignoring or avoiding information that contradicts their beliefs (also called confirmation bias). But the second problem is that some do not trust either the message or the operator. This means that distrust in science is associated not only with a lack of knowledge, but also with a lack of trust.

With this in mind, many research groups, including ours, have set out to find out why some people trust science and others do not. A strong predictor of people’s distrust of science has emerged during the pandemic: distrust of science in the first place.

Understanding insecurity

Recent research has shown that people who reject or distrust science tend to think they don’t know much about science, but more importantly, that they understand science.

This conclusion has been confirmed repeatedly over the past five years in studies examining attitudes towards a variety of scientific issues, including vaccines and GMO crops. We found this to be true even when participants were not asked about any specific technology. But they may not apply to some politicized sciences, such as climate change.

Recent research has also found that self-righteous people who dislike science tend to mistakenly believe that their point of view is accepted, and therefore many others agree with it.

Other evidence suggests that some science deniers also derive psychological satisfaction from framing their alternative explanations in irrefutable ways. This is often inherent in conspiracy theories, whether it is microchips in vaccines or Covid-19 caused by 5G radiation.

But the real purpose of science is to examine and test theories that can be proven false—theories that scientists call falsifiable. Conspiracy theorists, on the other hand, often ignore information that is inconsistent with their preferred explanations and, in extreme cases, question the motives of the person disseminating that information.

When a person who trusts the scientific method argues with someone who does not, he is actually playing by different rules of the game. This means it’s hard to convince skeptics that they might be wrong.

Finding a solution

What can we do with this new understanding of attitudes towards science?

The messenger is as important as the message. Our study confirms many previous surveys showing that politicians are distrusted, but university professors are, when it comes to science communication, for example. This should be remembered.

The fact that some people hold negative attitudes supported by the false belief that others agree with them points to another potential strategy: telling people what the consensus position is. The first to do this was the advertising industry. Claims such as “Eight out of ten cat owners say their pets prefer this brand of food” are very popular.

A recent meta-analysis of 43 studies examining this strategy (“randomized control trials” – the gold standard of scientific testing) found support for this approach to changing beliefs about scientific facts. By defining a consensus position, it implicitly clarifies what constitutes misinformation or unsubstantiated opinion; This also means that it addresses the problem of half the people not knowing what is true due to the circulation of conflicting evidence.

An additional approach is to prepare people for the possibility of misinformation. Misinformation spreads rapidly, and unfortunately, every attempt to debunk this information causes the misinformation to spread even further. Scientists call this the “continuous exposure effect.” Genies are never put back in bottles. It is better to anticipate objections or inoculate people against strategies used to promote misinformation. This is called a “pre-rebuttal,” as opposed to a rebuttal.

Different contexts may require different strategies, but it matters whether the science involved is based on consensus among experts, like climate change, or is cutting-edge research into the unknown, like a brand new virus. In the latter case, a good way is to explain what we know, what we don’t know, and what we did (emphasizing that the results are preliminary).

By emphasizing uncertainty in rapidly changing fields, we can counter the objection that the sender of a message cannot be trusted because he says one thing today and another tomorrow. However, no strategy can be 100% effective. Even with the widely discussed PCR tests for COVID, we found that 30% of the public said they had never heard of PCR.

A common problem with science communication is that it appeals to people who are already interested in science, which may be why you’re reading this article. Still, new communication science suggests that reaching out to non-sciencegoers is definitely worth a try.

Source: Port Altele

Leave a Reply

Your email address will not be published. Required fields are marked *

Exit mobile version