Promoting science-based medicine involves many components – discussing the often-complex relationship between research and best practice, examining specific claims, promoting science-based regulations, and addressing the many ways in which people try to undermine the scientific basis of medicine.
To those naïve to the challenges we face, at first the concept of science-based medicine seems obvious. Well of course you’re going to base medicine on science, what are the other options? At its core the idea is simple: medical practice should be informed by the best evidence we have available. In practice, this is complicated, because there are many ways to evaluate the evidence.
Further, people find many ways to deny the science, either a specific scientific conclusion, or science itself. We hear these various excuses all the time – there are “other” ways of knowing, we don’t need science to know what works, I am the evidence, etc. Sometimes people respect science, but just get it wrong. They might misunderstand the placebo effect, overestimate the significance of studies, not understand the nature of p-hacking, or fail to realize the potential for self deception in less-than-rigorous studies.
Then there are those who will dismiss entire swathes of science out of hand. This is commonly done through an appeal to conspiracy theories, such as references to “Big Pharma”, or the notion that doctors lie to make money, that the system is broken and cannot be trusted, or even that science itself is broken.
While there are differences in cognitive style, the research and common experience suggest that most people engage in many of these tactics at various times. These cognitive flaws and biases are not something that other people do, they are things that we all do, unless we vigilantly and carefully guard against them.
A new study further supports this conclusion. Emilio Lobato and Corinne Zimmerman did a study of 244 students and faculty members, asking them their opinions on GMOs, climate change, vaccines, and evolution. They also asked them open-ended questions about what would change their mind on these topics. What they found was that people used inconsistent reasoning across the four topics.
Some patterns did emerge, in line with previous research, showing a strong correlation between an analytical style of thinking and acceptance of the scientific consensus. There was also a positive correlation with a liberal ideology, and a negative correlation with religious belief and conspiracy ideation. Otherwise, the study showed a “consistent inconsistency”.
Subjects were asked to justify their rejection of the scientific consensus. In 33% of cases, one third, subjects simply restated their position, essentially giving no justification. In 34% of cases the subjects did cite evidence. In 20% of cases the subjects referenced their cultural or religious identity. So only about a third of the time did subjects reference evidence as the justification for their belief. This does not mean their belief is based on evidence – only that they justify the belief that way.
We know from other research that people will sometimes come to a conclusion for emotional reasons (identity, ideology) and then rationalize that belief, citing evidence or arguments that were not the real reason for their belief in the first place. They will also resist changing their position, even in the face of solid evidence, if their belief is emotionally held.
There are many studies showing that people will engage ad hoc in motivated reasoning, meaning that the conclusion comes first, and reasoning is used to justify the conclusion rather than determine the conclusion. There is inconsistent evidence for a possible backfire effect – which means not only rejecting evidence which contradicts a held belief, but strengthening that belief in the face of contradictory evidence.
The new study also shows evidence of motivated reasoning. Subjects would use various methods to deny the science, shifting from topic to topic. Subjects might cite evidence for one topic, then personal belief for another, then give no justification for a third. Only 11% of subjects cited evidence to justify their position on all four topics.
When subjects were asked what would potentially change their mind on a topic, 45% of subjects stated for at least one topic that nothing would change their mind, and 17% said this of more than one topic. On the positive side, 80% of subjects said that evidence would change their mind on at least one topic, but not a single subject said this about all four topics.
What all of this suggests is that people do not usually engage in metacognition – thinking about their own thinking. They may have a cognitive style that they tend to use, but otherwise they engage in whatever type of reasoning serves their purpose on any particular topic. They might vigorously defend the consensus of scientific opinion on one topic, then reject it on another citing a vague conspiracy, and dismiss it on a third without any real justification or by appealing to fallacious logic.
To counter this we cannot just teach science or explain what the evidence says. We need to teach critical thinking skills – which is metacognition. Critical thinking includes an understanding of all the various ways our thinking can go wrong. Just as important, however, is that critical thinking involves stepping back from our own cognitive process to examine it objectively, to make a sincere effort to consistently apply valid logic and the same fair and objective criteria for evidence.
Doing this is really hard, because people are very good at motivated reasoning. We are extremely creative at inventing reasons to reject logic and evidence we don’t like, and those invented reasons can create a powerful illusion of being correct – so much so that almost half of people feel comfortable saying that no new evidence could even theoretically change their mind on a topic.
We are also not doing this alone – we are social creatures and have a robust social network through which we spread ideas. In many cases motivated reasoning comes prepackaged. All the work of inventing creative reasons, cherry picking evidence, twisting logic, attacking scientists and institutions, and making emotional appeals has already been done. It’s even been market-tested, tweaked, and improved. The result is slick (and often well-funded) science-denial that takes real dedication to unpack.
The good news is that critical thinking skills are broadly applicable. That’s why I often encourage teaching critical thinking around topics that are less emotional (for the target audience), and then slowly encouraging the application of those critical thinking skills to more and more emotionally held beliefs. This is actually a life-long process, and it’s never done.
All we can hope to do is move the needle slowly in the direction of increased scientific literacy and critical thinking. That is ultimately the only way to promote science-based medicine.