Educating the public about medical myths and misconceptions has various challenges. The psychological deck seems to be stacked against us. It’s easier to scare people with possible risks than to reassure them with facts. People tend to be more compelled by emotional anecdotes than dry data. There is something inherently compelling about conspiracy theories that attract many people. People are good at remembering dramatic details, but poor at remembering whether or not they are true and what the source of the information is.

But perhaps the most profound factor making our job difficult is that once an idea has taken root in someone’s mind, it is remarkably difficult to change. Humans are instinctively good at motivated reasoning and confirmation bias. We see what we want to see, remember the bits that support our narrative, and can rationalize away pesky things like logic and evidence. The result can be a powerful, even overwhelming, illusion of confident knowledge, even in notions that are patently absurd. We can then erect elaborate defenses around these beliefs to protect them from reality.

This is all bad enough, for ideas that spread naturally through the culture, but these cognitive bugs can also be hijacked by those with an agenda, which is usually to sell something. This is how pre-packed narratives create and maintain a market for useless and implausible snake oil and dubious medical treatments. We can point out that homeopathic potions have no active ingredients and essentially break the laws of physics, and the response by defenders is the equivalent of, “Ignore that man behind the curtain”. Just say something about Big Pharma, shills, and then coat it with some hand-waving nonsense about water memory, and belief in magic is maintained.

At this point you’re probably thinking – this is all just the setup, right. What’s the solution? There is a solution, actually. Not a silver bullet, but at least something that works. It’s clearly difficult (I won’t say pointless) to give people information after they have already been indoctrinated into pseudoscience and conspiracy theories. So obviously we need to get to them before misinformation has taken root. This is where education comes in, and not just of children in school, but lifelong learning.

There are basically three types of preventive education that are likely to reduce susceptibility to pseudoscience. The first is scientific literacy. There is some controversy over how effective this is, however. A few decades ago the “knowledge deficit model” was dominant, and the prescription for belief in pseudoscience was to teach people science. However, recent research has not been kind to the knowledge deficit model. You cannot usually change someone’s mind about an emotionally held belief with just information.

But sometimes you can. You can move the needle a bit with public education about certain topics. Sometimes beliefs are based more on misinformation than emotion or identity, and if you correct that misinformation you can change beliefs. This is very topic specific, and also is affected by the kind of information you give and how it is presented. For example, global warming denial is strongly predicted by political ideology, and not at all by scientific literacy. However, vaccine denial does correlate with low scientific literacy, which implies that science education can be a mitigating factor.

Other research, however, shows that simply giving facts is not enough. It is more effective to replace one narrative with another explanatory narrative. This is because our knowledge is not just a collection of facts. We weave those facts into an explanatory narrative that we use to understand the world, which gives us at least the illusion of control and predictability. If you tell someone a fact that challenges their explanatory narrative, you are attempting to pull the carpet out from under them. They will resist (with fervor proportionate to the degree to which they are emotionally invested in the narrative). But – if you simultaneously give them an alternate narrative, they can swap one out for the other without losing their equilibrium.

I am not suggesting this is easy, or always works, but it appears to be statistically more effective than just doling out facts.

The second type of education is critical thinking. This relates more directly to the point about narratives. Critical thinking is about metacognition, knowing how to think with a valid process that is self-reflective and therefore potentially self-corrective. If someone understands exactly how conspiracy thinking is a cognitive trap, they are less likely to fall into that trap. Critical thinking and scientific literacy is a powerful combination – this is essentially what we mean by scientific skepticism, which is exactly what we are doing here (in the realm of medicine).

But there is a third component to education that arguably sits on top of the other two – media literacy. This is something scientific skeptics have addressed tangentially, but increasingly is being recognized as a primary concern we need to take head on. It’s easy to conclude that this is due to the rise of social media, but of course this issue has always existed. It is, however, exacerbated by the fast and open exchange of information and ideas allowed by the World Wide Web.

While we should focus attention on formal education, most people get most of their knowledge and information outside the classroom. Consumption of information is increasingly online, although traditional media still dominate. We live in a multi-media world, and those with access are constantly consuming information from multiple sources, increasingly online. Trying to change what people believe with formal education is therefore a losing proposition. We need to engage everywhere.

But this also implies that media literacy is critical to overall information literacy. I won’t say it is more important than scientific and critical thinking literacy, because the three work together and cannot be disentangled. Media literacy may be the linchpin, however. Recognized principles of media literacy (again – with much overlap with critical thinking) include:

  • Question everything. Don’t believe something just because it feels right – especially if it feels right.
  • Think about the source of information. Is it a primary or secondary source? What are the likely biases of that source? Is it tied to an agenda? Is it authoritative?
  • Check multiple sources. Are all or most sources saying the same thing?
  • Try to track back to the original primary source, rather than trusting someone else’s summary.
  • Are the claims being made credible or plausible?
  • What does the actual evidence say? Distinguish this from how it is being interpreted, or what speculation is flowing from the data.
  • Are there any apparent attempts to manipulate your emotions? Are they appealing to any form of tribalism, fear, greed, or otherwise pushing emotional buttons? Are they using “click bait” headlines, or sensationalizing the facts?

All that is a lot of work, however. If we want the greatest number of people to actually do something we need to make it as easy as possible. This is again a significant inherent disadvantage to the skeptical position – it’s difficult, and hard work by its very nature. Meanwhile, going with your feelings and intuitions is much easier. There is, however, a way to make media literacy more intuitive, and that is through experience. This is the idea behind a recent study that uses a game to teach media literacy through experience.

The idea is to have subjects play a game on their smartphone in which their task is to manipulate social media in order to maintain credibility while fooling as many people as possible. This may seem counterintuitive – aren’t we teaching these subjects to be deceptive? But I think it’s like teaching someone stage magic. Magicians are famously skeptical, because they know how easy it is to deceive others. It’s their job. Knowledge of deception inoculates them against those same methods of deception.

That is what the researchers found with social media as well. Teaching people the mechanisms by which social media can be used to manipulate emotions, because they themselves are doing it, inoculates them against those same methods. Essentially the game teaches them to be skeptical critical thinkers when it comes to consuming information online.

The job of promoting science-based medicine is increasingly difficult, but at the same time our knowledge of how to do it optimally is at least improving. We now have a more clear idea that we need to focus simultaneously on scientific literacy, critical thinking skills, and media literacy. We also cannot just take comforting beliefs away from people. We need to show them how to make sense of our complex world with facts and logic, and how to avoid all the common cognitive pitfalls. And we need to promote media literacy, so that people can learn for themselves how to evaluate sources and information.

Sure, this is a herculean task. But the alternative is to slide numbly into a post-truth world of fake news, truth decay, pseudoscience, and alternative facts. There is no final or complete success of failure in this struggle. We’ll just keep pushing as hard as we can to move that rock up the hill.


Posted by Steven Novella

Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.