Shares

Part of the SBM mission is to examine, and hopefully help to improve, the ways in which science is communicated – from scientists, to press releases, mainstream and social media. After doing this for a few decades I have concluded that it’s very difficult, because science is messy, nuanced, and complicated.

For example, we often walk a fine line at SBM, trying to communicate the problems with the current institutions of science (publication bias, p-hacking, predatory journals, infiltration of pseudoscience, etc.) without making it seem like science itself is broken and doesn’t work. We point these things out in a constructive way, and when possible suggest specific fixes. I am also usually careful to point out that these issues do not mean that science does not work. Eventually the self-correcting nature of science does tend to sort things out. The problem is that it can take an unnecessarily long time, or we can waste resources chasing illusions, or practitioners may be making decisions based on unreliable preliminary evidence. This, of course, is especially true in medicine as an applied science.

In addition to exploring how to make science and medicine more effective and efficient, it is extremely important to educate the public about how science actually functions. Any institution is only as good as the people in it, and the institutions of science and medicine are dependent on public support. It doesn’t matter what scientists may believe in their “ivory towers” if the public decides to trust in gurus and charlatans instead. Looked at another way, individually physicians understand the important of informed consent. As a profession we need to do better to give society informed consent, which means educating not only about science but about anti-science, science denial, and pseudoscience.

How science is reported

A new study looks at various narratives that frame how science stories are often reported in the media, and their effect on public perception. The strength of the study is that it includes 4,497 subjects, and it includes various arms for comparison. One main weakness is that it is looking at short term effects only.

The authors edited science articles so that they are consistent with one of four narratives:

  • The “honorable quest” or discovery, in which a scientist discovers knowledge that is reliable and consequential;
  • The “counterfeit quest,” or retraction of published work, in which a scientist engages in dishonorable and guileful conduct;
  • The science is “in crisis/broken” narrative, which indicts scientists or the institution of science for failing to address a known problem; and
  • The “problem explored,” where scientists explore and potentially fix a problem revealed by the “crisis/broken” narrative.

They also included a fifth control arm where subjects read a story about baseball. For me the most interesting result was how little these narratives affected the outcome, even though the study is presented as showing a significant effect. Overall trust in science and scientists was moderate to high. The authors report:

Exposure to stories highlighting problems reduced trust in scientists and induced negative beliefs about scientists, with more extensive effects among those exposed to the “crisis/broken” accounts and fewer for those exposed to “counterfeit” and “problem explored” stories.

The effect size, however, was very small. There was a larger effect from the pre-existing belief of the subject:

In the “crisis/broken” and “problem explored” conditions, we identified a three-way interaction in which those with higher trust who considered the problem-focused stories to be representative of science were more likely to believe science is self-correcting and those with lower trust who perceived the stories to be representative were less likely to report that belief.

So if you already have trust in science, you will perceive stories about problems as evidence that science is self-corrective, and if you have a preexisting low trust in science you will see the same stories as more evidence science is broken. This is what I find anecdotally. Often we might report on a story about a problem in science here with the narrative that science works but can be better, while anti-science news outlets will report the exact same story with the narrative – science is broken and you can’t trust it (so buy my supplements).

Interestingly, support for funding science was not affected by any of the narratives. This fact may supersede the other findings. If you still support funding scientific research, you must have some level of confidence in it. At the very least it suggests we need to take the other findings with a grain of salt.

While I ultimately agree with the authors that we need to think carefully about how we structure the narratives of science reporting, I think there is a bigger problem than which of these narratives to use. Rather, the problem is that none of these narratives is adequate. The authors write:

This study demonstrates the detrimental consequences of media failure to accurately communicate the scientific process…

I agree – but none of these narratives captures the scientific process, which is messy, complicated, and nuanced. Instead of the “honorable quest” narrative we need the “let’s put this new evidence into proper scientific perspective” narrative. Instead of a straight “counterfeit quest” we need to also include context about the underlying scientific field or claim, which may or may not be valid despite instances of fraud.

The “crisis/broken” narrative is perhaps the most harmful, because it pretends that proper science has no issues or problems. When such problems are exposed, it then makes it seem like all of science must be hopelessly broken. This is an invitation to science-denial – but again, mostly among those who already have distrust in science. At least, however, this narrative reinforces that distrust. The better approach is to combine and moderate the “crisis” narrative with the “problem explored” narrative to create a nuanced narrative. Yes there are problems (not a crisis) in science, but those problems are not fatal and can be fixed.

Also, science is not monolithic. Some disciplines and subcultures of science are much better than others, so they have to be looked at individually. There are some systemic problems, like predatory journals, that need systemic solutions, however.

Similarly, I would not take the “science journalism is broken” narrative. There is great science journalism, horrible journalism, and everything in between. But there are some systemic problems that should be improved. That is also part of our mission – to raise the baseline quality of science journalism by exploring what works, what doesn’t, and how it goes wrong. Falling into simplistic narratives is one of the major problems. Science reporting does not lend itself to simple narratives. We need to raise the comfort level with nuance.

Shares

Author

  • Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.

Posted by Steven Novella

Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also has produced two courses with The Great Courses, and published a book on critical thinking - also called The Skeptics Guide to the Universe.