One of the things I write a lot about, both here and at my not-so-super-secret other blog is science denial. This denial of science takes many forms, including denial of the conclusions of climate science that the earth is warming due to human activity, denial of evolution, denial of the science showing that vaccines are safe and effective and do not cause autism or myriad other conditions for which antivaxers blame them, and many more. Here, most commonly, we tend to write about denial of vaccine science and the antivaccine movement, although other relevant topics include HIV/AIDS denialism (the denial that HIV causes AIDS), denial by a number of cancer quacks that chemotherapy works, and a more general denial of scientific medicine by a number of practitioners of “natural medicine.” This science denialism is virtually always coupled with one or more conspiracy theories. The one I discuss the most is the antivaccine conspiracy theory that claims that the CDC “knows” that vaccines cause autism but has covered up and suppressed the evidence for it. Cancer quacks like to claim that a cure for cancer exists, but that big pharma, oncologists, and the government are keeping it from the people and suppressing knowledge of it in order to maintain their income and profits. Given this level of suspicion, you’d think that the public harbors a high degree of mistrust for scientists and physicians, A new poll released earlier this month by the Pew Research Group shows that that’s (mostly) not true:
In an era when science and politics often appear to collide, public confidence in scientists is on the upswing, and six-in-ten Americans say scientists should play an active role in policy debates about scientific issues, according to a new Pew Research Center survey.
The survey finds public confidence in scientists on par with confidence in the military. It also exceeds the levels of public confidence in other groups and institutions, including the media, business leaders and elected officials.
If you look at the accompanying graph of one of the overall conclusions of the poll, the percentage of Americans who say they have a great deal or fair amount of confidence in scientists to act in the best interests of the public, you’ll see that 86% of Americans trust scientists to act in the best interests of the public, up from 76% in 2016, with 35% saying they have a “great deal” of confidence, up from 21% in 2016. For medical scientists, the percentage is 87%, up so slightly over 84% in 2016 as to be within the margin of error. By way of comparison, business leaders only total 46% positive responses, with single digit percentages of respondents answering that they have a “great deal” of confidence in business leaders to act in the public interest. Elected officials fare even worse, with only 35% positive responses and even lower single digit percentages answering that they have a “great deal” of confidence in them, although, oddly enough, the percentage was only 27% in 2016. (Given the state of our politics in 2019, I’m finding it odd that more people express more confidence in our politicians to do right by the public, even if the overall confidence level remains abysmally low.)
When Pew gets into the weeds, there are a number of interesting findings regarding who has confidence in scientists and which kinds of scientists are the most trusted. Indeed, it’s rather akin to looking at a state’s overall vaccination level, seeing that it’s high, and dismissing concerns about low vaccine uptake even though there are pockets of low vaccine uptake in the state that could predispose to outbreaks. Trust in scientists and physicians might be high overall, but there are nonetheless major areas of concern. I’m going to concentrate mainly on the Pew Research Group’s findings with respect to public trust of medical and nutrition science, because, well, the name of this blog is Science-Based Medicine.
Americans trust practitioners more than scientists, but distrust nutrition science
The Pew survey asked respondents whether scientists in six different specialties can be counted on to act with competence, present their recommendations or research findings accurately, and care about the public’s best interests—or, in the case of physicians, their patients’ best interests. In addition, Pew asked about potential sources of mistrust, including issues of transparency and accountability for mistakes or misconduct. One conclusion of this poll is that Americans tend to trust “science practitioners” (such as physicians and dietitians) more than they trust science or medical researchers. For example, a majority of respondents (54%) say that dietitians do a “good job providing recommendations about healthy eating all or most of the time,” compared with only 28% who say nutrition scientists do a good job conducting research all or most of the time. Personally, I wasn’t surprised by these figures, given how the media report scientific and epidemiological studies of nutrition, particularly the narrative of, “Food X prevents cancer” followed later by a report of a study that finds that Food X either doesn’t prevent cancer or causes cancer. Indeed, given the uncertainty in a lot of nutritional research, coupled with the sensationalistic way the media like to report on nutritional research, it’s not at all surprising that there is so little trust in the research. What I do find surprising is that Pew would spin a finding that only a little more than half of people trust dietitians to be a positive finding, or that these are positive:
In addition, 47% say dietitians provide fair and accurate information about their recommendations all or most of the time, compared with 24% for nutrition scientists discussing their research. Six-in-ten Americans (60%) think dietitians care about the best interests of their patients all or most of the time, while about half as many (29%) believe that about nutrition researchers when it comes to concern for the public.
Again, those are pretty low numbers. The numbers are even lower for environmental health scientists and practitioners:
In contrast, public levels of trust in environmental health specialists and environmental research scientists are roughly the same. For instance, 39% of U.S. adults say environmental health specialists do a good job versus 40% for researchers, and 35% say each provides fair and accurate information all or most of the time.
Those are horrible numbers and could easily be one reason for how easily denialism of environmental science can take root.
When it comes to physicians, the numbers are considerably better, but still not the greatest. Although 74% of the public view physicians mostly positively, there are still problems:
Similarly, the public tends to view medical doctors more positively than medical researchers when it comes to their concern for the public’s interests and providing trustworthy information. For example, 57% of Americans say doctors care about the best interests of their patients all or most of the time, compared with 35% for medical researchers. About half the public (48%) believes that medical doctors provide fair and accurate treatment information all or most of the time, compared with 32% who say this about medical researchers in discussing their findings.
Less than three-fifths of the public think that doctors care about the best interests of their patients all or most of the time? Only 49% of people believe that doctors do a good job (another finding of the survey)? We have a major trust problem here.
Sources of mistrust and trust in scientists
What could be the reasons for this lack of trust? Pew asked respondents about conflicts of interest, transparency, and scientific misconduct. Half of the respondents say that misconduct is a big problem (15% very big and 35% moderately big) among medical doctors, with 48% answering the same (14% very big and 34% moderately big) about medical researchers. Among the other scientific specialties examined, the percentages ranged from 24% for dietitians to 42% for environmental research scientists. Again, we have a trust problem here.
One set of potential reasons:
No more than 19% say that scientists across these six specialties are transparent in revealing potential conflicts of interest with industry all or most of the time. A larger share – ranging from 27% to 37% – believes scientists are transparent only a little or none of the time. Similarly, fewer than two-in-ten Americans say that scientists admit and take responsibility for their mistakes all or most of the time.
If you want to know why antivaxers are so quick to invoke the “pharma shill gambit” when attacking scientists, physicians, and vaccine advocates, look no further. Yes, it fits their conspiratorial mindset and they invoke this gambit so often as to make it easy to joke about it, but it’s also a criticism that has a chance of gaining traction, given how many people view scientists as insufficiently transparent about their financial conflicts of interest.
There are also large numbers of people who believe that scientists don’t face sufficient consequences for misconduct. Depending upon the specialty, up to half of US adults say that scientists and physicians face serious consequences for misconduct “only a little” or “none of the time”. Specifically, 53% say that about nutrition researchers; about dietitians, 47%; about environmental researchers, 48%; about medical researchers, 45%; and about environmental health researchers, 42%. Oddly enough, only 30% say that medical doctors rarely face consequences for professional misbehavior.
We know what fuels mistrust, but what fuels trust in scientists? Pew found that people’s level of familiarity with scientists and their level of factual knowledge about science “can be consequential for public trust in scientists”. To sum it up, people with higher levels of factual knowledge about science tend to hold more positive and trusting views of scientists. However, the effect of these factors in promoting trust in scientists tends to be more limited than the effect of transparency, conflicts of interest, and admitting mistakes are in driving down trust in science. For example, among physicians, the percentage of respondents who say that doctors provide accurate information about their recommendations all or most of the time is 44% in those who know a little bit about doctors and 56% among those who know a lot. The percentage given the same answer to the question is 34% among those with little science knowledge and 58% among those with a lot of science knowledge. For medical science researchers, the percentage giving the same answer to the question is 17% in those who know a little bit about medical researchers and 53% among those who know a lot about them. Those with low science knowledge only give those answers 22% of the time, while those with high science knowledge respond that way 41% of the time.
Another unsurprising finding is that race plays a role in trust in science. For example, a large majority of black Americans (71%) and Hispanics (63%) respond that misconduct by medical doctors is a very/moderately big problem. Compare this to the finding that only 43% of whites give the same response to the question, and you can see that minorities distrust physicians way more than Caucasians do. The same is true regarding people’s opinion of medical researchers. A larger percentage of blacks (59%) and Hispanics (60%) say that misconduct by medical research scientists is a very big or moderately big problem, while only 42% of whites do. It’s not news to physicians that minorities, especially blacks, are much less trusting of doctors. Given the history of medicine in the US, racial inequities in medical care, and the ongoing relative paucity of black physicians, increased suspicion is not unreasonable. Even though medicine has been trying to alleviate medical disparities based on race, recruit more minorities into medicine, and reduce implicit bias among physicians, that history is hard to overcome.
Partisan political differences in public trust of scientists
No discussion of sources of mistrust of medicine and science would be complete without the examination of political effects. I’ve noted in the past, for instance, that, although antivaccine views are roughly equally prevalent on the right and on the left, right now the most prominent and politically active antivaxers tend to be on the right. So it was of great interest to me to see how partisan political beliefs influence trust in scientists.
Not surprisingly, there were wide differences in trust noted, but those differences tended to focus in certain specialties. For example, comparing the responses of people who are Republican or independents who lean Republican with those of people who are Democrats or independents who lean Democratic, Pew found huge differences in trust of environmental practitioners and scientists. (For purposes of brevity, I will just refer to each as Republicans or Democrats, with independents leaning towards each party lumped in with party members.) Way fewer Republican respondents (40%) reported having a mostly positive view of environmental research scientists than Democratic respondents (70%). Similarly, more Democrats (73%) than Republicans (46%) have a mostly positive review of environmental health specialists. Differences persisted for nutritional scientists, with 57% of Democrats and only 43% of Republicans trusting nutrition research scientists, the equivalent split being 63%/58% for dietitians. When it came to physicians and medical researchers, the differences collapsed to near or at the margin of error, with the Democrat/Republican split on this question being 77%/73% for physicians and 70%/67% for medical research scientists.
The obvious conclusion here is that areas where there is more partisan disagreement about policy show a wider partisan split in favorability rating. This split is even more pronounced when it comes to public perception that a given scientific discipline provides fair and accurate information most of the time. When it comes to the findings of environmental research scientists, only 19% of Republicans trust them to do this, while 47% of Democrats do; for environmental health specialists, the Republican/Democratic split is 25%/43%. On the question of whether environmental scientists care about the best interests of the public all or most of the time, the split was 22%/50%, and for the question of whether they do a good job conducting research all or most of the time, the split was 26%/51%.
There are a couple of remarkable things about these results. First, although I wasn’t surprised that Republicans would be very distrustful of environmental science, given Republican alignment with the interests of business, that climate science is a huge part of environmental science, and that climate science denialism has over the last couple of decades become baked into GOP ideology, I was amazed that the percentages saying they trust environmental scientists were so low among Democrats, less than 50%. One must wonder if the decades-long campaign to demonize climate scientists has had an effect on the public overall. Second, there’s a devastating finding in this section that shows just how politically polarized environmental science has become. When Pew compared the percentage of people with low, medium, or high science knowledge who say their view of environmental researchers is mostly positive, they found that science knowledge didn’t matter for Republicans. Only 39-41% of Republicans did, regardless of level of science knowledge, while the percentage increased from 48% of Democrats with low science knowledge to 89% among Democrats with high knowledge. As for the percentage of respondents saying that environmental researchers provide fair and accurate informational all or most of the time, the results with the same. For Republicans, only 18-20% agreed with this statement, regardless of their level of science knowledge. Among Democrats the results increased from 29% (low science knowledge) to 65% (high science knowledge). In other words, among those who are or lean Republican, level of scientific knowledge doesn’t affect their opinion of environmental scientists, and similar results were found when the same questions were asked about environmental practitioners.
Truly, climate science denialism and denial of environmental science appear to have become inseparable from GOP ideology and identity, and that’s a huge problem given that it hasn’t always been that way. Remember, the EPA was created during the Nixon administration, and as recently as the 2004 election cycle and even beyond, many Republicans were not hostile to climate science and accepted that human activity is contributing greatly to climate change.
There also remains distrust of science on the Republican side in other areas, just not as strikingly different from that of Democrats. For instance, only 17% of Republicans compared to 29% of Democrats agree that nutrition research scientists give fair and accurate information most of the time; oddly enough, there’s essentially no difference in responses to the same question about dietitians (Republicans, 46%; Democrats 49%). There’s a similar pattern in terms of the percentage of respondents agreeing with this statement when applied to medical research scientists, with only 29% of Republicans agreeing, while 35% of Democrats agree. Partisan differences notwithstanding, I was appalled at how low both percentages were. When the same question was asked about medical doctors, 46% of Republicans compared to 52% of Democrats agreed.
So, although the overall view of scientists and physicians is favorable, and most people say they trust them, there are some big problems. When minorities don’t trust scientists, that’s a problem. When a whole political party rejects well-established scientific findings as part of its ideology, that’s a problem. When so few people trust certain branches of science, such as nutritional science, that’s a problem.
What to do about trust and mistrust of scientists and physicians
Overall, this survey provides a rather mixed picture of Americans’ trust in scientists and physicians. While overall trust is high, there are pockets of major mistrust based on scientific discipline, political differences, and race. Fortunately, the survey’s findings suggest some strategies that might alleviate that mistrust.
One finding is that people would be more likely to trust scientific findings if researchers were to make their data publicly available. Specifically, 57% would be more likely to trust research if the data were publicly available, with 35% saying it wouldn’t matter and, bizarrely, 8% saying it would make them trust the research less. Open access to data is, of course, more achievable. For instance, to public genomics data, most journals require that the raw data be deposited in one of several genomics databases, and promising to do so is often required more often as a condition of funding by grant agencies. That’s an outlier, though. Open access to data is still not required in the vast majority of scientific disciplines, although the proliferation of online supplemental data sections for journal publications does make some progress in addressing the open access question. Of course, as a scientist myself, I have to wonder how far it is practical to go with this strategy. Does anyone want to see my experiment where I forgot to add one reagent and therefore got a bizarre result?
Similarly, 52% would trust research findings more if the research were reviewed by an independent committee, with 37% saying it would make no difference, and 10% saying it would make them trust the research less. Of course, I can’t help but wonder what the heck an “independent committee” would be. Pew doesn’t define it, and those who know enough about science know that peer review is essentially a small independent committee of two to four reviewers (most of the time), while study sections that review grant applications tend to be large committees of a dozen or more scientists. As they say, the devil is in the details, and it’s not clear just what form of an “independent committee” to review research would increase trust in the results of that research.
Funding sources matter, too:
Industry funding stands out as a factor Americans say leads to lower trust. A majority of Americans (58%) say they trust scientific findings less if they know the research was funded by industry groups.
The effect of government-funded research is less clear. About half of U.S. adults (48%) say learning that a study has been funded by the federal government has no impact on whether they trust its findings. The remainder is closely divided between those who say government funding decreases their trust (28%) and those who say it increases their trust (23%).
Similar factors inspire public trust in practitioners. About two-thirds of the public (68%) say they are more likely to trust practitioners’ recommendations more if that practitioner is open to getting a second opinion. About one-quarter (23%) say a practitioner’s willingness to get a second opinion makes no difference, and just 7% say it decreases their trust.
Among practitioners, the answer is obvious: We really have to be a lot more transparent about our industry ties and acknowledge that even small gifts can influence our thinking. I’ve heard more doctors than I can remember vehemently deny that a dinner or gift from a drug rep influences them, but those denials are not consistent with what we know from science about human psychology.
Unfortunately, more government funding of research, while in my opinion a desirable thing, is not desirable to all:
Opinions about government-funded research differ by politics. Among conservative Republicans, just 9% say that government funding increases their trust in research findings, while 41% say it decreases their trust. In contrast, liberal Democrats are more inclined to say government funding increases (34%) rather than decreases (21%) their trust in scientific research.
These findings are in keeping with political divides over support for federal spending on scientific research and an array of other government policy and spending priorities.
As long as there remains a partisan divide on the question of research funding, I don’t know that scientists can do anything other than try to be as transparent as possible about their funding sources, because increasing industry funding risks increasing mistrust among liberals and increasing government funding risks increasing mistrust among conservatives.
In the end, we as scientists and physicians should be heartened that we are among the most trusted groups in the US. We should also be concerned and perhaps even alarmed at the warning signals contained in this survey sounding the alarm about public distrust of specific scientific disciplines by specific subgroups of Americans. More transparency of funding sources and industry ties could help alleviate some of that distrust, as could more open access of data, but the specifics of how we achieve these things will not be simple or uncontroversial.