Recent evidence shows that public trust in science and vaccines has declined markedly since the pandemic. Why is this, and is there anything we can do about it?
Given all the denial of the science behind vaccines, GMOs, evolution, and climate science, you might think that Americans in general distrust scientists and physicians. It's actually not true. Trust in scientists and doctors remains high, but there are still areas where mistrust of scientists is a significant problem. What can be done?