The discovery of various vitamins – essential micronutrients that cause disease when deficient – was one of the great advances of modern scientific medicine. This knowledge also led to several highly successful public health campaigns, such as vitamin-D supplementation to prevent rickets.

Today vitamins have a deserved reputation for being an important part of overall health. However, their reputation has gone beyond the science and taken on almost mythical proportions. Perhaps it is due to aggressive marketing from the supplement industry, perhaps recent generations have grown up being told by their parents thousands of times how important it is to take their vitamins, or eat vitamin-rich food. Culture also plays a role – Popeye eating spinach to make himself super strong is an example this pervasive message.

Regardless of the cause, the general feeling is that vitamins are all good – they are not only important for health, they promote health. Many people take vitamin supplements on the idea that more is better, or for nutritional “insurance” to make sure they are getting enough of every vitamin.

The problem with deeply embedded cultural beliefs is that people make decisions based upon assumptions that everyone “knows,” rather than making evidence-based risk vs benefit decisions. This phenomenon is exacerbated when the industry is able to make aggressive health claims without requiring any scientific evidence to back up those claims (as is the case in the US since DSHEA was passed in 1994).

It is therefore important to shatter the pedestal on which vitamins have been placed, to bring them down to the level of scientific evidence. The good news is, there is a ton of research on vitamins, which continue to be the subject of much new research. Each year, therefore, the risks and benefits of vitamins become more clear. One recent study which is getting much press adds to this body of knowledge about vitamins.

In the latest issue of The Archives of Internal Medicine is a population based observational study looking at health outcomes and vitamin use as part of the larger Iowa Women’s Health Study. The authors looked at 38,772 older women and asked them to self-report their vitamin use. This is a long term study and their vitamin use was reports in 1986, 1997, and 2004, and mortality was followed through 2008. They found a small but statistically significant increase in mortality for those taking multivitamins, B6, folic acid, iron, copper, magnesium and zinc. There was also a small decrease in mortality for those taking calcium.

The strength of this study is that it is large with a long term follow up. There are many weaknesses, however. Vitamin use was self-reported. Further, this is a correlational study only. Therefore possible confounding factors could not be controlled for. For example, it is possible that women who have an underlying health issue that increases their mortality were more likely to take vitamins or to report taking vitamins.In fact, other studies suggest there is such a “sick-user effect” with vitamins.

It is therefore not possible from this study to draw any conclusions about cause and effect – that vitamin use increases mortality. But it does provide a cautionary reminder that it is not reasonable to assume that vitamin supplementation is without any risk. We still need to follow the evidence for the use of specific vitamins at specific doses for specific conditions and outcomes.


As is typical of observational studies, the results are somewhat mixed, depending upon the details of how such studies are conducted. There are also many variables to consider – which vitamins and which doses in which populations with what health conditions. There is therefore a great deal of noise in the data. I do not think we can conclude that the vitamins listed above actually increase risk of mortality. But neither can we conclude that there is any health benefit for routine supplementation. Years of research have failed to provide such evidence, and the mixed results we are seeing is consistent with there being no or only a small effect.

Based upon the totality of evidence the best current recommendation is to have a well-rounded diet with sufficient fruits and vegetables, which should be able to provide most people with all the micronutrients they require. There is no evidence to support routine supplementation. There is also reason to avoid taking megadoses of vitamins, as this can cause toxicity, and even short of toxicity the evidence becomes more compelling at higher doses of the risks of supplementation.

But there are also many situations in which targeted supplementation is evidence-based and appropriate. There is increasing evidence to support the use of vitamin D supplementation for many populations. Many elderly have borderline or  low B12 levels, which correlates with dementia. Pregnant women should take prenanatal vitamins. (To give just a few examples.)

Vitamins are just like any other health care intervention – they have potential risks and benefits and it is best to follow the evidence. For most people the best advice is to ask your primary health care provider which supplements, if any, you should take. Recommendations should be based upon specific health conditions and blood tests to measure levels of vitamins, so that specific deficiencies can be appropriately targeted.

Posted by Steven Novella

Founder and currently Executive Editor of Science-Based Medicine Steven Novella, MD is an academic clinical neurologist at the Yale University School of Medicine. He is also the president and co-founder of the New England Skeptical Society, the host and producer of the popular weekly science podcast, The Skeptics’ Guide to the Universe, and the author of the NeuroLogicaBlog, a daily blog that covers news and issues in neuroscience, but also general science, scientific skepticism, philosophy of science, critical thinking, and the intersection of science with the media and society. Dr. Novella also contributes every Sunday to The Rogues Gallery, the official blog of the SGU.