If you’re thinking of adding vitamins and supplements to your health maintenance routine in the new year, you may be better off improving your diet instead. A recent panel of experts concluded that taking pills containing vitamins and minerals may not actually do much to improve your health.
A group of experts known as the U.S Preventative Services Task Force reviewed the evidence on whether vitamin supplement pills are as beneficial as they claim, and the results weren’t particularly promising. They came to the conclusion that it’s generally much more effective to get vitamins from food sources. While vitamins, antioxidants, and nutrients have been clearly linked to better health, the body’s ability to successfully absorb vitamins depends on how they are ingested.
Nutrient-dense foods like fruits and vegetables contain a variety of vitamins and compounds that work together, amplifying the nutritional benefits and making it easier for the body to process each vitamin. Experts suggest that taking a supplement pill of a vitamin may reduce or even eliminate its health benefits because the supporting nutrients aren’t there to help it work.
Of course, further studies are needed to confirm the task force’s findings, and if your doctor recommends or prescribes a specific supplement, take it as directed. But if you’re going out to purchase vitamins of your own accord in an effort to be healthier, your time and money may be better spent on fruits and veggies.