Dartmouth Medicine HomeCurrent IssueAbout UsContact UsSearchPodcasts

PDF Version   Printer-Friendly Version

Discoveries

Ironing out a historical problem

By Amos Esty

In 1941, President Franklin Roosevelt faced a looming crisis: poor nutrition. Too many young men enrolling in the Army weren't getting enough vitamins and minerals, especially iron, in their diets. To produce hardier soldiers (and citizens), the government mandated that flour and bread be fortified with iron.

One result of this public-health intervention was a decline in anemia, which is caused by iron deficiency. But there may be some less-fortunate consequences.

Lungs: Iron performs many critical functions, including carrying oxygen from the lungs to the rest of the body. But humans use only a small amount of iron each day, so it accumulates—usually starting in the teen years for men and after menopause for women.

For years, researchers have hypothesized that iron might contribute to a number of medical problems, but few randomized trials have been done to confirm the connection. One of the few has been led by Leo Zacharski, M.D., a professor of medicine at DMS. Since 1999, he's studied iron levels and disease, including cancer, in 1,277 patients in a trial funded by the Department of Veterans Affairs.

Levels: Zacharski measured the amount of ferritin—a protein that stores iron—in all the patients' blood. Then about half of them underwent periodic phlebotomies—or bloodletting—to reduce their iron levels. At the start of the trial, the average ferritin level for all participants was 122.4 nanograms of ferritin per milliliter of blood (ng/ml). The goal for those in the reduction group was to keep their ferritin levels between 25 and 60 ng/ml.

The results confirmed the concern about the risks of excess iron. Of the 641 patients in the control group, 60 developed new malignancies during the study and 36 died of cancer. In the reduction group, only 38 of 636 patients developed malignancies and 14 died of cancer—a statistically significant difference.

Further insight came from the fact that not everyone in the reduction group followed the study's guidelines. The intention was that they'd have their ferritin checked every six months and have a phlebotomy if it was over 60 ng/ml, but some patients waited longer than six months. Zacharski says this variability strengthened the finding, because patients who stuck to the schedule were less likely to develop a malignancy than those who did not. And in the control reduction group, only 38 of 636 patients developed malignancies and 14 died of cancer—a statistically significant difference.

Questions remain about how iron could cause cancer, but Zacharski is convinced of the connection. "Iron loves to react with oxygen," he says. Its affinity for oxygen is what makes it essential for proper cellular functioning. Extra iron, however, can produce molecules called free radicals, known to damage DNA, proteins, and lipids—a possible pathway to malignancy. So is it time to ditch a policy from a different era? Yes, argues Zacharski. For most Americans today, he feels, iron supplementation is unnecessary—and, in some cases, harmful.


If you'd like to offer feedback about this article, we'd welcome getting your comments at DartMed@Dartmouth.edu.

This article may not be reproduced or reposted without permission. To inquire about permission, contact DartMed@Dartmouth.edu.

Back to Table of Contents

Dartmouth Medical SchoolDartmouth-Hitchcock Medical CenterWhite River Junction VAMCNorris Cotton Cancer CenterDartmouth College