Where did the iodine go?In the early 1900s, landlocked areas in the US (the Great Lakes, midwest, and northwest mountainous regions) experienced an epidemic of goiter, swelling of the thyroid gland due primarily to iodine deficiency. These regions have soil that is deficient in iodine as there is no access to the sea, the predominant source of iodine. Since the soil is deficient in iodine, plants grown in that soil are deficient in iodine as well. As a result, the people in these areas became iodine deficient and, in response, their necks swelled as their thyroid glands grew to trap more iodine from the bloodstream.
Map of the goiter belt
Taken from learnendocrinology.blogspot.com
In order to fight this goiter epidemic, iodide was added to salt because salt consumption was fairly stable at the time. This successfully solved the goiter epidemic, but did it solve the iodine deficiency problem? There is no way of knowing for sure, however, a study done in Italy in 2006 found that despite there being a very low prevalence of goiter in Piedmont school children, 39% had low urinary iodine excretion levels suggesting that the absence of goiter may not indicate iodine sufficiency(1). There are a couple of important ways iodine has been removed from the diet and levels are lower today than they were 40 years ago.
One of the ways that iodine has been removed from the US diet is via poor food recommendations given out by the USDA. It is currently considered good practice to reduce salt and egg consumption to help improve the risk of heart disease. Removing these two products from the diet greatly diminishes the iodine content of the diet as they are 2 of the more significant sources of iodine. In addition, most people have made the switch from iodized salt to sea salt because they have been told that sea salt is better. The sea salt most people consume does not contain any source of iodine or iodide, which is plainly written in small letters on every bottle. It's also ironic that most people who consume sea salt are consuming refined sea salt which can be seen when looking at the color of the salt. If it's white and clean looking you are essentially just eating rocks of table salt without iodine added, all of the beneficial minerals that make sea salt healthy have been removed in the refining process. So not only are you not getting any iodine, you aren't getting any of the benefits from sea salt either.
Another interesting fact that people don't realize is that up until the 1970s, iodated bread conditioners were used in breads to increase shelf life. Each slice of bread contained 150mcg of iodine, an amount equivalent to the RDA for adults. However, due in large part to a study performed in 1948 on rats by Wolff and Chaikoff(2), iodine was slowly being removed from the medicine cabinet and our food supply. In the study, the authors concluded that when serum levels of iodine reached a certain level, the thyroid stopped making thyroid hormones, an effect now called the Wolff-Chaikoff effect. This is a temporary phenomenon, perhaps being an artifact of the body adapting to a change in the internal environment that reverses itself in 2 days. The problem is that they never measured blood levels of the thyroid hormones in this study, and these findings have never been confirmed in humans. So while the thyroid gland may have temporarily stopped making thyroid hormones, we have no idea if this reflected a drop in circulating serum thyroid hormones. In addition, attempts to reproduce the Wolff-Chaikoff effect in rats have failed.
Two of the biggest problems with the findings of the Wolff and Chaikoff studies is that iodine had been used liberally in medicine up until that point successfully for many conditions and the Japanese, who are considered much healthier than most populations, consume enormous quantities of iodine from seaweed in comparison to the RDA and the findings of the Wolff-Chaikoff study. In a later paper published by Wolff when he was with the National Institute of Health(NIH), Dr. Wolff identified consumption of iodine in excess of 200mcg as problematic and levels above 2000mcg(2mg) as potentially harmful(3). What is odd about these recommendations is that the vast majority of Americans would consume well above 200mcg by simply eating 2 slices of bread, and there was no epidemic of thyroid disorders back when this was the standard. In fact, thyroid disorders have actually increased since iodine consumption has been reduced. It doesn't end there, when you look at the levels of iodine consumed by the Japanese, these recommendations look even more foolish.
It is currently estimated that the Japanese consume an average of 1-3mg of iodine per day in the form of seaweed(4). Judging by the data they compiled, two things are evident. First, while the average consumption of iodine by the Japanese is between 1-3mg/day, there appears to be large variability in consumption levels. The largest study in this review contained 4138 subjects and found a mean urinary iodine content of 3300mcg/L. Since the average person urinates 2L/day, this would lead to a daily urinary excretion of 6600mcg of iodine per day which would indicate an intake of 7333mcg/day, 37 times what is considered harmful in the Wolff paper. Secondly, as the Japanese have adopted more of a Western diet, iodine consumption has dropped considerably as can be seen by the fact that iodine consumption is much higher in the older Japanese generation in comparison to younger ones(4). In addition, Japanese people who move to the United States have much higher rates of the diseases reported in part 1 of this blog than do Japanese people remaining in Japan(4).
One final example of the use of high levels of iodine can be found by perusing the iodine forums on curezone. Led by Drs. Abraham, Brownstein, and Flechas; many people are ignoring the recommendations of Wolff-Chaikoff and using very high doses of iodine to reverse many of the issues discussed in part 1 of this blog series. These people are using 50mg, 100mg, and even 150mg of iodine successfully to help with a range of issues. This has spawned many books including Dr. Brownstein's, Iodine: Why you need it, why you can't live without it as well as The Iodine Crisis by Lynne Farrow. In the Iodine Crisis, Farrow reports many of the cases of successful use of high dose iodine found on curezone as well as at her site www.breastcancerchoices.org.
There is no doubt iodine consumption has decreased substantially since these changes have been implemented to our food system. The National Health and Nutrition Examination Survey (NHANES) has looked at nutritional intake since 1971. Between 1971 and 1999, urinary iodine excretion dropped by 50%(5). However, researchers consider the US population to be iodine sufficient because the level of iodine found in the urine indicates that iodine consumption approximates the 150mcg RDA for adults. That is, of course, assuming you believe 150mcg to be an adequate amount of iodine. One of the odd things about this data, however, is that the urinary iodine excretion from the 1970s would indicate significant iodine excess by the Wolff criteria. Despite the potential iodine excess back then and more appropriate iodine intake according to Wolff now, thyroid disorders as well as breast cancer rates have increased substantially over this time period. Breast cancer rates in the 1970s were 1 in 20 and have increased to in 1 in 7 today.
If iodine sufficiency is attained at 150mcg or iodine per day, you would assume that higher intakes would just lead to more iodine excretion in the urine. A study done by Koutras et al. in 1964 found that with increasing iodine intakes up to 800mcg/day, the body accumulates up to 7mg of iodine over the course of 12 weeks with no change in thyroid output(6). This would indicate that not only is 150mcg of iodine per day not adequate to become iodine sufficient, the iodine is being used by tissues other than the the thyroid gland.
ConclusionChanges to the food system have substantially decreased the iodine intake of people in the United States. While urinary iodine excretion levels indicate iodine sufficiency, the data currently used to recommend iodine intake is not adequate. It seems foolish to use a single study performed in rats as a basis for recommending the intake of any nutrient for humans. While it seems safe to say that the RDA of iodine is adequate to prevent goiter, it seems like a pretty big jump to use it as the basis for total body iodine sufficiency based on a study in rats. Further studies are needed to identify the level of iodine in the diet necessary for optimal health. Given that the Japanese as well as other Asian cultures consume far more iodine than Americans and tend to have better health outcomes indicates that higher levels of iodine intake are, at the very least, tolerated very well and, at best, better for your health. This wide range of tolerable iodine intake coupled with the temporary nature of the Wolff-Chaikoff effect indicates that this effect may merely be the body adapting to a rapid change in iodine intake and not due to the damaging effects of high iodine intake.
In the final installment of this series we will look at the iodine loading test, how to safely use high levels of iodine from food or supplements, and factors that may increase your iodine needs. While removing iodine from bread may have been a bad idea, what they replaced it with may have made matters far worse.