Chapter Two
Iodine: An Essential Dietary Element
Introduction
The protagonist here is iodine.[1] It provides the medical background for the remainder of this work. Conventional medical presentations focus on an organ, a physiological system, or a disease, while this presentation focuses on an element. My approach relates the dynamics of iodine's movement through physical, commercial, and organic systems to the theory of trace elements, to iodine's essential role in thyroid function, and to the most common disorders engendered by iodine deficiency.
Western biomedicine recognizes the importance of iodine, but ordinary clinicians often take adequate iodine intake for granted. Spanish physicians did not normally question, or were not normally taught to ask, whether their thyroid patients' iodine intake was sufficient, though many patients came from areas previously identified as iodine deficient. Clinical symptoms were therefore interpreted as idiopathic (stemming from no known external cause) rather than as the result of malnutrition. This is an instance of a tendency widespread in the medical profession to wrongly assume a disease to be idiopathic, and treat it as such, when, in fact, it should be seen as a case of endemic malnutrition that must not only be treated but also prevented in the future.
Knowledge of iodine or of its absence as an essential dietary ele-
ment is crucial to correct diagnosis and action. I seek to give an overview of these matters here. First, I discuss the way iodine normally moves through the environment and within organisms. Following is a discussion of iodine-related pathologies.
Physical and Organic Iodine
Theory of Trace Elements
A trace element's essentiality is difficult to demonstrate, for, in contrast to bulk and macro elements that are ingested and concentrated in living tissue at levels measured in grams and kilos, trace elements are ingested and concentrated in tissue at low concentrations and are measured in milligrams and micrograms. The biological role of only a few of these elements is known at present, but the list is expanding. An early definition of essentiality held that an element is essential if it is required for the maintenance of life and if the organism dies in its absence. The definition was problematic, however, for even in a laboratory experiment it is difficult to eliminate all traces of any particular element and hence to demonstrate that death follows from total deficiency. As a result, a more workable definition of what is essential has been proposed:
An element is essential when a deficient intake consistently results in an impairment of a function from optimal to suboptimal and when supplementation with physiological levels of this element, but not of others, prevents or cures this impairment. (Mertz 1981)
A trace element is now considered essential if on ingestion in suboptimal amounts, it impairs function and on supplementation, restores it. This change in definition is significant for health policy because the presence of apparently unafflicted individuals amid a population believed to be deficient posed, according to the old definition, a problem: their very presence cast doubt on the notion that the element was essential to the maintenance of life. A well-formed, intelligent individual amid a cretinous and goitrous population seemed, in the case of iodine, to call into question the whole idea of essentiality. The new definition disposes of that obstacle to prophylaxis.
The Dose Response Curve
The dose response curve (fig. 1) illustrates the new definition. It facilitates consideration of impaired function and deals with overintake as well as deficiency. Arsenic's toxic effects in large doses are well known, for example, but its deficiency effects are only beginning to be documented. Conversely, effects of molybdenum deficiency were well known before its toxic effects were even surmised.
The shaded area on the left in figure 1 shows impaired function below a certain threshold and adequate function above it. The shaded area on the right indicates the dysfunctional aspects of over-dosage. The intake of iodine at either extreme can produce a hypoor hyperfunctional thyroid gland. Optimal function takes place within a wide range of intake, allowing for daily and seasonal variation. People can take in most of their annual iodine requirement, for example, over the course of a fishing season. The breadth of that safe margin makes it unnecessary for policymakers to spend time pinpointing "locally ideal" levels of supplementation.
A trace element does not act by itself. Its efficacy depends on organification, that is, on its becoming part of a carbon compound within a living organism. It becomes effective only on forming part of larger molecules, such as the pair of thyroid hormones, T3 and T4.
Homeostatic mechanisms buffer the ends of the range of optimal intake. Supraoptimal amounts of a trace element may simply be excreted when intake far exceeds the required level. Suboptimal intake may be buffered, as in the case of iodine, by shifting production of hormone to T3, the generally more potent of the pair, which uses fewer atoms of iodine.
Below, I describe how iodine moves through the environment, the food chain, and the body; how certain factors impede its transformation into hormone; how the body responds to marginal intake; and the disorders in which iodine deficiency plays an important though often poorly appreciated role. This understanding of the cycle, and of iodine physiology and pathology comes from standard biomedical sources (Stanbury 1969, 1978; Matovinovic 1983; Fisher 1983; Utiger 1979; Tepperman, 1980; Petersdorf 1983; Netter 1965; Pitt-Rivers 1961; Thompson and Thompson 1980).[2]

Fig. 1.
Dose Response Curve (based on Mertz 1981:1332)
My purpose in focusing on the element needs to be underscored and explained: conventional medical presentations leave clinicians and health officials without a proper appreciation for the movement of iodine through the physical and organic world, setting the stage for taking the presence and availability of iodine for granted. Prophylaxis may take a back seat to therapeutics when this movement fails to be appreciated.
The Cycle of Iodine in the Environment
Iodine makes up 0.4 percent of the earth's mass but is unevenly distributed. It is present in rock and earth in the form of soluble iodine salts that when taken up by plants, enter the food chain. Iodine's solubility makes it prone to being leached out of soil, especially in areas of heavy precipitation. In this way, it gravitates toward the sea where it becomes concentrated.
Oceanic evaporation permits iodine to become airborne and return to the land by way of atmospheric iodine transport. Climatic forces of glaciation and high precipitation leach iodine out of highlying mountain areas such as the Alps, Himalayas, and Andes, leaving many mountain populations severely iodine deficient.
Leaching is particularly severe where the parental rock is limestone, as in the Cantabrian range of central and eastern Asturias. Limestone lowlands, once glaciated, tend also to be poor in iodine. In such areas, problems of iodine deficiency are compounded, for limestone dissolves as water percolates through it, thus charging groundwater with minerals. As part of drinking water those minerals bind with iodine, making it less available for organification. The "goiter belt" of the United States, stretching from New York State to Minnesota and beyond, exemplifies such a case. As a general rule, the farther the area lies from the sea, the slower it is replenished by atmospheric transport.
Iodide is more abundant in rock and soil than in seawater, but the life forms that thrive in seawater concentrate it, for example, in kelp and fish thyroids. These substances themselves, or the ash derived from them, have long been used in China, the Andes, and Asturias as folk remedies for goiter.[3]
The largest natural storehouse and site of extraction of iodide is the Chilean nitrate bed, which was formed when ancient sea-
beds became mineralized. Until recently, most of the world's iodide production came from this deposit. With the multiplication of industrial uses of iodine in the twentieth century, iodide production has diversified, drawing on both minerals and plants for raw material. Kelp, for example, is harvested on Asturian shores and sent to other Spanish provinces for processing into gums and chemicals. Indeed, more than 99.5 percent of the world's current production of iodide and iodate is destined for industrial ends not related to nutrition. Supplementation of the world's human population with prophylactic iodine would annually take no more than 370 tons. However scarce iodine may be, even in the diets of people harvesting it from the sea for industrial purposes,[4] it cannot be considered a scarce world resource.
Dry salt mined from interior deposits may, before it is purified, be rich in iodide. But contrary to popular belief, solar salt and sea salt made from iodine-rich brine are not themselves rich in iodine, for brine contains impurities drawn off before the salt is harvested. Only artificial applications of iodide during later stages of salt manufacture ensure its iodine content.
Drinking water is frequently used as an indicator of local iodine status, though humans rarely receive more than 10 percent of their dietary iodine from drinking water. It may, however, be an appropriate indicator of intake if one recognizes that water draining the local environment generally reflects the iodine content of the vegetation, thus reflecting the iodine status of people subsisting chiefly on locally grown plant food. It is, however, a poor indicator of iodine status when the diet includes goitrogens (see below) or when the diet includes many foods of animal origin, since terrestrial iodine becomes concentrated at the top of the food chain. This means that people with greater access to milk, eggs, blood, and meat—to foods at the top of the food chain—are less likely to experience pathology than those subsisting almost exclusively on a diet of roots, nuts, and grain. A dual diet within a single zone can thus exempt the richer segment of society from symptoms while producing them in the poorer. Unfortunately, this differential effect props up belief in the innate vulnerability of the poor, while seeming to undermine the environmental hypothesis.
Iodine has been withdrawn or added to diets in unexpected ways. Disturbance of trade routes or a change in salt supply has
brought symptoms of iodine deficiency to populations formerly free of them. In Nepal, for example, newly available solar salt has supplanted the unrefined rock salt formerly transported by animal power over difficult mountain passes (Mumford pers. comm.). In New Guinea, noniodized commercial salt has suppressed traditional salt laboriously extracted from certain rare iodide-concentrating plants (Buchbinder 1977).
Commerce and industry have adventitiously introduced iodine in several ways. Subsistence agropastoralists turning to commercial feeds, for example, have inadvertently introduced iodine from outside the local ecosystem into their own food chain.[5] People have unknowingly absorbed iodine in medications and applied it as a first aid measure to the skin. The expanding food industry has introduced it into food, prompting the National Academy of Sciences to propose that "any additional increases should be viewed with concern. It is recommended that the many adventitious sources of iodine in the American food system, such as iodophores in the dairy industry, alginates, coloring dyes and dough conditioners, be replaced wherever possible by compounds containing less or no iodine" (National Academy of Sciences 1970). A more balanced statement by the academy would have addressed not only national surfeits but also global deficiencies, taking into account as well the dangers at the low end of the dose response curve. The academy thus displayed the unexamined assumption of "iodine affluence" characteristic of much of Western biomedicine. Health workers in the Midwest have recently reported the reappearance of goiter on farms (NYT Sept. 29, 1987:1), calling into question the assumption of iodine affluence even in the United States. In chapter 7, we will see how this assumption has been exported around the globe.
The Physiology of Iodine
Basic Understandings
Marine demonstrated in 1915 the essential role of iodine in thyroid physiology. His findings led to pilot iodization projects in both Switzerland and the United States. Favorable evaluations led to mass prophylactic programs carried out by governmental authorities in Switzerland and essentially by commercial entities in the
United States (Matovinovic 1983). Mass prophylaxis was not, however, extended to most other populations also known to be endemic, such as people residing in the Alps of Austria, Germany, and France or in parts of Scandinavia and Spain. Why this should be so is of course the problem of this book. To begin to answer that question, one must know the basic scientific premises on which the prophylactic programs of the 1920s were launched.
Iodine compounds once ingested are broken down and pass into the blood as inorganic iodine (see fig. 2). The thyroid then captures the circulating iodine, joins it onto proteins, and transforms it into the hormone thyroxine, which is stored in the thyroid and released into the blood stream as needed. Thyroxine is essential for optimum growth and for the metabolic processes taking place in tissue. After the hormone has been used, it is broken down and its iodine component recirculated, part of it passing out of the system by way of the kidneys. Iodine lost through this route is known as urinary iodine excretion (UIE),[6] a rate that measures the iodine status of a population. By WHO standards, a population is iodine deficient when its average UIE falls below 50 micrograms (µ) per day. Mass iodine supplementation averaging 150 micrograms per day gradually raises a population's UIE to normal.
Goiter is an enlargement of the thyroid gland, variably manifest as a bulging growth situated at the front of the neck, a diffuse thickening, or an enlargement behind the sternum. The enlargement permits the gland to trap a higher proportion of circulating iodine. Supplementation diminishes the need for trapping and permits glands not too long established to recede to normal size.
In the early days of prophylaxis in Europe, a set of "anthropological" traits were also taken, apart from goiter, as indicators of iodine deficiency. Corporeally, these indicators were short stature, dwarfism, structural peculiarities of the shoulder, hip and foot defects, and a peculiar walk. Facially, they were a broad nose bridge, droopy eyes, and lack of expression. Prevalence of these signs in combination with a conspicuous number of deaf-mutes was taken as a sign of severe iodine deficiency. Severely impaired individuals were known as cretins.[7] Their pathology was seen as separate from but related to endemic goiter, for scientists and laymen had long observed that endemic cretinism was rarely found where goiter was not also endemic.

Fig. 2.
The Thyroid and Its Feedback System
Public health officials in Switzerland, where cretinism was endemic, therefore targeted the iodine supplement at both conditions, while in the United States, where cretinism was not endemic,[8] goiter alone was targeted for eradication. That difference in targets becomes significant in elucidating the obstacles to prophylaxis, for iodine prophylaxis in the United States came to be
associated exclusively with the prevention of goitrous deformities, not with the prevention of motor and sensory disabilities.
As a result of these differing approaches, the essentiality of iodine came to be widely appreciated: in Switzerland, through official public health channels and in the United States, through commercial advertisements for salt. However vaguely, people consuming iodized salt accepted the theory of iodine deficiency.
Except for the Swiss, however, few Europeans were exposed to the theory of iodine deficiency. Endemic goiter nevertheless gradually declined in most of the Western world as food supplies became increasingly delocalized and as iodine entered increasingly into the diet. In other words, neither supplementation nor public education played a significant role for most Europeans in the decline of endemic goiter and cretinism.
Medical publications reflected this decline: once the overt threat of endemic goiter had receded, so did articles focusing on the once-threatening disease. However vividly the theory of iodine deficiency had once been presented on both sides of the Atlantic,[9] the public and its physicians before long came to take iodine sufficiency for granted.
Current Understandings
Basic knowledge on which prophylaxis was established has, over the intervening sixty years, been elaborated into a refined theory and practice that is a powerful agent in managing thyroid disorders (see, e.g., Stanbury 1978). But these advances concern us here in only a limited way: (1) insofar as they promote dietary intervention or cast doubt on it, and (2) insofar as they prepare us to understand the villagers' symptoms and the treatments to which, as we will see in the ethnography, they have been exposed. Four examples will illustrate these advances.
First, concern about the loss of homeostasis has restrained many physicians from endorsing prophylaxis. These physicians were convinced that the sudden introduction of physiological amounts of iodine into individuals long adjusted to a scarcity of iodine might trigger hyperthyroidism (Plummer 1936). This conviction was propounded with much flair during the prophylactic era, so that persistent fears about sudden iodization lingered even after the
idea was disproved. In Europe, the feared phenomenon came to be known as "Basedowification"—hyperthyroidism renamed for Basedow, a nineteenth-century physician who was a militant opponent of iodine supplementation. In his day, supplements were administered on an empirical basis, in doses now known to have been of pharmacological, rather than physiological, magnitudes. Such doses did perhaps prompt pathology. Fear of Basedowification is now uncalled for, however, because, among other reasons, dietary iodine supplement is available only in physiological doses. Yet at least one diagnostic manual recently republished in Spain still cautions physicians about abusive self-dosification (Marañón y Balcells 1984).
Second, the discovery of thyroid stimulating hormone (TSH) strengthened the view that goiter is an anatomical/physiological adaptation that need not be prevented. The adaptationist view holds that TSH, rising in response to low levels of circulating thyroxine, prompts the proliferation of thyroid cells, implying thereby neither dysfunction nor uncontrolled cellular proliferation. TSH does stimulate the thyroid into work hypertrophy and does enable it to trap a higher proportion of circulating iodide; in this sense, it is indeed adaptive. The view fails to take into account, however, that under conditions of optimal iodine intake, rising TSH warns of dysfunction. The adaptationist view also fails to take into account the higher risk of thyroid cancer.
Third, the uneven distribution of goiter, which tends to affect females more than males, has come to be understood in the following way. Estrogens increase during adolescence, rise during pregnancy, fluctuate during menopause, and are exogenously introduced by way of birth control pills. Estrogens increase the binding of iodine, making it less available for organification. Thus, periods of elevated estrogen production in the female life cycle increase the need for iodine and thyroid hormone. Increased levels of TSH reflect this need and may, during these phases of a woman's life cycle, drive the gland into hypertrophy so as to meet the increased demand. This sex difference—where the severity of endemic iodine deficiency is such that the necks of most males appear normal—allows goiter to be seen as a woman's problem rather than a problem of malnutrition that, however variably,[10] affects both
sexes. Observed solely as a woman's problem, goiter seemed to call for therapeutics rather than for massive dietary intervention to correct the underlying environmental deficiency.
Fourth, thyroxine was in 1953 differentiated into two hormones, T3 and T4, differing in potency and in the number of iodine atoms in the molecule (MIT and DIT).[11] Knowledge of the two hormones seemed for a time to make dietary intervention less urgent, for it was observed that under conditions of iodine deficiency, the T3/T4 ratio shifted in favor of T3, the more potent hormone. Animal experiments later disclosed that while T3 does rise compensatorily in most of the body, it does not rise in the brain tissue, where T4, under conditions of suboptimal dietary intake, is already low. In rats, this constellation of hormone levels was accompanied by suboptimal brain function (Greene 1973, Escobar del Rey et al. 1981b ), which improved measurably after supplementation. These animal experiments led researchers to infer that the brain function of clinically symptomless children might also be improved by supplementation.
Indeed, supplementation has been found to increase the level of circulating thyroid hormones in children whose hormone levels were within the so-called normal range (Connolly, Pharoah, and Hetzel 1979). The rise occurred only in subclinical cases, in children free of symptoms who had, however, low levels of hormone. The rise did not occur in children whose hormone levels were normal and whose iodine intake was optimal. These findings suggest that the apathy and low cerebral function attributable to suboptimal hormone levels tend to escape the clinician's notice. Both measures can be improved, however, as was shown when the children's biochemical levels and school performance both rose on supplementation. One can therefore conclude that the hormonal shift preserves corporeal but not cerebral homeostasis. In other words, it protects the body more than the brain (Lancet 1979:1165–1166, 1983:1121–1122), making dietary intervention more urgent. Denying iodine supplements to a population because it is not blatantly goitrous may then be seen as a means of keeping it apathetic and docile. In other words, goitrouslessness should be no reason to withhold iodine prophylaxis, for subclinical thyroxine levels reduce vigor and intelligence (Delong, Robbins, Condliffe 1989).
Other Factors: Goitrogens and Metabolic Error
Western medicine, during the prophylactic era, understood iodine deficiency as the major cause of endemic goiter and cretinism. Since that era it has given increasing prominence to the role of goitrogens and metabolic error.
Goitrogens are any active forces or substances that induce goiter. They act in at least three ways, by affecting (1) the absorption of iodine into the bloodstream, (2) the chemical coupling of MITs and DITs to tyrosene, and (3) the binding of molecules. Goitrogens play an insignificant role in goiter and cretinism where the diet is varied, but where it is not (as in rural Asturias), they can play an extremely important role. In this chapter, I discuss only the goitrogenic mechanisms. In chapter 5, I will show how poverty induces a high goitrogen intake and how a selectively goitrogenous diet, for the physiological reasons given here, helps to keep segments of the population socially marginated.
Absorption is affected by thiocyanate, a substance produced either in the liver or intestine during the course of digesting foods from three plant families.[12] Cassava is of the Euphorbiaceae (formerly Manihot) family, a starchy root widely consumed in the developing world. Cabbage, cauliflower, broccoli, rape seed, kale, collards, and turnips are of the Brassica family, and radish, cress, and mustard are members of the related Crucifera. Foods from these botanical groups are widely consumed in Europe and the temperate parts of the world, sometimes as garnishes and often as daily fare.
Thiocyanate produced by these goitrogenous foods preempts the sites on fatty acids to which iodine ordinarily binds for its passage through the intestinal membranes. Thiocyanate thus impedes the passage of iodine into the circulation and promotes its loss through feces. This loss is insignificant where iodine is abundant, but where it is scarce, thiocyanate slows down hormone production.
Perchlorate is another goitrogen acting in the same preemptive way in the intestine. While abundant in a variety of nuts, perchlorates are especially abundant in chestnuts (L. castanea sativa ), beechnuts, and acorns.[13] In historical Europe, these nuts were considered "hungry foods," consumed as staples of daily fare only dur-
ing war or when grain crops failed. But as will be seen in chapters 5 and 7, they were regularly consumed where grain was habitually scarce.
Goitrins, another form of goitrogen, interfere with the coupling of MIT and DIT molecules. They too may be derived from cruciferous plants, more from turnips than from cabbage, becoming goitrins only in the presence of certain intestinal parasites that arise locally. They may also, as in one well-known case, be derived from volatile compounds of geologic origin. Where because of the prohibitive cost of fuel, only the richer segment of the population boiled its water, driving off these volatile compounds, only the poor became goitrous (Gait'án 1974). Goitrins like these pique the curiosity but play an insignificant role in the global distribution of endemic goiter and cretinism and contribute little to understanding the obstacles to prophylaxis.
Finally, there are mineral goitrogens that, absorbed through drinking water and passed into the bloodstream, act to bind iodine, thus making it less available for organification. The best-known mineral goitrogens are produced in groundwater flowing over bedrock of limestone, where minerals such as calcium and fluorine dissolve out of the rock, enter the water supply, and are ingested with drinking water. Mineral goitrogens like these are characteristic of the central Asturias and of the historical goiter belt of the American Midwest.
Hereditary metabolic error has come to assume, since midcentury, an increasing role in thyroidology and has become an important consideration in the diagnosis and management of goiter. Metabolic errors may impede iodine metabolism at several sites: they may impede the transport of iodine, the coupling and breakdown of molecules, and the recapture of iodine, leading in these several ways to symptoms like those produced by suboptimal iodine intake. While metabolic errors must be seriously considered in any idiopathic case of goiter or of other thyroid-related diseases, they have rarely been shown to play an important role in goiter and cretinism that is endemic. Even where the stage has been set for the concentration of metabolic error in inbred populations, iodine supplements have dramatically reduced the incidence of IDD. However, since popular interest in inbreeding outweighs popular interest in prevention, it serves the interest of the opponents of
prophylaxis to stir up renewed interest in inbreeding, thus distracting attention from prevention.
Supplementation
Determination of appropriate levels of iodine supplementation has frequently posed what might be considered a spurious problem for health officials deliberating over the institution of mass prophylaxis. This is largely because the breadth of the margin of optimal intake—as we saw in the dose response curve—has not been widely appreciated.
Even when mass prophylaxis was initially being tested, iodine supplementation gave generally satisfactory results. It was seen early on as preventing the appearance of goiter in the young; reducing diffuse, hyperplastic goiters; promoting the gestation and birth of normal offspring; and—once supplementation had been under way for the length of a gestation period—halting the addition of endemic cretins and congenitally deaf to the population. Goiters did not, however, recede in those cases where a stimulus to cell growth had been present over a long period.
In cases where goiter was well established and of long duration, supplementation was counterproductive, making some of the older women's hypertrophied glands tender and painful and threatening the possibility of recurring problems. This response made some physicians hesitate to introduce mass prophylaxis, for they feared inducing toxic goiter, an acute and life-threatening form of hyperthyroidism.[14] As a result of such experience, guidelines were developed suggesting that iodine supplements be withheld from goitrous individuals over age forty. Such individuals could be supported with dessicated thyroid or with the synthetic thyroid hormone that became available after midcentury.
Supplementation posed other problems. Mistakes were made in the early prophylactic programs when very goitrous individuals were wrongly offered hope that their goiters would recede. When the goiters, unresponsive to supplementation, did not recede, their bearers were, on occasion, coerced into having them excised under primitive village conditions.[15] It is understandable that women operated on so peremptorily might become apprehensive and resistant when large-scale prophylactic programs were later
undertaken. Memories of such interventions are passed down by word of mouth and produce psychological obstacles that, on the inauguration of new prophylactic campaigns, require sensitive management.[16]
Prophylactic programs were originally aimed at eradicating only goiter and cretinism, since only these were understood as the acute and measurable manifestations of iodine deficiency. The conditions impeding vigor and cerebral function, surmised in those days but not amenable to quantification, were therefore not targeted for eradication. Medical men working in nonsupplemented areas where the incidence of endemic goiter was nevertheless declining measurably, wrongly came to assume that iodine intake was in the optimal range.[17]
Under these circumstances, goiter and thyroid complaints that were brought to the clinic ceased to be viewed as responses to malnutrition. Instead, they came to be seen as "idiopathic" thyroid. At least in Asturias, therefore, thyroid disorders came to be managed surgically, pharmaceutically, or with radiation.[18] But there is little reason to think these invasive practices were peculiar only to this region or to Spain. There is reason to think that, after the decline of endemic goiter in most of the Western world, enlargements of the thyroid came generally to be seen as idiopathic.[19] The idea that, on a global scale, most thyroid disorders are preventable thus gradually faded away.[20]
Disorders Stemming from Iodine Deficiency
Hypothyroidism
In hypothyroidism, too little hormone is produced for the maintenance of ordinary metabolic processes. A woman who is pregnant and hypothyroid runs the risk of giving birth to a cretin. Symptoms of hypothyroidism are morphological, dermal, neural, and behavioral. Facial expression may be dull, and the voice may become hoarse. Swelling occurs around the eyes, and the eyelids droop. Hair becomes sparse, coarse, and dry. The skin, especially on the shins, becomes scaly and thick. Hypothyroidism impairs memory and intellectual function, and apathy takes over. Some patients become psychotic. Hypothyroidism affects the heart, reflexes, and
menses. Anemia may occur due to prolonged and excessive bleeding, or menses may disappear and lead to precocious aging. In biochemical terms, there is a drop in T4, uncompensated by rising T3.
Hypothyroidism is treated with animal or synthetic thyroxine. Treatment runs the risk of side effects and overreactions that tip the body toward hyperthyroidism.
Hyperthyroidism
An overactive thyroid gland is hyperthyroid. Symptoms are diverse: neural, ocular, cardiac, metabolic, and behavioral. Patients may suffer from nervousness, tremors, and insomnia; pressure behind the eyes, spontaneous tears, and photophobia; heat irritation; abnormal sweating; irregular and rapid heartbeat; changes in appetite and weight loss, hyperactivity, and fatigue. Hyperthyroidism if accompanied by fever can lead to psychosis and coma and is therefore life threatening.
On a global scale, hyperthyroidism tends to be secondary to simple goiter, hypothyroidism, and the treatment of these conditions. American medical researchers have found that most of the hyperthyroidism seen in this country, however, is neither iatrogenic nor secondary to simple goiter but immunologic in origin. These findings apply to urban populations whose iodine status, generally good, is the result of a diversified diet.
Patients are tested for T3 and T4 and for radioactive iodine uptake (RAI). Antithyroid agents[21] are given to impair coupling of MIT and DIT and decrease hormone levels. Arriving at the minimal dose that will produce a euthyroid state is difficult and risky, for antithyroid medications have side effects such as nausea and loss of the taste sense and run the risk of inducing hypothyroidism. Radiation and surgery are other treatments for hyperthyroidism, but these are reserved especially for the treatment of goiters that are large or multinodular. Pharmacologic doses of iodine are administered prior to surgery to reduce the vascularity of the thyroid.[22] Goiterectomized patients in the United States are put onto replacement hormone; elsewhere, as in Spain, the remaining portion of the gland is usually expected to maintain adequate hormone production. Because hyperthyroidism often produces hypothyroidism and raises the probability of thyroid cancer, it is standard prac-
tice in the United States, but not always elsewhere, to monitor excised patients. However adaptive a simple goiter may be, its so briefly described complications make goitrousness a serious health hazard.
Incidence
The incidence of goiter, its sequela, and other thyroid conditions is difficult to know, especially in isolated or impoverished areas or where people are unaccustomed to receiving or demanding modern medical service. Incidence is affected by a number of factors. For example, the incidence of hyperthyroidism in populations long free of endemic goiter is low compared to those until recently endemic. At the same time and at a local level, a high incidence of hyperthyroidism may include cases that elsewhere would be separately classified as autoimmunologic conditions.
Traditional classification systems make it difficult to assess the incidence of thyroid disorders stemming from iodine deficiency. The conventions of international nomenclature have not combined the disorders under a single rubric that signals their common cause. The term iodine deficiency disorders (IDD) has been proposed to fill this need (Lancet 1983:1165), but as of 1988, no medical data system, such as MESH or ICD, had used the term. Hyperthyroidism is tallied on one list, toxic coma on another,[23] and cretinism on yet another. In the absence of a unifying concept, the incidence of IDD as evidenced by these symptoms alone or in combination will—as it was for this investigator in a modern Spanish village—remain difficult to ascertain.
Cretinism: The Transgenerational Effect of Hypothyroidism
Cretinism results from a deficiency of thyroid hormone during gestation. It is caused by an inadequate maternal intake of iodine or an enzymatic defect in the infant. Cretinism becomes apparent during the first year of life and is manifested in stunted growth and intellect. Endemic cretinism occurs in areas of endemic iodine deficiency; when it occurs otherwise—in areas not known to be deficient in iodine—it is attributed to the presence of a defective
enzyme and is called sporadic cretinism. While the causes are different, the outcomes can be so similar as to be viewed under the single concept of congenital hypothyroidism.
Endemic cretinism is the result of a sequence of processes: the levels of estrogen and iodine binding proteins rise rapidly in a woman who has recently conceived, on occasion the demand rising so high as to shift a nongoitrous euthyroid woman into hypothyroidism during the first eleven weeks of gestation. In that event, the gestating fetus is deprived of thyroid hormone, for during this early part of development he is totally dependent on maternal hormone. Since critical development events ordinarily take place during this period, the organism's deprivation of thyroid hormone at this time adversely affects its eventual neurological and intellectual capacity (Fierro-Benitez 1968; Hetzel 1989, chaps. 3 and 6).
The exact sequences and timing of these developmental events are difficult to pinpoint in humans, but the implications of hormone deprivation can be inferred from rats made hypothyroid in the laboratory. The neural tissue and development of the myelin sheath of the offspring of hypothyroid mother rats are poor, so that their nerve impulses are slowed down and the potential complexity of perceptual and behavioral responses is reduced. In rats, these failures and developmental delays cannot be compensated by introducing exogenous hormone after the fourteenth postnatal day.
The developmental agenda for humans is different. After the twelfth week, the human fetus begins to produce its own hormone, but adequate production depends on the availability of unbound iodine in the shared bloodstream of mother and fetus. While the exact timing is unknown, a limit presumably exists for humans also, after which muscular coordination and learning ability are forever impaired.
Natural experiments have suggested what that agenda may be. An iodized oil injection received by a woman before she conceives has been found to protect her offspring from iodine deficiency during his uterine phase, thus protecting him from endemic cretinism. By contrast, the newborn with an uncompensated enzymatic defect who becomes cretinous has obtained the benefits of maternal hormone during the critical period of nerve development. He has lacked it only during the latter phase of gestation, when he is dependent on his own hormone production. Postnatal introduction
of exogenous hormone effectively reverses his enxyme defect and permits recovery from the developmental delay if treatment is begun promptly and sustained for life.[24]
Developmental deficiencies may be compounded by postnatal sociogenic brain damage (Montagu 1972). Since the hypothyroid infant is relatively unresponsive to most stimuli and elicits fewer social interactions, he is predisposed to this additional kind of brain damage.
Paradoxically, then, and contrary to popular belief, the congenital hypothyroidism of the endemic cretin is not genetic, while genetic defect does characterize the sporadic cretin whose heredity the public rarely calls into question. This irony has implications for mass screening and prophylaxis, as will be seen in chapter 7.
It is a mistake to believe that endemic cretinism is hereditary. It is also a mistake to believe that the most common thyroid disorders are idiopathic or autoimmunologic in origin. These mistaken beliefs, operative until very recently in a modern industrialized nation and probably still operative in many parts of the world, pose an obstacle to prophylaxis, for they wrongly classify most of the world's thyroid disorders into categories of disease that have little to gain from dietary intervention.
Conclusion
I began by elucidating the theory of trace elements and describing the iodine cycle. The movement of iodine was traced through the body to show the pathological effects resulting from iodine deficiency and the ingestion of goitrogens. Disorders like hypothyroidism, hyperthyroidism, motor and cerebral defects, and cretinism were traced to underlying iodine deficiency. Attention was drawn to the historical difficulties in recognizing the common denominator of these iodine deficiency disorders and in recognizing that, on a global scale, they represent the greatest proportion of thyroid disease. IDD, like other kinds of deficiency disease, can be dealt with effectively only if it can first be seen to exist in relation to a set of contexts that are historical, social, and political. These will now be examined.