Preferred Citation: Greenwald, Howard P. Who Survives Cancer?. Berkeley:  University of California Press,  c1992 1992. http://ark.cdlib.org/ark:/13030/ft9b69p365/


 
Chapter Four Can Cancer Be Prevented?

Exogenous Causes of Cancer: Opportunities for Prevention

Effective cancer prevention requires recognition of exogenous causes of the disease—factors that originate outside the body. Some experts, in fact, prefer to characterize these factors simply as "environmental." The exogenous causes of cancer include a large number of agents and stimuli recognizable above the level of the cellular changes that initiate malignant disease. Phenomena of this kind are presumably more easily recognizable than those lurking in the deep recesses of cells. Once recognized, it is thought, they can be controlled.

Physicians and scientists have recognized exogenous causes of cancer for hundreds of years. Discovery of such factors often began as an observation that some people, particularly those in certain trades or professions, have especially high risks of developing cancer. Exposure to hazards at the workplace, such as industrial chemicals, has presented the most obvious risk. Modern observers have included radiation, diet, and personal lifestyle practices in this area of concern, expanding the notions of "exogenous" and "environmental" beyond their original meanings.

Exogenous stimuli appear to cause or contribute to the initiation of cancer by damaging the structure of DNA in the cell. An exogenous stimulus may also promote the proliferation of cancer cells and the development of tumors by helping create conditions within the body that are favorable to cancer growth. One exogenous factor may have its most powerful effect when combined with another, the two multiplying each other's influence.

Exogenous causes of cancer, proven and suspected, became public issues in the 1960s and 1970s, surfacing alongside more general concerns with ecology and the environment. Chemical hazards and pollution attracted the widest attention. In 1967, one authority estimated that 90 percent of cancers were caused by chemicals in the environment.[1]

E. Boyland, "The Correlation of Experimental Carcinogenesis and Cancer in Man," Progress in Experimental Tumor Research 11 (1967): 222-34.

A highly comprehensive analysis, examining chemicals (mostly industrial exposure), radiation, diet, alcohol, and tobacco, concluded that "in most parts of the United States in 1970 about 75 or 80 percent of the cases of cancer in both sexes might have been avoidable."[2]

R. Doll and R. Peto, "The Causes of Cancer: Quantitative Estimates of Avoidable Risks of Cancer in the United States Today," Journal of the National Cancer Institute 66 (1981): 1196-1308.

These and other estimates have led to a widespread belief among both laypeople and health professionals that most cancers are environmental in origin. A strong tradition of research has indeed identified many exogenous factors as causes of cancer. But the conclusion that most


68

cancer is avoidable is so striking that the related research requires very careful examination. Assessment of the degree to which prevention can actually replace cure requires weighing the evidence. Risks associated with chemicals, lifestyle, and other exogenous causes of cancer must be systematically compared with those that unquestionably reside, in varying degrees, within the organs, tissues, and cells of every human being.

Hazardous Substances And Chemicals

The late twentieth century brought widespread awareness in the minds of the public and health professionals that exposure to the by-products of industrial development increased the individual's likelihood of contracting cancer. Many industrial processes utilize or produce carcinogenic substances to which workers face potential exposure. People with no industrial exposure face similar hazards, as suspect chemicals are released into the atmosphere and water supply, remain active as manufactured products, or are produced by consumers themselves.

Astute observers of public health have realized the connection between hazardous substances and cancer for centuries. As long ago as the 1500s, observers linked exposure to arsenic with skin disease. In the twentieth century, researchers demonstrated that people exposed to arsenic in the metal-smelting industry or through air and water pollution tended to develop cancers of the lung and skin. In the 1700s, the English surgeon Sir Percival Pott noticed that chimney sweeps had a surprisingly high probability of developing cancer of the scrotum. Progress in the techniques available to science and medicine allowed twentieth-century investigators to link scrotal cancer to a specific chemical found in coal tar (benzopyrene).

The period between the late 1800s and the mid-twentieth century brought a rapid succession of discoveries linking industrial chemicals and cancer. Milestones included discovery of a link between organic chemicals used in the German dye industry and bladder cancer (1895); between chromates and lung cancer in miners and smelters (1948); between asbestos and a variety of cancers in several industries (1949).[3]

P. Decoufle, "Occupation," in Cancer Epidemiology and Prevention, ed. D. Schottenfeld and J. F. Fraumeni (Philadelphia: Saunders, 1982), pp. 318-35.

Inaction by industrialists or public officials in response to long-recognized hazards has produced outrage among some public health spokespersons. As long ago as the 1800s, life insurance companies noted the high rate of early deaths among workers in locomotive shops, where much asbestos was used, and refused them coverage. Yet asbestos remained


69

in widespread use through most of the twentieth century. Thousands received massive exposure to asbestos with minimal protection in the shipyards of World War II. Millions completely outside asbestos-related industries received doses of asbestos through its use in the construction of homes, schools, hospitals, roads, and recreational facilities, or its presence in food, water, and air. A crusade by Dr. Irving Selikoff and others in the 1960s and 1970s finally mobilized organized labor and the public health profession to safeguard asbestos workers and remove asbestos from the environment.

By the late twentieth century, scientists were taking more proactive steps to identify cancer-causing substances. One method, rodent bioassay, simply exposed laboratory mice and rats to high concentrations of the suspected substance and observed the animals over their life-times to determine whether cancers developed. Because rodent bioassay was expensive and time-consuming, scientists developed an alternative method, called "mutagenesis assay," for testing suspected cancer-causing agents. This type of test makes use of the fact that cancer-causing chemicals usually have their strongest effects in the initiation phase of cancer development. Typically, these agents cause cancer by damaging DNA, resulting in production of abnormal cells. In mutagenesis assay, investigators expose well-known strains of bacteria to the substance suspected of causing cancer, and look for signs that the exposure gives rise to mutant cells.[4]

B. I. Weinstein, "The Scientific Basis for Carcinogen Detection and Primary Prevention of Cancer," Cancer 47 (1988): 1133-41.

If mutant cells do appear, the suspect chemical is subjected to further testing.

Even the development of actual cancer in rodents does not indicate that a chemical causes cancer in humans. Such a conclusion would require epidemiological validation, based on comparison of cancer rates among people exposed to the suspected agent and those without such exposure. At least in the modern United States, scientists do not purposely expose people to suspected chemicals to see whether they cause cancer. They may, however, keep track of people who work in industries that produce, utilize, or otherwise promote exposure to suspect chemicals. The experience of workers in such industries generates knowledge about substances with which citizens and consumers may come in contact regularly, although in weaker concentrations.

By the 1980s, scientists had accumulated solid evidence that workers in a small number of industries faced heightened risks of cancer, which could be traced to specific chemicals and processes. A landmark review by the English scientists Richard Doll and Robert Peto of all research


70

completed by 1980 serves as an excellent source of information in this crucial area.[5]

Doll and Peto, "The Causes of Cancer."

Miners and metal refiners face a variety of cancer risks as a result of chemical exposure. Copper and cobalt smelters have elevated risks of skin and lung cancer because of contact with arsenic, a hazard shared by some pesticide manufacturers. Cadmium workers have higher rates of prostate cancer because of their exposure to the metal. Chromium workers, including people who manufacture paints containing chromium, have high rates of lung cancer. Nickel refiners have an elevated tendency to contract nasal, sinus, and lung cancers.

In other industries, complex organic chemicals—particularly polycyclic hydrocarbons, elaborate chains and rings of carbon atoms—have been found to cause cancer. Persons employed in dye and rubber manufacturing may be exposed to a class of chemicals called aromatic amines, which cause bladder cancer. Workers exposed to benzene in the manufacture of glues and varnishes may contract blood-related cancers. Persons who work with coal tar derivatives—for instance, gas workers, roofers, asphalters, and aluminum refiners—have a tendency to develop cancers of the skin, scrotum, and lung. Vinyl chloride causes cancer of the liver and nasal sinuses among workers who manufacture this widely used substance, as well as among hardwood furniture makers and leather workers, who use it in their industries.

According to Doll and Peto, cancer-causing chemicals and processes encountered in industry accounted for about 4 percent of all cancer deaths in the United States in 1978. Taking into account both the potency of the agents involved and the number of people affected, they concluded that the greatest threats occurred in lung and bladder cancer and leukemia. These researchers estimated that industrial hazards caused 15 percent of the deaths from lung cancer, 10 percent of the deaths from bladder cancer, and 10 percent of the deaths from leukemia. Radiation, which may cause leukemia and which sometimes occurs as an industrial hazard, receives detailed attention below.

People who work outside the industries described above face hazards related to some of the same substances, which are distributed as components of consumer goods or are purposely or accidentally released into the environment. Asbestos used in construction and some consumer goods represents a continuing threat. Cancer has been linked with a variety of pesticides, including DDT and the grain fumigant EDB (ethylene dibromide).[6]

B. N. Ames, R. Magaw, and L. S. Gold, "Ranking Possible Carcinogenic Hazards," Science 236 (1987): 271-80.

While workers involved in the manufacture or utilization of these chemicals encounter higher doses, others may take in their residues with food.


71

Radiation

"Radiation" joins "hazardous chemicals" as a major scare word related to cancer. The term radiation generally refers to energy emitted from a source in the form of particles or waves. All radiation transmits energy from its source to a target: radio waves from an oscillating electrical circuit, radar beams from a radar transmitter, light from an incandescent filament. The level of energy associated with a particular form of radiation is measured by its frequency and wavelength. Higher frequency and shorter wavelength correspond to greater energy. The range of energy levels in various kinds of radiation is called the electromagnetic spectrum, encompassing, in order of increasing energy, radio waves, microwaves, infrared light, visible light, ultraviolet light, and ionizing radiation. Ionizing radiation, with sufficient energy to knock electrons out of their atomic orbits, encompasses X rays, alpha and beta particles, gamma rays, and, at the far end of the spectrum, cosmic rays. Ultraviolet light and all ionizing radiation are known to cause cancer.

Like hazardous chemicals, ionizing radiation may damage the structure of DNA required for normal cell reproduction. When high-energy particles or rays strike individual cells, they impart energy to its molecules, including the DNA. Many scientists adhere to the so-called two-hit theory, believing that radiation must affect at least two very nearby cells to produce visible tissue abnormality.[7]

J. D. Boice and C. E. Land, "Ionizing Radiation," in Cancer Epidemiology and Prevention, pp. 231-53.

This logic suggests that greater radiation exposure will result in more tissue damage and an increased possibility of cancer. Certain kinds of ultraviolet light may also induce changes in DNA.[8]

J. Scotto, T. R. Fears, and J. F. Fraumeni, "Solar Radiation," in Cancer Epidemiology and Prevention, pp. 254-76.

Like hazardous chemicals, radiation has become a major public concern. Part of this concern has arisen from the belief that officials in government and industry, while aware of the risks represented by radiation, kept this information secret. The comments of Dr. H. Jack Geiger, a professor of community medicine and prominent public health spokesperson, exemplify this concern. With reference to the Hanford facility in Washington, operated since World War II for nuclear weapons research and manufacture, Geiger wrote in 1990:

Never have so many human beings been exposed to so much radiation over so long a time—while so few knew about it. Now, more than 40 years after it began, we are just beginning to learn what the U.S. government's Hanford nuclear weapons plant in Washington State has done to thousands of citizens who unwittingly drank radioactively contaminated water, breathed radioactively contaminated air, and fed milk laced with radioactive iodine to their children.

… From 1944 to 1947 alone, the nuclear weapons factory spewed 400,000


72

curies of radioactive iodine into the atmosphere. The bodily absorption of 50 millionths of a single curie is sufficient to raise the risk of thyroid cancer. For years thereafter, Hanford poured radioactive water into the Columbia River, and leaked millions of gallons of radioactive waste from damaged tanks into the groundwater.

Geiger charged the "Department of Energy, its predecessor agencies and their contractors" with an "official coverup," "two decades of lies," and "callous indifference to public health."[9]

H. J. Geiger, "Generations of Poison and Lies," New York Times, August 5, 1990.

Potentially cancer-causing radiation comes from many less sensational sources than the Hanford plant. Since the 1800s, medical and scientific reports have noted high rates of skin cancer among people such as farmers and sailors exposed to large amounts of sunshine. Several forms of skin cancer are relatively harmless, not spreading very far beyond their origins and easily removable by surgery. Malignant melanoma, however, which comprises about 5 percent of all skin cancers, was estimated to cause 6,300 deaths in the United States in 1990.[10]

American Cancer Society, Cancer Rates and Risks, 1990 (Atlanta: American Cancer Society, 1990).

Ultraviolet light is the cancer-causing agent in sunshine. Fair-skinned people living in desert areas or southern latitudes face the highest risks of contracting skin cancer. Data collected around 1970 illustrate the effect of sunlight on these risks. White people living in Fort Worth, Texas, had well over twice the risk of developing malignant melanoma as those in Minneapolis, Minnesota. Fort Worth is located at 32.8 degrees north latitude and receives twice as much ultraviolet radiation as Minneapolis at 44.9 degrees north.[11]

Scotto, Fears, and Fraumeni, "Solar Radiation."

Ionizing radiation is more powerful than sunlight as a cause of cancer. Workers in industries that utilize radioactive materials experience elevated risks of cancer. Medical and dental procedures expose patients to radiation. People near the sites of atomic and hydrogen bomb detonations have absorbed significant amounts of radiation in their bodies. The natural environment itself contains many sources of radiation. In addition to sunlight, naturally occurring radioactive material in the earth heightens cancer risks. Cosmic radiation from outer space contributes significantly to the incidence of cancer.

Workers in industries that utilize radioactive materials have been the subject of intense interest. A classic investigation of women employed in the radium dial–painting industry before 1930 revealed a high incidence of bone cancer.[12]

R. E. Rowland, A. F. Stehney, and H. F. Lucas, "Dose-Response Relationships for Female Radium Dial Workers," Radiation Research 76 (1978): 368-83.

In order to keep their points fine, workers often licked the brushes they used to paint the dials of watches and clocks. This practice resulted in actual ingestion of the radium. A study of uranium miners indicates that they are over five times as likely as the general


73

population to develop respiratory cancers.[13]

V. E. Archer, J. D. Gillam, and J. K. Wagoner, "Respiratory Disease Mortality among Uranium Miners," Annals of the New York Academy of Sciences 271 (1976): 280-93.

Security regulations prohibited independent scientists from obtaining data on the health of workers at nuclear facilities such as Hanford for most of the twentieth century. Studies based on these data were not yet available at the time this book was written.

Physicians and dentists must share the blame for many avoidable cancers. The history of medicine before the late twentieth century was marked by extensive overuse of radiation. Doctors used X rays to treat rheumatoid conditions, enlarged thymus glands, ringworm, and tonsil ailments. These procedures resulted in elevated rates of cancer in the organs targeted for medical treatment and in anatomical sites in the path of the radiation beams.[14]

Boice and Land, "Ionizing Radiation."

Growing awareness of the hazards represented by radiation led medicine to abandon many of these practices around mid-century. Physicians and dentists, of course, still use X rays and other radiation-dependent imaging techniques to help diagnose illnesses and injuries. These techniques have become considerably safer over the years because equipment now focuses X-ray beams more precisely and films require less exposure time.

Still, the use of radiation-based diagnostic tools—even such techniques as mammography, designed to provide early detection of breast cancer—involves some cancer risk. One investigator has identified medical procedures as the source of the greatest average radiation exposure for the United States population. He indicates that people who undergo medical procedures involving radiation receive an average dose of 120 millirems (a measure of radiation absorbed by the body). In comparison, the average worker at a nuclear power plant receives a dose of 400 millirems—higher, but within the same general range.[15]

R. E. Shore, "Electromagnetic Radiation and Cancer: Cause and Prevention," Cancer 62 (1988): 1747-54.

Populations near the detonation sites of atomic and hydrogen bombs received high doses of radiation from the blasts themselves and from subsequent fallout. The continuing study of persons who survived the atomic bombings of Hiroshima and Nagasaki in 1945 must serve as one of the key documents of the nuclear age. Increased rates of leukemia, which were most marked among young children, became apparent one year after the bombing. The risk of leukemia continued to rise for six to seven years after the bombing, and then declined steadily. Forty-three years after the bombing, leukemia deaths among those exposed to intense radiation were more than double the leukemia deaths among those not exposed. Deaths from other cancers also increased, though at a lesser rate, including cancers of the stomach, colon, lung, breast, urinary tract, and bone marrow.[16]

Y. Shimizu, W. J. Schull, and H. Kato, "Cancer Risks among Atomic Bomb Survivors," Journal of the American Medical Association 264 (1990): 601-4.


74

Exposure to radioactive fallout also increases cancer risks, although to a much lesser degree. Two hundred residents of the Marshall Islands who were exposed to fallout from a United States hydrogen bomb test in 1954 were examined regularly for twenty years. Girls (but not boys or adults of either sex) developed thyroid cancers in unexpectedly high numbers.[17]

R. A. Conrad, "Summary of Thyroid Findings in Marshallese 22 Years after Exposure to Radioactive Fallout," in Radiation-Associated Thyroid Carcinoma, ed. L. De Groot (New York: Grune and Stratton, 1977), pp. 241-57.

Military personnel present at a famous Nevada nuclear detonation called "Smoky" experienced an increased risk of leukemia—exposed soldiers developed leukemia at over twice the rate normally expected.[18]

G. G. Caldwell, D. B. Kelly, and C. W. Heath, "Leukemia among Participants in Military Maneuvers at a Nuclear Bomb Test: A Preliminary Report," Journal of the American Medical Association 244 (1980): 1575-78.

Among children growing up in Utah during the era of aboveground bomb testing, fallout exposure was weakly related to death from leukemia. According to a highly regarded study led by Dr. Walter Stevens of the University of Utah, the relationship between exposure to fallout and death from leukemia was so small that it could have been observed purely by chance.[19]

W. Stevens et al., "Leukemia in Utah and Radioactive Fallout from the Nevada Test Site," Journal of the American Medical Association 264 (1990): 285-91.

Natural sources of radiation include soil, rocks, food, and water that contain naturally occurring radioactive elements, as well as cosmic rays. People living in Denver experience higher radiation exposure than those at sea level because of their greater exposure to cosmic rays. It is estimated that 50 percent of radiation exposure to the United States population comes from natural sources. According to one investigator, natural background radiation accounts for an average dose in the United States population of 80 millirems, compared with an average of 3 millirems for airline travelers and about 1 millirem for those who live near (but do not work in) nuclear power plants.[20]

Shore, "Electromagnetic Radiation and Cancer."

Radon, an inert, radioactive gas occurring naturally in the soil, has drawn significant attention in the parts of the United States where it is common—for instance, in certain areas of the Northeast. This gas seeps into houses, giving rise over time to radioactive compounds. A lifetime of breathing radon at levels often found in the home produces a lung cancer risk of 1 per 1,000, in the absence of other risk factors. Between 0.5 and 2.0 percent of U.S. homes have radon levels eight times the normal level, raising the risk of lung cancer to 16 per 1,000.[21]

National Council on Radiation Protection, Evaluation of Occupational and Environmental Exposures to Radon and Radon Daughters in the United States (Bethesda, Md.: National Council on Radiation Protection and Measurements, 1984).

Public concern has arisen in recent years over the possibility that still other forms of radiation may cause cancer. High-tension power lines produce electromagnetic fields, which, according to some, can cause leukemia. Studies of this form of radiation, whose energy level is too low to cause ionization, have not produced any clear indication of danger.[22]

Shore, "Electromagnetic Radiation and Cancer," p. 1748.

A large study, involving 20,000 Navy radar technicians who performed tasks associated with microwave radiation, found no increase in the risk of cancer.[23]

C. Robinette, C. Silverman, and S. Jablon, "Effects upon Health of Occupational Exposure to Microwave Radiation (Radar)," American Journal of Epidemiology 112 (1980): 39-53.

Computer programmers, word processors, and others who


75

work at cathode-ray terminals (CRTs) have become concerned that radiation from these devices exposes them to cancer risk. CRTs are essentially television screens, which do emit some ionizing radiation. The amount of radiation an average American absorbs per year from television viewing is about .5 millirem—small in comparison with the annual radiation exposure of the average airline traveler, and tiny in comparison with natural background radiation.

Diet

Most health professionals and an ever larger segment of the public recognize the importance of diet in the maintenance of good health. Health authorities were joined by gurus and cultists of all kinds in the late twentieth century in urging the adoption of specialized food regimens. The death of nutrition spokesperson Adele Davis from cancer at a not especially old age in the 1970s may have weakened the dietary commitments of some. But, while perhaps exaggerated in some quarters, diet has real importance in prevention of illness. Substantial research suggests that diet affects cancer risk.

Food Additives.

"Poison in the food" always attracts attention. Throughout the late twentieth century, the news media carried stories about newly discovered cancer risks from chemicals widely used in the flavoring, coloration, or preservation of food. Alleged danger from food additives has always made good copy. But there is little or no scientific evidence that food additives pose significant, widespread cancer risks.

Today food additives undergo careful laboratory screening. Before such testing was routinely performed, some chemicals later found to cause cancer were widely used. A cancer-causing chemical dye called "butter yellow," for example, was used for many years to make margarine look like the real thing. The additive is no longer used for this purpose. The most intense discussion in the United States during this era focused on saccharin, a ubiquitous sweetener invented in 1902, and nitrites, a class of chemicals long used to preserve and flavor meats. Saccharin and nitrites have been regularly ingested by millions of people.

Several widely used artificial sweeteners were legally banned as food additives in 1968. Federal legislation a decade earlier had required the U.S. Food and Drug Administration to ban all food additives that had been found to cause cancer in human beings or animals. Rats fed and


76

injected with large quantities of saccharin had been shown to develop bladder cancer.

The rat study, however, did not demonstrate that saccharin could cause cancer in human beings. It is practically impossible to imagine a human taking in as much saccharin per body weight as the laboratory rats had received. Other species of laboratory animals, including mice, hamsters, and monkeys, did not develop cancer when fed large quantities of saccharin. The United States experienced no major increase in bladder cancer after the introduction of saccharin; diabetics, more likely to use sugar substitutes such as saccharin and cyclamates than others, evidenced no proclivity to contracting cancer.[24]

Doll and Peto, "The Causes of Cancer."

Evidence linking nitrites in food with cancer in humans proved to be even weaker. Experiments with animals indicate that ingested nitrites react with other substances derived from food, as well as digestive juices, to form carcinogenic compounds in the body.[25]

P. N. MacGee, R. Montesano, and R. Preussmann, "N-Nitroso Compounds and Related Carcinogens," in Chemical Carcinogens, ed. C. E. Searle (Washington, D.C.: American Chemical Society, 1976), pp. 491-625.

In rodents and other animals, the compounds derived from nitrites have caused cancers of the liver, kidney, esophagus, and respiratory tract.[26]

S. A. Yuspa and C. C. Harris, "Molecular and Cellular Basis of Chemical Carcinogenesis," in Cancer Epidemiology and Prevention, pp. 23-43.

But serious doubts remained about whether nitrites taken in with food significantly contributed to the formation of carcinogens in the body. No scientific consensus ever developed about the degree of risk (if any) that people incurred when they ate meat or other foods prepared or preserved with nitrites. While public concern was aroused and "aware" individuals may have avoided them, nitrites were never banned.

Dietary Fat.

Perhaps no other food component has received as much attention as fat. Most people know that a diet high in fats can lead to impaired blood circulation, causing heart attacks and strokes. The effects of fatty foods on personal appearance have doubtless contributed to the widespread belief that they are deleterious to health. Certainly, research has identified body weight as a contributor to heart attacks, strokes, and other life-threatening illnesses. More recently, scientists have uncovered evidence that a diet containing too much fatty food can also cause cancer.

Scientists who believe that a connection exists between fatty foods and cancer have several powerful theories to support their thinking. One of the major theories focuses on the promotion phase of cancer development, during which changes in body chemistry stimulate a few abnormal cells to multiply. Researchers have reported that fatty substances in the diet give rise to high levels of the hormones that promote development of tumors. All things being equal, people who consume a fatty diet risk


77

becoming overweight, their weight gain coming primarily from an increase of fat (adipose) tissue. An important study has demonstrated that an enzyme in human fat tissue enables the body to produce an especially potent hormone.[27]

P. C. MacDonald et al., "Effect of Obesity on Conversion of Plasma Androstenedione to Estrone in Premenopausal Women with and without Endometrial Cancer," American Journal of Obstetrics and Gynecology 130 (1978): 448-55.

This and related hormones, it is believed, create an environment in the body that encourages the tiny number of abnormal cells which may exist to multiply and form visible tumors.[28]

B. E. Henderson, R. K. Ross, and L. Bernstein, "Estrogen as a Cause of Human Cancer: The Richard and Hilda Rosenthal Foundation Award Lecture," Cancer Research 48 (1988): 246-63.

Researchers have demonstrated that fat people face especially high risks of cancer. A massive study by the American Cancer Society in the 1970s, which followed 750,000 people for thirteen years, demonstrated that the overweight were much more likely to die from cancer than people in the normal weight range. Women whose weight was 40 percent above normal, for example, had five times as high a rate of death from endometrial cancer and twice as high a rate from cervical cancer as those of normal weight. Men whose weight was 40 percent above normal had a 70 percent greater chance of dying from colon cancer and a 30 percent greater chance of dying from prostate cancer.[29]

E. A. Lew and L. Garfinkle, "Variations in Mortality by Weight among 750,000 Men and Women," Journal of Chronic Diseases 32 (1979): 563-76.

Other studies reported even stronger relationships between overweight and dying from some forms of cancer.

Not all scientists believe that hormones associated with fat promote cancer development.[30]

C. W. Welch, "Interrelationship between Dietary Fat and Endocrine Processes in Mammary Gland Tumorigenesis," Progress in Clinical and Biological Research 222 (1986): 623-54.

But some scientists have developed explanations that do not involve hormones as intervening factors in the linkage between fatty diets and cancer. These researchers focus on colon cancer and the effects of dietary fat on bowel contents. In a series of research studies beginning in the 1960s, scientists have examined the feces of population groups throughout the world, comparing those that have high rates of colon cancer with those having low rates.[31]

D. G. Zaridze, "Environmental Etiology of Large Bowel Cancer," Journal of the National Cancer Institute 70 (1983): 389-400.

These comparisons have revealed that groups whose diets were high in fats had high concentrations in their large intestines of certain steroids, bile acids, and microorganisms. Members of these groups were the very ones who faced the highest risks of contracting cancer of the colon. Ongoing research in this area proved inconclusive, however—some studies continuing to demonstrate a relation between eating fat and developing colon cancer, others turning out neutral or contradictory.[32]

T. Byers, "Diet and Cancer," Cancer 62 (1988): 1713-24.

Nobody really knows why some groups that have little fat in their diets are less likely to develop cancer. Reports in this area tend to identify total dietary fat as the culprit. Attempts to find relationships with more specific dietary components and their products have not been successful. Meat eating (except for fats contained in many meats) does not seem to be related to malignancies. Serum cholesterol, important in the development of heart disease, has not been isolated as a cause of cancer.


78

In the late twentieth century, researchers characterized some forms of cancer as "diseases of the affluent." Colon and breast cancer, for example, seemed to strike the city dweller who had sufficient resources to consume processed foods, meats, and other fat-containing items more often than the peasant who obtained his or her nutrition from grains and other traditional sources. A comparison of Japanese-Americans living in Hawaii in the 1970s with both Hawaiian Caucasians and Japanese living in Japan seems to support this general impression. Among the Japanese-Americans living in Hawaii, rates of colon and breast cancer more closely approximated those of the Caucasian population than those prevailing in Japan.[33]

J. Waterhouse et al., eds., Cancer Incidence in Five Continents, vol. 3 (Lyon: International Agency for Research on Cancer, 1976).

It is tempting to infer that the Japanese-Americans in Hawaii, mostly descendants of immigrants in earlier generations, had adopted the food habits (including increased fat consumption) of their Caucasian neighbors, and thus incurred similar cancer risks.

Beta-Carotene and Vitamin A.

The possibility that vitamin A and substances related to it prevent cancer has caused great excitement in medical and public health circles. Interest has focused principally on beta-carotene. Found in green and yellow vegetables (especially, of course, in carrots), fruits, and other foods of plant origin, beta-carotene is converted to vitamin A in the intestines. In the 1970s, researchers began to suspect that beta-carotene inhibited the promotion of cancer within the body. Although abnormal cells may have already been present, the theory went, beta-carotene kept them from developing into truly malignant cells and proliferating to become visible tumors. Evidence for this theory began to accumulate from studies using experimental animals and cell cultures.

Several large-scale, long-term studies concluded in the 1970s and 1980s indicated relationships between vitamin A (and related substances) and reduced risk of lung cancer in humans. A five-year study of over 8,000 Norweigan men indicated that those whose diet included larger amounts of foods containing (or capable of producing) vitamin A experienced lower risks of lung cancer. Reduced occurrence of lung cancer was found even among cigarette smokers. This finding was confirmed by studies of thousands of additional individuals in the United States, Singapore, and Japan. Two important studies saved and analyzed blood samples from apparently healthy people in Maryland and Japan. Those who developed lung cancer in later years were found to have had relatively low levels of beta-carotene in their blood serum. Still other studies have reported a preventive effect of beta-carotene on several forms of cancer other than lung.[34]

National Research Council, Diet and Health (Washington, D.C.: National Academy Press, 1989), p. 313.


79

Like other lines of inquiry on dietary practices and cancer, studies of beta-carotene and cancer have not produced entirely consistent results. After reviewing seven well-conducted studies carried out in the 1980s, one authority concluded that "those in the lowest third to quarter of the population distribution of carotene intake" had between a 50 and 100 percent greater risk of lung cancer.[35]

Byers, "Diet and Cancer."

Other studies, however, which examined all forms of cancer, found no evidence that beta-carotene or vitamin A helped protect against malignancies.[36]

W. C. Willett et al., "Relation of Serum Vitamins A and E and Carotenoids to the Risk of Cancer," New England Journal of Medicine 310 (1984): 430-34; A. Paganini-Hill et al., "Vitamin A, Beta-Carotene, and Risk of Cancer: A Prospective Study," Journal of the National Cancer Institute 79 (1987): 443-48.

Researchers in the 1990s, moreover, were still not sure whether vitamin A, beta-carotene, some other carotene-related substance, or a constituent of fruits and vegetables unrelated to any of these accounted for the observations of a preventive effect.

Fiber.

In the 1980s, messages to the public about the advantages of dietary fiber became frequent. Some research offered support for the belief that high amounts of fiber in the diet could help prevent cancer of the colon, one of the most widespread forms of cancer in the United States. The American Cancer Society recommended that Americans "eat more high-fiber foods such as whole grain cereals, fruits, and vegetables" to "help reduce the risk of colon cancer." The cautious advice of the American Cancer Society became a highly visible marketing message in the hands of food manufacturers, who highlighted the connection between fiber and cancer prevention in their packaging and promotion of cereals and baked goods of all kinds. "Fiber" became a household word, its role in preventing cancer the kind of health fact that "everybody knows."

Yet scientists have trouble agreeing on many things about fiber. First, what is it? An early investigator of the effects of fiber on cancer risks defined it as the components of plant cell walls that resist digestion in the human body. This definition includes a broad range of substances that may be found in food. Other scientists have included an even broader range of substances: undigestible or partially digestible proteins, sugars, and starches; cellulose; gums; mucilages; shell material from shrimp and other crustaceans; waxes; silicon.[37]

H. Trowell et al., "Dietary Fiber Redefined," Lancet 1 (1976): 967.

Clearly, substantial differences exist among these substances. Some are composed of relatively simple molecules; others, highly complex ones. Some dissolve in water; others do not. Some are almost entirely digested in human intestines; others pass through largely unaffected by human digestive agents.

The most important theory attributing an important role to dietary fiber in the prevention of colon cancer rests on the presumption that this substance absorbs or dilutes acids, biles, and other potential carcinogens


80

in the large intestine, reducing their effects in initiating or promoting cancer. It is also thought that high levels of dietary fiber decrease the "transit time" of feces—that is, the time which passes between eating and excreting—and thus reduce the period during which carcinogens are in contact with intestinal tissues.

Several early studies demonstrated relationships between high dietary fiber and low rates of colon cancer. One representative study published in 1977 compared residents of Copenhagen, Denmark, and Kuopio, Finland. Although people in both locations consumed similar amounts of several nutrients, including fat, those in Kuopio consumed significantly higher amounts of fiber. Colon cancer was four times as frequent in Copenhagen.[38]

International Agency for Research on Cancer, Intestinal Microecology Group, "Dietary Fiber, Transit Time, Fecal Bacteria, Steroids, and Colon Cancer in Two Scandinavian Populations," Lancet 2 (1977): 207-11.

While fecal transit time was similar in both locations, stool bulk was higher in Kuopio.

A review of seven highly regarded studies published in the 1980s, however, revealed inconsistent results for fiber.[39]

Byers, "Diet and Cancer."

At least one study, which showed a relation between fiber and frequency of colon cancer, suggested that only one of the many forms of fiber listed was associated with the disease. Recent research has suggested that low dietary fiber may also cause stomach, breast, ovarian, and endometrial cancers.[40]

National Research Council, Diet and Health, p. 299.

But studies attempting to correlate fiber intake with cancers other than colon have been few.

Other Dietary Items.

Food additives, fats, the beta-carotene complex, and fiber are the best-known subjects in the area of diet and cancer. A complete list of food items known or thought to cause cancer would be long indeed. A few less familiar items balance those already discussed.

If "everybody knows" that hazardous chemicals, radiation, and low dietary fiber cause cancer, very few realize that "natural" foods consumed by Americans every day may contain even more potent carcinogens. These foods contain cancer-causing chemicals not added by farmers, processors, or distributors, but produced naturally by the plants themselves. In millions of years of evolution, many plants have acquired the ability to produce chemicals that act as natural pesticides. These agents protect against fungi, insects, and animal predators. One authority has remarked that "we are ingesting in our daily diet at least 10,000 times more natural pesticides than man-made pesticide residues."[41]

Ames, Magaw, and Gold, "Ranking Possible Carcinogenic Hazards."

Although only a few of these naturally occurring substances have been tested, it appears likely that many are carcinogens.

Several very familiar foods contain naturally occurring carcinogens.


81

Basil contains estragole, mushrooms contain hydrazines, brown mustard contains allyl isothiocyanate—all capable of causing cancer in laboratory animals. Celery contains psoralens, light-activated carcinogens that become more concentrated in the presence of mold. Molds, which frequently contaminate food substances, produce antibiotic materials to protect themselves from other microorganisms. These materials are frequently carcinogenic. Aflatoxin, an extremely powerful carcinogen produced by mold, is found in wheat, corn, nuts, peanuts, and other stored grains and seeds.[42]

Ibid.

Traditional methods of cooking and storing food may also raise cancer risks. Broiling and frying meat and fish can produce chemicals in the same class (polycyclic hydrocarbons) as those associated with cancer among the chimney sweeps of old and among today's roofers, gas workers, and coke-oven operatives. Consumption of fried, salted, or smoked fish or pickled vegetables appears to cause stomach cancer, perhaps because of the large amounts of salt these foods contain.[43]

National Research Council, Diet and Health, p. 594.

As the proportion of such items in the American diet has declined (and the consumption of fresh fruits and vegetables has increased), the frequency of stomach cancer has plummeted. Japanese-Americans in Hawaii—who, it would appear, consume an "Americanized" diet—have a much lower rate of stomach cancer than Japanese people living in Japan. The Japanese-Americans are genetically similar, but they consume a much lower proportion of smoked, pickled, and salted foods.

Finally, it is interesting to review the reported relationship between coffee drinking and cancer. A 1981 article in the prestigious New England Journal of Medicine compared Boston patients hospitalized for pancreatic cancer with those hospitalized for other reasons. To the consternation of coffee drinkers, patient interviews indicated that those who had pancreatic cancer habitually consumed more coffee than the comparison group.[44]

B. S. MacMahon et al., "Coffee and Cancer of the Pancreas," New England Journal of Medicine 304 (1981): 630-33.

But by 1986, the researchers had changed their minds, reporting in a larger study that coffee drinking was unrelated to cancer.[45]

C. C. Hsich, B. S. MacMahon, and D. Yen, "Coffee and Cancer of the Pancreas (Chapter 2)," New England Journal of Medicine 315 (1986): 587-89.

Lifestyle And Behavior

Important causes of cancer and opportunities for avoiding the disease seem to lie in the area of lifestyle and behavior. Even under the most constrained circumstances, Americans typically exercise substantial choice over their day-to-day activities and personal practices. Researchers have carried out numerous studies of the relationship between


82

personal behavior and development of cancer. The areas of greatest importance include sexual practices, alcohol consumption, exercise, childbearing, and use of tobacco.

Sex.

Certain sexual practices predispose women to developing cancer of the uterine cervix. These practices include early marriage, first pregnancy at a young age, sexual activity in early adolescence, and multiple sexual partners. Women who have histories of venereal disease face higher risks of cervical cancer. The disease tends to occur most frequently among minorities and economically disadvantaged people.

It is tempting to ask whether men face a similar risk of cancer as a result of promiscuous sex. Like cervical cancer, prostate cancer has been found more frequently among men who begin to have intercourse at an early age, have had many sexual partners, and have histories of venereal disease. Findings from a comparison of Catholic priests with other men in the Los Angeles area, however, contradict this hypothesis. Among five hundred deceased priests, thirteen were found to have died of prostate cancer. In a comparable population of male decedents, only eight such deaths would have been expected. The absence of a lower death rate from prostate cancer among men who were presumably celibate contradicts the belief that the disease is sexually transmitted.[46]

R. K. Ross, A. Paganini-Hill, and B. E. Henderson, "Epidemiology of Prostate Cancer," in Urological Cancer, ed. D. G. Skinner (New York: Grune and Stratton, 1983).

Childbearing.

If promiscuous sex exposes women to greater cancer risks, having children may protect them. Pregnancy and childbirth seem to play a significant role in preventing cancers of the endometrium, ovary, and breast.[47]

Doll and Peto, "The Causes of Cancer," p. 1237.

Generally, women who have full-term pregnancies experience fewer full ovulation cycles over the course of their lives than those who neither become pregnant nor have children. Reduction in the number of cycles, it appears, results in lower overall production of female gonad-stimulating hormones, which promote the growth of malignant tissues. The relation between hormone production and pregnancy is complex, however. Women who have never had children may have a lower cancer risk than those who first become mothers in their late thirties.

Exercise.

People who exercise regularly may help protect themselves from some forms of cancer, particularly malignancies of the colon and breast. Several researchers have reported that sedentary people have greater risks of colon cancer than those who exercise regularly. Explanations have focused on the fact that exercise tends to stimulate


83

peristalsis, which may in turn reduce contact time between carcinogens and the tissues that line the colon. Two studies by researchers at the University of Southern California compared sedentary and physically active individuals. One study compared men whose occupations required strenuous physical work with those in sedentary occupations. Men with sedentary jobs had an 80 percent greater risk of colon cancer than those with physically demanding jobs. A second study compared retirement community residents who reported exercising less than one hour per day with those averaging two or more hours per day. Among men, those with low activity developed colon cancer 2.3 times more often than the highly active; no difference was detected among women.[48]

R. K. Ross et al., "Avoidable Nondietary Risk Factors for Cancer," American Family Practitioner 38 (1988): 153-60.

Physical activity among women, particularly in girlhood or adolescence, has shown signs of helping protect against breast cancer. As with ovarian cancer, reduced hormone production may serve as the preventive mechanism. As noted earlier, a greater number of complete ovulation cycles (due to lack of interruption from pregnancy or to early commencement of the cycles) seems to coincide with a higher risk of certain cancers. Exercise in adolescence seems to delay the establishment of regular ovulation, reducing exposure to these hormones.[49]

L. Bernstein et al., "The Effects of Moderate Physical Activity on Menstrual Cycle Patterns in Adolescence: Implications for Breast Cancer Prevention," British Journal of Cancer 55 (1987): 681-85.

Alcohol.

A growing number of researchers have begun to suspect a connection between alcohol consumption and cancer. Physicians have known for some time that heavy drinkers develop cancers of the mouth, larynx, and esophagus more often than light drinkers or nondrinkers. But a series of studies in the 1980s suggests that light or social drinkers may also face an increased risk of breast cancer.[50]

Byers, "Diet and Cancer."

Although a definite link between light drinking and cancer has not been established, this possible risk is a continuing concern of researchers.

Tobacco.

The fact that tobacco causes lung cancer is almost universally acknowledged. Tobacco receives attention here only to illustrate the magnitude of risk its use involves. Because the relationship between tobacco and cancer is so strong, and because its use is so widespread, it far and away represents the greatest overall cancer hazard discussed in this chapter. Authoritative studies indicate that about one-third of all cancer deaths in the United States around 1980 were caused by tobacco. A person who smokes twenty cigarettes per day beginning at age twenty has ten times the risk of developing lung cancer as a nonsmoker.[51]

Ross, "Avoidable Nondietary Risk Factors."

Tobacco use raises the chance not only of lung cancer but also of bladder, pancreatic, oral, laryngeal, pharyngeal, and esophogeal cancers.


84

Use of tobacco multiplies the carcinogenic effect of other agents, sometimes greatly. The risk for cancer of the esophagus associated with moderate use of alcohol, for example, is three times greater among heavy smokers than among nonsmokers.[52]

Doll and Peto, "The Causes of Cancer."

Risk increases of even greater magnitude have been observed for industrial carcinogens. Workers who both smoke and are exposed to asbestos or uranium-mine dusts, for example, experience astronomical lung cancer rates.

This already extensive list could continue indefinitely. Researchers on the exogenous causes of cancer have identified factors as exotic as parasites in undercooked snake meat consumed in Africa and Asia. They have reported phenomena as familiar to the public as the "epidemic" of cervical cancer in young women whose mothers received prescriptions of diethylstilbestrol (DES) during their pregnancies. Whether exotic or mundane, exogenous causes of cancer are of special interest because they are presumably avoidable.


Chapter Four Can Cancer Be Prevented?
 

Preferred Citation: Greenwald, Howard P. Who Survives Cancer?. Berkeley:  University of California Press,  c1992 1992. http://ark.cdlib.org/ark:/13030/ft9b69p365/