4—
Recombinant DNA Research
In May 1984 Federal District Judge John Sirica delayed an outdoor experiment that would have tested a genetically engineered organism developed to protect crops against frost damage. This decision was the most visible expression to that date of public concern about regulating environmental risks created by the biotechnology industry. But the attention directed at biotechnological risks during the mid-1980s was mild compared to the alarm these risks caused a decade earlier. At that time genetic engineering seemed headed for the same regulatory fate as nuclear power.
In the mid-1970s the risks associated with splicing genes from one organism into another through recombinant DNA (rDNA) techniques were causing widespread concern in the scientific community. Public fears of rDNA research, fueled by the possibility of a catastrophic release of dangerous organisms into the environment, were remarkably similar to fears about nuclear power.[1] Interest groups led by dissident scientists gained widespread visibility, media coverage became increasingly sensational, and local controversies raged in towns where universities proposed conducting rDNA research. Congress was close to passing legislation that would have created a new regulatory agency, patterned on the Nuclear Regulatory Commission, to oversee rDNA research.
By the early 1980s, however, the furor surrounding rDNA
research had all but vanished. The regulations that seemed so imminent in 1977 never materialized. Congress failed to pass a single regulatory measure, and most local governments followed in the same path. The federal regulations imposed by the National Institutes of Health (NIH) prior to 1977 were progressively relaxed, and the complete abolition of these regulations was seriously considered (but rejected) in 1982. Where media had initially emphasized risks of the research, now it focused on the future benefits and the exciting prospects of the biotechnology industry. This unexpected course of the recombinant DNA debate was at least in part the result of the strategies employed in diagnosing and preventing anticipated rDNA research-related catastrophes. This chapter examines those strategies.
Stages of the Controversy
In 1971 the Stanford biochemist Paul Berg proposed to insert DNA from SV40, a virus that can cause tumors in lower animals, into a common type of bacterium that inhabits the human intestine. Berg's proposal, which would have been the first recombinant DNA experiment, triggered concern about the risks of such research. Suppose, some of Berg's colleagues hypothesized, the recombined genes could initiate tumor growth in humans, and suppose a lab worker were accidently infected with them? It was possible that an epidemic of an unknown disease might thereby be created. These scientists believed that a closer examination of risks should precede such an experiment.
In deference to his colleagues, Berg canceled the experiment, but the doubts raised about it and similar experiments did not abate. Concern in the scientific community continued to increase until June 1973, when participants at the Gordon Research Conference on Nucleic Acids formally requested that the National Academy of Sciences appoint a committee to examine the risks of rDNA research. A committee was formed, and in July 1974 it announced its recommendations in a letter published simultaneously in Science and Nature, jour-
nals that are widely read in the scientific community. The committee recommended that: (1) certain types of recombinant DNA experiments that seemed especially hazardous should be deferred until the risks were better understood, (2) the National Institutes of Health should appoint a committee to evaluate the possible risks and establish research guidelines and procedures, and (3) an international conference on the potential risks should be held. All three recommendations were carried out. Scientists voluntarily observed a moratorium on the specified types of experiments, the NIH established what became known as the Recombinant DNA Advisory Committee to assess risks and establish research guidelines, and an international conference on rDNA research was held at Asilomar, California, in February 1975.
The conference at Asilomar is considered a landmark in the history of the rDNA controversy. After four days of meetings, the participants concluded that the moratorium on especially risky experiments should continue and that all other experiments should be performed according to safety guidelines that were set forth at the conference. The guidelines were elaborated and refined by the Recombinant DNA Advisory Committee, and in June 1976 they were formally promulgated by the NIH.[2] The guidelines prohibited six types of experiments, classified all other experiments according to degree of possible hazard, and required varying degrees of containment for each of these classes of experiments. The more potentially hazardous the experiment, the more extensive the containment requirements.
Just as the scientific community was reaching agreement on how to handle the risks of these experiments, public controversy erupted. It began at the local level in university towns throughout the United States. The most notorious local dispute took place in Cambridge, Massachusetts, in the summer of 1976. At issue was whether an old biology building should be modified to meet the containment standards required by the NIH for relatively risky rDNA experiments.[3] Disagreement within the Harvard biology department grew into a campus-wide debate. Among the leaders of the opposition to the proposed rDNA lab were Nobel laureate George Wald and his
wife Ruth Hubbard, a group known as Science for the People, and several environmental groups that had been active in the debate over nuclear power. Cambridge's mayor Al Velluchi, long at odds with Harvard, led the opposition and stated its case in this way:
We want to be damn sure the people of Cambridge won't be affected by anything that would crawl out of the laboratory. . . . It is my responsibility to investigate the danger of infections to humans. They may come up with a disease that can't be cured—even a monster. Is this the answer to Dr. Frankenstein's dream?[4]
The controversy spread throughout the country, though rarely with the intensity reached in Cambridge. By 1977 the risks of rDNA research had been debated at the local level in Ann Arbor, Bloomington, Berkeley, Madison, and San Diego and at the state level in New York, California, and New Jersey. The debates resulted in extensive media coverage, much of it sensational: "Science That Frightens Scientists" (Atlantic ); "Creating New Forms of Life—Blessing or Curse?" (U.S. News and World Report ); "New Strains of Life or Death?" (New York Times Magazine ). With the media interest came a flurry of books: Recombinant DNA, The Untold Story; Biohazard; Playing God .[5] During 1977 alone sixteen bills for regulating rDNA research were proposed in Congress and twenty-five hearings were held. The primary bills that finally emerged in both the House and Senate called for extensive federal regulations. Senator Kennedy's bill, which was passed by the Senate Committee on Human Resources, proposed a national regulatory commission empowered to promulgate safety regulations, license and inspect facilities, and fine violators.
Kennedy's proposed national regulatory commission was reminiscent of the Nuclear Regulatory Commission; by 1977 the controversy had all the earmarks of the nuclear power debate—aroused citizenry, media sensationalism, congressional concern, and vocal opposition by dissident scientists and environmental groups. Nowhere was the intensity of the controversy and its similarity to the nuclear debate more striking than at a National Academy of Sciences meeting held in Wash-
ington D.C. in March 1977. The opening session was dominated by protesters. In front of TV cameras, they chanted, waved banners, demanded that the conference be "opened up to the people," and asserted:
This is just the first protest. . . . We are just the little ruffling wind before the storm of public outrage. . . . This is the most important social issue of the coming next decade. . . . We are not going to go quietly. We have means at our command to resist the change in the human species. We will not go gentle [sic ] into the brave new world, that new order of the ages that is being offered to us here.[6]
But the promised "storm of public outrage" and the new regulatory regime never materialized. Recombinant DNA research, unlike nuclear power, quickly receded as an issue of contention. In Congress not one of the sixteen bills on rDNA research proposed in 1977 ever reached the floor. Only two bills were proposed in 1978, none the following year, and one in 1980. By 1981 there was almost no congressional interest in regulating rDNA research.[7] Any remaining interest focused not on the physical risks but on the ethical and legal ramifications of the research and on the genetically-engineered products that would eventually be commercially marketed.
At the local political level, the flurry of regulations that seemed so imminent in 1977 also failed to materialize. The few laws that were imposed essentially required conformance to the NIH guidelines. No funds were appropriated and no local governmental agencies were created to regulate rDNA research. Meanwhile, the NIH revised its guidelines in 1978, 1980, 1981, and 1982, each time making more rDNA experiments permissible at ever lower levels of containment.
Why Did Concern Evaporate?
Why is the case of regulating rDNA research so dramatically different from that of nuclear power? One possibility is that there has been a growing conservative, antigovernment, antiregulation mood in the nation since the early 1980s. How-
ever, this reason does not explain the continuing public concern over nuclear power, acid rain, and other technology-related issues. Another possible explanation is that the risks of recombinant DNA research are lower than those of nuclear power. This assumption runs counter to the opinions of esteemed scientists who have been arguing for years that the risks of nuclear power are far lower than other widely accepted activities such as burning coal for fuel, flying in airplanes, and driving automobiles. However, the public has not accepted this testimony from scientists. Why, in the case of rDNA research, should scientists' views have been accepted?
Perhaps the most striking difference between the two cases is that the scientific community reached more of a consensus about the risks of rDNA research than they did about the risks of nuclear power. Continuing disputes among the experts seem to lead to ongoing conflicts about policy. In all the cases cited in this book, scientists triggered the initial concern and then played prominent roles in the subsequent debates. Members of Congress, interest groups, and the media inevitably joined in, but because of the technical nature of the issues, they relied on the opinions of the experts. This pattern emerges repeatedly. On issues ranging from the arms race to carcinogens in the workplace to the prospects of future energy technologies, Senator X cites the testimony of an esteemed scientist from a prestigious scientific body, only to be countered by Senator Y, who cites opposite testimony by an equally esteemed scientist from an equally prestigious university.[8]
For nuclear power, the most vivid example of such conflict among experts was the dispute over the Rasmussen report. This major study of reactor safety, sponsored in 1975 by the Atomic Energy Commission and directed by MIT Professor Norman Rasmussen, estimated that the worst plausible reactor accident would lead to thirty-three hundred early fatalities, forty-five thousand early illnesses, and fifteen hundred latent cancer fatalities. However, it gave the probability of such a severe accident as one in a billion (per reactor per year).
Nuclear power advocates seized on this report as support for their cause. The exceedingly low probability of an accident, they argued, demonstrated the safety of nuclear power. But
others, including the American Physical Society, countered that because of substantial uncertainties (such as those discussed in chapter 3), the likelihood and effects of a serious accident could be considerably greater than estimated. The effects of low-level radiation alone would add several thousand to the estimated latent cancer fatalities.[9]
By 1985 the debate over the Rasmussen report still had not been resolved. Studies by the American Nuclear Society and the nuclear industry, motivated in part by the Three Mile Island accident, indicated that the report had overestimated by as much as a thousand times the amount of radioactive fission products actually released in an accident. If agreement could be reached on the validity of the new figures, a significant easing of restrictions on nuclear power might be justified. But the report's optimistic findings were at least partially contradicted by an American Physical Society study group that found that in some types of accidents the release might be greater than had been estimated.[10]
Initially, the rDNA research debate also was characterized by disagreements within the scientific community. It was a group of scientists who first raised the alarm about rDNA research and who insisted, despite the skepticism of some of their colleagues, on assessing the risks. But once the NIH guidelines were established, the politics of the rDNA debate began to change. By 1977 the scientific community seemed to have closed ranks on the issue, and it presented to Congress and the media a far more united front on the risks of rDNA research than it ever had on the risks of any other major technological policy issue. In one forum after another, one prestigious spokesperson after another—including some who had initially urged caution—argued that the earlier concerns had been greatly overstated.[11]
To be sure, skeptics were still to be found, but many were associated with public interest groups rather than mainstream scientific organizations and were vastly outweighed in prestige and number by proponents of rDNA research. There was no equivalent in the rDNA debate to the dissenting opinion posed by the American Physical Society to the Rasmussen report's estimates of reactor safety. Instead, the majority of scientists
favored expanded rDNA research, and, faced with this predominant opinion, most of those advocating extensive regulation backed down. Senator Kennedy withdrew support from his own bill to regulate rDNA research, which had been the leading piece of such legislation in the Senate.
It is clear that a consensus in the scientific community on rDNA research and a lack of consensus about nuclear power at least partly accounts for the differences in policy directions between the two technologies. But why were scientists able to come to near consensus about rDNA research and not about nuclear power? We return again to our original question: what is it about the rDNA research controversy that has made it so different from our other cases?
Several critics have suggested that the scientific community closed ranks because it had a vested interest in ending the regulatory debate. Unlike nuclear power, where regulations restrict industry, regulations of rDNA research would have restricted scientific endeavors. Regulations would have constrained both researchers working in universities and the growing number of scientists who were finding lucrative opportunities in the nascent rDNA industry. As long as they did not fear outside control, the critics contended, scientists expressed their doubts freely; once the threat of regulation beyond their control developed, scientists quickly closed ranks and covered over their differences about the risks.[12]
Similarly, it can also be argued that because the NIH was both funding rDNA research and regulating it, this may have created a bias that led scientists to understate the risks. Scientists seeking research grants obviously had an incentive to avoid antagonizing the NIH, their potential sponsor. In 1978 Joseph Califano, the secretary of the Department of Health, Education, and Welfare, ruled that representatives of public interest groups should be added to the Recombinant Advisory Committee, but by that point the controversy was already waning. On the other hand, scientists can advance their careers by challenging mainstream views with solid research, so there are also incentives to speak out as well as acquiesce. And in the case of nuclear power, the Atomic Energy Commission was in the same position as the NIH—funding as well
as regulating nuclear research and development—and still vigorous disputes about nuclear power risks occurred. While it would be surprising if self-interest did not have some role in the resolution of the rDNA research issue, it is not a sufficient explanation, and we must explore further to understand the rDNA research story.
Containing Hazards and Verifying Risks
The most distinctive characteristic of rDNA research is that the risks proved relatively manageable. The consequences of potential accidents could be limited, and estimates of risk could be verified. Therefore, the inherent nature of the problem was different from that posed by nuclear power or by most other risky technologies, and a consensus about the risks was much easier to achieve.
Strategies to Limit Hazards
While it is impossible to be absolutely certain that the consequences of a serious rDNA research accident will be contained, the probability of containing such accidents within acceptable limits is substantially higher than for nuclear reactor accidents. In addition to requiring that steps be taken to prevent accidents, the NIH relied on two tactics for limiting such hazards.[13] First, and reminiscent of the early approach to reactor safety, the NIH required both physical and biological containment. The NIH classified all permissible rDNA experiments according to their level of potential hazard. The greater the potential hazard, the more extensive the containment requirements. For example, according to the original NIH guidelines, experiments in the high potential hazard class could only be performed in labs of monolithic construction that were equipped with air locks, systems for decontaminating air, autoclaves, shower rooms, and other physical containment safeguards.
Biological containment also was required for more dangerous experiments.[14] Scientists were required to use as host
organisms only bacteria that had been shown in animal and human tests to be so enfeebled that they could not survive outside the lab. In this way, even should an accident occur during a hazardous experiment and physical containment fail, biological containment (the inability of the organism to survive outside the lab) would limit the hazard. Even experiments in the least hazardous category had to use only approved bacteria as the host organism. The most common host organism was E. coli K-12, a laboratory strain of a common, well-studied colon bacterium (Escherichia coli ).
The NIH also limited potential hazards by prohibiting particularly risky experiments. Originally, six classes of experiments were banned: (1) experiments with more than ten liters of culture; (2) experiments in which organisms containing rDNA were deliberately released into the environment (such as the one delayed by Judge Sirica in 1984); (3) experiments using DNA from certain pathogens; (4) experiments using DNA segments that code for vertebrate toxins; (5) experiments using rDNA techniques to create certain plant pathogens; and (6) experiments in which drug resistance traits were transferred to disease-causing organisms. These prohibitions were not absolute, since the NIH would make exceptions if sufficient safety precautions were demonstrated.
Prohibiting risky experiments was a very simple method of limiting hazards. To apply this method to nuclear power, reactors would either have to be limited to sizes that would virtually guarantee containment or be prohibited from any place except very remote areas.
Being able to limit the consequences of an accident from rDNA research simplified the NIH's task. The alternative would have been to rely primarily on prevention, which, as we have seen for nuclear reactors, is a difficult strategy to implement in practice. Limiting harm from accidents is simpler than trying to prevent accidents altogether. Regulators were thus able to focus on the relatively limited problem of estimating and protecting against the worst potential consequences of accidents. This is by no means a trivial task, but it is more manageable than trying to anticipate all possible causes of accidents.
Ways to Verify Risks
For both nuclear reactors and rDNA research, reputable experts argued that the risks were acceptably low, but such claims were much more easily verified for rDNA research than they were for nuclear power.
Substantial efforts have been made to assess the risks of rDNA research. At least sixteen categories of risks have been identified as shown below:[15]
Gordon Conference, 1973. Risks: new types of plasmids and viruses; large-scale preparation of animal viruses.
Asilomar Conference, February 1975. Risks: spread of antibiotic-resistant organisms; alteration of the host range of bacteria; and "shotgun" experiments with unknown outcomes.
Senate Health Subcommittee, April 1975. Risks: infections not susceptible to known therapies; animal tumor virus genes in bacteria.
Director's Advisory Committee Meeting, February 1976. Risks: new routes of infection; novel pathogens; disease transmitted by E. coli.
Senate Health Subcommittee, 1976. Risks: spread of "experimental cancer"; virulent hybrids due to combining two mild viruses; unknown impact on biosphere if new species created.
House Science and Technology Subcommittee, 1977. Risks: altering the course of evolution; transmission of laboratory recombinants to wild strains of bacteria; colonizability of E. coli.
National Academy of Sciences Forum, 1977. Risks: latent tumor viruses; insulin genes in E. coli; extraintestinal E. coli infections; breakdown products of recombinant DNA molecules.
Senate Health Subcommittee, 1977. Risks: possible interference with human autoimmune system; unanticipated hazards; disturbance of metabolic pathways.
Falmouth Workshop, 1977. Risks: creation of more dangerous bacteria; spread of R-factors.
Ascot Workshop, 1978. Risks: penetration into intestinal lining; new types of experiments with viruses.
Workshop on Agricultural Pathogens, 1978. Risks: E. coli transformed into a plant pathogen; more virulent strains of plant pathogens.
These hazards have been discussed at more than a dozen scientific workshops and conferences, analyzed through experiments, and reviewed by the Recombinant Advisory Committee, the NIH, the Office of Technology Assessment, and other federal organizations. Much of the research has attempted to answer one or more of the following questions:
Could an organism escape from a laboratory and establish itself in humans, animals, or other parts of the natural environment?
Could a recombinant organism transfer its rDNA to other organisms?
Could the rDNA make the escaped organism dangerous to man or the environment?
From the outset, some scientists asserted that the answer to all of these questions is "no." But the majority of scientists originally said, in effect, "We do not know, so we will have to proceed cautiously until we find out." Their attempts to answer these key questions are an important part of the rDNA story and reveal a great deal about the task of averting catastrophe.
Survival of Organisms Outside the Lab
The most likely way that organisms could escape from a research laboratory, many scientists believed, was by their unwitting ingestion by a lab worker. At least four sets of studies were performed in which large amounts of E. coli K-12 (up to ten billion organisms) were fed to volunteers. All of the organisms died in a few days, and none were passed out in the stool of the volun-
teers. In another study, fecal cultures from sixty-four lab personnel working with E. coli K-12 were periodically obtained over two years. The workers used no precautions other than standard microbiological techniques. At no time in the two years was E. coli K-12 recovered in the stool, and this again suggested inability of the bacteria to survive outside the lab.[16] These findings, among others, were interpreted as follows in a letter to the NIH from the chairman of a conference of infectious disease experts held at Falmouth, Massachusetts, in June 1977: "On the basis of extensive tests already completed, it appears that E. coli K-12 does not implant in the intestinal tract of man."[17]
There were some qualifications to this conclusion, however, as was attested to in the chairman's summary of the published professional proceedings of the Falmouth conference:
A number of variables are known to influence the colonization of organisms in the intestinal tract. Implantation can be altered by antibiotic administration, starvation, the type of diet, reduction in gastric acid, and antimotility drugs. It is clear that more implantation experiments need to be performed with attention to these variables, many of which may be found in laboratory workers exposed to these organisms.[18]
The report acknowledged that certain strains of E. coli can implant under certain conditions. Persons undergoing antibiotic treatment, for example, are susceptible.[19] The NIH subsequently established laboratory practice guidelines to protect against some such dangers, but it is questionable whether the guidelines are enforceable. Overall, however, a great majority of scientists working in this field concluded that the probability of hazard was too low to take precautions beyond those that are standard in microbiological work.
Transmission to Other Organisms
Even if some of the organisms escaped, they would soon die unless they could establish themselves in a human, animal, or other part of the natural environment. Research has also been conducted to evaluate this possibility. In more than thirty years of use in genetics laboratories, E. coli K-12 has a perfect safety record,
and in a study of almost four thousand laboratory-acquired infections, only two involved any type of E. coli. In neither of these cases was the infection passed on to another person.[20] Moreover, large numbers of organisms would have to be ingested in order for a human to develop an intestinal illness. This requires more than personal contact—usually a grossly contaminated source of food or water. Given contemporary sanitation and food standards, it was judged highly improbable that an infected laboratory worker could start an epidemic of gastrointestinal illness.
Again, however, this conclusion must be qualified. E. coli can cause diseases outside the intestine, and in these cases it takes fewer organisms to produce such illnesses. But most of the rDNA research scientists concluded that transmission of genetically altered organisms would be very unlikely.[21]
Creating a Dangerous Organism
Even if escape and transmission of organisms containing recombined genes is very unlikely, it is not impossible. If a dangerous organism were created and somehow did escape, the consequences conceivably could be catastrophic. Therefore, research was conducted to examine the likelihood that a dangerous organism might unwittingly be created. In one class of rDNA experiments, known as "shot-gun" experiments, the genes of a plant or animal cell were cut into many segments, and these were inserted at the same time randomly into E. coli K-12. These experiments in particular raised fears of an unwitting creation of harmful organisms. To test these fears, the Recombinant Advisory Committee recommended a so-called worst-case experiment that the NIH agreed to fund.
The intention of the worst-case experiment was to see if a dangerous organism could be created deliberately. Investigators inserted the DNA for a virus into E. coli, and then administered the altered organisms to mice. Only one of thirteen experimental combinations produced illness and at a rate much lower than the virus would have caused without the DNA manipulation. Most scientists found the experiment reassuring; as one biochemist put it: "This type of experiment is therefore safer than handling the virus itself . . . and a major
lowering of the required containment levels seems clearly justified."[22] Again, however, this conclusion required qualification. According to several critics, the one case of viral infection was a new disease pathway not possible without the rDNA techniques.[23]
In another series of experiments to evaluate risks, scientists inserted a similarly altered organism in hamsters to test tumor formation. No tumors developed under normal experimental conditions, but under less likely experimental conditions, where two copies of the virus were inserted simultaneously (as might accidentally occur) tumors were produced. The researchers interpreted this evidence as supporting the safety of the research, as did most other scientists. However, a small minority again dissented and criticized the experimenters' methodology and interpretations.[24]
Because of studies such as these, rDNA research seemed less threatening, and subsequent research reinforced this impression. The gene structure of plants and animals was shown to be significantly different from the gene structure of bacteria. Plant or animal DNA must be altered deliberately to make it compatible with the host if it is to become an active part of the bacteria into which the genes are inserted.[25] Similarly, in most rDNA experiments, the new segment of genes constitutes no more than 0.5 to 1 percent of the total gene set of the host organism. Unless this segment is carefully and deliberately integrated with the host's genes, the host is unlikely to be significantly affected.[26]
Another worst-case scenario was that E. coli bacteria containing rDNA designed to produce insulin were assumed to have escaped from the lab, established themselves in the human intestine (and replaced all the normal intestinal E. coli ), and produced the maximum amount of insulin. Even given these assumptions, the analysts concluded that the total amount of insulin that could be produced would "not be expected to have much, if any effect, on a mammalian host." The results of the analysis, when generalized to more active proteins, indicated that most experiments would pose little problem.[27]
The risk that rDNA can turn a host into a pathogen was
summarized at another important meeting held in 1978 at Ascot, England, and attended by twenty-seven scientists, most of whom were virologists:
The workshop concluded that inserting part or all of the gene set of viruses into E. coli K-12, with approved vectors, poses "no more risk than work with the infectious virus or its nucleic acid and in most, if not all cases, clearly presents less risk. In fact, . . . cloning of viral DNA in E. coli K-12 may produce a unique opportunity to study with greatly reduced risks the biology of extremely pathogenic and virulent viruses." In other words, inserting a piece of viral DNA into E. coli K-12 locks it up inside the bacterium and makes it much safer to handle than the naked virus itself.[28]
A few scientists did not share this view, but the great majority believed that the evidence showed it to be very unlikely that genetically altered bacteria would be dangerous in the unlikely event that they did escape and the even more unlikely event that they established themselves in the environment.
Appraisal of risks
These experiments and analyses do not show that there is no risk from rDNA research, and in fact they reveal that risks can arise in certain, very specific, highly unlikely conditions. But such controlled experimentation and the data generated through experience led a great majority of scientists and experts to conclude that most of the early concerns about rDNA research were unfounded. The ability to practically assess risk is the essential difference between the nuclear power and rDNA research issues. Much of the analysis of reactor safety is hypothetical, focusing on unlikely conditions that might arise during events that have never occurred. In developing policy about rDNA research, by contrast, policy makers have been able to make such analyses on a more concrete basis; risks have been evaluated by empirical tests governed by usual scientific methods and standards.
The concrete nature of this evaluation has had an enormous effect on the regulatory process. As confidence grew that E.
coli K-12 was unlikely to survive outside the lab or be dangerous if it did, the guidelines governing rDNA research were gradually relaxed. In 1978 the requirements for biological containment in experiments using E. coli K-12 were reduced to the lowest level, and in 1981 most classes of experiments using E. coli K-12 were exempted from the guidelines altogether. Moreover, new host systems were approved for use, the NIH abolished or reduced restrictions on most experiments not using E. coli K-12, and by spring 1982 there was no longer a category of prohibited experiments.
Conclusions
In chapter 7 we will describe the major characteristics of a catastrophe-aversion system that is at least partially illustrated in each of our cases. The history of recombinant DNA research provides the clearest example of such a system in operation. When concerns about the risks of rDNA emerged, steps were taken to protect against potential catastrophe. Certain classes of experiments were altogether banned, and all others had to be performed within biological and physical containment. Once precautions had been taken, attempts were made to learn more about the risks, both by deliberate testing and by monitoring experience. As more was learned about the risks and as concerns decreased, the initial precautions were progressively relaxed.
Critics have argued that this strategy should have been implemented more stringently. It is true that testing could have been more thorough, the burden of proof could have been kept more squarely on the advocates of potentially risky research for a longer period, nonscientists could have been given more decision-making authority earlier in the process, more of the opposing scientific opinions could have been evaluated, more experiments could have been prohibited, and the guidelines could have been relaxed more slowly.
How cautiously to proceed in protecting against potential risks is always a matter of judgment as well as a matter of science. In the case of rDNA research, a substantial majority
of experts in the field agreed that the risks were very low. The NIH, local governments, and members of Congress went along with this judgment, and these judgments have been borne out by experience. Former critics of rDNA research no longer are vocal about the dangers—perhaps persuaded at last or maybe just overwhelmed. Whether or not the critics were correct, sensible strategies were developed to cope with the uncertainties and potential risks of rDNA research. We discuss these strategies in more detail in our concluding chapters.
A secondary point in this chapter has concerned the striking difference in the fates of the public debates over rDNA research and nuclear power. We have argued that this difference can be attributed largely to the differing natures of the two problems. In rDNA, a worst-case event can be contained and estimates of risk can be tested empirically. A worst-case event for a nuclear reactor cannot be contained and therefore must be prevented; and estimates of risk must necessarily remain hypothetical. Although one side or the other in the nuclear controversy may be mistaken, there is a sound reason why no strategy has been developed that can resolve the debate.
If the objective nature of the technological issue affects its fate, what about the many other explanations for technology policy debates? And how does the objective nature of a technology relate to the widely accepted view that public perceptions and fears are what really guide technology debates and shape policy? We believe that the nature of a social problem limits what constitutes legitimate debate. The nature of the problem in the nuclear power debate is such that there is no way to establish definitively the magnitude of the risks. Advocates of nuclear power can insist that the probabilities of accidents are very low, that the major causes of accidents have been anticipated, and that the worst case would not really be that bad, but none of these arguments can be fully verified. Regulators are left with no conclusive basis for deciding between these claims and opposite claims. Lacking a basis for resolving the facts of the matter, factors like public perceptions and general attitudes become important. The position one takes on the conflicting estimates of the risks depends on
whether one trusts government institutions, whether one fears high technology, and so on.
In contrast, the nature of the rDNA problem imposed objective constraints on the resulting debate. Once studies demonstrated that E. coli K-12 was highly unlikely to survive outside the lab and cause epidemics, the credibility of counterclaims about high risks diminished. The burden then shifted to the opposition to provide evidence, based on experience or experiments, that the risks were in fact as great as claimed. In other words, the facts constrained the debate. Such "facts," determined at any given time by scientific consensus, may later prove to be mistaken; until that time, they restrict the range of legitimate disagreement. And they constrain the currency of the debate as well, since only new "facts" are sufficient to displace old ones.
It is important to note that the discussion in this chapter has been limited to the physical risks of recombinant DNA research. As commercial biotechnology products are introduced, a new set of potential risks and regulatory issues is emerging. Moreover, as scientific techniques become increasingly powerful, debates surely will arise over the propriety of intervention into the human gene structure.[29] Whether appropriate strategies will be developed for handling these ethical and philosophical issues remains to be seen. To the extent feasible, the strategies used in the rDNA research controversy will be worth emulating.