Quarantine and the Problem of AIDS
David F. Musto
Men take diseases, one of another.
Therefore let men take heed of their company.
Henry IV, Part 2
In ancient times citizens noted that, occasionally, a disease from a distant locale was sweeping toward them from neighboring villages, or that after a ship from a foreign land reached shore with ill persons aboard, residents in the port city would take ill. Such temporal sequences could not be ignored and, if the illness were a serious one, fears escalated as the illness came closer. Knowing the cause of an illness or its mode of transmission provides the basis for some rational approach to containing the spread of the disease. Prior to the nineteenth century, however, these agents were unknown, and civil authorities were thus left with whatever means seemed reasonable in the wisdom of the time to fight the spread of diseases. Protective measures were based on what we would now consider erroneous explanations for contagion. From this era of scant knowledge comes the origin of the familiar word we use to describe the isolation of the sick or contagious from the healthy. "Quarantine" comes from the Italian word for "forty days," and refers to the period during which ships capable of carrying contagious disease, such as plague, were kept isolated on their arrival at a seaport.[1]
Today, quarantine has come to mean a marking off, the creation of a boundary to ward off a feared biological contaminant lest it penetrate a healthy population. The essential characteristic of quarantine is the establishing of a boundary to separate the contaminated from the uncontaminated. But to consider only those quarantines of diseases that are infectious or that have short periods of illness, characterized by, say, fever, would be to overlook the deeper emotional and broader ag-
gressive character of this measure. Evidence of this elemental fear of contagion includes such instances as measures taken against yellow fever in the eighteenth century and the growing fear of the AIDS epidemic in the late twentieth century. The assumptions and psychology of quarantine are evident in restrictions against groups thought liable to degrade "racial purity" if allowed to immigrate into a "racially healthy" country. The multiple determinants of quarantine can be seen in a much earlier age also. The social history of leprosy is an enduring and dramatic example of boundaries being drawn around those with a lengthy illness that was highly feared and believed to be highly contagious.
Leprosy
The bacillus responsible for leprosy was not discovered until 1874, one of the first bacterial pathogens to be described. In preceding centuries the early stages of leprosy had often been confused with other skin diseases, but the advanced stages of leprosy—characterized by loss of nerve conduction and bodily disfigurement—occurred frequently enough to ensure continuous alarm about physical signs that might foretell the gradual and, for all practical purposes, irreversible wasting of the body by leprous infection. Leprosy was dreaded first of all because it was frequently assumed to be incurable and eventually fatal. Second, it was thought to be contagious—somehow. The strict rules established over the millennia to quarantine lepers reveals that people commonly believed they could be infected by touching a leper or coming into contact with his or her breath.
Medical care often falls into the simple sequence of diagnosis, then treatment. For leprosy the sequence was diagnosis, then separation. Leviticus, the third book of Moses, contains detailed rules for the diagnosis of leprosy. Once the diagnosis is made, the following is commanded by the Lord:
The leper who has the disease shall wear torn clothes and let the hair of his head hang loose, and he shall cover his upper lip and cry, "Unclean, unclean." He shall remain unclean as long as he has the disease; he is unclean; he shall dwell alone in a habitation outside the camp
(Lev. 13:45-46, RSV)
We often associate leprosy with Europe's Middle Ages, and indeed leprosy was a widespread problem then. It is estimated that thousands of individual or group asylums called leprosaria existed in the thirteenth
century.[2] The Christian church had reaffirmed the Mosaic concern with diagnosis and separation. The Third Lateran Council (1179) mandated living provisions for lepers, and elaborate rituals were decreed for the ceremony of separation. The common image of the medieval leper is of a forlorn individual coldly isolated and seeking sustenance through begging. It was not uncommon to believe that the loathsome disease was God's punishment for sin, particularly venereal transgressions. This linkage of leprosy with sexual promiscuity, with promiscuity seen either as a cause or a consequence of the disease, is interesting in light of our present attitudes toward AIDS.
But medieval society also took a larger and more humane view of leprosy. The church, the chief instrument for dealing with disease and sin during this era, devised religious ceremonies that enlisted the leper's cooperation in his or her isolation. The ritual centered on the leper and presented separation from society as a mutually wise decision. Sometimes the leper was encouraged to regard the disease as the sufferings of purgatory here on earth; leprosy was a sign that the leper would pass directly into heaven without the intervening punishment other mortals must endure in order to attain a purified form. Buttressing this concept were the Crusaders returning to Europe with leprosy apparently acquired in the Holy Land. A link between sin and the disease in these cases was unthinkable.
The ritual varied from one diocese to another and over time—for leprosy was a problem that, unlike its victims, would not go away. Fundamentally, the ritual was a service for the dead, because lepers, in effect, were declared dead to their society and the communion of the healthy. A priest would conduct the leper to church where the leper would hear mass kneeling under a black cloth suspended over his head. After mass he would be led again by the priest preceded by a crossbearer to another site in the church where comforting passages from the Bible would be read. As the leper left the church, he was sprinkled by the priest with holy water. The whole procedure was similar to that of conducting a dead body to the church, the saying of a requiem mass, and the passage from the church to the cemetery. Indeed, some rituals specified that dirt be scattered over the head of the leper or onto his feet; in some dioceses, the leper would stand in a freshly dug grave. When at last the leper had concluded his role in these elaborate ceremonies, he separated himself from society, while the priest admonished him:
I forbid you ever to enter the church or monastery, fair, mill, marketplace, or company of persons. I forbid you ever to leave your house without your
leper's costume, in order that one may recognize you and that you never go barefoot. I forbid you to wash your hands or anything about you in the stream or in the fountain and to ever drink; and if you wish water to drink, fetch it in your cask or porringer. I forbid you to touch anything you bargain for or buy, until it is yours. I forbid you to enter a tavern. If you want wine, whether you buy it or someone gives it to you, have it put in your cask. I forbid you to live with any woman other than your own. I forbid you, if you go on the road and you meet some person who speaks to you, to fail to put yourself downwind before you answer. I forbid you to go in a narrow lane, so that should you meet any person, he should not be able to catch the affliction from you. I forbid you, if you go along any thoroughfare, to ever touch a well or the cord unless you have put on your gloves. I forbid you ever to touch children or to give them anything. I forbid you to eat or drink from any dishes other than your own. I forbid you drinking or eating in company, unless with lepers.[3]
The priest might follow these uncompromising orders with a comforting message. At Reims the ritual included this expression:
This separation is only corporeal; as for the spirit, which is uppermost, you will always be as much as you ever were and will have part and portion of all the prayers of our mother Holy Church, as if every day you were a spectator at the divine service with others. And concerning your small necessities, people of means will provide them, and God will never forsake you. Only take care and have patience. God be with you.[4]
Lepers took a prominent role in the diagnosis of leprosy. One or more lepers might be on the committee responsible for these fateful examinations. Within the asylums lepers took care of one another. Religious orders sometimes cared for the sick and for the farm sometimes associated with the lepers' enclosure, but such a formal mixture of lepers with the healthy was limited.
In the course of the long period during which lepers were feared and segregated, it became apparent that not only was it difficult to control lepers who remained unpersuaded that they should be isolated but also that the placement of large numbers of lepers in quarantined farms required a degree of social organization and resources lacking in many parts of Europe. Prodded by the widespread fear of leprosy, however, church and state institutions perpetuated the practice of quarantine. Although the quarantine ideally was softened by religious rituals as described above, such benign practices were balanced by other instances of brutality in some places and by extermination programs carried out by Henry II of England and Philip V of France. Eventually, leprosy became a metaphor for heresy, moral turpitude, and unnatural and exces-
sive lust. Leprosy resisted one wave of attempted cure after another—alchemy, miracle, penance, whatever stirred hope—while disfigured people suffering its late stages continued to evoke dread, thereby promoting quarantine.
Leprosy can be contrasted with diseases whose courses are dangerous but brief, such as plague, yellow fever, and cholera. The isolation of ships coming from lands where plague was present was the classic example of quarantine. During the Black Death of the fourteenth century, when a sizable fraction of Europe's population perished through a rapidly spreading, quickly fatal infection, attempts were made both to establish quarantine, on the one hand, for habitations still spared or, on the other, to isolate the sick. Physicians and others with a need to visit the diseased wore apparel that entirely enclosed the body: gloves, shoes, headgear, and a gown with a cache under the nose for holding strong-smelling herbs to purify the air breathed in. Clearly, quarantine and such elaborate apparel carry an assumption that diseases are contagious; the means of contagion, however, remained unclear. The breath, putrefying organic matter, even the patient's gaze was suspected. With no certainty about what was the target of control, the citizenry's anxiety could quickly shift from one possibility to another, even to groups of people, as when Jews were suspected of poisoning wells and deliberately spreading plague. Frustration over their society's failure to halt a terrifying contagion led to destructive, irrational outbursts.
Yellow Fever
The New World was not immune to epidemics. North American port cities were subject to occasional but nevertheless disastrous onslaughts of yellow fever, a viral infection now known to be transmitted by mosquitoes. Cholera spread fear and death through several waves of infection, particularly during the nineteenth century. Cholera was later discovered to be caused by a bacterium and spread through food and water contaminated by human waste. For many years, though, both diseases confounded physicians and citizens alike. Observers divided roughly into two camps, contagionists and anticontagionists, which had considerable bearing on the issue of quarantine. Although writers on epidemic disease during the eighteenth and nineteenth centuries did not always maintain a pure belief in one or the other alternative, the differences can be simply stated. Contagionists took what appears to have been the commonsense position of most people through the ages, that a disease
was transmitted from one person to another. Anticontagionists, on the other hand, believed that both yellow fever and cholera were caused by many individuals coming into contact with the products of putrefaction as a result of hot weather or the inadequate cleansing of streets, homes, and businesses.[5]
These two views postulated strikingly dissimilar conclusions not only for the origin of epidemic diseases but also for their control. When yellow fever struck Philadelphia—then capital of the United States—in 1793, government officials fled, many people died, and an acrimonious controversy ensued over the origin of the ailment. Contagionists, who were in the majority at the College of Physicians, argued that the disease had been brought into the city by a ship from the West Indies. Under this line of reasoning, quarantine of suspect ships was a wise precaution. Dr. Benjamin Rush professed the opposing view. He argued that the epidemic was caused by summer weather and the spoilage of a shipment of coffee near the wharf. He went on to assert that yellow fever was only the intensification of fever which normally "prevails every year in our city, from vegetable putrefaction."[6] This latter view was quite in keeping with Rush's assertion that all diseases were essentially the same disruption of the body's function. From the point of view of Philadelphians, however, Rush's position was a condemnation of the city itself, while the contagionists' explanation merely called for greater vigilance, with the help of quarantine, against danger from the outside such as ships from the West Indies and visitors to the city of Philadelphia.
From the perspective of the twentieth century, the contagionist-anticontagionist controversy seems paradoxical. The contagionists correctly assumed that a specific infectious agent had to be transmitted to a person in order to elicit a specific disease. But it was the anticontagionists who, although etiologically incorrect, championed sanitary measures such as clean streets and efficient elimination of human waste which we now consider essential to a healthy community. Only later in the nineteenth century would the roles of inadequate waste disposal and mosquitoes breeding in stagnant pools be seen to be links in the epidemic chain. Rush denounced the contagionists for advocating quarantines, whose "faith in their efficacy . . . has led to the neglect of domestic cleanliness." Further, he claimed, "From this influence, the commerce, agriculture, and manufacturing of our country have suffered for many years."[7]
The social effects of quarantine were equally deplorable:
A belief in the contagious nature of yellow fever, which is so solemnly enforced by the execution of quarantine laws, has demoralized our citizens. It has, in many instances, extinguished friendship, annihilated religion, and violated the sacraments of nature, by resisting even the loud and vehement cries of filial and parental blood.
Rush maintained that yellow fever "is propagated by means of an impure atmosphere, at all times, and in all places." Do not quarantine, he admonished, but drain the marshes and clean the streets instead. His plea to reject the contagionists' solution might have been written today about conditions found with AIDS patients: "A red or a yellow eye shall no longer be the signal to desert a friend or a brother to perish alone in a garret or a barn, nor to expel the stranger from our houses, to seek asylum in a public hospital, to avoid dying in the street."[8] Benjamin Rush responded to the fear that created the imposition of quarantine, as well as to the cruelty that sometimes accompanied it. Such consequences are all the more regrettable now that we know that isolating yellow fever patients has no public health value whatsoever. The history of medicine, however, is filled with useless and even harmful remedies confidently applied to the trusting patient. Rush was one of many anticontagionists who not only believed that quarantines were useless, but also that those who advocated them were themselves obstacles to clean, airy, and sanitary cities.
Cholera
By 1832, when the first cholera epidemic struck the United States, enlightened physicians were much more in Rush's camp than in that of the contagionists. In fact, anticontagionism had become a mark of the educated physician, although the populace continued to hold the unsophisticated view that diseases such as cholera were transmittable from one person to another. Indeed, cities did declare quarantines, over the objections of physicians. The president of New York City's Special Medical Council, Dr. Alexander H. Stephens, privately characterized the quarantine he was supposed to help enforce as a "useless embarrassment to commerce." Politically, however, not to have enforced quarantines would have been "suicidal," according to Charles E. Rosenberg, author of the chapter on disease and social order in this book. Still, cities that did not impose a quarantine had a commercial advantage over those that turned away or detained ships seeking to enter their
ports. Agitation within a city would increase if potential victims could not flee to a countryside believed to be more safe.
The first cases appeared in New York City in late June, and the epidemic was upon the city for the remainder of the summer. The Board of Health was greatly criticized for its efforts: The job of cleaning the city was too big to accomplish in such short order, the cholera hospitals were overcrowded, and it was not easy to find caretakers for the sick and dying. The public had demanded protection, and the response of government at the state and local level was quick and authoritarian. The natural response of the populace was to cordon off the healthy or to confine the sick; a show of support for the creation of boundaries overwhelmed the medical experts' assurances that the disease was not contagious and that quarantine was an expensive and useless weapon.
Cholera, as we saw in the chapter by Guenter B. Risse, was associated with the poor and the immoral. About two weeks into the epidemic the Special Medical Council stated that the disease was "confined to the imprudent, the intemperate and to those who injure themselves by taking improper medicines."[9] The highest incidence of cholera occurred in the red-light district, which the New York Evening Post reported to be populated by the vilest brutes whose breath would contaminate and infect the atmosphere with disease, even "be the air pure from Heaven."[10] Cholera arrived in the 1830s, and the social reaction to the ensuing epidemic was greatly complicated by the emotionally charged atmosphere of an active temperance movement in which moralizing was common. Advice for resisting the disease frequently included warnings against ardent spirits. One of the first and most prominent of American psychiatrists, Dr. Amariah Brigham, advocated in 1832 that boards of health be given "the power to change the habits of the sensual , the vicious , the intemperate ."[11] The link between illness and morality has maintained a long and strong tradition. When an epidemic illness hits hardest at the lowest social classes or other fringe groups, it provides that grain of sand on which the pearl of moralism can form. Such was the case with a disease that has elicited alarmed calls more recently for isolation: tuberculosis.
Tuberculosis
Tuberculosis resembles leprosy in that it often is a long-term illness that permits the sufferer to remain ambulatory, perhaps for years, while potentially infectious. The victim might recover, but the high mortality
rate for the illness makes the diagnosis a very serious matter. By the nineteenth century tuberculosis became one of the most frequent causes of death in the Western world. If the cause and contagiousness of cholera were disputed until a bacterium was proved responsible in 1883, it is not surprising that tuberculosis, a more obscure and chronic infection, also sparked debate. The general opinion during the last century was that some people harbored a hereditary tendency toward tuberculosis that was exacerbated by poor sanitation and living conditions. The value of quarantine under these circumstances therefore seemed doubtful. But tuberculosis evoked quarantine responses once the cause was established to be a bacterium by Robert Koch in 1882.
Ten years after Koch's astounding announcement that the cause of tuberculosis had been found, the first tuberculosis association in the United States was formed in Pennsylvania. From this early effort to combine lay and professional support to battle one disease grew many other associations; eventually the National Tuberculosis Association (now the American Lung Association) emerged. The goal of the society was the prevention of tuberculosis by, first of all, "promulgating the doctrine of the contagiousness of the disease."[12] At about the same time, the New York City Health Department initiated steps toward mandatory reporting of tuberculosis cases. Beginning in 1894 institutions were required to submit such reports and three years later physicians were similarly obligated. Opposition among physicians to this requirement was substantial. Some argued that the mandatory reporting of cases of tuberculosis implied a lack of faith in the practitioners' abilities to take care of their patients. Others resented what they considered to be state interference in the patient-physician relationship, while still others believed the disease was hereditary regardless of what might be seen under a microscope.[13] Eventually, however, reporting of tuberculosis cases became compulsory throughout the nation.
Identification of tubercular patients led to requirements that the disease be properly treated. An effective antibiotic against the tubercle bacillus was not found until the 1940s, so treatment for the illness shifted from a relatively benign open-air regimen in cold climates, such as at Saranac Lake under the direction of Dr. Edward Trudeau, to a later, more drastic vogue for the collapsing of one lung and resectioning part of the rib cage. A general consensus that patients needed extended periods of bed rest and that everyone else needed to be isolated from the healthy led to the construction of tuberculosis sanatariums by state and local governments. The federal government built hospitals for native
Americans, who gave evidence of being particularly susceptible to the disease.
We have all but forgotten the terror tuberculosis aroused earlier in this century. The death rate from tuberculosis in 1900 exceeded today's death rate from cancer and accidents combined. As its contagiousness became more widely acknowledged, medical experts increasingly advocated early detection and treatment. Some potential patients, however, tried to evade diagnosis not only to avoid the bad news but also because being reported as a tubercular would make it difficult or even impossible to obtain insurance or to keep a job. Public health officials seeking authority to bring into treatment anyone who in their view was irresponsible, supported state laws to permit enforced treatment of the "careless consumptive" and to prohibit the discharge of a patient without approval of the medical staff.
Reports of involuntary-treatment laws in Connecticut suggest they were used infrequently and may have served more as a threat to obtain the cooperation of a patient. One reason appears to have been simply the expense of caring for a patient against his or her will, but it is unclear how many patients or potential patients were affected by the threat to invoke this stringent public health law. The health officer of New Britain, Connecticut, estimated he had invoked it "ten to fifteen times" in the period from 1920 to 1945.[14]
Gradually, the prevalence of tuberculosis, along with the fear it inspired, have declined until both are not even memories for many Americans today. The disgrace of having a disease often associated with unhealthy habits, not to mention the isolation from family and neighbors, has faded along with the many hospitals that were once strung across the nation for the care of the tubercular. It is clear, though, that by the time the disease reached its height, public-health-control measures had overcome many obstacles: The chest X-ray and the tuberculin skin test became so routine as to evoke hardly a comment from the patient.
Quarantine measures were also applied to other communicable diseases as their pathogens became identified. Efforts to quarantine sick persons and their households were dropped, however, when, in the light of new knowledge, it became apparent that such measures were ineffective. The infectious period of an illness, it was discovered, may occur prior to the onset of obvious symptoms; and the problem of enforcing quarantine, in any event, had always proved extremely difficult. Just as quarantine appeared to have no remarkable effect on the control of cholera in nineteenth-century America, so did the closing of schools in
response to infectious diseases such as scarlet fever and diphtheria, which broke out in the twentieth century.[15] Similarly, during World War I, an equally ineffective response to disease was to hold soldiers with venereal diseases in special enclosures.[16] Still, it should be borne in mind that quarantine has been most popular when the fear or prevalence of a serious disease has been highest. The fear of a disease, as the history of quarantine indicates, is not aroused by the simple knowledge of physiological effects of a pathogen, but from an ill-informed consideration of the "kind of person" liable to become ill, and the habits thought to cause or predispose people to the disease. Likewise, quarantine is a response not only to the actual mode of transmission, but also to a popular demand to establish a boundary between the "kind of person" so diseased and the "respectable people" who hope to remain healthy.
Quarantine and the "Disease" of Immigration
Creating boundaries between groups to prevent entry of undesirable agents of disease (an essential element in the concept of quarantine) can be seen in the tacit philosophy of some of the United States' immigration laws. Immigration laws have traditionally sought to prevent entry of anyone who would create a public burden. The philosophy of immigration laws early in this century, however, carried the notion of quarantine much further than the restricted entry of the diseased or disabled. Hereditarian theories of race and racial superiority were buttressed by the discoveries of Mendelian genetics and reports of animal-breeding experiments, all of which combined to create the eugenics movement. Those Americans alarmed by the influx of immigration from southern and eastern Europe late in the nineteenth century found, in what was then modern genetics, "scientific" support for their long-standing fear: Undesirable races would pollute the Anglo-Saxon germ plasm if allowed to enter the United States and to intermarry with the extant population. There were many exponents of this theory, which so closely resembled a simple view of the germ causation of disease: If a germ entered the body, a specific disease would be caused—neither the environment, nor educational efforts, nor biological variability of the individual infected by the germ were important. This racial theory surely demanded a line of defense around the racially pure, just as any quarantine drew the line against the biological contaminant, the cholera germ.
The ideas calling for a racial quarantine are summed up in Madison
Grant's The Passing of the Great Race , a pessimistic account, published in 1916, of undesirable immigration run amok, and of the glory of the Nordic race gradually fading into oblivion. Using eugenics theory to impart a "scientific" justification for his fears, Grant warned that such intermarriage "gives us a race reverting to the more ancient, generalized and lower type." Accordingly, racial disease could be prevented only by excluding carriers of biological contamination—the central concept of quarantine. This outlook triumphed in the Immigration Act of 1924, which drastically limited the influx of Europeans whom a person like Madison Grant would have found undesirable. The act was so effective that a year after its enactment the commissioner of immigration at Ellis Island reported that now almost all immigrants looked exactly like Americans.[17]
Drugs and Feared Minorities
The quarantine model can also be found in American reaction to the use of drugs by feared minority groups. The United States had an almost unrestricted market in morphine, opium, cocaine, and heroin during the nineteenth century and the first decade of this century. The use of these drugs became widespread, and in the years around World War I opposition to their nonmedicinal use reached a peak. Stringent federal laws assisted a variety of partial and conflicting state statutes attempting to control the use of narcotics. Interestingly, the campaigns that led to these laws ascribed the use of certain drugs to specific feared groups. Opium was linked to Chinese immigrants; cocaine to southern blacks; and heroin to an urban, violent, and criminal underclass. In the 1930s a similar, specific assignment was made of marijuana to Mexican immigrants who had come to the agricultural regions of the nation during the booming 1920s. In the crusade to control dangerous drugs, the emotional energy released by associating drugs with feared minority groups helped pass legislation prescribing severe penalties. The contrast with drugs that might be addicting and dangerous but are commonly used by the middle class, such as barbiturates, illustrates the intense emotions that can be evoked by appealing to the kind of fears that gave rise to the immigration laws of the 1920s.[18]
By the 1960s, a time of renewed addiction problems in the United States, simply being an addict rendered a person subject to involuntary confinement for therapeutic purposes. The Supreme Court declared that "in the interest of the general health or welfare of its inhabitants," a
state "might establish a program of compulsory treatment for those addicted to narcotics. Such a program of treatment might require periods of involuntary confinement."[19] Justice William O. Douglas in his concurring opinion went so far as to add that confinement might be justified "for the protection of society" and not just for the treatment of the addict. California and New York both established sites where addicts could be committed for treatment. In 1966 the federal government made provision for civil commitment through the Narcotic Addict Rehabilitation Act. All of these programs for massive detention of addicts failed legislators' expectations: Detention proved expensive and the rehabilitation rate was quite low. For our purposes—that is, to compare these latter measures with the possibility of quarantine in response to the AIDS epidemic—it is worth emphasizing that a group without an explicit ethnic affiliation but marked by a primary, and much feared, trait—addiction—was seen to deserve confinement "for the protection of society" by no less a champion of personal liberties than Justice Douglas. We have the advantage of knowing that the programs supported by such juridical sentiment proved impracticable.
The perceived role of drugs among feared minority groups was thought to be similar to that of a virus in an otherwise fairly healthy group. Eliminate the virus and the group would not only function much more efficiently but would also cease being a source of infection to the remainder of society. In a way, however, the fear of drug contagion was a little more optimistic than the eugenicists' pessimism that ascribed an unalterable inferiority to some ethnic groups. Remove the drug, or discourage its use by punishment, and the person and the group would be more easily assimilable and certainly less dangerous. Even so, some said the Chinese, for example, had a racial weakness for opiates. Broadly speaking, however, the tangible reality of the drug encouraged the hope that its removal would make a threatening group more tractable.
Early in this century, cocaine was said to cause southern blacks' hostile attacks on whites. Fear of cocaine fed the mounting racial tensions in the southern states. Cocaine was thought to improve marksmanship, while alcohol made it worse. Believing that blacks might be high on cocaine, officers in one police department traded their guns for larger calibers because they thought a mere .32 caliber revolver could not stop a "cocaine-crazed" black.
The smoking of opium by Chinese was used as an argument against Chinese immigration. Opium was said to be the means Chinese men used to seduce white women. Heroin, on the other hand, supposedly
bolstered the courage of underworld figures before a robbery. Champions of the strictest and most punitive antinarcotics laws, such as Capt. Richmond Pearson Hobson, considered narcotics a "racial poison." Hobson warned that the United States was under bombardment by the rest of the world, which sought to undermine American values and government through addicting narcotics. Each continent sent its wicked poison: Africa, hashish; Asia, opium; South America, cocaine; Europe, heroin. Captain Hobson was a keen student of the notion of racial degeneration, and the parallel he drew with undesirable races who wished to "invade" the United States is clear. The solution was to establish a boundary no foreign contaminant could pass.[20]
Some drug experts consider quarantine a remedy because they believe the isolation of drug-users is a protection against contagion. The idea that drug abuse is contagious is not new. In 1915 a Tennessee state official responsible for control of narcotic use, Lucius P. Brown, wrote in the American Journal of Public Health that
contagion is undoubtedly a very frequent method of spread. I have met many instances in which more than one member of a family was infected, the first case acquired accidentally or through a physician, infecting the other members of the family largely through a certain tendency on the part of the addict, particularly in the early stages, to introduce others to the delights of addiction.[21]
Addiction spread through contagion, or, as it is more commonly described now, "peer pressure," has led to some forms of isolation in the United States. During the years just after World War I, for example, addicts in New York City were brought to North Brother Island in the East River. In the 1930s a federal narcotics hospital was built in the form of a prison in Lexington, Kentucky. The major reason for these isolated locations was to ensure that the patient would have no access to drugs, although treatment and imprisonment also removed "pushers" from communities.
With the second major onslaught of drug use in the United States and other nations in the 1960s, the contagion model again proved popular, both to explain the growing use of dangerous drugs and to suggest a means of control. Dr. Henry Brill, later a member of the National Commission on Marihuana and Drug Abuse (1970-1973), described in 1968 two kinds of addicts: the medical, caused by treatment for a painful disease; and the nonmedical, or "street," addicts. The former he found to be solitary users, but the latter frequently used drugs in groups, and their
primary mode of spreading addiction was through "psychic contagion," as Brill labeled it, which may assume "epidemic proportions."[22]
A prominent Swedish drug expert, Dr. Nils Bejerot, agreed that interfering with this form of spread was a key to stopping epidemics of drug abuse. In Sweden the problem in the 1960s was stimulant abuse, such as amphetamines and other "diet pills," but the principle still held, he believed, for other forms of drugs. He called this situation an "epidemic toxicomania" and recommended establishment of "treatment villages" in open locations "without the patients being able to escape at the first impulse." He favored islands or depopulated areas for the construction of treatment villages. Dr. Bejerot thought a year in a village would be the minimum required. Women would have intrauterine devices inserted to prevent pregnancies.[23] Although these villages have not been adopted in Sweden or the United States, the proposal is an interesting look at the wish to apply quarantine to a feared and massive social problem.
Quarantine boundaries are best defended if there is a clear distinction between the feared aggressors and those requiring protection. The leper had a prescribed costume and warning cry. Immigrants often looked different from settled citizenry; in the cities, the poor could be distinguished from the middle and upper classes. In the case of narcotics, Chinese, blacks, and Mexicans stood out from mainstream society; and society, threatened by their discontent and hostility, hoped to stop their use of dangerous drugs, if not to expel them and "their" drug from the nation altogether. How convenient it was to discover a contaminant among a group already held in low esteem and easily distinguishable from the majority of the population; the role that this view of addiction played in race discrimination should not be underestimated.
When such groups are quarantined, lasting psychological damage may follow. Insights into the emotional sequelae (aftereffects of disease) that would be involved in quarantining those who test positive for human immunodeficiency virus (HIV), but are otherwise unaffected by the illness may be gathered from studies of Americans of Japanese ancestry who were interned in concentration camps during World War II simply because of their lineage. About 120,000 persons—men, women, and children living in western states—were abruptly taken from their homes and settled in government camps for several years on the grounds that they presented a security risk to the United States. In recent years deep regret for this action has been expressed in Congress and by many citizens aware of what happened under the stress of war. Studies conducted
on the former detainees reveal a number of reactions including denial; loss of faith in legal protections; aggression turned inward, with consequent feelings of guilt, shame, and inferiority—and identification with the aggressor.[24] We should try to learn from that era of fear and to consider the effects of quarantine on the targets of that fear. The efficacy of the quarantine procedure itself must also be questioned.
Acquired Immune Deficiency Syndrome
In light of the history of quarantine and its various ramifications, the position of the AIDS victim and society's response to the disease can be better appreciated. The large majority of AIDS patients in the United States are found in two groups, male homosexuals and intravenous drug users. The disease itself is caused by a virus that is transmitted by means of an infected needle or during sexual activity, especially anal-receptive sex. The disease itself occurs in an uncertain fraction of those who have been exposed to the virus. So far, the mortality rate for AIDS has been nearly 100 percent, although the patient may live a year or two after the diagnosis has been made and then mostly in the community and not in a hospital.
The question is whether AIDS possesses those characteristics that have aroused healthy citizens to call for a quarantine. It is indeed a serious disease with, so far, no cure. In this regard, AIDS patients face an irrevocable death sentence, much like the lepers of the Middle Ages. Furthermore, the groups with which AIDS is most closely associated in this country have typically been held in low esteem by the general population, the objects of discrimination in jobs, housing, and everyday social contact. Also, the disease is generally transmitted among drug addicts and homosexuals by means that have been or are still illegal in the United States. In this regard, AIDS, like other contagious diseases of the past, is associated with minorities who are considered sexually deviant and promiscuous. Like tuberculars and lepers, AIDS patients may have relapses between which life might continue outside the hospital, at home, or, at the least, in the community. During this time, however, the patient remains infectious and is therefore a source of apprehension. Recalcitrant patients who do not follow recommendations for "safe sex" evoke memories of "careless consumptives" whose presence motivated the passage of special laws permitting their involuntary isolation. Like tuberculous patients, AIDS patients have difficulty obtaining insur-
ance and, like members of any rejected minority linked to a serious communicable illness, the group as a whole may be treated as if all its members have the most dangerous form of the disease when any one of them applies for employment or housing, an ascription similar to the widespread association of specific drugs with feared minorities. In sum, AIDS patients have reason to be concerned over the possibility of quarantine or isolation. Are there any countervailing arguments?
The first restraint against a rush to institute quarantine measures against AIDS victims is the extensive experience showing that sustained quarantine for large numbers of people has not been successful. The great efforts to control the individual behavior of drug addicts have obviously been thwarted, or drug users would not now be spreading AIDS by injecting substances into their veins. Further, the spread of AIDS has not been found to be through casual contact, and there is reason to believe that not all of those with AIDS antibodies will develop a serious illness. If, however, longer experience with patients tested positive for AIDS antibodies reveals a very high incidence of illness in later years, or that AIDS is rapidly spreading from groups now chiefly associated with it—i.e., intravenous drug users, male homosexuals, and recipients of blood infected with the AIDS virus—the general population will in all likelihood become highly anxious.
The United States has a long history of mistrust of physicians and the medical establishment. The government also has had difficulty regaining its credibility about dangerous drugs after so many excessive warnings, particularly about marijuana, in the 1960s. When authorities make pronouncements about AIDS, their comments meet with considerable public skepticism. This skepticism must be borne in mind by those trying to provide reassurance, for if their reassurance is later found to have been overstated, the public confidence, which is needed to contain destructive emotions, will be compromised.
Strong reactions to the threat of AIDS will more likely result in restrictions on individuals if the disease continues to spread and to affect many more unsuspecting citizens. Passions could be mobilized politically and could result in a program to mark or isolate persons testing positive for AIDS antibodies. Just because quarantines are not effective does not mean they will not be attempted. The 1832 cholera epidemic in New York City led to politically mandated quarantine in spite of the almost unanimous opinion of leading physicians that it was a useless expenditure of time and funds. Perhaps the most helpful counter to unenlightened outrage is public awareness of the enormous effort under
way to understand and treat AIDS. This effort includes evidence of the growing success of educational programs among the groups most affected by AIDS.
If the AIDS crisis persists for some years, one can speculate that society or the groups most involved may develop ritual forms to recognize the mutual responsibilities between the healthy and the diseased. It would appear that such ceremonies for leprous persons helped both the healthy but vulnerable and the afflicted to accept their condition. Of course, with the absence of a single religious authority today, whatever ritual is developed may take on a more civic character.
If other diseases, say, multiple sclerosis and some cancers, are found to be preceded by a lengthy, asymptomatic viral infection, we may see the establishment of a new class of patients in circumstances common to AIDS victims now: A test may reveal the likelihood of death years in the future. What are these people to do in the meantime? How will they deal with the inevitable shock and grief that follow such a diagnosis? Our society may become motivated to create a sympathetic ritual not only to sustain but also to acknowledge these citizens. AIDS may be the model for ways to help both the well and the sick deal with such conditions produced by medical advances in etiology and diagnosis, but not in curative therapy.
In conclusion, the quarantine of AIDS patients remains a possibility, and depends on such factors as time until an effective vaccine or treatment is available, secondary and tertiary spread of the virus, and the faith of the public in official pronouncements regarding the illness. AIDS possesses many of the characteristics that have motivated past quarantine efforts—association with feared social subgroups, transmission through means the public has deemed unlawful or distasteful, the potential for spread outside these rejected groups to the public at large, and a lengthy infectious period outside hospital confinement. There is no assurance that quarantine will not be attempted, but awareness of its past ineffectiveness, accurate information, and understanding the irrational fears that wrongly prompt quarantine are good defenses against it.
Notes
An earlier version of this paper appeared in the Milbank Quarterly 64, suppl. 1 (1986): 97-117.
1. J. Gerlitt, "The Development of Quarantine," Ciba Symposia 2 (1940): 566-580. [BACK]
2. George Rosen, "Forerunners of Quarantine," Ciba Symposia 2 (1940): 563-565. [BACK]
3. Saul N. Brody, The Disease of the Soul : Leprosy in Medieval Literature (Ithaca: Cornell University Press, 1974), 66-67.
4. Ibid., 68. [BACK]
3. Saul N. Brody, The Disease of the Soul : Leprosy in Medieval Literature (Ithaca: Cornell University Press, 1974), 66-67.
4. Ibid., 68. [BACK]
5. Erwin H. Ackerknecht, "Anticontagionism between 1821 and 1867," Bulletin of the History of Medicine 22 (1948): 562-593. [BACK]
6. Benjamin Rush, "An Account of the Bilious Yellow Fever, as it Appeared in Philadelphia in 1793," in Medical Inquiries and Observations , ed. Benjamin Rush, 4th ed., 4 vols. (Philadelphia: M. Carey, 1815), 3:111. [BACK]
7. Benjamin Rush, "An Inquiry into the Various Sources of the Usual Forms of Summer and Autumnal Disease, in the United States and the Means of Preventing Them," in Medical Inquiries , 4:138. [BACK]
8. Benjamin Rush, "Facts, Intended to Prove that Yellow Fever not to be Contagious," in Medical Inquiries , 4:170. [BACK]
9. Charles E. Rosenberg, The Cholera Years : The United States in 1832 , 1849 , and 1866 (Chicago: University of Chicago Press, 1962), 30.
10. Ibid., 34. [BACK]
9. Charles E. Rosenberg, The Cholera Years : The United States in 1832 , 1849 , and 1866 (Chicago: University of Chicago Press, 1962), 30.
10. Ibid., 34. [BACK]
11. Amariah Brigham, A Treatise on Epidemic Cholera (Hartford, Conn.: H. and F.J. Huntington, 1832), 338, emphasis in original. [BACK]
12. George Rosen, A History of Public Health (New York: M.D. Publications, 1958), 388. [BACK]
13. Daniel M. Fox, "Social Policy and City Politics: Tuberculosis Reporting in New York, 1889-1900," Bulletin of the History of Medicine 49 (1975): 169-195. [BACK]
14. Connecticut Public Health and Safety Committee, "An Act Concerning Prevention of the Spread of TB," Hearings , stenographic transcript, State of Connecticut Legislative Archives, 17 April 1945, p. 178. [BACK]
15. A. L. Hoyne, "Are Present-Day Quarantine Methods Archaic?", Illinois Medical Journal 80 (1941): 205-208. [BACK]
16. Allan M. Brandt, No Magic Bullet : A Social History of Venereal Disease in the United States Since 1880 (New York: Oxford University Press, 1985), 116. [BACK]
17. John Higham, Strangers in the Land : Patterns of American Nativism , 1860-1925 (New York: Athenaeum, 1963), 156, 325. [BACK]
18. David F. Musto, The American Disease : Origins of Narcotic Control , exp. ed. (New York: Oxford University Press, 1987), 3-8, 219. [BACK]
19. Robinson v . California , 370 U.S. 660 (1962). [BACK]
20. Musto, The American Disease , 190-197. [BACK]
21. Lucius P. Brown, "Enforcement of the Tennessee Anti-Narcotics Law," American Journal of Public Health 5 (1915): 323-333. [BACK]
22. Henry Brill, "Medical and Delinquent Addicts or Drug Abusers: A Medical Distinction of Legal Significance," Hastings Law Journal 19 (1968): 783-801. [BACK]
23. Nils Bejerot, Addiction and Society (Springfield, Ill.: Charles C. Thomas, 1970), 271-275. [BACK]
24. U.S. Commission on Wartime Relocation and Internment of Civilians, Personal Justice Denied (Washington, D.C.: Government Printing Office, 1982), 295-301. [BACK]