Coping with the Urban Environment
Public crises and private benefit; the response of charity, reform, and science
The Problems of Health and housing in today's American cities are often perceived as belonging to two quite different domains. One cluster of problems relates to the fundamental inequality of our citizens and is manifested in the inability of inner-city poor whites and blacks to obtain the levels of health care and of a safe sanitary environment that were achieved by the majority of their fellow urban dwellers half a century ago. A second cluster of problems concerns the inability of more prosperous white Americans to obtain the kind of preventive care, day-to-day medical service, and supportive physical and social environments to which they should be entitled by modern medicine and modern physical planning. Because of the segregated structure of the metropolis and the class and racial politics of health and housing, the two problems are now dealt with as isolated issues. We debate welfare, clinics, public housing, and urban renewal for the inner city; we also debate voluntary insurance, aid to the medically indigent, community hospital service, group practice, the housing shortage, and improved planning and subdivision control for the suburbs. Yet because the entire metropolis or megalopolis is part of one national urban system and is dependent in most of its parts upon the workings of that system, the two clusters of problems are in fact inseparable: the failure of the health-delivery institutions to meet the acute needs of the inner-city poor is tied to the failure of the preventive services to meet the needs of the outer-city
majority; today's housing crisis of the slum is a product of yesterday's planning failure of the suburbs.
Although the death rate in American cities varies systematically by race and class, with the poor and the black having the shortest lives, the gap is small and would disappear as a by-product of the modernization of health and housing services as a whole. Today's problems rest in the context of a stable incidence of mortality and a widespread expectation of a long life. The difficult issues of our own tune turn around the universal experiences of city dwellers as they live out a more or less common life span. The urgent agenda of both inner city and suburbs speaks to the quality of life as we all undergo generally inevitable traumas, accidents, communicable diseases, confusions, criminal assaults, and physical disabilities from birth to death. The quantity and quality of available housing is inextricably tied to well-baby, pediatric, communicable-disease, drug, and accident services for young people and their families; community planning is linked to mental-health, accident, and chronic-disease care for middle age; and the placement, design, and supply of housing are crucial ingredients of geriatric care and social services to the old. The failure of the modern city to realize its potential in these fields is as much a product of the workings of the urban system as is its failure to distribute its wealth equitably in respect to family income and full employment.
The gap in health and housing between potential and realization can be understood in two ways: in terms of class and in terms of institutions. First, the construction of housing (and therefore the available urban stock of housing) has always depended on the capital resources and rent-paying abilities of city dwellers. It has always reflected the differential distribution of income in the city, and it has been dependent upon the fashions and abilities of the upper one-third to one-half of the population. Similarly, medical services have been closely bound to patients' ability to pay, to the class ambitions of doctors, and to the philanthropic styles of the rich, so that the health of the urban population as a whole has always mirrored the class structure of the city. Second, in institutional terms our provisions for both housing and health take their basic configuration from the pre-1920 era of the industrial metropolis. The real-estate and housing industries and their regulatory monitors assumed most of their current form in response to the problems and capabilities of the old crowded metropolis. Similarly, the modes of American medicine—public-health institutions, acute-care
hospitals, private doctors—arose in the late nineteenth and early twentieth centuries when such a structure suited the financial resources, scientific progress, and personnel capabilities of that era. Considering the advances in social science and medical science since 1920, one might reasonably expect that American society would have moved more rapidly away from these inherited constraints, but the painfully slow advance in these key elements of a humane environment can be attributed to the interactions between class-based financing and the power relationships of the accompanying institutions.
During America's first century of rapid urbanization, the years from 1820 to 1920, our urban environments polarized about two extremes. In the early nineteenth century the health of city dwellers depended upon the amplitude and adequacy of the traditional design of individual houses, upon the purity of family and neighborhood wells, the happenstance of open lots, and the variations in care of backyard and basement privies. Booming growth and mass migrations relentlessly pressed against this almost universal big-city environment, placing all citizens in continuous jeopardy from fire and disease. Innovations in transportation, municipal sanitary services, plumbing, heating, lighting, and to a lesser extent in the design of housing created a new environment for one-half to two-thirds of the urban population. By the 1890s the disparate trends of urban growth had become apparent. At one extreme stood the new urban world of single-family houses, row houses, two-families, and apartments, where an unprecedented part of the population enjoyed equally unprecedented security and a rapidly rising standard of living; at the other extreme stood the old big-city world of overcrowding in rooms and obsolete structures, faulty or nonexistent plumbing and heating, firetraps, fever nests, and malfunctioning integration of public and private sanitary systems. The potentials of the new environment and their unavoidable tensions with the old combined to call forth the housing practices and health programs of the industrial metropolis. The same potentials and tensions also set for our own era the basic institutional structures that still determine our housing supply, public health, and private care systems.
The safer and more wholesome urban environment sprang from a series of complementary events in transportation, public services, site planning, mechanical inventions, and home design. The succession of transportation innovations, from the introduction of horse-drawn streetcars in the 1830s to the electrification of street railways in the 1890s
and the supplementing of public transit by the automobile after 1910, had a contradictory environmental effect. As possible commuting distances lengthened with each transportation advance, the supply of land expanded exponentially, thereby relieving what would otherwise have been an intolerable pressure upon land within the reach of pedestrian journeys to work. The fact that fringe land around each booming city grew at a rate even more rapid than the city's population made possible a lowered density of many new residential environments in the industrial metropolis.
For housing built after about 1880 a new minimum standard prevailed. In most cities the standard manifested itself in miles upon miles of small wooden freestanding houses set back by a tiny lawn from the dirt and dust of the street, each separated from its neighbor by a narrow side yard and boasting a rear yard often as deep again as the house itself. The cumulative effect of forty years of such construction was to free the middle-class and typical working-class Americans from the dangers of alley housing, boardinghouses, and jerry-built conversions typical of the high land values of the big city of the early nineteenth century. Even in the nation's largest cities, where land costs were high and multistory housing prevailed, the opening of new land brought salutary effects. The universal two-family structures, the three-deckers of New England, and the flats of Chicago, though they crowded the land by today's standards, at least guaranteed no windowless rooms, and two exits—front and back stairs—in case of fire. Philadelphia's and Baltimore's row houses gained in amenity when builders stopped squeezing them into courts and rear alleys and began to lay them out instead in strips fronting only the main streets. The perspective of mile upon mile of houses of the industrial metropolis presents a dreary aspect to today's viewer, but in the essential ingredients of light, air, and fire safety the structures represent an important advance over the earlier practices of urban land crowding. Only in the inner-city tenements of every city, and especially in the crowded centers of New York and Boston, did the industrial metropolis's new transportation fail to improve the environment of large numbers of its citizens.
A contrary tendency of transportation innovation controlled the inner city. The ability of electric-powered surface transportation to deliver ever more thousands of commuters to the downtown sent the price of centrally situated parcels of land skyrocketing, thereby raising the rents for close-in housing to higher and higher levels. For the poor, confined by their job-access needs to the center of the city, this effect of transportation improvement on rents proved an insurmountable barrier to their realization of the benefits of modernization. For the well-to-do a modest move uptown and the purchase of new building designs and mechanical services in the form of firewalls, fire barriers, central heating and lighting, and a full complement of sanitary equipment secured the safety of their town houses. In addition, the invention of a wholly new structure, the fire-resistant steel-frame multistory apartment house with elevators and generous light and air shafts, allowed the inner-city middle class to attain the minimum benefits of suburban environmental safety in the face of high-density urban living.
Not more orderly land use alone, but light and air in conjunction with a more dependable water supply and waste disposal, made the early twentieth-century urban environment the safest form of mass habitation yet built. The mode of construction of water-supply and sewerage systems divided the responsibility between municipal capital on the one side and the individual installations of middle-class homeowners and home builders for the middle-class market on the other. For the majority of city dwellers this division of responsibility proved to be a workable partnership for the raising of living standards. Yet the unnecessary exclusion of perhaps a third of urban Americans in the 1920s from these standards may be laid to a lack of public attention to the inevitable class shortcomings that would spring from such a division.
The mode of construction of waterworks and sewerage in the United States arose out of the traditions and exigencies of the mid-nineteenth-century city. Sheer numbers, onrushing growth, and the crowding of land broke down the earlier small-town checks against fire and communicable disease. In every American city devastating fires swept whole blocks of valuable downtown districts. The Chicago Fire of 1871 was
but the most celebrated of a half century of conflagrations. Contaminated wells, overused and ill-tended privies, overcrowded buildings and rooms, and shiploads of undernourished and sick immigrants simultaneously brought epidemic waves of cholera, typhus, and yellow fever which swept the downtown districts of the poor, seeped into hotels and public places, and frightened all classes of city dwellers. Those who could afford to commute or leave their jobs fled the city during periodic epidemics, and the well-to-do adopted the habit of spending summers in distant suburbs in order to escape the season of greatest danger. Although public toleration for fire and disease stood at a much higher threshold in 1840 than in 1920, the desire to mitigate these trials found daily reinforcement in the sheer lack of reasonably clear water in many parts of the city. Water peddlers' wagons moved through the streets of New York selling spring water to housewives so they could brew a palatable pot of tea or coffee. A clouded and murky pail was often the best that a backyard well or neighborhood pump could offer for the family washing. By the 1840s the merchants' fear of fires and the desire for household convenience reached a pitch that overcame the universal distaste for taxes and heavy public expense.
During the 1840s and 1850s the major cities of the nation built reservoirs, aqueducts, and pumping stations, and laid water mains through almost every street. Yet no sooner had these giant municipal undertakings been completed than the abundance of water clogged the haphazard neighborhood sewers and flooded the streets, alleys, and back-yards of the city. And the threat of epidemics did not disappear. According to contemporary theory, stagnant water, putrefaction, and bad odors were the breeders and carriers of disease. Repeated statistical investigations by doctors and laymen established an incontrovertible correlation between the incidence of infection and inadequate sanitation. Thus from the 1850s to the 1870s cities shouldered the heavy burden of constructing their initial unified sewer systems to match the waterworks of the previous decades.
Both halves of the sanitary system again rested upon the divided responsibility of public and private effort. The division of labor seemed perfectly natural to the age. It minimized public costs, especially when complete systems had to be constructed from scratch, and at the same time it continued the long-standing tradition by which each property owner shouldered the responsibility for the improvement of his own buildings. The public water effort stopped with the laying of water mains in the streets and placing of hydrants from which householders could draw water and to which fire pumps could be attached. Any abutter who wished to tap the main in the street for service to his house, store, or factory could do so, but he had to bear the expense of the connection as well as to pay for his own plumbing and fixtures. Similarly, the sewer ran underground through the street, available to those who chose to make a direct connection to it.
The immediate consequence of this division of labor and responsibility was to hobble the effectiveness of the sanitary system. Homeowners and landlords whose tenants could afford a moderate increase in rents rapidly installed the water tap at the kitchen sink and the flush toilet, the essentials of the new environmental safety. Bathtubs, long considered a luxury, gained popularity more slowly. At the growing fringe of the city the financial partnership of public and private effort placed even more expense on the individual household. Here costs were allocated according to the traditions of beneficial assessment—that is, owners of land abutting a street were charged for a share of any public improvement that raised the value of their land. In opening up new land, the purchaser of each lot had to pay for all or some very substantial fraction of the costs of laying of the water mains and sewers, as well as for the house connections and equipment. The effect was to both raise the amenity level and the costs of new construction beyond pre-plumbing levels. For its part the city waterworks and later metropolitan water and sewer boards endeavored to keep up with suburban demand by building new (and rebuilding old) water mains, pumping stations, and trunk-line sewers. It was a costly race in pursuit of new development, and some
modern authors who have reviewed the pricing of water contend that the total effect was to encourage not only suburbanization but also the commercial and industrial waste of water.
Whatever the merits of alternative pricing schedules might be, there can be no doubt about the long-term environmental effects of the municipal-private partnership. By 1920 the middle class both within the city and in the suburbs had attained a newly safe and salubrious environment, while the working-class families who inherited old middle-class neighborhoods, or rented newly constructed multifamily housing of their own, reaped the same benefit. The poorest third of the population, however, was left out or lagged badly behind, suffering either from the complete absence of the new facilities in their homes or from limited plumbing facilities used by too many people. The water rates could easily have been used to install and maintain the necessary faucets and toilets, thereby overcoming some of the worst effects of the unequal distribution of personal income in the society. The public costs would have been relatively slight and the gains in health substantial. All that was lacking was the popular willingness to make available to all the minimum standards of decent middle-class and working-class life.
Because the lower-income half of American urban families had to find their housing in the structures vacated by the upper half, the new environment of the 1880-1920 years is the old environment of today's cities. These former growth rings are now the gray areas of today's metropolis the Brooklyns and Bronxes of New York, the West and South Sides of Chicago, the East Sides and Hamtramcks of Detroit. The weaknesses of a previous style of environmental progress have ripened into contemporary problems. It therefore repays us to identify the inadequacies of the past so that we will not repeat the same behavior.
The most serious failures of the 1880-1920 environment stemmed from faulty land practices. The structures themselves now suffer from inevitable obsolescence and aging, but many could be brought up to
current acceptance by sustained national prosperity and a steady attention by homeowners and municipalities. In the boom after World War II an extraordinary modernization of American housing went forward, and so it could again. The social and economic consequences of bad land planning, however, confront today's householder and public official with extremely costly and painful choices.
The ugliness of these old areas stems directly from the habitual land crowding of the past and from its use of uniform grid streets and narrow rectangular lots. Yesterday's developer, like today's, sought his profit by putting together a land-house package in which modish ornament and late-model fixtures were combined with a generous house size. The structure conforming so nicely to fashion was the sales item, and the land beneath it was skimped so that the total price could be held down and the lot-house package marketed to as wide a custom as possible. Actually there was much good sense in this strategy. Buyers could easily compare one standardized house with another, and the developer could save little on his houses by alternative designs or by cutting comers in construction. On the other hand, much could be gained by the developer who shopped in the metropolitan market for land. Land always ranged widely in price, in accordance with numerous variables, so that the developer who took his profit by marking up the land in the lot-house package rather than by alienating the buyer by radical alterations in the structure enriched himself, while he simultaneously catered to a mass middle-income market.
The consequences of this strategy in private development have been an array of relatively generous structures and pinched and inflexible land divisions. From such practices came the handkerchief front lawns, dark and narrow side yards, garage-lined alleys, solid blocks without parks or playgrounds, and the apartment-walled streets that are common to all our cities. In their day such areas were to achieve aesthetic success through the softening of awkward structures by means of trees planted between sidewalk and street and the visual merging of one tiny lawn with its neighbor. Although each lot might be small, the overall effect of the
block would be the relief of repeated buildings by continuous bands of green. Moreover, land covenants and later zoning ordinances against particular uses were designed to protect the residential grids from the encroachment of commerce and industry by confining these activities to a corner store, strips along the main thoroughfare, or bands on each side of railroad tracks.
Over time such expectations and achievements have been severely eroded by the coming of the automobile and also by the inherent rigidity of the social and economic requirements of the land plan. These grids were not designs for future growth and inevitable change; they were static layouts, and this in a country with a long history of racing urban transformation. Sheer crowding of the streets and yards by automobiles since World War II has destroyed the trees, hedges, and lawns, and since side and rear yards were small (or nonexistent on apartment blocks) cluster parking could not be introduced except by tearing down some houses and apartments. With the coming of the automobile the old residential areas of American cities have irrevocably lost their earlier pleasant qualities.
Such crowded and unwalled land presupposed adequate incomes and a neighborhood consensus for the private maintenance of what were in function the public amenities of the block. Children had continually to be restrained; lawns, hedges, and trees tended and replanted; janitors, tenants, and homeowners had to be fussy about trash; home businesses and car repairs had to be excluded; and the city had to be vigilant in its cleaning, policing, and planting if the fragile green strips were to be preserved. Declining incomes of old people, lowered wealth of successors to the first settlers, crowding by the poor, small businesses, multiple occupancy, multiplication of automobiles, loss of political status at City Hall, and impoverishment of municipal governments made such a demanding neighborhood performance impossible of perpetuation. High walls and enclosed gardens and courtyards in the European manner would have enabled American residential neighborhoods to be used more comfortably by people of varying incomes and ways of life, but to modernize our inherited gray areas in such patterns and also to make room for automobiles would require a heavy investment, to say nothing of tearing down structures to make parking places. All in all, such modernization will demand an investment in landscape construction which far exceeds that allotted to our common open urban and suburban styles.
Residential areas are not alone in suffering from the real-estate practices of the past. The developer's goal of selling every last lot meant that commercial and industrial strips were cut up and filled without regard for a reserve of space needed for the future. Today narrow strips of stores laid out to serve pedestrian and streetcar traffic cannot be readily adapted to automobile traffic. No land was set aside for commercial and industrial expansion, so that firms that prosper in old sections of the metropolis must move, much to the detriment of the local economy, to find adequate space. The inner city and gray areas thus become, by a kind of anti-Darwinian selection, the sites of the old-fashioned and least successful enterprises.
Finally, the new environment of 1880-1920 was more a machine for social mobility than a model for urban communities. The sociability of Americans, especially housewives and their children, did create friendly neighborhoods within the ever-expanding grids of streets and houses, but these important interactions took place despite, not because of, the land plans. The shopping strips, scattered churches and schools, and grid streets did not focus the paths of neighboring and daily errands in a way that made it possible for groups of people living within the same few blocks to know or recognize each other. This absence of widespread acquaintanceship caused by disparate daily paths has hindered the informal policing of old urban and suburban neighborhoods. Such sociability networks as did establish themselves were hardly a match for the contrary impulsion toward anonymity which American mobility patterns foster. We use housing as an expression of family status and affluence, to move out when we move up, or to shift houses in a restless search for better jobs. The inevitable consequence of these habits has been very high levels of neighborhood turnover, with all the social stresses and threats to stable property maintenance and values that such behavior entails.
The instability of residential property, the customary failure of builders to lay out or maintain a gardenlike neighborhood, and the universal lack of community solidarity led some wealthy nineteenth- and twentieth-century Americans to experiment with communities planned to overcome these failings and to protect suburbs from the usual processes of urban growth. The earliest experiment, Llewellyn Park, New Jersey (1853-69), had a single-gated entrance which opened to a sinuous ribbon of streets laid out along the contours of a hilly site. Four hundred acres were subdivided into one-acre sites abutting an interior fifty-acre park. The park was to be controlled and maintained by the homeowners as common land. Thus the enjoyment of a country gentleman's park in the then-popular romantic style became possible for several hundred families, each of whom was responsible individually for the upkeep of only one acre. A similar design for sixteen hundred acres in suburban Chicago was laid out by New York City's Central Park designers, Frederick L. Olmsted and Calvert Vaux, in 1868-69. Here an entire residential community was contemplated. The commuters' railroad station and the town stores served as the community center, while curved streets sunk below the grade of the house lots, reserved parkland, and subtle alterations in the Des Plaines River created the garden effect. In this case the subdivision—Riverside, Illinois—constituted a single political unit, so that the regular political machinery of local government could be employed for the maintenance of public spaces and for the policing of the subsequent development of the town.
The enthusiasm for golf, which seized the rich in the late 1880s, offered new devices for community planning of wealthy subdivisions. The golf club with its expensive lawns and plantings could serve as a park in its own right and also as a barrier to later encroachment by smaller houses and apartments. It could in addition serve as a powerful mechanism for controlling the social unity of the area, and with such merits the golf club became the most widespread tool of suburban community design in the American metropolis. Its failings of course lay in the substitution of private club for public community and in the extreme class, race, and ethnic segregation it inevitably imposed.
The largest and most successful of all these upper-income residential communities has been the Country Club District of Kansas City, begun in 1905. One firm has since continued to develop a succession of subdivisions that fan out from the axes of two main streets that join at a shopping center. An extensive list of covenants between the developer and the purchaser including for many years covenants against black purchase—and active homeowners' associations have been used to control the siting of the houses and the type of structure built and to maintain the public services of the streets and district.
In 1911 the Russell Sage Foundation attempted to demonstrate in its Forest Hills, Long Island, project that these examples of design of suburban communities for the wealthy could be adapted for middle-income housing. The experiment, taking place within the municipal boundaries of New York City, attracted a great deal of attention and was a considerable success in the field of design, but it also conclusively proved that the conventional subdivision was more profitable. The lesson Forest Hills taught the infant city-planning profession was that the community planning features of curvilinear streets, cul-de-sacs, playgrounds, parks, and unified shopping centers would be adopted by subdividers only if local government regulations required them. This lesson has been well learned, and much of the superiority of post-1920 suburban subdivisions over their predecessors comes from the imposition of such rules for land platting by professional planners employed by the local governments of the American metropolis.
In particular the neighborhood unit scheme, derived from nineteenth-century planned community experiments and advocated by Russell Sage Foundation executive Clarence Perry, proved a flexible device. The neighborhood unit idea, modeled in part at Forest Hills, was an institutional and traffic design program for promoting the social organization of new suburbs. Each neighborhood was to be defined by one primary school, situated in a central park. The borders of the neighborhood were to be set off by main traffic arteries. In this way only neighborhood-serving and local residents' traffic would move through the area, while
the schoolchildren and their after-school play would bring resident families into contact with one another. Service stores were to be so placed as to make for social unification, and the daily round of errands would also promote acquaintanceship. In varying modifications the neighborhood-unit idea for suburban planning has been promoted by professional planners and widely adopted in middle-class subdivisions across the nation.
Gans's study of Levittown, New Jersey, demonstrates that school, street, and errand planning do not make communitarians out of America's nuclear and highly mobile families, but such designs do reduce traffic accidents and provide an informal atmosphere in which to raise children. Only in cases where subdivisions of limited class range have coincided with local political boundaries do strong suburban communities seem to develop in the metropolis. In such cases the positive effects of the promotion of public facilities, high levels of maintenance, and innovative municipal services manifest themselves, but so also do the negative effects of racial and class exclusiveness. For the preponderance of Americans, neither the old grids nor the community-planning experiments of the nineteenth century seem to create an adequate urban environment—an environment able to roll with the social impacts of a rapid rate of urban development and at the same time to fill the gap between family isolation and the goal of an open democratic community life.
If the environment of 1880-1920 is the physical inheritance of our cities, it was the medical responses of that same era that fashioned the basic set of institutions established to protect the health of our urban populations. The staying power and rigidity of this legacy derives from its extraordinary successes in its own time. Armed with new scientific discoveries and techniques, these medical institutions scored an undreamed-of victory over the epidemic and mortality crises of the nineteenth-century city. Simultaneously doctors, hospitals, dispensaries, and public-health units offered a broad range of acute-care services which met many of the day-to-day needs of the upper two-thirds of the population. Such remarkable accomplishments so raised the status and popularity of doctors and their institutions that not until our own time have the shortcomings of these arrangements from the past come under scrutiny. Yet today's problems were also those of the years of first
triumph: a badly skewed delivery of health-care services that favored city dwellers, whites, men, and the well-to-do; an arrested environmentalism that neglected nutrition, housing, community, family life, and preventive care; narrow specialization and bureaucratization which de-humanized the patient; an overemphasis on drugs, surgery, and advanced instrumentation which drew scarce resources from the essential, if less heroic, long-term physical and mental therapies; a general self-satisfaction on the part of the medical fraternity which isolated it from a range of overlapping professions in education, engineering, planning, and social science.
The sustained contagious-disease and mortality crises of giant nineteenth-century cities manifested themselves most ominously in the old environment, the quarters of the poor untouched by or only partially improved by the new patterns of city building or the new sanitary services of the 1880-1920 metropolis. Here society had proved unwilling or unable to extend its environmental remedies, but fortunately the nation was spared the endless recurrence of the ancient disabilities of great cities because the discoveries of medical science were able to deal with a select list of disease and thereby reach out to protect almost the entire urban population. In addition, such was the new wealth of these cities that the adequate-income majority was able to purchase a greatly enlarged range of services for its routine health care.
In the 1820s, at the onset of rapid urbanization, American cities were virtually defenseless against both epidemics and the normal incidence of disease. The art of medicine could do little but set bones, amputate limbs, pull teeth, vaccinate against smallpox, and assist births. The few drugs doctors prescribed were unspecific and often given in debilitating dosages. Worse still, the contemporary custom of drawing blood and purging bowels was actually injurious to the ill. In these early nineteenth-century years the best medical care consisted of commonsensical home nursing by relatives and the family physician so that nature's own cures could most effectively take place. For the ordinary citizen the comfort of one's own family and the attendance of the solo practitioner were the normal recourse in times of accident or sickness.
For those outside the ministrations of family comfort, the largest
cities like Philadelphia, New York, and Boston had established hospitals open to migrants, sailors, the old, mentally ill, and the sick poor. Despite good intentions, extreme class segregation inevitably undermined these institutions. As custodians of outsiders and castoffs, these early hospitals fell far short of contemporary home standards. Except in the few hospitals staffed by Catholic orders, nurses were those with the lowest status and little opportunity for other employment and were sometimes even superannuated prostitutes and former felons. Only when the middle class itself experienced hospital conditions, as thousands did as soldiers and volunteer nurses during the Civil War, did the importance of hospital nursing impress itself on the consciousness of the mainstream of the society. Hospital funds were always short, rooms overcrowded, bedding dirty; in the absence of special operating rooms and anesthesia, the screams of the patients echoed through the wards. Under such conditions, alcohol, then a major hospital remedy, was perhaps the most humane prescription. In the early nineteenth century, hospitals deserved their popular reputation as places where shiploads of sick immigrants were dumped, and where the poor and the unfortunate went to die.
A more successful institution of these early years was the dispensary, a neighborhood clinic where medicines and advice were given to poor patients. Since dispensaries held no resident patients, they escaped some of the effects of ward contagion. They were cheap to run, and they also enjoyed a measure of public support as the one medical institution that could help the city in its continuing struggle against smallpox. The common council of New York, for instance, frequently voted funds to the dispensaries for immunization, but popular distrust of vaccination limited their effectiveness. Though charities, the dispensaries rose above some of the worst degradation of philanthropy because the needs of the medical profession elevated the quality of their services. There were few medical schools in those days and even less clinical supervision, so that ambitious young doctors who wanted to extend their apprenticeship sought dispensary positions much as today they seek hospital residencies. With such diverse roots of support, city dispensaries multiplied during the years before the Civil War; New York's first dispensary opened in 1791, and by 1866 there were ten.
Altogether, family nursing, the private practitioner, the hospital
and the dispensary were but a tenuous defense against accident and disease. Today the national death rate stands at about 9.5 per thousand inhabitants; in 1900 it was 17.2; in New York prior to the Civil War, so far as records tell, it fluctuated between 26.1 and 40.7. Infants and children dwelt in greatest jeopardy, suffering about two-thirds of each year's deaths. Yet public concern did not then, nor did it earlier, focus on childbirth and child care, or even on the major day-to-day causes of adult death and morbidity:' tuberculosis, typhoid, and dysentery. The public accepted these diseases as the hazards of life itself, though statistics seem to show that in the nineteenth-century urban environment such dangers to life increased with city size. Rather it was the dramatic summer incursions of epidemics of yellow fever, Asiatic cholera, and to a lesser extent typhus (a disease the well-to-do could ignore as the special providence of poverty-stricken immigrants) that mobilized public opinion. These epidemics called forth the nation's earliest environmental public-health programs: quarantines, emergency and immigrant pesthouses, disinfection of the rooms and houses of the stricken, and attacks on nuisances and filth.
Although the causes of the epidemics were unknown at the time, European sanitarians had conclusively shown (and American investigators confirmed) that overcrowding and bad sanitary conditions were correlated with a high incidence of such cases. Landmark studies were Lemuel Shattuck's Census of Boston (1845) and his Report of a General Plan for the Promotion of Public and Personal Health (1850), John H. Griscom's The Sanitary Conditions of the Laboring Population of New York (1845), and the American Medical Association's multi-city investigations of 1849. The net effect of these measures and of the water and sanitary constructions which accompanied them seems to have been to stem the potential for an ever-rising death rate, which unattended urban growth would have unleashed. Until the new medical science arrived high mortality could not be turned back, but the early environmentalists did at least succeed in holding the half-dozen largest cities of 250,000 to 2,000,000 inhabitants to levels of safety commensurate with those of less than 100,000.
The rapid succession of medical discoveries which began to accelerate after 1870 led to a proliferation of medical institutions whose innovative services were as important to urban living as the sanitary engineering of the former big-city era had been. The discoveries of bacteriology made possible the specific identification of an impressive list of common diseases like pneumonia, typhoid, tetanus, dysentery, whooping cough, tuberculosis, and numerous wound infections. Parallel discoveries in chemistry, pathology, and endocrinology allowed the effective intervention by doctors in a considerable number of both children's and adults' illnesses. Thanks to the new science, by 1900 the profession of medicine was rushing forward from its previous statistical observations and commonsensical nursing toward active intervention both in individual cases and in the urban environment. The new capabilities manifested themselves in traditional and novel forms alike: in the private practice of the single physician, in the totally refashioned institution of the voluntary hospital, in the expanded private and municipal public-health clinics and dispensaries, and in new regulatory programs.
For the majority of urban dwellers, the most obvious gift of the new science appeared in the augmented effectiveness of the neighborhood physician. In 1900, solo practitioners' training and equipment were still quite primitive. Nevertheless those who kept abreast of recent discoveries could carry a few efficacious drugs and vaccines in their black bags, by now knew enough pharmacology to avoid the destructive dosages of unspecific drugs, owned a small table-top laboratory where they could perform a few simple urine and blood tests, had a systematic method for examining patients to detect their symptoms, and possessed sufficient knowledge and technique of asepsis to treat minor injuries, deliver babies, and handle contagious disease without endangering their patients.
Furthermore, now that the scientific foundations of medicine had been established beyond cavil, it became possible to standardize the norms of competence and to extinguish the professional conflicts that had raged among believers in various causes of disease and different methods of therapy. The nineteenth-century hodgepodge of quacks and of physicians trained in commercial medical schools as well as in universities was soon placed under strict licensing and educational standards. By 1920, city dwellers who could pay a private doctor's fee could expect
a fair level of competence in the treatment of a considerable list of common diseases and injuries. This new effectiveness, achieved in the lifetime of one generation, raised the status of the family physician to a position of extraordinary popularity. The private practitioner became that legendary figure of healer, father, and family guardian which enabled the medical profession to defend itself against major reform in our own time.
In the late nineteenth and early twentieth centuries a wholly new kind of institution, the voluntary general hospital, served as the social agent of medical progress and the adjuvant, teacher, and disciplinarian of the private city physician. By ceasing to be merely the repository for the unfortunate and becoming instead the home of the new advanced practice and the servant of the middle class, it moved from the periphery to the center of medical care. In the years after the Civil War, everything about the hospital changed. The discovery and perfection of techniques of asepsis made it a reasonably safe place to go for treatment of serious illness or severe accidents. Surgery became reliable and effective. With the growth of science and the shift in hospital clientele, nursing ceased to be the resort of undesirables or a province of religious orders. It matured instead into a suitable lay occupation for those educated middle-class and working-class girls who were seeking independent roles in a society that had formerly offered little outside the factory, shop, school, or home. Finally, the hospital became the center of scientific progress. Only the large hospital could afford the expensive equipment and laboratories required for complicated techniques; only the large hospital could provide the variety of cases essential to research and medical education. These science-based changes in the hospital engendered a new fusion—the union of university medical schools, voluntary general hospitals, medical researchers and specialists, private practitioners, and their middle-class clients.
Such a fusion was a reflection of the social structure and economic power of the industrial metropolis. The fabulous fortunes of the late nineteenth and early twentieth centuries were represented on the governing boards of general hospitals and universities. The urban rich, as yet but minimally taxed for public programs, expressed their enthusiasm for
the new science, and often their gratitude for medical care, by endowing chairs of medicine and furnishing the capital for new universities, new hospitals, and numerous additions to existing facilities. On these boards representatives of wealthy families met with fashionable practitioners and distinguished specialists to determine the broad policies of medical schools, hospitals, and research. Here lay the source and direction of pre-World War I medical capital. The middle-class patient, too, made his contribution. Hospital fees began to be levied for service, and these fees provided a major fraction, or even the entire funding, of the day-to-day operation of the hospital. The working class and the poor, here as in the city at large, were subject to means tests to set the degrees of remission of their charges. Moreover, as charity cases they were subject to crowding, segregation in the wards and outpatient clinics, and a cheapening of service that paralleled their outside lives as residents of the city and as low-income consumers. Nevertheless, analogous to the rising living standards of the industrial metropolis itself, the new general hospital did give the poor access to an unprecedented level of medical care.
The economic formula of the new voluntary hospital, altered by the omission of expensive charity, research, and training by proprietary hospitals but emulated in the best municipal hospitals by the substitution of the city's funds for the wealthy donor, proved so successful that hospitals multiplied at exceptional rates in the late nineteenth and early twentieth centuries. In 1873 there were only 178 hospitals in the United States; by 1909 there were 4,400; by 1918, when the number of non-federal institutions peaked, there were 7,000. A totally unplanned growth, which was the outcome of the potentials of new science and the wealth of industrialization, had produced the social structure of the hospital-based medical profession that has proved to have serious consequences for our own time. In the early years, when research was first lifting the veil of ignorance and when new techniques and new hospital practices constituted such tremendous advances over what had prevailed, the gifts of the rich and the making of decisions by the wealthy and the professional elite seemed natural and beneficent. Yet the failure of the American medical structure to represent either the middle-class or low-income patient in what were in fact public policy decisions has badly distorted our medical undertakings.
Ever more expensive research, ever more elaborate techniques, and
the concentration on surgery at the expense of long-term care for the old were some of the most obvious results of the exclusive representation of wealthy donors and the medical elite. Environmental and preventive measures, chronic diseases, dentistry, the day-to-day rendering of service, and what might be termed throat-stick medicine have been relatively neglected. Moreover, with the rapid advance of science the solo practitioner became more and more closely allied to the hospital because it was the source of personal prestige and advanced knowledge and technique. As a result, neighborhood practice almost disappeared from poor districts, and the working class has come to be dependent upon the accident of location of hospital outpatient facilities. In 1920 the future of increasing medical specialization and the class, racial, and neighborhood consequences of the hospital structure of American medicine were only beginning to be perceived, but the seeds of our current difficulties had been sown.
Not that the industrial metropolis ignored the public-health possibilities of the new science; it pursued them vigorously, constructing the institutional framework and practices that still supplement our practitioners and hospitals. Indeed, the totality of environmental services and health-care institutions succeeded at last in bringing an end to the historic linkage of large cities and death. In New York, despite its gigantic size, and despite some of the most densely crowded wards in the world, the death rate had already by 1900 been brought below its lowest nineteenth-century levels, and it continued to fall in the big cities almost every year thereafter so that in our own time urban and rural death rates have at last converged. But at such a moment we face a familiar historical crisis—the institutions of the past fail to adapt to the needs of the present. As our health concerns have shifted from mortality to morbidity, we begin to experience all the failings of our old health-delivery structure.
The discoveries of bacteriology reinforced the nineteenth-century campaign of urban sanitarians. Water departments introduced filtration and chemical purification in the early years of the twentieth century. The identification of both human and animal tuberculosis bacilli led to the testing of herds and the certification and pasteurization of milk, which
had been a major source of child-killing infections. The precision of modern chemistry, coupled with the new large-scale marketing of meat, food products, and drugs, made it possible for the federal government to augment ineffective municipal market inspections by nationally enforced standards for purity in foods and drugs that moved in interstate commerce. The ability of the new science to explain how vaccination gave immunity even made it possible for cities and states to overcome hoary public prejudice and to institute safe, effective, compulsory smallpox vaccination for schoolchildren. Finally, by shifting the focus of attention in the campaigns to remedy the ills of slum housing, the new science brought the regulatory effort to its peak and logical stopping place.
The "fever nest" slum blocks, with their high incidence of cholera and typhus mapped by the early sanitarians, had spurred the public to establish municipal boards of health and to support their pioneering programs for the removal of nuisances and for cleanup and disinfection. In New York such expert reforms had been given impetus by the frightening experience of the Draft Riots of 1863, and the city enacted the nation's first tenement-house regulation. But when quarantine measures ended the plagues, the working-class and middle-class voters lost their fright and with it their enthusiasm for aggressive public-health measures. Housing reformers were forced to fall back more and more on appeals for public support on the grounds that overcrowding led to drink, crime, and prostitution, rather than urging a community of interest in safety from disease. The threads of moral horror and the community of health have always been intertwined in American housing proposals, and the early twentieth-century tuberculosis and well-baby campaigns did aid housing reform by contributing a set of causes for which there was broad popular experience and sympathy.
As a result of the intensification of the sanitary attack on slum housing, by World War I all of the nation's large cities had modern housing codes specifying permissible room density, ventilation, and sanitation. These laws were an important achievement in ensuring that all future construction would conform to decent minimum standards. But regulation of housing cannot by definition expand the supply of housing, and indeed it tends to raise rents when it is enforced. Also, it offers no
remedy for the common situation where poor tenants and poor landlords meet. Much slum property is owned by slum dwellers, not by rich corporations. Many slum owners scrape their way into a heavily mortgaged landlord status. Neither they nor their tenants welcome the news that costly repairs must be undertaken to bring their old buildings up to modem standards. The nineteenth- and early twentieth-century housing-regulation movement was after all the achievement of sanitary specialists and wealthy philanthropists, both of whom were unwilling to disturb the basic property and income relationships of the society. Therefore however well-meaning, and despite its long-term contributions, the regulatory movement often appeared in poor neighborhoods as an exercise in harassment of the poor, against which petty bribes and aid from the ward boss in inducing inspectors to wink at violations were the best defense. Housing reform shed some of its early philanthropic incubus thereafter, and in the thirties it picked up labor support when it was recast in the form of public construction and appeared as an aid to full employment for building-trades workers.
The wealth of the industrial metropolis and the efficacy of the new science also enabled the cities of the nation to establish a series of institutions that would offer specialized medical services to supplement the basic system of private practitioners and hospitals. Unfortunately for the public welfare, the urban health-delivery system was weakest in low-income areas, as housing regulations also had been, and it was in these areas that the incidence of disease and accidents rose most sharply.
Infants and children of the poor lived in the greatest danger. In the summer of 1893, Dr. Abraham Jacobi and the philanthropist Nathan Straus opened a milk station, where boiled milk and advice on infant care were offered free to mothers of slum children. The immediate success of the project in preventing deadly summer fevers led to imitation and the rapid maturing of municipally managed well-baby clinics. Soon clinics for tuberculosis and venereal disease were added to the public list. The new medicine encouraged the multiplication of dispensaries, both in the old form of the freestanding clinic located in a poor neighborhood and in the new form of the outpatient departments of public and voluntary general hospitals.
By 1920 New York City possessed 228 dispensaries and clinics. There were 60 baby-health stations; 21 tuberculosis clinics; 12 venereal-disease clinics, only two of which offered treatment; 26 municipal single-purpose clinics for treatment of eyes, teeth, rabies infection, and occupational hazards; 34 independent dispensaries; 65 outpatient departments of hospitals; four children's dental clinics in schools; and even six dispensaries for college students. A fourth of New York's eight thousand physicians put in some of their time staffing these institutions, at which approximately 1,250,000 patients were treated annually. Although these statistics of institutional growth, doctor participation, and patient use were impressive when held against the light of the preceding half century, the deficiencies of these supplements to private fee-paying doctor and hospital care had already revealed themselves. They were second-rate charitable supplements and as such were bound to atrophy in a society that honored self-help and responded most positively to fee-paying patients.
The entire list of clinics and dispensaries was not regarded by doctors as a group of institutions on their way to the provision of complete neighborhood care. Instead they were viewed as charities for the improvident and the unfortunate or as a restricted concourse of specialists who would not compete with the private doctor's general practice. The New York dispensary law required that all patients be subjected to a means test to determine medical indigency before treatment, granting exceptions only for a few contagious diseases, notably tuberculosis and venereal disease. Numerous studies were conducted, as in welfare today, to detect cheating by patients who could afford to pay the normal rates. The baby-health stations could administer only to well babies, while sick babies had to be taken to a general practitioner, dispensary, or outpatient department of a hospital. The public acquiesced in these constraints, and the network of supplementary institutions was used by the working class and the poor as their means of access to specialists and to doctors essential in cases of serious accidents or of sicknesses they could not neglect.
The whole charitable nature of these institutions prevented their maturing into adequate general-care centers. The doctors who staffed them were either ill-paid or, in the majority of cases, contributed their services. Dispensary and clinic work carried no prestige; such jobs were
either the doctor's tithe or were sought by young men hoping for an entrée to a regular hospital appointment by way of clinic duty. Although dealing with a public which suffered special hardships from the loss of working hours or days, only 2.5 percent of the total New York clinics' time was scheduled outside the normal business day. Patients had to wait in long lines. There was a shortage of supplementary personnel for the routing of patients, follow-up of cases, handling of records, and the offering of social services. Diagnosis was weak, records fragmentary and often illegible; tests were neglected and treatment haphazard. "Among cases of syphilis studied in only 50 percent was an indication found that the patient had been given the proper treatment," one study reported. Even the outpatient departments of general hospitals, where the latest equipment and laboratories at least existed within the same building, suffered because such facilities were planned and scheduled for resident patients. The outpatient service was the stepchild of the hospital, its trustees, its administrators, and its doctors. All in all, despite its lusty growth from 1870 to 1920, the system of clinic and dispensary was poor man's medicine.
The important public consequences of this charitable incubus lay in the withdrawal of popular support for all kinds of group and socialized medicine. As in the case of public housing, promising reforms directed to the social consequences of the unequal distribution of personal income went unsupported by working-class organizations because of their experience with services which departed from the normal private market form. In medicine the urban dispensary and clinic did not grow into a successful neighborhood or district institution, and the campaigns for health insurance faced opposition or apathy from organized labor. Similarly, the federal government's promising demonstration of non-charity public housing during World War I died as suddenly as it appeared. In both cases small groups of professionals and intellectuals had demonstrated that they had fully mastered the logic of the industrial metropolis's housing and health structures, but many more years of investigations, reports, and social failures would be required before major segments of the public would mobilize for change.
The extreme shortage of housing near war plants and navy yards forced a reluctant federal government into its first venture in civilian public housing. Prior to the war the housing-reform movement had been
102. Row House Yards, South and Iranistan Avenues, Bridgeport, Connecticut, 1919. The U.S. Housing Corporation employed the most advanced land-planning practices of its day. On an expensive 25-acre site, using the smallest two-bedroom units of any project, designers mixed a colonial American building style with contemporary English Garden City site planning to provide these generous garden spaces. National Archives
103. War Workers' Housing, Madison Street, Waterbury, Connecticut, 1919. English cottage version of contemporary suburban styles employed by the federal government in its first public housing venture. These duplexes, pairing five- and six-room units, were erected for skilled brass workers. National Archives
104. War Housing Twenty Years Later, off Lincoln Street, Bath, Maine, 1940. Because the sites were well planned and the architecture met the local consensus about what constituted decent housing, the U.S. Housing Corporation's work remained popular and aged well. Library of Congress
105. Country Club Plaza, Kansas City, ca. 1930. Since 1905 the J. C. Nichols Company has been managing the nation's only continuously planned residential development. The basic strategy has been to follow the city's growth by subdividing land in an ever-widening triangle. Houses and lots are sold, but the company keeps the shopping centers it builds. This is the first (1923), at the intown apex of the development triangle. Nearby apartments helped to get the shopping center started. J. C. Nichols Company
106. Grand Drive from 53rd Terrace, Country Club District, Kansas City, ca. 1914. Following contemporary examples of other upper-middle-class suburbs, especially Roland Park, Baltimore, the company gave special attention to site preparation and landscaping. Small parks, winding streets, and generous plantings are its hallmarks. With each subdivision Homes Associations are formed to maintain the local common grounds. J. C. Nichols Company
107. Belinder Avenue, Country Club District, Kansas City, 1963. Uniform setbacks of the houses, careful plantings and maintenance produce the epitome of the American residential street. Photograph taken twenty-five years after first development. J. C Nichols Company
108. Parking Garage, 47th Street, Country Club Plaza, Kansas City, ca. 1948. By keeping title to the shopping centers the developers can control and
finance continuous modernization. Here a former parking lot was converted into a free parking structure for 400 cars. In contrast to the pains of urban renewal, the Country Club Plaza stands as a convincing argument for municipal ownership and management of the commercial and industrial land of the metropolis. J. C. Nichols Company
109. Homestead Country Club, Country Club District, Kansas City, 1954. Since full-amenity development to high standards can only be profitable for middle-to-upper-income families, the entire Country Club District is a city planning triumph but a social disaster for Kansas City. Only public financing and control of land and housing development could have prevented the inevitable side effects of class and racial segregation. J. C. Nichols Company
110. Public Housing, Holyoke, Massachusetts, 1941. Controls against giving too much to the poor reduced New Deal housing to levels below that of the World War I projects, thereby ensuring that the gap between the rewarded poor and the middle class would widen disastrously once the Depression ended. Nevertheless, in small cities across the nation where land costs were low and projects not too large, substantial gains were made over local slum conditions. Compare these new two-story row-house apartments to the four-story wooden tenements in the background. Library of Congress
111. Subsistence Homesteads, El Monte, Los Angeles, 1936. A Resettlement Administration demonstration project of farm homes for clerks and industrial workers employed in the city. The three-quarter-acre lots and locally designed five-room houses successfully captured a broad popular demand. Two thousand families applied initially for the planned 140 units. Houses were ultimately sold to their occupants without loss to the government because the project was sensibly located along one axis of metropolitan growth. Library of Congress
112. Ida B. Wells Housing Project, Pershing Road and Martin Luther King Drive, Chicago, 1942. Typical New Deal big-city federal housing project—barracks for 1,655 black families. White antipathy to public housing outside the established ghetto forced enlargement of the project in 1955 and 1961 so that it is now an all-black philanthropic city of 12,000 inhabitants. Library of Congress
113. Lakeview Terrace, Whiskey Island, Cleveland, ca. 1936. Local housing authorities, unwilling and unable to see housing as an opportunity to let the poor move to modern neighborhoods and closer to the growing sectors of the metropolitan economy, frequently repackaged the poor in old slum sites. Here, 620 families were settled in an industrially impacted neighborhood. Subsequently an interstate highway has further blighted one edge of the project. Urban Archives, Temple University
114. Lafayette Park, Detroit, ca. 1962. A 164-acre urban-renewal project one mile east of the downtown. Conceived first in 1949 as a public housing program, it was redesigned when urban-renewal legislation offered a tax-hungry city the opportunity to clear land and build for the wealthy. Low-income property on the site was leveled without adequate relocation measures and luxury row houses and apartment towers were built. Completed 1971. U.S. Department of Housing and Urban Development
115. Prefabricated Housing Experiment, Akron, Ohio, 1971. Mistaking a social and economic problem for a technological one, the U.S. Department of Housing and Urban Development recently launched a high-publicity program for factory-made homes. Scarcity and costs of well-prepared land, class and racial segregation, and the financing of adequate social and educational services to housing, not the structures themselves, have been the real problem. U.S. Department of Housing and Urban Development
116. Scattered-Site Public Housing, Mount Clemens, Michigan, ca. 1964. Occasionally public housing meets the popular norms for decent living. Here 160 units for blacks, whites, and the elderly were mixed within a small metropolitan satellite city's urban-renewal program. Eight sites were scattered over 485 acres so as not to disturb the existing neighborhood fabric. Tenants paint their own apartments and keep up their own lawns and gardens. Problems so far: kitchens too small, not enough closet space! U.S. Department of Housing and Urban Development
fully occupied with building regulations, the brand-new controls of zoning, and experiments with philanthropic and limited-dividend model-tenement housing. Continental and British examples had begun to attract attention in advanced professional circles, and Massachusetts had tried a small experiment, but to most Americans governmental construction of houses seemed to represent a dangerous step toward socialism and a direct threat to the genius of the Republic. Despite successive reports of an inability to attract and hold skilled workers without the provision of some decent accommodation for their families, Congress delayed authorization of public war housing. The fact that such an undertaking resembled German socialism more than anything else made it doubly unpalatable to Congress. Yet skilled workers would not tolerate for long the boardinghouses, barracks, and made-over garages that unskilled men and women accepted. Therefore, five months before the Armistice, a public-housing program for skilled workers was at last authorized, subject to the strict condition that all housing so built be sold to private persons at the war's end. Two federal agencies, the Emergency Fleet Corporation and the U.S. Housing Corporation, undertook the rush task and together they built or subsidized the construction of more than fifteen thousand dwelling units at seventy-nine project sites across the country.
The final report of Frederick Law Olmsted, Jr., planner of Forest Hills and manager of the Town Planning Division of the U.S. Housing Corporation, is of exceptional interest because it demonstrates that professionals had early thoroughly understood the mechanisms and limitations of the private housing market and the remedies needed to maximize the social benefits of private construction. The report also shows that the U.S. Housing Corporation had demonstrated how a model public-housing program should be managed if the government should ever have wished to move beyond the limitations of the private market.
First, the summary report recognized the basic trickle-down nature of American housing. New housing is built for the middle class and the upper levels of the working class, and all others inherit what is vacated by these. Thus the quality of housing in a given city depends directly upon its quantity. If there is a shortage of housing, Olmsted stated, then those least able to pay rents must double up and occupy unfit structures,
and the immediate result is "slum conditions unfavorable to that self-respecting family life upon which the security of our democracy rests."
Second, he recognized local housing conditions to be a national and not a local problem, because of the crucial role played by the national flows of mortgage capital. Thus during the years from 1914 to 1918 rising building costs and more lucrative opportunities for investment elsewhere had driven capital away from new construction, so that a housing shortage existed even prior to our entry into the war in 1917. As a permanent remedy for the inevitable periodic shortage of money for home construction, Olmsted recommended federal intervention in the capital-supply market along the lines that had recently been followed by the 1916 Farm Loan Act. Under this program the government lent money to local cooperative banks, and they in turn extended cheap long-term mortgages to farmers. In 1933, with the crisis of the Great Depression, the Home Owners Loan Corporation was created exactly along such lines. This act, plus subsequent New Deal additions, established the basic American housing strategy: to encourage the private trickle-down housing market through government intervention and government support for the supply of mortgage funds.
Olmsted noted that the U.S. Housing Corporation had "dealt but little with the more difficult problem of satisfactory and economical housing for the families of unskilled and relatively low paid workers." Yet looking back on the pioneering work of this agency and assessing its accomplishment in the light of America's subsequent public-housing disasters, one can appreciate these World War I construction projects as model programs that defined the basic conditions under which any successful public-housing policy must proceed. The essence of the corporation's work lay in its adoption of a contemporary consensus for standard new housing. In designing for skilled workers who were engaged in a common patriotic enterprise, the corporation's program was not obliged to lower its standards to a level below that of private housing. It did not have to avoid offending the sensibilities of private tenants and homeowners by offering less than the equivalent housing to its recipients of public welfare. On the contrary, the central office in Washington set normal prevailing standards, called together mixed planning and archi-
tectural teams, and turned them loose to do as good a job as they could. The result varied from the ordinary to the excellent. Many projects used the latest traffic, curvilinear-street, park-reservation, and community-center devices of the best English Garden City and wealthy American garden-suburb practice. Taste was not regimented. There were Colonial and Tudor houses in the East, stone houses in Ohio, Spanish stucco in California, and neat wooden bungalows in the state of Washington. Moreover by a strong emphasis on site planning these projects enjoyed the lasting advantage of having their utilities, streets, and services finished and located in a way that would enhance the long-term use and maintenance of the homes, instead of leaving newcomers stranded and struggling for city services, as had so often happened in low-cost outlying private developments and would occur later in public housing.
When the federal government did finally enter on public housing during the New Deal, it violated (except in its three controversial Greenbelt towns) the basic World War I conditions of success. Instead of building to the standard of middle-class private consensus it built second-class philanthropic housing. By so doing it drove off local architectural and planning talent, erected obsolete structures that would have to be lived in for fifty years, and stigmatized the beneficiaries as second-class citizens.
The health-insurance reformers of the pre-World War I era also displayed a competent perception of the shortcomings of the existing medical structure. The remedies they proposed, like those of the U.S. Housing Corporation group, were essentially conservative—designed to use government to make the private system more effective, not to institute a novel public organization. The health-insurance movement began in Europe and was carried to America by intellectuals, and accordingly it was never a campaign of the medical practitioners. It commenced with a concern for the maintenance of income for injured workers' families, and then as it gained momentum it moved on to proposals for insurance against everyday medical expenses. As the campaign progressed from legal and industrial reform to contact with medical practitioners and conflict with private insurance companies, it encountered a paranoid counterattack which defended the recently developed institutional struc-
ture of medicine as if it were the last bastion of American free enterprise and the most sacred of the nation's ancient traditions.
The first phase of reform, employers' liability laws and workmen's compensation insurance, advanced smoothly because it proved itself able to gather adherents from all the concerned parties successively—reformers, labor unions, industrialists, and insurance carriers. Under the old common law, each individual worker had been held to have assumed the risks of accident and disease inherent in his occupation. If he suffered injury or disability in the course of his employment, he had to pay his own expenses or else initiate a lawsuit to prove not only that he was without fault but that his employer had in fact been negligent in the operation of his business. The expense of these suits, the callousness of the age, and the imbalance of power between worker-plaintiff and boss-defendant made such a recourse uncertain and the awards niggardly.
Yet the accident and disease rates in such large industries as textiles, steel, glass, mining, chemicals, and railroads added scandalously to the local relief rolls. In 1885 Alabama enacted an employer liability law making manufacturers responsible for their employees' injuries. Other states followed, and they simultaneously established numerous commissions to investigate European schemes for insuring workers' wages against days lost because of occupation-related disease and injury. These studies uncovered a mutuality of private interest. Manufacturers, by devoting increased attention to guards on machinery, the handling of materials, dust control, and prevailing shop conditions, could increase the productivity of their crews. The incidence of accidents and disease proved calculable, and it was found that insurance companies could write policies at reasonable rates and workers could benefit by an administered schedule of payments for lost wages, injuries, dismemberment, and even death. Maryland passed the first workmen's compensation law in 1902, and by 1920 forty-two states had followed. Though a highly successful advance, workmen's compensation always suffered from the weaknesses of its original consensus. As an insurance program it never covered all workers and, since employer and carriers both had an interest in low and stable payments, compensation schedules in America have recompensed workers for only a small fraction of their real costs. Although provisions for medical charges were added to the original wage-based cash benefits, the restriction of the program to work-related health problems seems to have prevented practitioners from
perceiving this insurance scheme as a threat to private doctor-patient relationships. Hospitals did, however, receive direct payments from insurance companies for treatment of accident cases, and this innovation seems to have softened hospital administrators toward insurance schemes in general.
Again it was Europe that pioneered in payments for sickness and accidents not related to the job. Here the issue concerned income maintenance for workers' families when the wage earner could not work, payments for medical care, and funeral expenses. As early as 1883 Germany had begun contributory employer-employee local insurance funds, and in subsequent years the number of industries to undertake such coverage was steadily expanded. Great Britain followed a parallel course in enacting in 1911 a National Insurance Act, which established compulsory unemployment and health insurance. Workers were to receive some measure of protection against the inevitable occurrence of periodic unemployment as well as some assistance to defray the costs of health maintenance. A special feature of the British scheme was its accommodation of existing benefit associations and insurance companies. The government promulgated a list of approved insurance societies, and these were to receive the joint payments of workers, employers, and the government. Local boards of doctors, insurance representatives, and government officials were to oversee the payments.
Reformers in the United States were primarily academicians organized in the American Association for Labor Legislation. In 1914 the Association reported on its studies of European precedents and opened a campaign for medical insurance at the federal and state levels. The Association did not contemplate total unemployment compensation. The reformers hoped to insure industrial workers against the expenses of childbirth, accidents, sickness, and funerals so that the working class could become full-paying patients of the private medical-care system. As in Great Britain, government, employer, and employee would all contribute to funding the compulsory insurance pools. The AALL report also allowed self-employed persons not covered by the legislation to join such programs on a voluntary basis. Either state or private insurance carriers were envisaged as insurers, and the schedule of payments was to
be administered by employer-employee boards supervised by the government. The report also expressed the hope that such an insurance scheme would encourage physical examinations, early diagnosis, and such general preventive health practices as well as finance the care of acute illness.
From 1916 to 1920, bills for federal investigation of health insurance and bills for state programs were put forward. The American Hospital Association and the three nursing associations reviewed the question and issued reports calling such a step inevitable and urging hospital administrators to be sure that the scheduled fees were adequate even as they prepared for increased case loads. The National Association of Manufacturers, pleased with its workmen's compensation experience and safety-first campaigns and impressed by German business practice, expressed itself at first as favorable and then moved to a position of supporting private insurance only. Organized labor offered weak support at best along with some opposition, with the A F of L executive committee unable to agree on a position. President Samuel Gompers testified before Congress that such schemes would lead to federal spying on the homes of workingmen. The treasurer and a vice-president of the same union testified in favor, as did the railroad conductors. The poor quality of medical service offered by practitioners working for British insurance funds there created unfavorable publicity, but the medical insurance campaign here did not in any case call for group practice or any other alteration in the delivery of medical care. The low quality of American clinic practice may also have entered the minds of union leaders, since one of their major goals at the time was to achieve full equality of status for the American workingman.
But the violent opposition came from a coalition of private insurance companies and doctors. One insurance executive campaigned full-time against the legislation. Once again the war inflated the specter of German socialism. "When compulsory health insurance enters the United States, Socialism will have its feet upon the throat of the Nation," he said. In 1917, private insurance companies had written industrial policies covering in some way 37,500,000 workers. Conservative doctors in the American Medical Association repudiated the early stand of its leadership in favor of insurance and in a fit of wartime xenophobia voted the "do-gooders" out of office. Everywhere the state
bills were defeated and the isolated reform intellectuals faced heated doctor, insurance and even Christian Science opposition.
The turning back of medical insurance proved more than a temporary setback for an idea whose time had not yet come. The campaign took place during the wave of reaction that swept the country during and immediately after World War I. In this climate the American Medical Association, an institution that had begun its organized life with an advanced survey of urban slum conditions, confused the conservative reform of insurance with public medicine and nailed itself to an intransigent defense of the institutional system of American medicine as it obtained in 1920. The Association's permanent resolution on state medicine, one just recently modified, forbade support of government programs of any kind except those already in existence for charity, mental health, communicable diseases, and military care. This legacy from the years of the first half century of rapid progress has cost the nation dear, the rural areas even more than the cities. It has cost the public and doctors alike. For years large numbers of American citizens have been denied access to decent medical services, and the medical profession itself has been denied the adjustments and steady evolution that would have attended the public insurance reforms. Instead of medical problems, our cities now face a medical crisis.