Who They Were and What They Did
Clinicians were the war's most visible psychological experts. Not only did their numbers exceed those of their policy-oriented counterparts, but their immediate clientele—literally millions of soldiers—eclipsed the relatively small group of war managers and policy-makers whose needs governed the path of experts with more conventional social scientific inclinations. One-third of the psychiatrists in the United States volunteered immediately to serve in the massive effort to screen every single one of the fifteen million recruits to the armed forces. But this amounted to a mere three thousand people, less than 2 percent of all U.S. doctors (who numbered around 180,000 in 1940) and less than 3 percent of military physicians.[3]
It was clear early on that a critical shortage of psychiatrists would hobble the effort unless a crash course in mental medicine were provided to the military's general medical personnel. Twenty-five psychiatrists comprised the military's entire psychiatric staff when the United States entered the war, but another twenty-four hundred medical officers were rapidly trained in the treatment of emotional disorders, along with a wide assortment of allied professionals, from clinical psychologists to social workers and nurses. One-quarter of the country's trained psychologists, to take only one example, served in the military by the end of the war years.[4] And the numbers of clinicians increased dramatically as a result of military requirements. In 1940 a bare 272 members of the American Psychological Association (less than 10 percent of the entire membership) had been employed in clinical capacities of any kind, and among these, very few were assigned major psychotherapeutic tasks.[5] By July 1945 seventeen hundred psychologists were working for the military, a significant number of them in clinical capacities.[6] War had offered many of them their first opportunities for clinical training and practice, persuading them that the field of individual treatment was the place to be in the future.
Because of psychiatry's medical origins, experts involved in clinical tasks appeared much less controversial at first than experts assigned to propaganda or intelligence operations, whose delicate tasks were almost
always shrouded in secrecy. Indeed, before the shocking results turned information about the military's mental state into top-secret data, clinical experts proudly broadcast their plans to mount screening and treatment programs in the name of humanitarianism as well as effective management. Properly supported and implemented, clinicians argued, they could increase military efficiency by selecting out individuals who were identified, in advance, as psychological drags on the war effort and dealing quickly with cases of mental breakdown after the fact. While clinicians shared with the experts discussed in chapters 2 and 3 a commitment to advancing national security and the skillful conduct of war by the U.S. military, their historical reputations marked them as virtuous healers rather than skillful manipulators. Years spent caring for the sick and unfortunate had offered psychiatrists precious insights into the general human condition, argued Alan Gregg, director of the Rockefeller Foundation's Medical Science Division. "By showing us the common rules, the uniform limitations, and liberties all human beings live under because they are human, psychiatry gives us a sort of oneness-with-others, a kind of exquisite communion with all humanity, past, present, and future. It is a kind of scientific humanism that frees us from dogma and the tyranny of the mind, a relief from the inhuman straitjacket of rigid finality of thought."[7] According to William Menninger, who led the military's psychiatric effort during the war, Gregg's call to reveal the transcendent existential truths standing behind clinical experience was "a credo for every psychiatrist."[8]
Because of the memory that 69,394 men (around 2 percent of all those examined) had been rejected from the World War I military, the first priority was a screening program for inductees.[9] Robert Yerkes's notorious World War I intelligence testing program also came to mind as an unsettling reminder. Even though it had not been comprehensively administered, Yerkes had found a 50 percent rate of mental defectiveness among inductees and 60 to 70 percent of the rest demonstrated very low levels of intelligence: the average white, native-born soldier scored a mental age of thirteen.[10] This amounted to a virtual epidemic of feeblemindedness among the young men who were to be the country's first line of defense. In 1940 psychiatrists faulted their World War I counterparts for being insufficiently rigorous in their preemptive screening. They had relied too heavily on physical exams and symptoms, and psychiatrists were called in only on a referral basis, when some other military gatekeeper suspected the existence of a mental problem. When psychiatrists did have the chance to investigate, they
complained that military bureaucrats frequently ignored their recommendations and labeled clinicians "nutpickers" or "nutcrackers."[11] Psychiatrists accused the World War I armed forces of harboring attitudes toward their profession that were "colored by a mixture of prejudice and ignorance."[12]
Without a doubt, the World War II screening effort would have to be a substantial improvement, and early indications were positive. "The Selective Service System seems to be fully awake to the importance of psychiatric considerations," reported one professional committee with great satisfaction.[13] Designed and run by psychiatrist Harry Stack Sullivan, director of the prestigious William Alanson White Psychiatric Foundation, the program was incorporated into the 1940 Selective Service Act upon the express request of President Roosevelt, who was worried about the projected high costs of psychiatric hospitalization. The screening process itself entailed a series of four to five thorough psychiatric examinations, beginning at the local draft board level. Each exam was supposed to last fifteen to twenty minutes so as to avoid "the ridiculous business of staring at people for a moment and pulling a few out of the line for further study."[14] According to the plan, standardized interviews would elicit detailed information about registrants' family backgrounds and emotional profiles with such questions as "Do you suddenly get so mad you don't know what you're doing?"[15] All screening interviews would be conducted in private.
The screening program assumed psychiatrists' ability to identify "predisposed" individuals and thus predict mental trouble, two skills that would, experts claimed, save the government much time and expense. It was widely publicized that psychiatric services and disability payments to veterans had cost close to $1 billion between 1925 and 1940 and it was estimated that each psychiatric casualty during World War II would cost at least $30,000.[16] If only screening were properly implemented, "human values will be conserved; a great burden of unnecessary disability compensation payments, hospitalization expenses, and pensions will be avoided—and the prestige and effectiveness of psychiatry, greatly expanded."[17] Every physician working for a local draft board, after all, would necessarily come into enlightening contact with psychiatry, most of them for the first time. For physicians unfamiliar with psychiatric diagnoses, guidelines for questioning were provided, including strict instructions that problematic candidates be immediately referred to the psychiatric member of the nearest medical advisory board.[18]
Predisposition was a psychiatric concept with roots in nineteenth-century medicine. By 1940 large-scale socioeconomic events like the depression had moved the concept away from a narrow, genetic meaning and neo-Freudians, including Sullivan himself, were stressing the power of culture to shape and reshape human behavior. During the interwar period, a great deal of discussion revolved around ameliorating the detrimental social conditions—childhood delinquency, sexual perversion, unemployment, and so forth—that could enhance the biological predisposition of individuals to mental troubles.
As understanding of predisposition broadened, so too did psychiatric comprehension of the condition to which it pointed: mental illness. In contrast to the narrow criteria employed during the World War I effort, psychiatric disability was defined very broadly in World War II. At the inception of the draft in November 1940, Selective Service System Medical Circular No. 1 recommended summary disqualification of individuals displaying symptoms of any one of eight types of psychiatric handicap. Stupidity, serious personality disorders, substance abuse, and organic brain disease were four of the officially sanctioned grounds for psychiatric rejection. Physicians without psychiatric training were reminded that "these conditions are likely to escape notice unless one is particularly looking for them."[19] They were also told to immediately refer individuals exhibiting the following "deviations" to the nearest psychiatrist: "instability, seclusiveness, sulkiness, sluggishness, discontent, lonesomeness, depression, shyness, suspicion, overboisterousness, timidity, sleeplessness, lack of initiative and ambition, personal uncleanliness, stupidity, dullness, resentfulness to discipline, nocturnal incontinence, sleep walking, recognized queerness, suicidal tendencies either bona fide or not, and homosexual proclivities."[20] Draftees who expressed any discomfort at all about undressing in the presence of examiners were considered potentially unsuited to the conditions of military life and were therefore subject to disqualification. "Fatigue, increase in use of alcohol or tobacco, tendency to show increasing irritability, increase in profanity, decrease in neatness, being at odds with officers, and desire for transfer" were shortly added to the long list of offenses deemed worthy of discharge.[21]
Psychiatric screening did not live up to its architects' hopes. It probably could not have done so, given the breadth of the screening criteria and the drastic shortage of trained personnel. With millions of men flooding into the military, it was simply impossible to conduct the program as it had been designed, and one or two quick exams, lasting a
minute or two at most, was the rule. So overwhelming were the practical problems that psychological tests were eventually designed for use with inductees and trainees that were entirely self-administered and scored in a minute or less.[22] Questions too varied from place to place, and time pressures often reduced what was supposed to be a serious probe to yes or no answers to questions such as "Do you think you had a happy childhood?" and "Do you wet your bed?" Results too were inconsistent. One psychiatrist might judge manic-depressive candidates eminently qualified for military service while another routinely rejected all who divulged vegetarian dietary habits.[23]
Frustrated by logistical hurdles, Harry Stack Sullivan quit his Selective Service post in 1942. Others also regarded the screening effort as "little more than a farce" and concluded that the constraints under which psychiatrists were operating were likely to impair their professional reputations as well as military effectiveness.[24] "Under such circumstances psychiatric screening was bound to be a hit-or-miss affair in which the hapless psychiatrist had to spice his knowledge and experience with large sprinklings of hunches and fortune-telling."[25]
Equally serious were the disagreements that surfaced among psychiatrists themselves about the recognizability of predisposition or the qualities necessary in a good soldier. Did the military's well-known sensitivity to signs of predisposition present unexpected opportunities to malingerers, who exploited psychiatric concern to avoid military service? Could the very aggressiveness that made mental patients unmanageable prove a distinct asset in combat? Was homosexuality, surely among the most common forms of perversion in men, really such a blight on military discipline, and did it, when discovered, merit automatic discharge and criminal prosecution? Such controversial questions were responses to wartime imperatives, but they also threatened psychiatrists' hard-earned authority to predict, not to mention treat, mental trouble.
The overall results of psychiatric screening and examination were both militarily alarming and publicly contentious. A total of 1,846,000 recruits were rejected from the armed forces for "neuropsychiatric" (NP) reasons, a full 12 percent of all recruits and a full 38 percent of all rejections. (No other justifications for military rejection approached NP deficiency; only "musculo-skeletal" and "eye, ear, nose, throat" came close with 17 and 10 percent, respectively.) An additional 550,000 or so men who survived their initial exam were eventually given NP dis-
charges, a full 49 percent of all discharges for mental and physical defects. Of these, 386,600 were "honorable" medical discharges based on a range of diagnoses, especially "psychoneurosis." Another 163,000 were "dishonorable, administrative discharges for reasons including psychopathic personality, drug addiction, alcoholism, and homosexuality. The total number of individuals formally disqualified from military service because of psychological malfunction was 2.5 million, a number dramatic enough to provide convincing evidence that rampant emotional disturbance constituted a threat to national security.[26]
More detailed statistics were just as staggering. Of the casualties severe enough to require evacuation during the major U.S. campaign in the Pacific, at Guadalcanal in summer and fall 1942, 40 percent were psychiatric. In a six-month period in 1944, combat divisions in Europe experienced a psychiatric casualty rate of 26 percent; with intensive combat, this figure jumped to 75 percent. Resentment also materialized around the disproportionately high rejection rates of Native Americans (40 percent) and black Americans (53 percent). Leaders of these communities often accused psychiatrists of racial bias and demanded easier entrance into the military. Psychiatric discharges were also 10 percent higher in the Women's Army Corps than they were among male soldiers, but no protest about gender bias was mounted. Indeed, alarm over the potential masculinization of female recruits insulated disproportionately stringent psychological screening and discharge practices from criticism. Other citizens grew impatient with all the talk about neurosis. They were convinced, as some military officials were, that perfectly capable men were using the excuse of mild or nonexistent maladjustment to remain safe at home.[27]
By 1943 the military considered such attitudes serious enough to do two things: order a major study to calm mounting objections to psychiatric screening and censor information about rejection rates and the mental state of soldiers.[28] Most clinicians believed public opinion on matters of mental health and illness was dreadfully ignorant, and they admonished that too few men were being screened out of the military, rather than too many.[29] The backlash nevertheless forced them to rethink their role. Clinicians had mobilized for the patriotic purpose of assisting the U.S. military, only to find their good intentions and diligent work overshadowed by their exposure of mental problems in millions of ordinary men.
With such grim statistics and with the military's continuing need
for massive infusions of manpower, it is not surprising that the initial enthusiasm for avoiding mental troubles entirely by screening them out slid gradually into an emphasis on effectively treating men who showed signs of mental trouble. During the first two years of the war, psychiatric casualties had been summarily discharged; they were given a diagnosis, but treatment was discouraged because "the official point of view of the Army toward psychiatric illness was a mixture of fatalism and disinterest; treatment was discouraged."[30] By 1944 the army's Neuropsychiatric Consultants Division, headed by William Menninger (the first psychiatrist ever elevated to the rank of brigadier general), was downplaying the Selective Service emphasis on screening and lobbying to overturn the policy of therapeutic skepticism. Aggressive treatment programs, William Menninger argued, would allow psychiatry to display its powerful healing capabilities and shine up its tarnished image.
By March 1945 the practice of automatically discharging soldiers with NP diagnoses was terminated. Determined not to let the disappointments of the early war stand as setbacks, William Menninger pushed military clinical practices in directions ever more sensitive to social context, abandoning as unhelpful, or at least insufficient, the notion that individuals could be conclusively categorized as either predisposed to mental trouble or not. The war's progress had transformed mental troubles into transitory and relative phenomena, with a number of possible outcomes. At one extreme was descent into more or less permanent mental disturbance and incapacity of the variety familiar on the wards of state hospitals. At the other was return to normality. William Menninger suggested that, if caught early in the form of simple maladjustment, mild mental trouble would rarely lapse into severe mental illness. It was due to this belief—that prompt treatment would arrest deterioration and probably guarantee recovery—that psychotherapy came into its own.
Efforts therefore shifted from screening soldiers to educating vast numbers of military clinicians in up-to-date methods of psychiatric diagnosis and treatment. Trained psychiatrists worked in induction centers, basic training camps, and in hundreds of general military hospitals at home and overseas; ten hospitals were devoted exclusively to NP casualties. Some were assigned to combat units. Most of the people who had direct contact with soldiers, however—48,000 medical officers and 872,000 nonmedical officers—had no previous psychiatric training. Consultants, whose job it was to spread psychiatric knowledge around
as liberally as possible, were in the vanguard of the treatment campaign, responsible for developing and maintaining high and consistent clinical standards throughout the military.
Personnel shortages gave psychiatrists the reason they needed to proselytize, which they did with missionary zeal. Here was an opportunity to place general psychiatric principles at the center of all medical education and practice and correct the woeful errors of doctors ignorant of psychological factors by introducing them, and impressionable medical students, to "the anatomy and the physiology of the personality."[31] Exasperated too that their fees lagged behind those of other physicians, many psychiatrists seized the opportunity war presented to raise the prestige of psychiatry within medicine. They agreed with Alan Gregg, a Rockefeller Foundation officer and one of psychiatry's biggest professional boosters, when he declared that it was high time for "radical change."[32]
[Psychiatry's task] derives in part from the incomprehension of all the rest of medicine which has gone so heavily technical and specialized that the psychiatrists are the only people left who are likely in many instances to insist upon a comprehensive view of the patient. . . . I come to the conclusion that unless psychiatry could be spread as a leaven in the lump of medicine and throwing most of its emphasis not upon madhouse material but upon the psycho-pathology of everyday life, psychoneuroses and behavior abnormalities, we would have to work in vain for any substantial improvement in the physician's comprehension of his patient.[33]
The army sponsored various efforts to shore up the numbers of military psychiatrists (including schools of military neuropsychiatry at Brooke General Hospital at Fort Sam Houston in Texas and at Lawson General Hospital in Atlanta) by offering intensive introductory courses.[34] But chronic shortages of qualified faculty brought pleas to private organizations to fund visits by civilians in order to improve the sophistication of military clinicians. The Rockefeller Foundation, which had allocated over $10 million of its medical research funds to psychiatry in the decade before the war, willingly shipped in a crew of "visiting firemen" to lecture on diagnostic procedures and demonstrate case conferences.[35] Gregg hoped they would convert their students to the messianic view that "the convergent rays of psychiatry, psychoanalysis and psychology now flood the conduct of man with light as it has never before been illuminated."[36]
The combination of advocates' enthusiasm and wartime necessity
succeeded in increasing the profession's status and numbers. As of 1944, psychiatry was accorded a division of its own in the army's Office of the Surgeon General, ranking on a par with surgery and medicine. By the end of the war, twenty-four hundred physicians were working as military psychiatrists, a number equal to the total membership of the American Psychiatric Association in 1940. A majority had no prewar psychiatric training.[37] William Menninger estimated that the military trained more psychiatrists in a few short years than all U.S. medical schools could have produced in a decade.[38]
Personnel shortages also temporarily curbed rivalries between psychiatrists and nonmedical clinicians who specialized in mental troubles, especially clinical psychologists. William Menninger was a tireless advocate for clinical teamwork. He adapted the innovative models tried before the war in his family's Topeka, Kansas, clinic (which would become a national hub of interdisciplinary training after the war) and agitated for resources with which to train clinical psychologists as well as psychiatric social workers and nurses.[39] Not content with a traditional division of labor that would have left psychologists in charge of testing, he encouraged them to participate in activities limited to psychiatrists before the war: diagnosis and even the practice of psychotherapy.
If the war generated a spirit of professional cooperation, the professions nevertheless remained unequal; psychiatrists were to supervise all others who ventured into the sacred territory of individual treatment. Because their subordinate position during World War I had produced much tension and little collaboration, psychologists resisted working within the medical corps under psychiatric authority. Psychological testing, of course, remained an important—and relatively autonomous—function assigned almost exclusively to psychologists. Psychologists recalled that, more than any other single activity, military testing had paved the way for professional advances in the past. Because it "brought psychology, down from the clouds" during World War I and "transformed the 'science of trivialities' into the 'science of human engineering'" in the interwar period, psychologists in World War II persuaded the military early on to locate administrative responsibility for testing in the Army Adjutant General's Office, where it was insulated from psychiatric interference.[40]
Testing programs were intended to accomplish important administrative goals; their clinical value was secondary, at least at first. By war's end, nine million men, almost 15 percent of the country's entire male population, had taken military General Classification Tests.[41] Designed
as basic job placement tools and measures of trainability, these tests included exercises in reading comprehension, basic arithmetic, mechanical knowledge and aptitude, and so forth. Exercises in sentence completion included items such as:
Always __________________ the salute of those under you.
1. approve 2. seek 3. appreciate 4. watch 5. return
It was clear in 1942 that victory over Japan would be an _____________ victory indeed if it were coupled with a United Nations defeat in Europe at the hands of Germany.
1. important 2. appalling 3. empty 4. officious 5. indirect[42]
In 1944 alone, sixty million standardized tests were administered to twenty million individuals in the military for the purpose of efficiently sorting men into the two thousand occupational and training categories that existed in the military.[43] Military testers avoided using terms like "intelligence" and "IQ" to describe what they were doing (semantic choices such as these had drawn much controversial attention to their World War I predecessors) even though the results correlated neatly with educational background.[44]
It was exactly such unglamourous administrative personnel work that psychologists' applied roles in mass institutions before World War II had prepared them to do. Leaders of the personnel effort emphasized the significance of this brand of psychological management by calling testers "the working architects and builders of the modern Army" and their tests "war weapons, although the roar and bang of machinery is absent from the silent room in which they work."[45] But the tide was already moving away from classification and toward new areas of applied work in which tests figured prominently. The momentum of war itself swept psychologists away from administration and into the clinical picture in significant numbers, where personality inventories and projective tests become more common features of the therapeutic process, similarly valued for their time-saving attributes.[46]
The appearance of symptoms of mental trouble in countless soldiers and the serious problems these posed for the fighting efficiency of the U.S. military were the most compelling reasons why psychologists tried to make their tests promote individual healing as well as military efficiency and also took on new diagnostic and interviewing tasks previously monopolized by psychiatrists. Skyrocketing breakdown rates (NP admissions in the United States went from 31.2 per thousand per
year in January 1942 to 68.9 per thousand per year in August 1943) prompted the military to set up a training program in clinical psychology at Brooke General Hospital, alongside the School of Military Neuropsychiatry.[47] Five other training centers were envisioned but never materialized. There were not enough psychiatrists to serve as teachers.
In spite of logistical obstacles to their training, psychologists rose to the challenge before them. According to a 1944 report by Robert Sears (one of the authors of the important Frustration and Aggression ), psychologists throughout the military were quietly taking case histories or even conducting psychotherapy, learning as they went, sometimes with little or no formal training.[48] One administrator in the Veterans Administration agreed. "This was therapy and it was called 'therapy'—recourse was rarely had to the euphemism 'counseling.'"[49] They did it because it was necessary at the time, but intelligent psychologists could certainly see that individual treatment was the wave of the future. In 1946 a survey of every psychologist and psychologist-in-training who had served in the military showed a striking movement toward clinical work during the war years. Hundreds of them had practiced psychotherapy for the first time and many intended to return to school for further training in this field.[50]
Blurring the division of labor between psychiatrists and clinical psychologists did more than permanently alter the balance of power between these two professions, although it did that too. It contributed to normalization, the dramatic shift in the subject and aims of clinical expertise. Before 1940 psychological testers worked to achieve the managerial goals of mass institutions like businesses or schools, performing the administrative tasks required in the interest of scientific management, educational progress, and operational efficiency. While most psychiatrists prior to the war worked in the institutional context of state hospitals, they believed firmly that their most profound loyalty was to individual patients and the alleviation of their mental troubles. Psychology's historical bond with reformist social science and psychiatry's origins in medicine undoubtedly had much to do with this difference in disciplinary identity.
Being drawn into diagnosis and treatment during the war made psychologists appreciate and identify with the ideal of personal mental health to a greater extent than they had in the past. This in turn helped them realize that administratively useful activities like testing could double as therapeutic aids; by the end of the war, projective personality
and other psychological tests were being utilized to encourage self-reflection in individuals as well as provide information to military policy-makers.
Psychiatrists, on the other hand, became more aware than ever that their roles as healers and guarantors of military efficiency might be at odds. A myriad of morale-related responsibilities and the expectation that they treat men who broke down in order to return them to duty made it clear to psychiatrists that their first duty was to the military institution—and not necessarily to the mental health of the soldiers in it. Psychiatrists, according to Harry Stack Sullivan, had to absorb the lesson that their role was similar to other wartime experts.
The Public expects a considerable human cost in war, and it hasn't much native sympathy for people who can't stand the gaff. It is the Army's business first and foremost to win the war. Considerations of needless human cost are relevant only to the extent that precaution will not hamper the war effort. Medical men are peculiarly obtuse to this. They simply have not learnt to put first things first. . . . The war calls on psychiatry to be practical. No one expects it to be perfect.[51]
Winning the war was the first priority. Humanitarian concerns were acceptable as long as they did not obstruct victory.