Preferred Citation: Modell, John. Into One's Own: From Youth to Adulthood in the United States, 1920-1975. Berkeley:  University of California Press,  c1989 1989. http://ark.cdlib.org/ark:/13030/ft7h4nb4tz/


cover

Into One's Own

From Youth to Adulthood in the United States 1920–1975

John Modell

UNIVERSITY OF CALIFORNIA PRESS
Berkeley · Los Angeles · Oxford
© 1989 The Regents of the University of California

For Judith



Preferred Citation: Modell, John. Into One's Own: From Youth to Adulthood in the United States, 1920-1975. Berkeley:  University of California Press,  c1989 1989. http://ark.cdlib.org/ark:/13030/ft7h4nb4tz/

For Judith

Acknowledgments

When one has worked as long and hard on an intellectual project as I have on this book, one builds up a lengthy list of people who have contributed to it. The extent of a list of "intellectual contributors" risks numbness in any readers it might have, and its close overlap with a list of one's friends and the difficulty of deciding at years' remove whether someone's contribution was to the book in particular or to its author's thinking in general produce a dilemma, whether one's inclination is to be inclusive or exclusive.

I have elected to acknowledge here only the most prominent, direct, and purposive contributions to the project, for I am sure that readers will realize that I am both thankful to and proud of the larger list of people I might have included.

Some of my debts are to institutions. The John Simon Guggenheim Foundation provided a fellowship during which this book "spun off" a related project. Among specialized libraries, the Institute for Sex Research, University of Indiana, the Minnesota Historical Society, and the Social Welfare History Archives, University of Minnesota, were especially generous in their efforts on my behalf. Both of my home institution libraries, at the University Minnesota and at Carnegie Mellon University, were supportive, assiduous, and resourceful: without their help, the book couldn't have been. Let the name of Erika Linke, who (coincidentally) departed my initial institution for my current one just when I did, and who has been exceptionally helpful at both, stand for that of a half-dozen librarians to whom I am indebted.

The History Departments at Minnesota and at Carnegie Mellon were both wonderful places for me to work and think. The Minnesota Family Study Center, University of Minnesota, was too. These were daily settings and always mattered a great deal to me, but so, also, in different ways, did the occasional and varying institutional contexts provided to me by the Committee


xiv

on Child Development Research and Public Policy of the National Research Council (National Academy of Science), the Social Science Research Council, and the Social Science History Association. To cite these institutions is in part shorthand for listing the many individuals I've met thanks to them.

Tanya Rogers, for two years my secretary, was the best person at any job that I've ever seen. For this book, she typed and carried out some data entry. I thank her.

Copy editor Sheila Berg has done all that she can to make me sound less like a second-rate Tobias Smollett and more like a writer from my own century. It's not welcome help, exactly, but thanks are due.

Stanley Holwitz of the University of California Press, my editor, has offered patient, shrewd, and encouraging advice to me over quite a long time. That he's been a fan of the idea of this book has mattered to me.

Machine-readable data would have served me hardly at all had I not had the intelligent assistance, in sequence, of Phil Voxland, Director of the Social Sciences Research Facilities Center, University of Minnesota, and John Stuckey, then Director of Computing at the College of Humanities and Social Sciences, Carnegie Mellon University.

In the process of collaboration on closely-related projects, I was taught much of what I know by John Campbell, Glen H. Elder, Jr., Frank F. Furstenberg, Jr., Tamara K. Hareven, Theodore Hershberg, the late Reuben Hill, Duane Steffey, and Douglas Strong.

The manuscript was read critically and indispensably in its final form by Glen Elder, Doug Gower, Carol Z. Stearns, Peter Stearns, and Viviana Zelizer. This was an enormous chore, and I appreciate it greatly.

DEPARTMENT OF HISTORY
CARNEGIE MELLON UNIVERSITY
PITTSBURGH, PA 15213
AUGUST 20, 1988


1

1—
Defining One's Own

Sylvia's Choice

At sixteen, Sylvia rejected the hand of the boy next door. Her mother made the decision easy, reminding her that she was too young to know what love was. When Sylvia was eighteen, her mother prompted her to reject the suit of a poor but ambitious young man with whom Sylvia thought she really was in love. At nineteen, Sylvia considered but finally rejected the proposal of a young man of wealth and family, whom she did not love. "When Sylvia was twenty-five she was much lovelier than she had been at nineteen. At least, so her mother said. . . . Somehow, the men she met [now] were not so eager for matrimony. Most of them were earning smallish incomes, most of them had someone dependent upon them, most of them, when they did consider marriage, looked for a girl who had some earning power."[1] For a period, Sylvia rejected the logic of this proposition but eventually acceded. Her rebellion was episodic and individual.

This story in The Ladies' Home Journal in 1941 presenting the workings of the marriage market was typical of the genre that was a staple of the monthly woman's fiction mill. Marriage at some age, Americans held and still hold, is clearly too young; love at sixteen is either impossible or empirically unrecognizable. The winnowing process of courtship, however, rapidly reduces the pool of eligibles to those with special demands or disqualifications. The corrosion of age on woman's physical allure begins its cruel work; and the great, if lessening, social disadvantages of the single female allow even the bachelor dregs to demand not only beauty but economic resources. In this account, the events leading to marriage are presented as essentially a learning process. The literary token of the accomplish-


2

ment of this process was a recognizable expression of true love, and marriage was the melodramatic climax or humorous resolution toward which the action tended. The protagonist's uncertainty about marriage was followed by a declaration of intention to marry, after a learning process in which both sexuality and some kind of nonsexual "rightness" were discovered to unite the couple. This learning process was formally analogous to the "search" phase of the "marriage market" as abstracted by economistic model builders.[2]

In 1941, Sylvia knew that her mother knew the rules of the game all too well. But in more recent decades, the path has become obscured—indeed, contested—and in many of its particulars. Most obviously, it has become an embarrassment to present marriage itself as a happy ending, not so much because marriage is not a happy event but because so often it is no longer an ending. The impact of divorce and serial marriage on parenthood, on children, indeed on the kinship system as a whole, is under wide debate today.[3] The search for "the" husband in women's fiction today has dissolved into a variety of quests with less-determinate patterns: for physical gratification, for love, for self, for security, for "fulfillment." These may take longer to find; and both men and women may gain the capacity to contribute to them only slowly and, indeed, may develop them only rather late.

At the same time, entry into marriage in American society, no less than earlier in the century, is still said to depend on love, which in our culture is understood to be spontaneous. But love ordinarily has an explicitly age-graded aspect: "puppy" love is different from "mature" love. "If you've never been kissed, you've never been ardently loved, before you are twenty-six, then beware! Love, at eighteen may be just a lark, a game, but at twenty-six, the starved senses, suddenly aroused, whirl with a giddiness that blinds clear thinking."[4] A second culturally defined dimension of marital love, roughly distinguishing "fleshly" from what might be called "obligational" love, has also usually been thought to be influenced by the chronological ages of the lovers and their ages relative to one another.[5] Thus have age norms of marriage been intertwined, as in Sylvia's case,


3

with the ways people are supposed to feel toward each other and the forms these feelings are encouraged to take.

Two decades after the exposition of the conventions by which Sylvia finally learned to live, two best-selling books roundly condemned contemporary patterns of early marriage as a special bane to American middle-class women. Debate over the shape of the way young people should approach marriage had moved from the personal to the political. Today we view Helen Gurley Brown's Sex and the Single Girl as a period piece and honor Betty Friedan's The Feminine Mystique as the opening (or reopening) gun in a heroic battle to realign the genders. Both books offered arresting arguments that women's personal fulfillment was sabotaged by early pursuit of marriage and parenthood. But their prescriptions differed radically.[6]

I think a single woman's biggest problem is coping with the people who are trying to marry her off! . . . Finding him is all she can think about or talk about when . . . her years as a single women can be too rewarding to rush out of. . . . I think marriage is insurance for the worst years of your life. During your best years you don't need a husband. You do need a man of course every step of the way, and they are often cheaper emotionally and a lot more fun by the dozen.[7]

The problem that has no name—which is simply the fact that American women are kept from growing to their full human capacities—is taking a far greater toll on the physical and mental health of our country than any known disease. . . . If we continue to produce millions of young mothers who stop their growth and education short of identity. . . . we are committing, quite simply, genocide, starting with the mass burial of American women. . . . We need a drastic reshaping of the cultural image of femininity to reach maturity, identity, completeness of self, without conflict with sexual fulfillment, . . . to stop the early-marriage movement, stop girls from growing up wanting to be 'just a housewife.'[8]

At the time these tracts appeared, the age at which women were marrying had already been moving upward for half a decade. What is important is not demographic precision, however, but the passion with which the authors spoke to and as women, yet


4

from startlingly different perspectives and with such contrasting tone: one recalling the coyness of such Hollywood confections as the Tony Curtis-Natalie Wood Sex and the Single Girl , the other foreshadowing changes we even now are assimilating. Women who faced the world quite differently sensed that there was something wrong with young women's life course and that as women they had a stake in rectifying it.

Brown took on herself the major task of promoting an open and enthusiastic recognition of female sexuality, so that in its various guises it is seen as suffusing the life of the "mature" single woman. "Theoretically a 'nice' single woman has no sex life," she remarks. "What nonsense! She has a better sex life than most of her married friends. . . . Since for a female getting there is at least half the fun, a single woman has reason to prize the luxury of taking long, gossamer, attenuated, pulsating trips before finally arriving in bed. A married woman and her husband have precious little time and energy for romance."[9] But Brown's transvaluation is accomplished by promoting the single-girl phase as a period of almost single-minded focus on fun with men, however varied, exquisite, and (but for that last time when the right man comes along) transitory. "Liking men is sexy. It is by and large just about the sexiest thing you can do. . . . And there is quite a lot more to it than simply wagging your tail every time a man pats you on the head. You must wag your tail, or course, . . . but there are about five thousand more aggressive ways to demonstrate liking. . . . You must spend time plotting how to make him happier. Not just him . . . them! "[10] Sexiness, practically, inheres in plotting, luring, tempting, challenging, especially at the workplace, and, above all, enjoying men. All this is a learned skill, one substantial enough to rightly command a longish time in its practice, a period brought to an end only by a marriage on the terms that Brown understands it.

For Friedan, this kind of sexual triumph, on the ideological level, is no solution; it is part of the problem. Once, she argues, women did have to liberate their sexuality from the pedestal, but postwar gender ideology had already changed this before she (or Brown) wrote. "The split in the new image opens a different fissure—the feminine woman, whose goodness includes


5

the desires of the flesh, and the career woman, whose evil includes every desire of the separate self."[11] And this separate self is exactly what Friedan believed women deserved as their birthright, and needed for their mental health—contradicting the popular psychology of the day that (like Brown) saw women's problem in sexual neuroses. Education and then employment would save women, not lustier, more extensive courtship habits.

Contemporary married life seemed dreary to both Friedan and Brown, and both believed that it would be far less dreary if it were entered into later. The postponement of marriage had for both authors the secondary advantage of superior choice of mate, and the primary advantage of prior "fulfillment" for the woman. "Those who glom on to men so that they can collapse with relief, spend the rest of their days shining up their status symbol and figure they never have to reach, stretch, learn, grow, face dragons or make a living again are the ones to be pitied. They, in my opinion, are the unfulfilled ones." (And this is Brown speaking.)[12] The two authors each sought to revise the life course of American women, in the belief that the content and value of marriage (and, explicitly in both cases, parenthood) are in part determined by the courses women took on the way there.

Both authors would extend schooling, the extent of which both saw as far too subject to foreshortening by women in the interest of early marriage. Both would make the occupational life a far lengthier and less casual part of women's life courses, although Friedan advised that women seek vocation in the classic sense, while Brown advised frequent job changes or at best a shallow careerism to facilitate the pursuit of fun with men. And they agreed that sexuality must be recognized and accepted outside of marriage, lest it drive toward one that was poorly timed. While the prophetic quality of these prescriptions may (in hindsight) be more a matter of simple observation, the fact is that the scenarios Friedan and Brown proposed do describe in many ways how life was to change in the next decade and a half. How people grow up—the life course—has been a subject for debate through much of our century. The debate, however, more commonly addresses directly the content of phases of the life course rather than their proper timing


6

or sequencing. Recent debate on marriage provides a case in point.

How to Marry

Typically, pre-World War II fictions played with the age norms of marriage by setting youthfully eager wishes off against essentially external hindrances, which delayed marriage. The culmination of true love was postponed because, while the flesh was eager, the economy often made it impossible for couples to fulfill with sufficient certainty the obligations of true love. In the prosperous postwar period, however, the willing flesh of the enamored arrived at marriage (younger) after conquering not external hindrances but the actors' own doubts and confusions, characteristically placed within the sexual realm. A typical didactic fiction in a 1957 issue of True Love Stories , "Engagement Jitters," provides a case in point.

Diane Glazer had met Raymond Tappan eighteen months before. Their courtship was in no way unusual; as their interest in one another grew, so did the number of their dates. They'd been going steady a little over a year when Ray asked Diane to marry him. He was twenty, his military training was behind him, his future as a clerk in the post office promising. Diane had suffered no doubts when Ray proposed. She loved him, he loved her; what could be simpler? Of course she'd marry him! In six months, a June wedding? Of course! . . . But as the date of the wedding grew nearer, Diane found some of her excitement dying down. . . . Before, when Ray had kissed her, she'd always had to fight her raging emotions. Now sometimes, she wanted to run when he drew her into his arms. Oh sure, they'd talked frankly about sex. . . . but talking and doing were two different things! And the doing part was only a few weeks away.[13]

Correspondingly, when Hannah Stone and Abraham Stone added a new section on ideal marriage age to their virtual catechism on health in marriage in the "completely revised" postwar edition of their well-known Marriage Manual (1937), their prescription moved the offset against premature lust from the external realm to the internal. "The best age for marriage is the age at which emotional and social maturity is attained. In gen-


7

ral . . . the early twenties are the best years for marriage." But "the extent of a person's maturity in thinking and behavior" outranked both "chronological age" and "the economic situation" in indicating when to marry.[14]

In retrospect, we are hard put to determine whether the ideal marriage age had shifted downward because people grew up emotionally quicker, or vice versa, or whether the removal of material hindrances to marriage allowed many people to marry younger, encouraging a simultaneous change in the age people considered best for marriage and the way people at a given age felt about themselves. In fact, one cannot say in the abstract, for material circumstances, values, feelings, and institutional arrangements are all thoroughly intertwined. Transitions like marriage often demand a certain material wherewithal, and under some conditions, changes in material circumstances may be granted a certain primacy, on the assumption of institutional constancy, as in the matter of parental underwriting of marriage. But just as this volume will discern changes in the material environment, it will also show institutional changes as well as normative and even emotional ones. My purpose is not to disentangle cause so much as it is to portray in some richness the way in which the push "into one's own" was repeatedly revised over a half-century.

A series of life course transitions , including marriage, similarly freighted and indeed interrelated, are the subject of this book. Sequentially, marriage is at the center of the events I will explore; it is preceded by the inception of what roughly can be called dating and by the initiation of sexual intimacy.[15] As courtship "leads to" marriage, so marriage "leads to" parenthood, the fourth transition treated here. (First marriage is by no means ultimate marriage since the 1960s; thus, my account of "family-building" also treats divorce.)

The path into one's own is somewhat vaguely bordered, but it is no less bordered for that fact. In twentieth-century America, for instance, as elsewhere and at other times, powerful social forms have gathered about life course transitions which are distinctly but not precisely prescriptive in content.[16] The most obvious is the wedding, a ritual that in twentieth-century America has always seemed somehow anachronistic,


8

but which, as "tradition," has always seemed to renew itself. Bride's Magazine 's 1973 revision of its Bride's Book of Etiquette instructs readers that "most wedding customs evolved from a wish to symbolize all the good things the union meant to the couple and the community. . . . Those that continue to symbolize the same good intentions . . . will flourish. . . . Other, older traditions are gradually outgrown and eventually abandoned. . . . Do look over some of these time-honored customs and choose those that appeal to you and your families' sentiments."[17] The wedding is the particular ritual whose form symbolizes compliance with widely held values, including those regarding appropriate timing .

Religious weddings, especially large church weddings, constitute in the contemporary American context a form of communal ritual oversight of the marriage.[18] Couples marrying in religious ceremonies have been markedly more concentrated in the modal age-at-marriage categories than those marrying in civil ceremonies. This pattern, if anything, intensified over time between 1961 and 1974.[19] Reeves's data on marriages in New Haven indicate a marked trend toward a somewhat enlarged proportion of civil marriages among all marriages from 1870 to 1940 but rising only to about 18 percent.[20] Long-term annual observations for the city of Philadelphia indicate a slow, gradual increase in secular weddings from an initial figure of about 2 percent around the turn of the century to a peak of around 8 percent in the early 1920s, followed by another decline, to about 5 percent in 1937, at the end of the series.[21] National data for 1939, 1940, and 1948 show that at this time about a quarter of marriages were civil.[22] When the national vital registration system began to regularly record type of ceremony in 1960, the proportion of civil marriages was slightly lower than this. The trend since then has been a very gradual increase.[23] The data, taken together, indicate that during the twentieth century, there have been modest changes in fashion in type of ceremony, but nothing more than this. In view of the dramatic changes in the timing, structuring, and terminability of marriages during this period, the stability in the ritual is remarkable. As a passage from one stage of life to another—although both stages may have developed new content—marriage contin-


9

ued to matter, to the community as to the bride and groom. Continuity in ritual provided resources that in part offset circumstantial changes in the way young people came into their own.

Obtaining systematic information on weddings themselves over time requires a certain resourcefulness. Newspaper reports of weddings offer such an insight into trends in ritual surrounding the entry to married life. Wedding notices are stylized, their contents partly editorial whim and partly the preference of the family member who reports the wedding to the paper, so we read not so much a report on what happened as an account of what should have happened .[24] But exactly this quality is what interests us about the wedding as a ritual , and from this perspective, stylized stories in local newspapers serve nobly. To respond to the indeterminate but not improbable inclusiveness bias of wedding notices, and especially the possibility that this has changed over time, I have drawn clusters of wedding stories from consecutive late spring and early summer issues from 1925 to 1975 of three newspapers from places varying in population size, on the grounds that the smaller the town, the likelier the incorporation of people of lesser means. The three newspapers, all from Minnesota, included a metropolitan but localistic newspaper, the St. Paul Pioneer Press , a small-town daily, the Albert Lea Times , and a small-town weekly, the Thief River Falls Tribune .[25]

As though in response to the plasticity of marriage timing —a central theme of this book—there has been a distinct secular trend toward increasing elaboration of the rituals surrounding these events. One of the more prominent aspects of this trend has been a growing emphasis (in the wedding notices) on large ceremonies. To accommodate expanded attendance, weddings have been shifted from weekdays (6 in 10 in the 1920s in all three towns) to weekends (8 in 10 by the 1970s). Newspaper accounts more and more have included the names and origins of wedding guests from beyond the vicinity. By 1957, they often listed "honored guests" from afar.

Concurrently, the number of named offices in the wedding expanded markedly, as formalities became more elaborate. And the reception has emerged as a central part of the wedding story. "A wedding is a solemn ceremony and the reception that fol-


10
 

Table 1. Wedding Receptions in Minnesota Newspapers

   

Where Place of Reception Is
Mentioned, Proportion Held in
(in percentages)

 
 

% with Reception Mentioned


Private
Home


Hall, Club, Restaurant

Church-
Related Building


Number of Articles

St. Paul

1925–26

72

90

10

  0

105

1932–33

78

66

25

  9

58

1941

80

43

47

11

61

1946

79

33

35

33

58

1957

92

20

58

22

60

1969

92

  6

66

28

57

Albert Lea

1925–26

78

95

  5

  0

27

1932–33

63

61

28

11

30

1941

86

63

22

14

58

1946–47

88

27

28

47

60

1957

98

10

16

75

59

1975

96

  9

24

67

50

Thief River Falls

1922–25

79

96

  4

  0

33

1940–41

87

67

18

15

39

1946

93

70

11

19

30

1957

97

15

21

64

35

1973

100

  6

24

70

54

SOURCE : See discussion in text.

lows should be joyous. It's traditional to gather friends and relatives to celebrate the happy day."[26] Table 1 details two aspects of the reception trend. The first shows that gradually (in each newspaper) the reception became an obligatory part of the story, and thus, putatively, of a ritually complete wedding. At the same time, the reception moved from the home of the bridal family to church parlors in the small towns and to private clubs or restaurants in St. Paul. The reception's rise points to


11

the secularization of the wedding ritual (even as the proportion of religious ceremonies has remained roughly stable) and its increasingly public orientation. Surely, this shift does not bespeak a lessening of social oversight over the marriage but a shift—or, more properly, a broadening—in its focus. Indeed, nuptial couples now were twice on inspection, twice required to be grave, then joyous and sociable before they left on their honeymoons, symbolic of their separateness.

The trend of officiants within the formal portion of the wedding has been distinctly toward the masculine, a tendency tied to the move toward elaborate weddings, and also their increasingly public orientation. A marked rise in the prevalence and number of ushers represents the most striking instance. Ushers, typically, were of the generation of the couple who saw fit to affirm the value of an orderly passage into marriage. In Albert Lea, for example, wedding stories in 1925 mentioned an average of three wedding officiants but in the period from 1973 to 1975, no fewer than nine. Girls and women, however, have increasingly filled a burgeoning, imaginative list of reception roles—"coffee pourer" and the like—that scarcely ever were assigned to men. In the case of both wedding and reception officiants, nonrelatives have gained in numbers more quickly than have relatives, but there have also been growing numbers of relatives in official roles. When we categorize relatives with stated wedding roles according to whether they are relatives of the bride or the groom, we find consistently heavier participation from the brides' side. The wedding in American culture—no less so in 1975 than in 1925—was to be arranged by the bride's side of the family, as etiquette books insisted. As ever, the bride had more at stake, and she accordingly convened more of her kin. But at the same time, the community's ritual stake had seemingly grown.

Although the rather steady trends in ritual oversight do not correspond to the ups and downs in the fragility of marriages, at least as registered by divorce rates, it certainly is plausible that a growing awareness of the voluntary nature of both entrance into and departure from the married state occasioned the evidence of enlarged ritual communal concern.


12

Defining Sylvia's Choices

The timing of transitions in lives is individually determined in our society—to a degree, probably increasingly so (as in the case of marriage), but not entirely. At the same time, the location of the transition point along the life course is socially recognized, monitored, and sanctioned, although the timing of some transitions is obviously more strongly sanctioned than that of others. When I was of an age to protest such matters, a popular song lamented that "they tried to tell us we're too young, too young to really be in love." "They" cared not because they believed that young people's emotions were of real concern to them but because "in love" has been a significant marker in twentieth-century American lives, with attendant rights and privileges, with consequences for related and subsequent action.[27]

Analogously, if less sublimely, the licensing of automobile drivers, which at two distinct points in the course of life has been a matter of concern to me, is and has long been an age-graded, gender-differentiated, societally sanctioned phenomenon. It is also a phenomenon with an unremarked recent history that is indicative of the ways the life course may change. For boys initially and increasingly for girls, the capacity to drive virtually defined a life course stage. That is, driving was not simply a privilege with obvious utility but also definitive of a stage in one's life , although admittedly one without a particular name attached to it. Excellent national data since World War II on drivers' licenses by age[28] reveal that the growing availability of automobiles encouraged more and more boys, younger and younger, to take out licenses. The steepest increase was at age fifteen to seventeen, when most American boys in the late 1940s became licensed drivers. For girls, the age-grading pattern was always less steep, taking more years for an entire cohort of girls to become drivers. But over time girls, have increasingly approached boys' pace of transition to licenseship, the convergence occurring initially at the older teen ages and more recently at the younger ages. Over time, the steepness of age-grading for girls has come to approach that of boys. This narrowed the age span during which in any boy-girl couple the


13

boy alone would possess this legal, practical, and symbolic competence, a point of some symbolic and perhaps practical consequence for gender relationships.

There is more to the story than adolescents' own choices, as in the case of many of life's highly freighted transitional moments. For adults controlled the governments that licensed drivers, and their response to adolescents' increased material resources was symptomatic of the often quiet debate over the nature of the adolescent years that has been carried on in twentieth-century America. In the early 1960s, there was a broad movement to limit the freedom of children to drive by raising the legal age for licensing. Some states came to offer two age-graded licenses, a full and an aptly named "junior" license. By the late 1960s, at just about the point when adolescent boys were about as completely licensed as they would become, adults relented and began to add their full normative sanction to an early transition to driverhood. Often, adults now inserted the completion of school-sponsored drivers' training courses as an intermediate stage of adult-organized socialization to the road.[29]

In American society, as in most societies, although with varying emphasis, age is an important social marker. Yet age (even in combination with gender) is not ordinarily—in our society—a status to which in and of itself particular rights and privileges are due, certainly after early childhood and before retirement. Rather, chronological age provides the most important single cue for a series of transitions that mark the departure from a prior status or relationship to a major social institution and the entry into a subsequent status or relationship. Two major American institutions affecting young adults, formal education and the armed forces, are explicitly age stratified. Many occupations build age increments of income into the normal careers they imply.[30] The paths through life have been, accordingly, marked by traditions, entered into by individuals attendant on more or less clear cues and sanctions.

On the one hand, on-time transitions are, as a matter of course, culturally prepared, cushioned by anticipatory socialization and by supportive institutional arrangements.[31] On the other hand, and correspondingly, individuals moving too slowly or too quickly through a particular transition are often admon-


14

ished, where they are not restrained by administrative regulations or by positive law itself: a too-early retiree will receive no Social Security benefits for some years; youths seeking to marry too young may be told by the state to get their parents' permission, or they may not be allowed to marry even with their parents' consent; school "dropouts" are so stigmatized that they will feel they have failed to complete an expected transition rather than having simply chosen to spend those years at work instead of in school. The violation of these norms may be quite powerfully sanctioned. School dropout offers an example of a norm for which strong sanctioning has developed recently and rapidly. In 1964, 9 percent of white male high school graduates ages 16 to 24 not in college were unemployed, compared to 14 percent of like high school dropouts. Among blacks, the comparable figures were 19 percent and 18 percent, respectively. But by 1976, this "price" of dropping out had risen from 5 percent for whites and — 1 percent for blacks to 11 percent for whites (9 percent vs. 20 percent) and 10 percent for blacks (22 percent vs. 32 percent).[32]

The life course perspective holds that while biographical sequences are not by any means wholly determinate, they are determined to a degree, and in two senses. First, the steps one has already taken make more probable particular future outcomes: if I marry at 21, I am more likely to have a child by 25 than if I marry at 23. Second, both the timing and the sequencing of important life events are to a degree socially determined, whether structurally, normatively, or both: if married men, or fathers, are deferred from military service in their early twenties, and military service is a life stage neither greatly honored nor highly rewarded, there will be added incentive to marry at 21 rather than at 25.[33] The life course perspective argues that the determinate elements of these patterns constitute objective "social facts" and, no less, that individuals live and experience their own biographies as aware actors, who do not merely receive these patterns as in the nature of things, but construct and evaluate them as they move along , looking both forward and back. Culture, in this view, although both a set of symbols and a structure of belief and thus not equal to the sum of individual outlooks, is in substantial measure responsive to this sum.

Under current assumptions, conformity with the social and


15

cultural cues promoting timely movement through the life course is expected to be directly satisfying to the actor. When, on balance, this seemingly does not happen—as has been documented commonly happens when married couples first become parents—troubled commentary is heard. In 1926, Margaret Sanger expressed the conventional understanding of the motivation to parenthood among happily married wives in terms of a "maternal desire.., intensified and matured, . . . the road by which she travels onward toward completely rounded self-development, . . . the unfolding and realization of her higher nature."[34] A decade later, however, Lewis Terman was embarrassed to report on the basis of his extensive empirical investigation, Psychological Factors in Marital Happiness , that "the widespread belief" that Sanger and others reflected was not on the average borne out by the facts. Nevertheless, it was "reasonable to suppose that the presence of children is capable of affecting the happiness of a given marriage in either direction."[35]

Another decade later, Evelyn Millis Duvall and Reuben Hill reasserted that "for the couple ready for this step, having a baby is a supremely satisfying experience," a position for which at least one fine height-of-the-baby-boom study found some empirical justification.[36] But data from the late 1960s and 1970s showed that this was no longer true—if it ever was—for the average American couple. Summarizing the results of many studies, including a soundly based study of their own, Norval Glenn and Sara McLanahan concluded that in view of the fact that "in American society children tend to lower their parents' marital and global happiness," it was "ironic that most Americans want to have children" and that they do so.[37] The irony, of course, hinges entirely on the individualistic—and arguably hedonistic—assumptions governing our interpretation of life course transitions. Such assumptions, however useful they may be in simplifying interpretation of motivation, fly in the face of evidence that even in a relatively short number of years, contraception, by changing the material circumstances of choice , has participated in a redefinition of the "should" that has surely always played a part in the motivation to become a parent.[38]

Some transitions are typically more age determinate than others. On the whole, transitions earlier in the life course—where state bureaucracies are given greater sway—have tended


16

to occur more uniformly to members of a given cohort than those occurring later.[39] And for some elements of the population, some life course transitions have been relatively more loosely timed. An instance of this, relevant to the account to follow, has to do with the timing of marriage, which has always been considerably more closely supervised (and, correspondingly, more nearly uniform) for women than for men. Intuitively, one can see how this fact is related to other aspects of the asymmetry between the genders. And thus it should not be too surprising that the gender differential in this regard has been declining recently. How culturally influenced the marriage transition is, for both men and women, is attested to by the near-disappearance of the category "bachelor" as a culturally recognized (if not universal) life course stage for men and the development during the past two decades of a closely parallel popular understanding of a rather extended unmarried adult state—"living with"—for members of both genders.[40]

Indeed, the very concept of a life course "stage" like bachelorhood implies cultural notions about the content of that stage and about its place within one or more of the trajectories its occupants are presumed to be working out. We here witnessed, for instance, the passing of one strongly supported middle-class norm, that of men's economic independence at marriage.[41] It must be remembered, however, that if culture sets some of the terms for the staging of the life course, it does not set them all—certainly for individuals, but perhaps for whole cohorts. Gunhild Hagestad's insight, that "some of us find ourselves in life stages for which our society has no clear culturally shared expectations" is important for understanding the recent social history of the American people and useful for interpreting the materials presented below. "Demographic change [for instance] may have been so rapid and so dramatic that we have experienced 'cultural lags'" in the construction of normatively defined "stages."[42]

Constructing a History of the Life Course

Increasing attention has been given to the life course over the past two decades by an interdisciplinary grouping of scholars.


17

Their concerns have evolved from a focus on the cohort among demographers,[43] the relevance of the notion of age stratification to social gerontologists,[44] and a concern for life span psychology among students of human development.[45] Somewhat more recently, it became evident to workers in several of these fields that if they were genuinely to import a processual orientation to social science, historical change could no longer be ignored, as was so characteristic of American social science at the time. "Career lines are structured by the realities of historical times and circumstance; by the opportunities, normative pressures, and adaptive requirements of altered situations; and by those expectations, commitments, and resources which are brought to these situations."[46] Both historical events and trends affect individuals differently according to life course stage, sometimes affecting the life course itself in the process. "Processes commonly denoted as [individual] development . . . [are] social products to be understood within the particular features of a specific societal and historical context." In that context, the analyst seeks "the causal bases of age stratification within the social system that lead to some level of age-graded events for a collectivity at a particular historical moment and to broad similarities in individual life courses or psychological biographies during that period."[47]

From a historical life course standpoint, structure may—sometimes—be seen in dynamic perspective. "The important contribution that historical research makes is in specifying and examining diachronic changes, which often have a more direct impact on the life course than macrosocial changes. Most importantly, historians can identify the convergence of socioeconomic and cultural forces, which are characteristic of a specific time period and which more directly influence the timing of life transitions than more large-scale or long-term linear developments."[48]Children of the Great Depression (1974) is justly viewed as the pioneering empirical exploration of this fundamental insight.[49] It examines life courses of children who in varying ways faced the Depression's rigors and provides an acute treatment of many of the theoretical issues. Especially eloquent has been Elder's insistence that the historically oriented life course approach be explicitly connected with the


18

agentic perspective on individual experience and choice carried within the sociological discipline by the "Chicago school" variant developed by W. I. Thomas and carried on by Herbert Blumer and Everett C. Hughes.[50]

Martin Kohli has argued in an exceptionally thought-provoking essay that not only have particular stages changed historically but also the salience of the life course itself .[51] The "chronologization" of life, he maintains, has grown apace with modernity (or capitalist development), as "part of the more general process in which individuals are set free from the bonds of status, locality and family." Such a process is of quite long standing, of course, and yet there now appear signs of reversals—the kinds of indefinition that individuals themselves must resolve, which Hagestad refers to. Kohli admits there are many hints that individuation , not chronologization, has become the dominant trend over the last decade or two. Nevertheless, he maintains, "the successful institutionalization of the life course is the basis for the present individualizing departure from it."

Presented narratively, the burden of my account is to demonstrate concretely the power of such insights as Kohli's. The chapters that follow show a life course segment rendered (somewhat ironically) more salient and, in some respects, more determinate by the increasingly explicit debate that has emerged over its construction. The number of contestants in this debate has been progressively enlarged, so that over the twentieth century, teenagers qua age group have come to articulate—and to have articulated for them, especially in music—a distinctive view of how they wish to grow up. This is not to say that teenagers differed from adults in what they wanted to grow up into , but, instead, about how and when. I show, thus, how dating, a contested institution constructed by "kids," was connected with the institution of marriage in a way that by the 1970s seemed decidedly conservative. As I also show, increasingly self-conscious considerations of gender played a part in the debate about dating, marriage, and the youthful life course as a whole. It is apparent, too, that a distinctive organization of the youthful life course has more lately emerged among the inner-city black poor, a subject for debate within the black community and for denunciation outside it.


19

We currently are witness to an adult effort to condemn large portions of American youth as a "postponed generation."[52] Explaining the inappropriateness of youth's hesitant passage through the life course by "scarcity," Susan Littwin describes a generation of middle-class young people who had learned to "paint or run a mock constitutional convention or jog six miles," only to learn that in the hard world beyond adolescence "no one cared." "It is hard enough to establish an adult identity, even in the best of times," she argues, employing a characteristic translation of roles into a psychological state. "What today's twenty-to-thirty-year-olds have elected to do is continue the identity search while avoiding reality," that is, the signals of the current job market, "and that makes it exceedingly slow work."[53] The reader can hardly fail to detect like themes in neoconservative condemnation of the mutual failing of one another by schools and students.

My examination of transitions is embedded in a more inclusive study of the life course in which transitions are seen from the perspective of their sequence . Determinate sequences underlie the "career," or, in the less evocative terminology that Elder for that reason prefers, trajectories . Within a given culture, those transitions seen as part of the same trajectory commonly have a normatively prescribed (or at least preferred) sequence. "Through cultural and structural forces, established career lines present individuals with particular constraints, incentives, and options as they work out their trajectories."[54] Through this perspective, one is led to link the examination of the socially and culturally structured circumstances individuals find themselves in, with their chosen responses to those circumstances. Individuals understand their own situations in terms of the process—their relative efficaciousness in it, the extent of positive or negative sanctions they have received—by which they have arrived at their present stage.

Often quite prominent in popular debates about the life course are disagreements not about timing but about the sequencing of transitions, about the appropriate shape of trajectories, about what it means when a handful of actors, or growing numbers of actors, violate what is ordinarily done in relating one change in their lives to another. Such an argument often has a less arbitrary sound than that over the timing tran-


20

sitions, being couched in terms of "competence" rather than "maturity"—something presumably an attribute of the individual rather than something substantially derived from the social definition of the individual's chronological age. Consistent with the individualistic trend of our times, however, sequencing arguments have commonly faded on the demonstration of effective "competence" by those claiming the right to out-of-sequence transitions.

An important example is the blurring of the once well-guarded normative sequence of leaving school and entry into the labor force. The decades since World War II have seen a massive expansion of the employment rate (and the hours of work) among high school (and college) boys and girls, at the same time as out-of-school boys and girls of the same ages are suffering increasing unemployment. In the interest of reducing the risk of "dropping out"—a distasteful transition—schools have adopted a number of mechanisms that permit and even encourage tentative entry to the labor force before graduation.[55]

The increasingly embarrassed giggle that accompanies contemporary use of the term "virgin" (in the context of persons, not derived uses applied to materials) is likewise evidence of massively lessened vigilance regarding the sequence of coitus and marriage, especially for women. To recur to popular music, whether one wishes to take "love" literally or metaphorically, it is apparent that the "love and marriage . . . go together like a horse and carriage" sequencing formula of my adolescence has been uncoupled, to be reassembled every which way. Many hope that the fear of heterosexually transmitted AIDS will return this sequence to its earlier state.

The life course perspective brings together historians' concern with experience and the recognition that aggregates like "populations" do not have intentions. It allows us to take advantage of the fact that samples of populations leave accounts of how they feel about their actions. Obviously, this is a very broad perspective, and not one that proposes a singular methodology. But it does propose that students of the life course focus their attention in a number of ways.

1. As a necessary step toward simplifying, we reduce what is in fact a continuous moment-to-moment development to a series


21

of what, a priori , are defined as transitions: marriage, parenthood, and military service are typical.

2. These transitions are seen as involving changes in individuals' social roles, to accord with changed statuses as defined by social institutions. Thus, becoming a father involves acting in a particular kind of reciprocal relationship with an infant and involves being known publicly as one who should perform a certain set of obligations that pertain to occupying the status of "father" in the institution "family."

3. The cultural meaning attached to such roles and statuses is not fixed, but, in part, changes according to the experience of the actors who are living in them. That "fatherhood" no longer brings draft deferral somewhat changes what it means to fathers.

4. The experience of a given status is not divorced from the other sets of roles and statuses occupied simultaneously by the actors: the experience of "motherhood" is different for mothers who are simultaneously wives and those who are not.

The empirical emphasis of students of the life course has been substantially, although not entirely, on concerns usually associated with social psychology, especially having to do with the learning of life stage roles. A number of scholars within the discipline of sociology, however, especially in its more demographic reaches, have worked with life course concepts in such a way that they move toward aggregate concerns that are in some ways more akin to the kind of questions posed by historians (which I emphasize in the chapters that follow). Sandra Hofferth, for instance, has apportioned the aggregate experience of recent cohorts, subdivided by race, to time in childhood spent in incompleted, completed, and broken families.[56] Peter Uhlenberg, in a number of superb studies that take up particular transitions and sequences, has estimated the prevalence, timing, and variation in timing of these transitions over historical time, alerting scholars to the truly marked changes in the modal life experiences of historical populations.[57] Dennis Hogan's ambitious Transitions and Social Change[58] is based almost exclusively on a single large retrospective interview survey of the transitions of American men from youth to adulthood, covering in a very different way the same general subject of this volume. Hogan has looked closely at the individual level at both


22

the timing and sequencing of transitions and is as concerned with the amount and sources of variation within the single-year birth cohorts he examines as with central tendencies . In addition, he has sought to explain these statistically with a number of independent "historical" variables characterizing succeeding birth cohorts.

My interest has long been in the Janus-faced relationship of these changes in the aggregate to the choices as faced by the individuals making them. In reviewing Elder's path-breaking Children of the Great Depression in 1975, I argued that for all Elder's concern to place individual development in historical context, in the end he was most interested in the one-way relationship between the two—in the impact of large-scale historical change on the way individuals' lives were lived. I argued that a social-historical approach to the life course might be no less interested in the way those altered individual experiences aggregated to constitute a new context for others living through these changes. I maintained that when Elder examined the impact of the Great Depression on the subsequent lives of children and youth at the time by comparing those whose families had suffered substantial declines in income with those whose families had not suffered declines, he implicitly assumed that the direct impact of economic deprivation markedly outweighed the indirect impact—that which might be felt by all families who observed others' plights, who anticipated hardship, who compared their situations not with the period before the Depression but with what might have been. That Elder found differences on the level of individual families implies nothing about the existence or magnitude of universal, contextual effects.[59] Even "kids" can make history, as their choices aggregate into behavioral patterns and, rationalized, become normative. It will be shown how dating, in the 1920s a liberating invention largely of girls' making, became by the 1960s a vehicle that often constrained girls in the choices they now were permitted to make.

The amount of certainty and determinism in the environment of individuals has varied historically,[60] and I find it a fascinating paradox that the relaxation of certainty in the material environment may possibly give one's community the freedom to impose a regime of individual decision-making that in fact


23

may be more, rather than less, externally determined in the perception of the individual.[61] The early commitment of young members of the postwar cohort to marriage or childbirth had not in itself committed other members to similar prompt action. Rather, change in cohort behavior was essentially the sum of annual responses to period phenomena: That is, memory was not cohort-specific. But the kinds of period phenomena I show to have had an impact on the timing of vital events were sometimes subtle enough that actors did not always understand themselves to be responding to them. In fact, their response was not to them directly but to changed circumstances underlying the balance of prudence, idealism, and optimism that characterizes individuals' decisions to form a family. As environments have gradually shifted, so have Americans' sense of how one "ought" to form a family, but these shifts did not affect particular cohorts uniquely, bringing about a society that on the level of belief was stratified by date of birth (or marriage, or parenthood) about values regarding family formation. Yet environment did not impinge uniformly on people of different ages, and herein lay the mechanism by which characteristic timing patterns in the life course changed.

Were these kinds of changes over time in the experience of cohorts a product of some initial cohort characteristic—whether predisposition or powerful early experience or radically different upbringing—or did historical experience occurring over the life course of the cohorts produce the distinctive life course curves? This question amounts to trying to decipher the impact on age-graded behavior of "cohort" and "period" processes.[62] The original impetus for this line of questioning came from the discovery in the early 1950s[63] that the accelerated birth schedules of that period were not a reversion to older large-family norms but instead constituted a long-lasting revision of the tempo of Americans' childbearing, a new style of family formation possibly related to a new style of family.[64]

"Period" effects were overall the most important in explaining those aspects of the life course that concern me here. In no instance did the kind of circumstances that typically have differential effects on persons of different ages—say, the unemployment produced by economic depression or the severe dis-


24

location of a large call to military conscription—set a whole cohort into a distinctive timing pattern that was sustained through its life course. Social history does not exclusively study cause and effect, but it ought to sort them out when possible. I am arguing here that a set of environments promoting early marriage and childbirth (for example) made possible the articulation and, no doubt, the practice of a variety of normative schedules that were not themselves innovations but rationalizations. Indeed, as we shall see, these rationalizations typically were drawn from elements already present within the set of ideas explaining (and, admittedly, setting outer limits to) family-formation behavior in recent times.

Even though it is apparent that the marriage and parenthood "schedules" of cohorts changed very markedly, it also seems to be the case that it was the environment for marriage and childbirth that changed lastingly, that this changed environment eventually affected members of virtually all cohorts undergoing either of these transitions, and that new life course schedules tended to come into effect which influenced all subsequent birth cohorts —at least until another "period" phenomenon contributed to the establishment of a new pattern. This is not to say that the heightened early pace of vital events had no impact on the lives of cohort members in subsequent years. It does mean, however, that the sets of actors' perceptions, values, and understandings that arose as part of these new schedules were not unique to particular cohorts but were shared by all of an age to be married or become a parent.

Demography and the Social History of the Life Course

This volume is about the summing up of multiply caused, individually engaged lines of action that altogether amount to a change in the way a whole cohort of individuals face the world. I would like to know with certainty whether (as I suspect, and as I will argue based on admittedly modest evidence) as dating became an institutionalized stage in the adolescent life course, an introduction to heterosexual physical pleasure became a more rapid and more certain concomitant of courtship.


25

This must inevitably be "latent history," in Bernard Bailyn's sense, history that emphasizes themes—certainly the aggregate themes—that were not necessarily important or perhaps even present in the minds of the participants. The debates over aspects of the life course that I discuss as often as not followed behavioral change; or, a quietly institutionalized pattern like engagement may change with no explicit cultural debate at all. The justification of writing latent history is that the themes it takes up are important in some sense that contemporaries did not recognize but that we can now recognize in hindsight. The justification here is that the life course, as a socially organized process of growing up, is an abstraction that allows us to focus a variety of simultaneously acting demographic, material, and cultural developments on one coherent aspect of experience of contemporaries. I hope this effort will enable us to see how sometimes subtle shifts in the way the sequence of life course events has been organized have brought individuals to the stage of parenthood, and to antecedent stages, differently prepared and with different understandings of what that stage entails. "One's own," along with the process of achieving it, has changed.

At the core of this book lies a demographic approach, sometimes applied unconventionally, one not massively different in its logic from that employed by the aggregate-level, neatly demographic empiricists, like Hofferth, Uhlenberg, or Hogan, although far more informal. In this vein I seek to discover, and thereafter to explain, group and over-time variations in rates of transitions —for example, annual rates of marriage among the unmarried, of first parenthood among those married during the previous year, or, by extension, of first premarital coitus among single virgins. Around this core, as much as evidence and imagination have permitted, I have tried to build a double contextual framework—of the fit of the individual transitions into the socially constructed life course and of the fit of this life course into the material and institutional imperatives of the day, as they impinged on individuals.

My commitment is to understand the life course as a series of individual decisions that are not determined but are nevertheless structured by external phenomena, including the prior


26

behaviors of others in the same cohort. I argue that in the twentieth century, the youthful life course of Americans has been quite malleable. This is not so startling, however; in early modern England, the age of marriage moved sharply downward in response to the shift from a landed to a protoindustrial economic base.[65] What is special about the American situation in the twentieth century is the variety of forces to which life course timing responds, most notably, in the realm of beliefs. Especially striking in this account are the subcultural and institutional structures erected by young people themselves, which have played a substantial part in setting the timetable for coming into one's own. This underlines much of the enlarged salience of the youthful life course and explains, too, some of the heat of the on-and-off debate over it. For the way one grows up is closely related to what one becomes.

Demographers proceed by confining their measures as much as possible to those "at risk" of experiencing that which is of analytic interest, as I try to do here. Thus, although Alfred Kinsey's extraordinary data on sexual behavior permit me to measure a fair amount of important information on "petting," enabling an estimate of annual rates of transition into the status of "having petted," for instance, they unfortunately do not permit me to estimate like rates based only on those who are dating—even though (with some exceptions, of course) only those dating are really "at risk" of petting. Likewise, demographers proceed by progressively "refining" their measures. As much as possible, they measure what they are interested in for narrower and narrower groups, so that they may discover differential rates and seek reasons for these. So do I, although I am often constrained by the modest evidence available.

Because two of the transitions centrally treated in this book are commonly understood as demographic phenomena, and because my most secure and therefore more primary methods are demographic, this book focuses more than it otherwise might on marriage and parenthood , and somewhat less on other elements of the youthful life course. Marriage ("nuptiality" is the demographers' technical term for its study) has been widely studied and in recent times has been well documented. I focus


27

on first marriage, with some reflections on departure or non-departure from it by way of divorce.[66]

My treatment of parenthood is something of a twist on the best-studied, best-documented aspect of demography, "fertility." While demographers are mainly interested in one particular product of the act of giving birth, the babies who will eventually replenish the population, I am interested in a different product, the parents who came into being with the birth of their first child. This means that only nonparents are at risk of becoming parents and that only firstborn children can be counted when I compile the rates according to which those at-risk couples become actual parents. Although most demographers' analysis of fertility is thus of no direct use to me, fertility data has been commonly enough tabulated by parity (birth order) that I can base my argument about the transition to parenthood quite solidly. Unfortunately, only in recent years has reliable information linking first-parity birth data to time since marriage been widely available, for where this is available, it permits me my preferred way of examining fertility—as the subsequent transition of a married couple, after however many years of marriage, into parenthood.

On the whole, marriage serves as the centerpiece around which I array other life course transitions, especially as I work to establish the relationship of the timing of one event to prior and subsequent events. In this account, then, divorce is by and large examined in a demographer's life course style as an event terminating a marriage after however many years, or, more in keeping with the marriage-centric tendencies of this book, non-divorce in any given year after a first marriage is seen as indication of the survival of that marriage. By the same token, the intent of my perspective moves me toward seeking to make statements about the extent to which coitus and other, less culturally freighted aspects of sexual exploration have preceded marriage. Were there consistent, reliable data, I would wish to know (changes in) the proportion of dating couples who had already petted who went on to coitus, as well as the proportion of individuals who had dated by ages 12, 13, 14, and so on, who had petted by ages 14, 15, 16, and so on, and who had had


28

coitus by 16, 17, 18, and so on. I have to make do. By and large, I have chosen to relax my standards of certainty rather than my descriptive and analytic ambitions. But because the data are invariably weak as they approach the edges of my account, I use the solid core as an anchor.

Because this is a social-historical account, not a demographic one, the circumstances in which transitions are accomplished are of particular interest. Especially interesting are the institutionalized structures and rites that commonly surround transitional events. In a subsequent chapter, I make an effort to study engagement at a particular historical juncture at which the partly institutionalized life course stage was under pressure. Because of that pressure, I believe, documentary materials were produced from which the historian could discern at least a speculative account. But engagement also proved one of my most conspicuous failures in research, for I had—quite erroneously—imagined that both secondary and primary sources would be readily available. In fact, neither are (again apart from the normative, and even these are slim).[67] Perhaps we can take this lack of interest as indicative of a lack of importance placed on engagement by twentieth-century American culture, but both the Kinsey data and a variety of studies of marital happiness indicated that both the fact and the length of an engagement have mattered to the kind of marriage that eventuates.[68] This suggests that the event has a place in the analysis of the life course.

The preponderance of evidence presented here is quantitative. At first glance, this may not seem to be consistent entirely with my shared focus on material, institutional, and cultural considerations facing individuals constructing their life courses or with my wish to imaginatively reconstruct the contents of life course transitions, but I believe that it is. The reason for my having made a more determined search for quantitative materials than for (say) diaries and letters that might directly reveal individuals' own constructions of their situation lies in the nature of the life course as I understand it. My first concern here has necessarily been to describe in considerable detail and with as much precision as possible the range of options that individuals might have taken and the distribution of the options actually elected. Only after having assessed the overall, aggregate struc-


29

ture of "experience" in this sense do I move to the macroscopic level, to the level of material circumstance, institutional arrangement, and cultural prescription.

The optimal kind of document for my quantitative use sometimes exists, for it is a kind of document that has come to be in exceptionally high demand as social science has moved toward seeking processual views, namely, the individual-based, longitudinal record that allows one to describe transitions, sequences, and, sometimes, actors' perspectives on these. With such records, one can examine the delicate weave of individuals' trajectories through the structures that, from one perspective, help form them, and, from another, that they help to create. But such data do not exist, not remotely, for the earlier periods that I treat in this book. Therefore, I must use a variant of the historian's craft, must make do with all sorts of unconventional and admittedly imperfect evidence, used—sometimes—unconventionally.

And to do this, I must proceed first by pressing hard against the available aggregate , quantitative materials to pull from them plausible suggestions of portions of the careers of the individuals. To try to read the behavioral options of individuals out of the observed behavior of a cohort (or, worse, a cross-section of individuals at different ages viewed at a single point in time) is, formally speaking, a perversion of the data.[69] But, then, historians always pervert data. An operational definition of a historian's methodological skill, I believe, would be the ability to find in the shards of the past something that their creators did not intend to express by having created them. One can do so as responsibly as possible, seeking, as would a demographer, the most precisely constructed "at-risk" measures that can be discerned there. The quantitatively sophisticated reader will recognize that I am using many kinds of data here as though demographic, around which interpretation will be arrayed.

The story I am telling is a national story. Regional variation is not one of the phenomena I am particularly interested in exploring, even where the data are available, except where such variation allows inferences about change at the national level. But often I must have recourse to local data, for, very often, that is all that is available. Most census data, but not all, pertain


30

to the national level, but it has only been quite recent that vital-registration data have been uniformly available on this level. A good deal of the social research that I cite or on which I carry out "secondary analysis"[70] in pursuit of my story is local. My assumption is that good samples that are representative of identifiable local populations preclude thereby the largest dangers, but sometimes I have been forced even to relax these cautions. Usually, however, I use local data—like the Minnesota wedding reports—either to provide time series information that otherwise is not available where the trends are presumably produced by responses to the same kind of macroscopic changes affecting the nation generally or to examine systematic variation within the data. Insofar as national trends outweigh place-to-place variation, I believe I am generally on safe ground, that the range of phenomena I find out about is important, and that we can informally take into consideration the fact that my materials are derived from wherever they could be found. But I have regularly taken the more aggressively interpretive path in preference to the more cautious.

The variety of social science data sources that are available to the resourceful historian of the twentieth century is surprising. Wherever possible, of course, I have attempted to reanalyze the original data, not because I mistrust previous analysts but because many of my purposes are "perversely" demographic, or to make close comparisons to other materials of no interest at the time of the first analysis, or in some other way run athwart others' purposes. More of such raw data tend to be available the more recent the period on which one seeks information and, generally, the richer the survey, although I was fortunate to be able to use the raw data from three superb and very old studies: an extraordinary commercial survey of youth taken in 1939, the Indianapolis Fertility Study of 1941, and Alfred Kinsey's Study of Sexual Behavior, gathered from the late 1930s into the early 1960s. More often than not, however, raw data are not available,[71] but often enough even published data-arrays reveal other things than initially seen, or of interest. In this category, large numbers of publications of the Census Bureau have been most useful, as were, also, a number of social inquiries carried out under WPA aegis in the mid-1930s.[72]


31

While I have, where possible, utilized public opinion survey research as an important clue to cultural change and relied in later chapters rather heavily on others' syntheses of such materials, I have also carried out a fair amount of primary research in a number of unconventional nonquantitative sources. In the examination of the changing nature of wedding ritual, presented above, I quantified a type of "belletristic" evidence. In another instance, I systematically examined another stylized belletristic source—lovelorn letters—with respect to the evolving vocabulary for describing "dating" relationships of young people. And in surveying a substantial amount of imaginative but tasteless short fiction describing courtship, I read with an eye closely attuned to the formal structuring of courtship problematics in those stories and the diction used to engage readers' emotions in the events of the fiction. But I sometimes simply read "culture" as historians ordinarily do—in a nontechnical sense in which descriptive, hortatory, normative, and personal documents are intuitively scanned for "what was at issue" at that time and place. In this vein, the documentation pertaining to cultural conflicts and difficult-to-resolve issues related to family formation was of particular interest to me. My reliance on such materials lessened with the greater availability of social science data for more recent periods.

On the whole, I feel that such informal procedures as I used with the cultural materials are justifiable in the main because I have assessed them in the light of the behavioral, "demographic" core of my account, which was independently gathered. This, of course, was the evidentiary strategy of my research in the first place. But neither author nor reader should blink at the fact that the fundamental criterion for accepting the interpretation that this book constitutes is that it is intuitively plausible in view of a large and varied body of evidence. That is, despite all the numbers, this book is history, not social science; it is a piece of conventional history about an unconventionally chosen subject, employing unconventional evidence. It is an effort at writing a history of an aspect of social change the conceptualization for which is drawn from social science and the data for which have typically been provided by social scientists.

I had initially hoped to explore three aspects of differential experience systematically: by gender, by race, and by social


32

class. It is apparent that males and females, blacks and whites, working-class and middle-class people grew up according to somewhat different schedules at any given time, and often with somewhat different values. But my account has not turned out to be systematically comparative in this sense. I have succeeded best with regard to gender, which is the most important differentiator and also, happily, the one for which evidence is easily the fullest.

But demographic data is not always available broken down by race, and belletristic evidence is slight for blacks. Indeed, because Afro-American history is now beginning to develop the broader outlines of a social history, the kinds of relatively intimate questions raised in this book are as yet quite obscure, the evidence required to elicit answers to them not in the least obviously available. Where I have been able, I have made racial comparisons, interpreting these differences in the light of the broader general trends (pertaining, I am often afraid, especially to the white majority). And the evidence on social class, while often more readily available than that on race, is also more difficult to interpret, with a variety of indices of class rolling the question about exactly whom one is talking about. Again, where possible, I have detailed and discussed socioeconomic differentials, but, as with blacks, I have never felt confident of interpreting these within the distinctive social history of particular classes , rather than as somewhat simple comparisons to more aggregate trends. The largest reason, in life course terms, to have developed the race and class differentials fully is to test the intuitive hypothesis that while class differentials are declining, the experience of the two races in this important aspect of social and personal life is becoming more distinctive . But to carry these accounts beyond the essentially demographic terms in which scholarship so far has taken them is beyond my capacity at this point.

This book is conventionally historical in that, after two general chapters, it uses periodization, not just to define chapters but to allow me to focus attention on different portions of the life course at different historical periods and to argue for varying causes, from period to period. Obviously, "periodization" is a radical simplification, subserving stylistic purposes no less


33

than more analytic purposes. The formal assumption behind periodization is that the periods can be treated as more or less internally homogeneous with regard to some important underlying dimension or trend, or at any rate as more alike in this regard than they are like the periods that precede and follow.

The 1920s, not unusually, are treated as a distinct period. The Great Depression serves as a second period for my account and World War II as a third. The "baby boom" constitutes a fourth period, one a bit more unconventional than the others in that it is periodized according to somewhat uncommon criteria. A fifth chronological chapter treats the period that runs from the end of the "baby boom" to 1975, at which point, roughly speaking, many of the family formation phenomena under study began to change once again. The "periods" I treat are short, far shorter than those in most social-historical accounts. I have arranged the account, first, to emphasize my substantive argument about just how malleable the youthful life course has become and how subject it is to a shifting debate. But I have periodized also to highlight the ways in which material or institutional change—which largely defines all but the most recent period—intertwines with the more manifest and more commonly remarked cultural change. And I have exercised my historian's right to leave the most recent phase of development to others.

I have emphasized in each chronological chapter a single contested or sharply modified transition around which to organize a larger part of the story than might have been available to contemporaries. Frankly, I do this partly for the modest drama it brings to the longish and complicated story I tell here. For, despite this selective emphasis, it is the argument of the life course approach that the sequence of events is cut of a single cloth. In the first "period" chapter, focusing on the 1920s, I elect to emphasize the evolution of an institution that governed (and brought progressively earlier into the lives of individuals) the transition to heterosexual erotic and emotional exploration: dating. In the chapter on the Great Depression, as dating continued to develop and diffuse, my emphasis shifts to a phase of the life course that lost much of its meaning: engagement. The following chapter, dealing with World War II, looks closely at


34

the way military service affected entry into marriage. In turn, the focus shifts to parenthood, appropriately enough, in treating the baby boom decades that followed World War II. The final chronological chapter, dealing with the challenge to and repudiation of the baby boom in many of its salient aspects actually has several emphases, notably, the freeing of sexuality from the "timing" elements previously contained within the institutions of dating and marriage.


35

2—
The Changing Life Course of America's Youth

The transitions into which the life course can be analyzed are deeply embedded in the material, institutional, and cultural circumstances in which they are accomplished. This will be the subject of subsequent chapters. As background, this chapter lays out a number of trends and patterns seen across the half-century examined in this book, trends in the very transitions on which the analysis will henceforth focus as well as in closely related contexts. The materials presented here are highly aggregate, and the treatment is descriptive and fairly close to the underlying data.

The sequence of discussion essentially follows the sequence of the typical life course. The data presented will show that:

• schooling has come to extend later into the life course, but gainful employment, receding for a while, has advanced in recent decades.

• for young men, military service, which greatly affects the timing of marriage as well as the departure from school and entry into the civilian labor force, initially rare, became common for a generation before once again becoming rare.

• premarital coitus, first with fiancés and then among other intimates, became more common, first slowly and then more rapidly; no less significant, women's patterns have converged on men's. Premarital conceptions followed a different path, growing somewhat earlier than the major expansion of female premarital coitus and being especially prominent among black women.

• marriage, long moving earlier in the life course, turned around (as Friedan and Brown had hoped) and did so ear-


36

lier for women than for men. In addition, the determinants of marriage timing have changed.

• parenthood trends have generally followed nuptiality trends, but the relationship of these two transitions has changed in that the sequencing of pregnancy and marriage has become less determinate, and the average interval between the two has changed markedly from period to period.

• divorce propensity has grown unevenly, but rapidly, without the typical timing of divorce within marriage changing much and with no trend in the strong relationship of the age at marriage to probabilities of divorce. But an initially negative relationship of divorce and parenthood has largely disappeared.

• overall, many of women's life course patterns have come more nearly to approximate men's. But many patterns of blacks and whites have become less similar to one another.

Leaving School, Beginning Work, and Military Service

The median age of leaving school and of entry to gainful employment has risen in parallel from about 16 in 1920 to 18 or 19 in 1980. Transitions like that from out of school into work typically imply changes of roles that themselves determine substantial elements of one's daily rounds, perspectives, and obligations. Requiring the learning both of a cognitive and an emotional sort, life course transitions of this magnitude are commonly psychologically demanding, because of the need to learn new ways. In the years following the Great War, educators resumed their effort, begun decades before, to extend the influence of the school into late adolescence. Proportions of adolescents entering high school increased rapidly, so that four in five boys and five in six girls did so by the early 1940s. By 1980, virtually all did. The proportions of young people who graduated from high school increased right along with increased access to high school. The proportion of boys and girls who were graduated from high school increased from some 16 percent around 1920 to about half by 1940, and to 67 to 85 percent


37

between the immediate post-World War II high school cohort and those of the late 1960s and early 1970s, after which the figure stabilized. And through the 1960s, for increasing proportions of high-schoolers, graduation led to college.[1] Since World War II, however, the age of school departure has become rather less narrowly defined, with a more substantial proportion of eligible young people than previously remaining enrolled in school until their early twenties.[2] As schooling has pressed later in the lives of many Americans, entry to the labor force has come earlier for a substantial minority of the youthful population and earlier also than the initial departure of a substantial minority from school .

Thus, there has been a secular trend since the 1920s to push later in life the ages of a full transition from school to work, while, since World War II, there has been an accompanying loosening up of the age at which at the latest young people leave school and of the age at which at the earliest they enter gainful employment.[3] These trends exist for both genders. The transition thus involves an increasing number of people who simultaneously find themselves in school and at work, as table 2 documents. Growing from a low proportion among boys and a very low proportion among girls, rapidly and regularly increasing proportions of those who were either in school or at work were simultaneously engaged in both , rising to over one in four for boys at 16 and girls at 17, which reflects a phase in late adolescence that for increasing numbers was complicated by the simultaneous occupancy of roles that many have said might have implied conflicting demands. Here we find an increasing tendency to learn the new without exchanging it for the old and, as such, a subtle but significant change in the construction of the life course.

More than any other important youthful life course commitment, military service has varied irregularly from cohort to cohort because of the sporadic wartime mobilization.[4] Before World War II, the demands of military service on young people were minimal. The standing army was small in the 1920s and 1930s, and service was entirely voluntary. Conscription began in 1940, and because of the relatively small cohorts that came of age during the war and then again during the Korean con-


38
 

Table 2. Proportion of Those Either Enrolled at School or in the
Labor Force Who Are Involved in Both, by Age and Sex,
1930–70 (in percentages)

 

Male

Female

                                            1930                      1950                     1970                        1930                      1950                      1970                                                    

At age

14–15

7.2

13.3

13.1

  3.2

4.7

6.5

      16

9.9

17.8

26.5

  5.1

9.0

16.3

      17

9.0

18.7

26.4

  5.4

13.2

26.4

      18

7.1

13.4

29.3

  5.7

11.2

22.9

      19

5.5

   9.5

21.9

  5.9

9.3

18.2

      20

4.3

   7.6

19.2

  5.6

8.0

16.1

21–24

2.8

   7.8

13.3

  4.3

5.4

9.0

    SOURCES : Calculated from Census 1930–1, 1182–1183, and Census 1970–4, 97–98.

 

Table 3. Proportions of Men Serving in the Military by Service in
War or Otherwise and Year of Birth (in percentages)

                                                    All Service                                    War Service                              Peacetime Only

1901–05

16.5

13.3

3.2

1906–10

24.3

22.0

2.3

1911–15

35.6

34.0

1.6

1916–20

57.3

56.5

0.8

1921–25

74.2

73.5

0.6

1926–30

72.3

69.2

<3.2

1930–34

64.9

54.2

10.7

1935–39

42.3

14.2

28.1

1940–44

36.1

23.9

12.3

1945–49

39.2

38.1

1.1

1950–54

17.0

14.7

2.3

    SOURCES : Calculated from Census 1960–6, 1; Census 1970–5, 1; U.S. Veterans Administration, Veterans in the United States: A Statistical Portrait from the 1980 Census (Washington, D.C.: Office of Information Management and Statistics, 1984), table 1; and census tabulations of age by sex.
     NOTE : All men's veteran status determined between ages of 35 and 44 except the two earliest (at 50–59) and the two youngest (at 25–34) cohorts.


39

flict, large proportions of young men passed through the military at one time or another during the following three decades, until the peacetime draft was abolished in 1973. During 1953, the last year of war in Korea, very nearly one in two at the peak age of 20 served. Military manpower needs stabilized and were satisfied increasingly smoothly by the growing cohorts attaining military age. Vietnam, by the end of the 1960s, produced a new phase, particularly affecting those who had recently finished high school and who did not go on to college. A third of all male 20-year-olds served during 1968—high but not rivaling the Korean demand. But after Vietnamization and then the ending of the draft, the proportions of young men called on to serve in the military dropped to levels lower than at any time since World War II. After conscription was replaced by a volunteer army, authorities had to depend substantially on the material benefits they offered to induce young people to commit a phase of their lives to the service. For adolescents, the military became just another job option, as indicated in table 3, which summarizes the military experience of successive cohorts of American males.

Premarital Coitus

Something of a consensus on periodization has emerged among students of premarital sexuality. They record two sexual "revolutions," one in the first two decades of the twentieth century, and a second spanning from the mid-1960s and the late 1970s. Catherine Chilman, reviewing the literature on adolescent sexual behavior, sees continuity across the two revolutions.

Sharp changes in the United States toward greater sexual liberalism occurred in the early 1900s and were reflected in the more emancipated behaviors of a sizable proportion of middle- and upper-class women in the 1920s. . . . As women became more emancipated from earlier puritanical prescriptions, men became more emancipated too, especially in terms of greater freedom to have premarital and extramarital sex relations on a more equalitarian, companionship basis with women in their own reference groups. This trend toward increased sexual liberalization has


40

strengthened recently, especially since the mid-1960s. . . . It is probable . . . that further liberalization of attitudes, if not of behaviors, had taken place between the 1920s and the 1960s, especially as a result of upheavals caused by World War II.[5]

Our understanding of the first revolution rests heavily on the retrospective accounts of premarital sexual behavior tabulated by Alfred Kinsey and his associates.[6] The Kinsey data presented in table 4 reveal a general increase in premarital coitus for both white men and women, with the greater increase having occurred among fiancées among women, nonaffianced friends among men. Throughout this period, many men apparently engaged in occasional casual premarital coitus with relatively promiscuous women, who constituted a relatively small portion of the female population. The persistent, perhaps increasing gender difference is the outstanding datum in the table and will bear contrasting with patterns seen in the second "sexual revolution." The modest changes in incidence of premarital coitus in the early figures, too, was accompanied by a decline in the variance in age at which individuals experienced coitus for the first time.[7]

The second sexual revolution is better documented, and revealed that 1960s women were different from their predecessors. A late 1950s Midwest study, replicated in 1968, saw women's rates of coitus increase from 21 percent to 34 percent, while men's rates were stable.[8] These figures were virtually duplicated by a probability sample of students in nonreligious colleges taken in 1965 and in a 1967 national sample of college students.[9] The 1967 sample was tabulated by class at school and indicates an initiation by sophomore year of nearly all the young men who were going to be sexually initiated at any time during college. For young women, however, the rise was steady, year by year. A thorough, representative national study of girls' sexuality in 1971 revealed that by that date, the high-school years were a time of steadily increasing coital incidence, to the point where one-third of the eighteen-year-old white girls had had intercourse and four in ten of the nineteen-year-olds.[10] Replications of this survey have shown a continued enlarge-


41
 

Table 4. Proportion of Ever-Married Persons Having Had
Premarital Coitus with Fiancées and with Others
(excluding Prostitutes), by Birth Cohort and Sex
(in percentages)


  Birth

Approx. Period
of Premarital


Males


Females

Cohort                        Coitus                                   Fiancées                      Others                       Fiancées                       Others

< 1900

< 1920

45.4

66.3

31.1

15.3

1900–09

1920s

61.1

71.9

40.2

27.2

1910–19

1930s

50.2

76.5

41.2

18.9

1920–24

early 1940s

57.6

81.6

33.7

21.1

1925+

mid-1940s+

48.0

80.1

38.4

22.1

SOURCE : Computed from Kinsey Sex Histories, standardized for year of interview and educational level.

ment of the field of adolescent girls who had had coitus from the early 1970s to the mid-1970s and again to the late 1970s.[11]

Substantial proportions of firstborn children in America have long been premaritally conceived, but this proportion has changed markedly.[12] Careful tabulations of retrospective family formation schedules gathered in 1975[13] allow us to examine closely the early months of marriage to see how fertility and marriage were sequenced. Figure I represents the proportions of women married at 18–21 or at 22 or older who became mothers at such a time that premarital pregnancy was clearly indicated, for five-year marriage cohorts beginning with women first married between 1930 and 1934.[14] Both the racial differentials and the time trends are quite large. A marked rise in antenuptial pregnancy among both whites and blacks seems to date from about the late 1950s, so that while about one in eight 1930s white marriages were preceded by pregnancy, close to one in five 1960s white marriages were. For blacks, corresponding figures differed by age but indicate, overall, a rise of from about one in three black marriages preceded by pregnancy to over one in two. Among blacks, premarital pregnancy characterized young brides rather than older brides, but among


42

figure

Figure 1.
Proportions of Women Married at 18–21 or 22 and Older Who
Had a Child within 8 Months of Marriage


43

whites, the relationship with age was quite weak.[15] The heightened emphasis on marriage (and, ideally, subsequent parenthood) for young white women in the baby boom did seem to argue subtly for a corresponding relaxation of the ban on premarital conception. There was, however, a striking downturn in premarital conception during the last decade on which the table bears, a period known for its considerable liberation of female sexuality. The availability of legal abortion probably explains the downturn more than does any tendency toward other prudent behaviors; but the downturn points also to a reduction in the favorable attitude toward parenthood, as reflected in the cumulative-parenthood graphs just examined.

Further understanding of the normative aspect of the sequencing of marriage and fertility can be obtained by examining the subject of illegitimacy, using data from another retrospective family-formation survey carried out by the Current Population Survey to analyze births to mothers (not just wives) at ages 15–19 and ages 20–24 during successive four-year periods.[16] By dividing antenuptial pregnancies into those that come to term before and after marriage, we find that the former have been far more common among black than white women, even when the larger component of antenuptial pregnancies among black women altogether is taken into account. Between the late 1930s and the late 1960s, the legitimization ratio for both races remained roughly steady, but in recent years, among whites and even more so among blacks, younger girls became sharply less prone to legitimizing their pregnancies through marriage before parenthood. Or, to put it another way, as the normatively defined schedule of marriage moved once again toward a later preferred date, higher and higher proportions of younger girls who had become pregnant could not or would not marry promptly, and to a greater extent than among women somewhat older. Single parenthood became more acceptable, illegitimacy less of a curse—changes evidently more quickly remarked by relatively young girls than by their elders. And this was particularly true among black girls. By the 1975–1978 period, seven in eight births to black teenagers were outside of marriage . The differential, and the trend, is great enough to


44

suggest a distinctive normative element to at least this element of the black life course.

Marriage

Over our period, the phase of life in which marriage is typically contracted converged with that of the school-to-work transition. For men, the entire period in the 1960s saw a general, if uneven, decline in the median age at first marriage and a marked contraction (certainly during the 1940s and perhaps beyond) of the period of years over which a great majority of young men contracted their first marriages. Bachelorhood became a less and less prevalent stage, and whatever bachelorhood there was became something that happened at a relatively young age. For women, these trends were present until the late 1950s, when there was some reversal. What this means in practical terms is that for increasing proportions of young men and women, the processes of leaving school, entering the labor force, and entering marriage occurred at nearly the same time.

Thus, in 1930 fewer than 1 percent of young women ages 16–19 who were in school were married, as compared with nearly half of those who had left school.[17] By ages 20–24, at which point 85 percent of young women out of school were married, a minute 6 percent of those few who remained in school were married. For women, and generally for men, too, schooling and marriage were virtually exclusive statuses in 1930. By 1970, the proportion of women in school and married at 16–19 had risen slightly although the proportion for those who had left school had declined .[18] And at 20–24, while the proportion married among those out of school was only three-fourths as high as it had been in 1930, the proportion married among the considerably enlarged group still in school had increased threefold . The rather determinate sequence of school departure and marriage of 1930 had weakened.

Labor force entry and marriage had not so unequivocally become less determinately sequenced as had departure from school and labor force entry and marriage. At 20–24, proportions married among those men not in the labor force increased by about twofold from 1940 to 1970, a more rapid increase


45

than that among men in the labor force.[19] Among women, the change was enormous, but reflected not so much the destruction of determinate sequencing of the transitions of the life course as new assumptions about appropriate roles for married women. Essentially, the propensity of women in the labor force to be married saw such an increase between these years that it almost effaced the initially radical difference in this propensity between themselves and women of like ages who were not working.

Young men and women, then, while facing a set of developmental tasks that was not markedly different over the years in question, accomplished them with timing and sequencing that had changed. In part, these behavioral changes simply reflected different constraints and resources facing the participants. But in part, they reflected and even encouraged changed outlooks on "growing up," as experienced. They also reflected institutional change.

Figures 2 and 3 present the annual likelihood of marriage among young white men and young white women still unmarried, at given ages.[20] We here examine first marriage at a relatively early age (taken as 19 for males, 17 for females), at two ages at the beginning and end of roughly the most typical marriage ages (22 and 25 for males, 20 and 23 for females), and at an age taken as somewhat old (28 for males, 26 for females). Turning first to white males, and looking first at marriage at the youngest age shown, we see in Figure 2 a pattern of only very modest change through the mid-1960s. A brief downward deflection at the beginning of the Depression and another in the last two years of World War II breaks a slowly upward-drifting pattern, as youthful marriage for white men became just slightly more prevalent, the most notable period of increase coming in the decade following the end of World War II. The marked downward trend in the mid-1960s[21] reflects a marked and persistent backing away from youthful marriage on the part of white males, to the point where the annual probabilities of marriage of any unmarried 19-year-old were less by 1979 than they had been fifty years earlier, before the great restructuring of this part of the life course around World War II.


46

figure

Figure 2.
Annual Likelihood of Marriage among Young Unmarried
White Males

We see even more striking changes in men's nuptial patterns at the older ages, especially those in the central ages at which first marriage was the most probable. At 22 and at 25, the graph moves up quite steeply from a less than 10 percent probability in the earliest years considered to more than a 20 percent probability by the mid-1960s. Once again, from the perspective of the young people themselves, this quantitative


47

figure

Figure 3.
Annual Likelihood of Marriage among Young Unmarried
White Females

change represents a real modification of experience. Young men entered on courtship—indeed, life more generally—with different expectations when they knew from experience that there was a one in five chance of marrying in the year than when they knew that there was merely one chance in ten.

The marked upward trend was broken by sharp downturns. The Depression mattered a great deal, although the tendency


48

toward postponement of marriage at these two ages was reversed by the middle of the 1930s. The sharp upward spike in World War II was prominent at both these ages. The postwar spike was even more so. The era following the Korean conflict constituted another upward spike, and a lasting one, so that by the time Vietnam came to influence the choices of young white men, the movement up to a probability of marriage in a given year of more than one in four for unmarried white men at 22 and 25 was only a slight upward deviation from tendency. The end of Vietnam, however, represented the end of an era in family-formation patterns. The decline in marriage probabilities for white men at the central marriage ages was nothing short of spectacular. White men in their young twenties clearly looked forward differently after Vietnam. Less and less of their youthful life course came to be organized around marriage plans and preparations.

Nonwhite single men, in a general way, changed over the period much as did white single men, but their temporal variations were even more extensive . The Depression and the war had a greater impact on all ages: the marriage boom associated with the 1950s and early 1960s was very apparent. And the collapse in marriage probabilities after the mid-1960s was a highly prominent trend among black men. The suddenness of the decline is clearly pointed out by a decline from about one in four single black men at 22 marrying during each Vietnam year to a mere one in ten marrying by the early 1970s. This perspective helps link the crisis of black teenage unemployment to the crisis of the black family.

The data for white females shown in figure 3 closely parallels that for white males, except for an earlier decline in marriage probabilities among younger women starting in the late 1950s. The peaks and troughs present in the male graph are once again present, but white females exhibited one markedly new tendency: the still-single "older" women lost considerably in marriageability under the baby boom marriage regime. The "sorting" of women into the marriageable and the future spinsters occurred early and vigorously in the baby boom, one of that period's experiential meanings. The minority that failed to "pass muster" at the age when most succeeded were as though


49

stigmatized thereby. Men, too, married at somewhat more uniform ages during the baby boom, but this was far truer among white women. There are clear signs of a shockingly more broad definition of "old maid" during the late 1940s and early 1950s. This pattern persisted to the mid-1970s: among white women, marriage was for the young only.

Nonwhite women, like their nonwhite male counterparts, experienced considerably more year-to-year fluctuation than did whites, owing presumably to their greater vulnerability to external circumstances. Early marriage disappeared far earlier and dramatically than for white women. And, for black women, as for their men, the period since the late 1960s has been a period of general deterioration of marriage possibilities.

Detailed New York State marriage-registration data allow us to explore the question of interannual variability in nuptiality, with close attention to age.[22] All age groups varied very considerably from year to year in numbers who married, average annual variation exceeding 10 percent. There were, moreover, consistent differences by age—consistent also for both women and men—in the extent of variation in this variability. Those who married older were more likely to postpone marriages in one year, and, perhaps, to accelerate them the next, as circumstances became less or more propitious for marriage. Young people, it seemed, were relatively inflexible, perhaps pushed by inner urges that they had not learned to weigh against less intimate concerns, perhaps pushed by premarital conceptions.

When observations are subdivided into the period extending from the 1920s to World War II, and into the period beginning with that war to 1967 (when the data series ends), the variability of annual marriages is seen to be far greater in the latter period, despite the rigors of the Depression. More important yet, the relationship of age to the variability of interannual variation was present only in the later era . In earlier times, it seems, younger people had to exercise about the same degree of prudence that older people did. (Lower rates of premarital pregnancy, and hence of marriage promoted by necessity rather than by choice, in the usual sense, probably also played something of a role.)

To a significant degree, the year-to-year variations in cohort first-marriage experience as well as the pronounced secular


50

trends can be explained with reference to a number of measurable, intuitively comprehensible circumstances, both demographic and economic. These differ somewhat by gender, and from the prewar to the postwar period, but they basically tell a consistent story. Most of these circumstances influenced the initial nuptiality of the young most of all, for their resources were generally the least and their buffering from external circumstances the least substantial. Thus, in certain years, because of the volatility of fertility rates in this century and the tendency for men to marry women somewhat younger than themselves, men were occasionally in short supply, slightly accelerating their marriages while retarding those of young women. In like fashion, military calls have varied markedly. When they are high, this has ordinarily been a signal to young people to marry sooner, rather than later. Marriage has been a life experience that we consider highly consequential for the individuals directly involved, consequential as well (if less so) for the two families thereby joined, and consequential for subsequent demography and for the nature of aggregate demand for goods. The paths that have led two individuals to join together in marriage at given points in their own lives have been in a sense prepared by others, by institutions and by rules that the partners have played by in getting to the altar.

In years when jobs were relatively plentiful, and wage income relatively high, marriages were solemnized rather than postponed. On the one hand, a concomitant of twentieth-century prosperity has been increasing job opportunities for women, and their work lives in one way encouraged, but in another way discouraged, nuptiality. On the other hand, women's premarital incomes have made possible earlier marriage: it is, indeed, the dynamic aspect of growing disposable income per capita that is most closely related to marriage trends. At the same time, gainful employment—the possibility of decent independent livelihood outside of marriage—has on the individual level tended to predispose women not to hasten into marriage. On balance, younger first marriages were promoted by women's gainful employment, while older ones were retarded.

These striking age-to-age differences in determinants of marriage probabilities are strictly a product of the postwar. In the


51

prewar period, one single model seems to suffice for men and women of all ages—a rather simple economic model. The constantly shifting age relationships characteristic of the postwar period was not characteristic of the prewar and, arguably, not in periods previous to that. The age-differentiation of the postwar, and the great awareness of age-related phenomena, was a product of the way the environment impinged so variably on people of different ages. After World War II, young men but not older men were strongly influenced in their marriage decisions by the military draft, and by higher current incomes per capita. For older men, the draft operated mildly to encourage marriage, while current income had no particular impact. Younger women, unaffected directly by the draft in their marriage behavior, were affected by current and relative income, as were the older men; for them, these relationships were more or less the same in the older age group, except that for them the draft encouraged marriage.

Parenthood

In the minds of most Americans for much of the twentieth century, marriage not followed by childbirth within a few years was somehow lacking, and partly for this reason the age probabilities of first childbirth—certainly for whites—have generally resembled those for first marriage, slightly lagged, and at a somewhat lower level. Childbirth rates have shown more extreme downward cohort-to-cohort movement than rates of first marriage (notably in the 1909 and 1944 cohorts), a reflection of the far more irreversible economic impact of childbirth. However, marriage—and again this is understandable in commonsense terms—seems to have been sometimes but not always as upwardly "flexible" as first childbirth. This was so especially among the early and mid-twenties in pre-World War II cohorts. Such external events as economic downturns have affected not only the way couples passed through courtship into marriage but also the way they committed themselves to parenthood, once married. Military service and the circumstances surrounding the draft call seem not to have encouraged couples to become parents so much as it encouraged marriage.


52

Parenthood patterns for white and black women are substantially different variants of the same family. On the whole (the difference was especially apparent for the earliest cohorts but remain visible in later years), black women initially had a far more rapid pace of childbearing but trailed off very substantially by their mid-twenties. The tendency toward early childbearing among nonwhite women, perhaps a product of ignorance of and then disregard for birth control, fairly well characterized all the cohorts, from the earliest to the most recent—in contrast to the far more varied teenage-fertility patterns among the cohorts of white women. The slope representing movement into parenthood was generally steeper for white women than for black women; and there was for them no particular trend in this regard. More white women were more nearly universally mothers by age 34 than were black women, a fact that was true for all cohorts examined here but especially true for the 1926 and 1936 birth cohorts, which participated in the extensive baby boom of the years following World War II. Whites participated more fully than nonwhites in this striking episode in the reordering of the life courses of young people.

Tsui's calculations of the average number of months elapsing between first marriage and first childbirth for white women still married offer a summary perspective on the relationship between the timing of marriage and the timing of parenthood.[23] She shows that the rapid decline in average interval continued to the late 1950s, followed by the near plateau into the mid-1960s, and an increase that was even sharper than the rate of decline that followed in the late 1960s and early 1970s. The years of decline in first birth interval were years of especially steep decline in this measure for those marrying oldest . Marital careers had once been sharply defined by the point in the life course at which couples took the plunge into marriage, the cautious distinguished from the incautious. But by the post-World War II period, this was no longer so, for in the war years, the average first birth interval of those marrying under 25 actually increased, sometimes owing to the draft and other war-related inconveniences. For those who married at older ages in those years, however, and presumably less often subject to the draft, jobs and prosperity allowed considerably earlier parenthood.


53

The sharp increase in average pauses between marriage and first childbirth in the 1965–1974 period again brought about differentiation by age at marriage, but with a new twist. In the new arrangement, those marrying youngest were anomalous, because some fairly substantial proportion of their marriages followed antenuptial conceptions. But for the other three, the pace of first births on average now ran in the opposite direction to the youthfulness of marriages. Almost as though couples had, at marriage, both a distinct fertility target and an age beyond which childbirth was thought undesirable, and as though "child-free" years were now valued as a distinctive stage within marriage, it was now younger-marrying couples who postponed first births, while older-marrying couples did so only to a lesser extent, perhaps for fear that they not attain their fertility targets or their preferred date for leaving a childbearing stage.

There is very little reliable data that describe the differential pace of parenthood by the age of marriage of the partners for the earlier years of this century. One source, although only imperfectly comparable, is available in the retrospective fertility questions on the 1910, 1940, and 1950 censuses, and is displayed in table 5. When we tabulate for current ages of white and black women of under 20, 20–24, and 25–29,[24] we find that for marriages contracted late in the first decade of the twentieth century by native white women, there was a considerably higher propensity to become parents in the first few years of marriage than there was to be for marriages contracted late in the Great Depression and even in the half-decade following the conclusion of World War II. These turn-of-the-century couples showed the same marked relationship that baby boom couples showed between youthful marriage and prompt parenthood. The data on both white and black women indicate, in short, that the Depression seems to have brought primarily a general slowing-down of patterns of transition from marriage into parenthood, although those who married older (and there were somewhat more of these) postponed parenthood a bit more commonly than those who married relatively young. After the Depression, however, the hesitancy shown by those black women who married older seems to have persisted far more than for white women, whose passage to parenthood al-


54
 

Table 5. Proportion of Women Married 3–4 Years and in Intact
Marriages Who Have Had a Child, by Race and Current
Age, 1910, 1940, and 1950 (in percentages)

 

White Women

Black Women

Age of first
marriage
****   1910 *                   1940 *                    1950 **                  1910                        1940                    1950***

20–24

86.2

74.7

79.1

79.0

61.6

75.9

25–29

76.7

59.2

73.6

67.6

45.6

58.0

30–34

68.3

51.1

59.2

62.0

39.2

50.9

    * Native white women only.
    ** All white women living with their first husbands.
    *** All nonwhite women living with their first husbands.
    **** The ages at marriage reflect current ages of 20–24, 25–29, and 30–34 at census date and the 3–4 years since marriage.
    SOURCES : Calculated from Census 1940–3, tables 1, 2, 3, 4; Census 1950–4, tables 18, 19.

most returned to its 1910 swiftness.[25] By the time of the baby boom, however, this racial distinctiveness was effaced. It seems to have taken nearly a decade after the war's end, however, for the most "prudent" among black women to abandon the highly cautious attitude toward parenthood that the Depression had instilled—considerably longer than it took white women.

Figure 4 examines trends in immediate transitions from marriage into parenthood, those that occur between the ninth and twelfth month, "wedding-bed conceptions" and their near cousins. The graph suggests that this kind of an unguided or eager initial approach to pregnancy early in marriage described a curvilinear pattern, rising from the depths among Depression marriage to maxima among marriages during or just after the baby boom. The turning away from early parenthood was considerably more abrupt than its earlier expansion. Throughout, those who married older were either better armed with knowledge about birth-control techniques or else less eager to become parents right away, or both. And, throughout, black women who entered marriage without being pregnant were slower thereafter to become pregnant than their white counterparts.


55

figure

Figure 4.
Trends in Transition from Marriage into Parenthood between the
Ninth and Twelfth Month of Marriage


56

figure

Figure 5.
First-Birth Probabilities of White Women between Successive
Anniversaries

When we examine first-birth probabilities of women between successive anniversaries of their marriages, the historical patterns speak even more eloquently of the short-term malleability of the youthful life course. The 1975 retrospective data allow us to examine marriage cohorts from the early Depression on. To simplify, figure 5 shows this story only for white women and only for three marriage cohorts, with separate tabulations for those who married "young" (under 22) and those who mar-


57

ried "old" (22 or older). Several generalizations obtain quite generally:

• The transition to parenthood is more likely in the second year of marriage than in subsequent years. The reason for this was rarely, if ever, biological. Rather, the relationship is to be explained by the unique standing of motherhood among women who had married younger, strongly influenced by the cultural prescription of marriage and parenthood, and before they had a chance to develop other interests and roles.[26]

• Couples who married younger were considerably more likely to become parents promptly, but this difference waned over time.

• After the second year, however, the rate of decline of the likelihood of marriage is slight: once couples made a "decision" to become parents promptly or not to become parents promptly, normative scheduling became less pressing on them. A marriage cohort moved gradually into parenthood for idiosyncratic or circumstantial reasons. (The baby boom marriage cohort was something of an exception here—for reasons I will consider below.)

Beyond these generalizations, the most striking phenomena that figure 5 indicates are two differences between the baby boom marriage cohort and those earlier and subsequent. Most evident, of course, is the increased probability of parenthood at almost every point in the marital career. But equally significant was the extent to which the third year of marriage continued to be one of relatively eager motherhood for brides of this cohort. Indeed, we can fairly say that for the baby boom marriage cohort alone, prompt parenthood was a transition that was virtually enjoined on them. Most couples' effort to accomplish this in the first few years of marriage left only normatively or biologically "deficient" couples still childless at their fourth wedding anniversaries. As we shall shortly see, later in this chapter and elsewhere, as the baby boom faded, so did the normatively promoted connection between parenthood and a lasting mar-


58

riage. That is, timing of parenthood after marriage became more volitional, just as would even the sequencing of the two events.

Figure 5 is particularly eloquent in this regard in its representatives of the 1970 marriage cohort. For both younger and older women who married then, the pact of the transition into parenthood was not only slower but also less differentiated over the years that followed marriage —the exact obverse of the baby boom. Correspondingly, the age at marriage mattered less to the pace of transition into parenthood than it once had. The disappearance of this formerly pronounced pattern could possibly be explained by a spread of contraceptive knowledge and capacity, and no doubt this played a part. But considerably more striking is a normative explanation that also deserves weight: both in the 1930s and in the 1950s, marriage was essentially supposed to be followed by prompt parenthood, the marriage being understood as completed, perfected, by the arrival of a child. But by the 1970s, with new definitions of marriage abroad and divorce considerably more prominent, both in a statistical sense and as an understood and frequent contingency of marriage, couples moved into parenthood both more slowly and more evenly.

Black women's patterns, not shown here, differed from those of white women in some significant ways. If black women who married young were considerably more likely than their white counterparts to become pregnant before marriage, as we have earlier seen, they were and remained considerably less likely to become mothers at later dates in their marriages. Black women who married later were, as has already been suggested, far less likely to become mothers than were their white counterparts. In these respects, the Depression cohort particularly affected blacks. And for older black women, there was no "baby boom" to speak of .

Taking together the seeming reluctance of childless married black women to become mothers, the considerably greater black premarital conception rate, and the far more accelerated youthful fertility rate for all black women than for whites, we can see that even in the earliest period this book treats, the entry into adulthood for black women differed from that of white women, both in timing and sequencing. Indeed, throughout the period


59

discussed here, blacks differed so markedly in the way they typically constructed their life courses, that it at least suggests a subculturally distinctive (but no less dynamically changing, and in some of the same ways as for whites) normative structure. (Such a subculture would clearly not necessarily encompass the entire black community, but it surely must incorporate substantial numbers.) No less, however, we must recognize how much more subject to circumstantial pressures blacks have been as they have negotiated their own life courses. The flexibility of the black family structure has been often remarked, with differing evaluation. Their life courses, too, were flexible. Some reflection will suggest that these two observations are in fact related.

Divorce

Marriage and the onset of coital activity have become less closely interconnected with time, as have also marriage and parenthood. These trends are intuitively consistent with the well-known rise in divorce, which implies that whereas formerly for most people (in contemplating marriage, and in enacting it) one's first marriage was one's final marriage, this has become decreasingly true. Divorce, moreover, has changed its relationship to the life course, notably, its relationship to the age pattern at which marriage was initially contracted.

The general upward tendency of divorces during much of the twentieth century was not steady but included lengthy periods of approximate plateau in this rate, which followed the rather sudden upturns that account for much of the secular upward trend. World War I marriages brought such a sharp increase, which was followed by a plateau that lasted up to about the middle of the Great Depression, at which time increasing numbers of marriages begun incautiously in the face of highly challenging material circumstances earlier in that decade began to fail. On top of this rising rate, failed World War II marriages pushed the divorce rate to a new peak in 1946. But at this point, the rate declined, rather persistently, reaching a low in 1958 that resembled the level of divorce in 1940.[27] The decade and more of steady decline was at this point replaced by


60

an initially slow increase in divorce that picked up headway around 1963 and accelerated into the mid-1970s. Black marriages, initially hugely more prone to divorce than that of whites, were less subject to the upward trend. Glenn and Supanic note that "every recent study that has focused on black-white differences in divorce (or divorce and separation) in the United States has found more marital dissolution among the blacks." In their own sample, from the early 1970s, they discovered that "the correlates of divorce/separation" of the two races "seem to differ in important ways" and could not be reduced substantially by racial differences in demographic or socioeconomic characteristics.[28]

Preston and McDonald partition change in marriage-cohort divorce rates into a trend component and the episodic deviation therefrom.[29] They concede that the pronounced underlying upward trend in divorce rates is not amenable to the kind of explanation they offer for deviations from that underlying trend. Upward deviations, they discover, are in part products of the destabilizing circumstances surrounding the couple at the initial solemnization of the marital tie: marriages contracted in depression or war have been prone to impermanence. Stress may have made it hard for the couple to establish a satisfactory relationship during the early days of the marriage, or material resources or social support may have been lacking. When the wars ended, when the depressions passed, the upward deviations in divorce rates receded. This argument accords well with the interpretive summary of Glenn and Supanic: common features of these dimensions conducing, in the early 1970s, to relatively unstable marriage were those—residence in mobile regions and communities, absence of religious affiliation, and attendance at service—that suggest relatively little integration into social groups that might help to hold the marriage together.[30]

Through the whole period examined in this book, with a slight moderation for wartime marriages, marriages contracted young were considerably more likely to eventuate in divorce.[31] Early, the fragility of marriages contracted relatively young was recognized, documented in empirical investigation, and warned against. For both whites and blacks, men and women, younger


61

marriages were considerably more prone to divorce than were others, and it is hard to discern any trend in this relationship. Glenn and Supanic established that in the 1970s, among a list of nine aspects of background, circumstance of marriage, and current circumstance, age at first marriage was by far the most important predictor of divorce for women and one of two overwhelmingly important predictors (along with religious commitment) for men.[32] Marriages entered into at a young age, on average, undergo relative economic privation early on, just as their patterns are being established; persons less "mature" are presumably less able to recognize what a lasting marital relationship will require; and, insofar as marriages contracted "too" young are deemed ill-advised, social support for such marriages is compromised by doubts.

Recent national data[33] show more precisely wherein the stigma of prior divorce has lessened as divorce has become progressively incorporated in the American marriage system.[34] Thus, in 1961, 3.5 times as many divorced men who remarried at 25–34 married women who had previously been divorced as did single men marrying at 25–34. For remarrying divorcees, the ratio was a bit lower, 2.9 times as many marrying divorced men. So, quite clearly, "endogamy" among the divorced was the pattern in 1961. And so it was also by 1978 (the most recent data available), but by that date the ratio for remarrying men at this age had dropped to 2.8. For women, it had dropped to 2.4. This is rapid change. Acceptance of the legitimacy of divorce as a necessary component of a marriage system incorporating large volitional elements has encouraged acceptance of divorced people as legitimate—if not quite yet fully fledged—reentrants to the not-yet-quite-integrated marriage pool.[35]

Over this period of time, how long it ordinarily took failing marriages to fail changed very little, even though the proportion of couples choosing to terminate their marriage has risen greatly. Typically, six or seven years has been the median duration of a first marriage that ends in divorce, while three to thirteen years' duration encompasses the central 50 percent of all marriage durations.[36] The 1920s saw a slight lengthening of the median duration of marriages terminated by divorce, but by 1948, the average was below what it had been at the begin-


62

ning of the 1920s. The median length of marriage terminated by divorce climbed again, peaking in 1963 at 7.5 years (the highest recorded figure since the early years of the century) before gradually moving downward during the divorce "boom" that ensued at about this time.[37] This stability, however, disguises an important trend that is quite relevant to the life course: divorces of childless couples have come more quickly than they used to, while the couples with children who eventually elected to divorce waited somewhat longer to do so.

Even while the trend in overall divorce rate held fairly steady over much of the first nearly fifty years we are examining, there was a change in the life course location of divorce that foreshadowed the large change in the incidence of divorce that was shortly to come. Gradually, parenthood became less of a hindrance to divorce (while still remaining a substantial hindrance, to be sure). The statistics indicate directly the proportion of divorces that are of couples who have children, leaving us to infer from this the divorce-proneness of married couples with and without children. Only couples with children are at risk of divorcing after having had children, after all, and those who have become parents are no longer at risk of divorcing without having any children. We must therefore gather some estimate of trends in the proportion of all married couples who had children of their own under 18 years old living in their families , so as to compare it at least roughly to the trends in proportions of divorces that are of parents.[38] Such data became available only in 1940, but what they reveal makes interpretation easy: proportions of currently married couples who were parents varied only slightly, changes in age structure offsetting those of the timing of fertility.

Divorce registration was spotty and irregular until recently, and the patterns we discern there may be inaccurate in detail. But it is the gross patterns that are relevant to the case at hand. In 1916, three in eight couples divorcing reported that they had one or more children.[39] By 1922, this proportion had dropped slightly, one in three couples who were divorced in the year reporting that they had a child. But during the decade of the 1920s and into the 1930s (when the data series was dropped), this proportion rose gradually to just slightly above


63

the point it had been in 1916.[40] When the data series picked up again in 1950, the proportion of divorcing couples who had children was up to 46 percent. By about 1957, the proportion crossed the 50 percent mark and continued to rise, passing to the 60 percent level by 1963. And at roughly this point, before divorce rates per se rose at a pace and to a level that tended to suggest even to dispassionate observers a change in the nature of the institution of marriage, the level stuck . In 1975, the proportion of divorcing couples who were parents was 59 percent.[41] The trends, then, indicate an increasing integration between parenthood and a commitment to permanent marriage, at least between the 1930s and the mid-1960s. During this period, there was no lasting pattern of childlessness among couples, but there was a decided increase in proportion of divorces granted to parents. After this period, the same trend in effect continued, but with a different surface manifestation, as proportion of divorcing couples who were parents held steady even as the proportion of all couples at risk of divorce declined rather dramatically. Thus, in 1930, where between about 38 and 40 percent of all couples divorcing were parents,[42] 59 percent of all married couples in that year were.[43] Thus (ignoring age), propensities to divorce were at this time perhaps two-thirds as great for couples with no children, compared to couples with children. In 1975, however, 59 percent of divorcing couples had children, while 54 percent of all married couples—a smaller proportion —had children. Once again ignoring age, the presence of children acted almost as though no hindrance to divorce.

Table 6, based on retrospective data, provides some historical depth. It shows the disposition of marriages by divorce in succeeding decades, examining the racially varying propensity at different stages of existing marriages for divorce to be constrained by the presence of children. Thus, we see that on the whole, parenthood has militated against divorce, at least in the aggregate, but that this relationship is relatively weak for both whites and blacks in the first two years of marriage and then, again, fades somewhat. More striking are the racial differentials: historically, and especially more recently , black divorce rates have exceeded white rates by more for the childless than for


64
 

Table 6. Average Annual Proportion of Women's Marriages
               Ending in Divorce, by Year of Divorce, Whether Wife
               Has Living Child, and Race (per 1,000)

   

Whites

Blacks

Divorces during                                        Parent                      Not                       Parent                      Not

1st–2nd

1940–49

16

16

31

25

year of

1950–59

12

12

18

18

marriage

1960–66

13

25

10

15

3rd–5th

1940–49

12

10

25

18

year of

1950–59

17

13

24

17

marriage

1960–66

16

15

29

13

6th–10th

1940–49

16

  9

15

18

year of

1950–59

  9

  8

17

19

marriage

1960–66

13

  9

24

13

11th–15

1940–49

  9

  7

11

14

year of

1950–59

12

  6

25

  9

marriage

1960–66

10

  8

12

14

    SOURCE : Calculated from Census CPS P20-223, table 4.

parents, suggesting real racial differences in the way this aspect of the life course has been constructed. Black divorce rates among the childless considerably exceeded that for whites in marriages in existence in the 1940s, almost but not quite as much as black rates exceeded rates for whites where no children were involved. But increasingly, the presence of a child has seemingly brought a constraint on black couples which cut down considerably on their readiness (or ability) to seek divorce. By contrast, among the childless, blacks' rates of divorce have grown to exceed whites' by even more, over the period to which the data pertain. Interpretation is probably premature, but it is hard not to perceive in this weave of patterns the continued existence of a "conventional" life course pattern, more characteristic of whites than blacks, of women who marry older rather than younger, in which divorce is still seen to violate the most approved family-building patterns. But, as demonstrated above, the trend data point clearly to the declining relevance.


65

We will return later to the notion that increasingly blacks have evolved a variant, and distinctive, life course, which characterizes some but not all of that race and very few not of that race.

Divorce, in this sense, had worked its way fully into the family-building sequence and was now a life course event that could "legitimately" take place indifferently before or after parenthood. In fact, while this trend seems certain, the story is probably even more complicated, depending in fact on prior aspects of the life course of the members of the couple. Thus, Moore and Waite, employing longitudinal data referring to 1968 through 1972, show that the impact of children on marriages then varied according to both wives' age at marriage and the interval before parenthood. They show, further, that the circumstances—or the rules—of the life courses of black and white people were sufficiently differentiated at this date that some patterns varied markedly by race.[44] They argue that, contrary to what they had anticipated, "early childbearing does not seem to increase the probability of marital break-up among whites, quite the opposite." But, for blacks, "there is a strong association between teenage childbearing and marriage breakup" which is not simply a product of the also-significant relationship of youthful marriage and subsequent divorce, which obtains for blacks as for whites.[45] They further show that while the "presence of children under three years of age has no influence on marital stability among brides 18 or younger and significantly increases the likelihood of dissolution for those 19–20 at first marriage, . . . [it] has a significant inhibiting effect among those who delayed first marriage until at least 21."[46]

Conclusion

Divorce, the usually voluntary ending of one marriage as often as not followed by entry into another, is an appropriate place to end this introductory chapter on trends. Nothing so well as the periodic alternation of spectacular upward lunges and level plateaus of divorce rates suggests the sometimes dramatic ways in which the aggregated choices—constrained, often anguished choices, but choices nevertheless—have modified the norma-


66

tive expectations surrounding the youthful life course. No less, the new incorporation of divorce into the understanding of marriage, together with the greatly enlarged prevalence and widespread expectation of coital experimentation before marriage, points to the way in which the meanings of phases of life have changed in the current century.

And yet, as we saw, the ritual of the wedding has become, if anything, increasingly encrusted by "tradition." If individual volition is more commonly reflected in the circumstances in which marriage is undertaken and exited, this does not point to a loss of the cultural or institutional importance of marriage in the life course. It may, as Kohli has suggested, point to an enlarged pressure on increasingly self-aware individuals to harmonize what they understand to be what they themselves want with what they understand to be the social prescription for people like themselves. This paradox may well lie at the base of the current interest in life schedules to which this book, among others, responds.


67

3—
Modern Youth: The 1920s

Cultural Innovation

That the 1920s was not a decade of unalloyed prosperity as myth proposes should not blind us to the pervasiveness of cultural themes reflecting a sense of growing plenty that fed and was fed by a focus on the post-Victorian sense of individualized satisfaction. If some elements of the consumer economy lagged, many evolved rapidly.[1] New expectations and elements of a revised organization of the youthful life course emerged from an enlarging and increasingly self-confident middle class, ineluctably intertwined.

In the 1920s, the United States moved a long way toward reducing the enormous heterogeneity that had been created by the headlong development of an urban nation and had for half a century focused the nation's cultural and political energies. Immigration, to take the most obvious example, severely constricted by the Great War, was now sharply reduced by statute. The proportions of the foreign-born who were passing through their childbearing years declined sharply, not compensated for by a comparable increase in second-generation "ethnic" youth. Emigration and death, in addition to cultural adaptation, both consciously foisted and unplanned, worked their effects on the foreign-born community. As the "second generation" of the last great stream of immigration grew up in the 1920s, the once-acute sense that heterogeneity was a "problem" for American democracy began to fade. Evidences of ethnic and cultural discontinuities, to be sure, were still numerous; but a sense of gradual assimilation had begun to overwhelm more conflictual imagery.

The twenties began with a sharp depression, as overoptimistic entrepreneurs failed to anticipate the degree of retardation


68

the changeover to a peacetime economy would entail, and precipitated a sharp depression. By 1922, however, the economy had largely recovered and in short order had absorbed the substantial enlargement in production capacity that war mobilization had produced. With productive capacity so near to ability to consume (given current organization of demand and wants), the precise recognition and imaginative reformation of demand and those financial and informational services that sub-served the closer coordination of activity, became crucial economic skills. Among the rewards for economic growth for many, then, was the accession to white-collar jobs that offered shorter hours, cleaner conditions, and prestige.[2]

When Robert Lynd and Helen Merrill Lynd traveled to "Middletown" (Muncie, Indiana) in 1923 to discover how the twentieth century had changed the now-booming American industrial heartland, changes in women's lives seemed to the Lynds to lie at the center of what felt new and different. Wives were more able than before to support themselves and better informed about sex and contraception; but the bread winner/ homemaker dichotomy had remained firm. Despite the absence of any sign of change in the ideologies in which this distinction was embedded, behavior had changed in response to the pervasive seeking after material well-being for oneself and one's family and the increasingly favorable evaluation of such motives. As new wants—for an automobile, for a tract house, for commercialized leisure—came to motivate family getting and spending, the coordination of daily life in the family changed, as did the family bonds that subtly drew on these patterns.[3]

Even economically comfortable families seemed to the Lynds less able than before to derive unquestioned satisfaction from "the plans for today and tomorrow, the pleasures of this half-hour."[4] The heroic economic performance of goods in the household appliance, health, cleanliness, beauty, recreation, and entertainment categories during the 1920s points out dramatically the newness of the consumer preference schedule that was emerging. Otis Pease plausibly finds a new cultural theme in the "conspicuous preoccupation with leisure and the


69

enjoyment of consumption. . . . Leisure to consume and to enjoy material goods was an effective guarantee of happiness. . . . [Advertising copywriters] looked on themselves, in effect, as crusaders for the liberation of a middle-class people from the tyranny of Puritanism, parsimoniousness, and material asceticism."[5] The expansion of national, branded products presenting their case in national periodicals led to advertising campaigns of great skill and impact. The world of objects and possessions burned brightly in many of these 1920s publications, with new graphic techniques complementing a profound shift in advertising philosophy—toward evoking a favorable aura that might be associated with the product.[6] A new standard of living was being defined, whether one thinks of advertising as manipulative or as merely educating the values inherent in the new goods being distributed. "Consumer durables" came to occupy a far larger corner of the daily routine, setting a kind of standard of interest and excitement in acquisition. "There is probably today a greater variation from house to house in the actual inventory list of family possessions and of activities by family members than at any previous era in man's history. The consumer's problem is one of selection to a degree never before known."[7]

By the 1920s, Americans lived with an internal monologue about the short-run satisfaction of wishes. To be sure, there were voices that upheld self-denial for its own sake, but to young people, these voices increasingly sounded anachronistic.[8] The psychology of advertising at this time recognized just this and identified the sexual as prominent among these wishes. "Advertisers should realize that appeals to the physical aspect of the sex instinct will get attention without question but will lead only to such action as is in accord with man's selfish wants. It is only when the psychological aspect is aroused that man wants to do something for his wife, sweetheart, mother or sister."[9] Among the goods merchandised so successfully by the new methods were clothing, accessories, and toilet goods, all depending on links to sexual expressiveness. American young men and women had, of course, long prepared themselves for one another's eyes, but the new emphasis on "aura" had a


70

profound effect on many who beheld them, proposing legitimacy and propriety for an open and unashamed focus on self-presentation.

Contemporaries were momentarily exercised over a rapid expansion of consumer debt. So startling was this development that two-thirds of a sample of Oregon credit buyers raised moralistic objections to such borrowing against the future.[10] Within the decade, consumer borrowing had become pervasive and was understood as a normal, neutral way of increasing purchasing power.[11]

For many youth of the middle and more prosperous working classes, the material prosperity of the period meant they grew up with access to a family car, the enlarged range of casual social intercourse offered by the telephone, and the beginning of a "by rights" claim to discretionary spending within the family budget.[12] A parents' group secretary's report reflects nicely both the dimensions engendered in the older generation, and their resolution of the matter.

In the young days of many of us, clothes meant quality, but that's not so now-a-days—at least the main thing is style, and continuous change in terms of color and style. Quality doesn't count for so much. Clothes are cheaper too. But with our early induced feeling for quality we can't understand this constant buying of cheaper clothes, and think it wasteful. . . . We need to reorganize our thinking. Why not more dresses as an expression of the individual?[13]

Childbearing and even child rearing were postponed by some women who now could work gainfully, achieving a range of material comfort so that family life could embody the new sense of domesticity. As before, "child-bearing is . . . to Middletown a moral obligation. Indeed, in this urban life of alluring alternative choices, . . . there is perhaps a more self-conscious weighting of the question with moral emphasis." When Middle-towners reduced their fertility, they only shifted their emphasis "somewhat from child-bearing to child-rearing." By this point in a marriage, "in general, a high degree of companionship [between marriage partners] is not regarded as essential for marriage," although hopes for a lifetime of "being in love" were seemingly on the rise.[14] Divorce became more imaginable for


71

women, and this now began to be a consideration in their initial choice of marriage partners, and marriage timing. "Apparently this growing flexibility in attitude toward the marriage institution reacts back upon itself; one factor in the increasing frequency of divorce is probably the growing habituation to it."[15] It was not so much that marriages were less successful than before as that people were prepared—and women, with more gainful employment open to them, more able—to sever ties that had not proven satisfying. When the Lynds revisited Middletown in 1935, they noted "a growing belief" among youth "that marriage need not be final since divorce is no longer a serious disgrace."[16]

In Middletown, marriage age continued to move downward, but not because unions had become more impetuous. Rather, young people responded to the ability to postpone fertility, the availability of remunerative work for wives, and the replacement of communitywide socializing by an increasingly privatized life.[17] Weddings in Middletown were now often only a "brief ceremonial exchange of verbal pledges," but bride and groom were by convention and generally in fact linked by being "in love."[18] In a sense, their courtship was now the better suited to exactly that value. "Sexually, their awareness of their maturity is augmented by the maturity of their social rituals," which still went on with subtle parental guidance. The Lynds' account suggests that a rather callous approach to the opposite sex was gradually being replaced by a deeper way of knowing, accomplished perhaps under the influence of the "personal intimacy" now permissible.[19]

The 1910s and early 1920s were characterized by a "dance craze," which contributed to a new definition of appropriate heterosexual relationships among young people. Before the 1910s, open-admission dances had most characteristically been held by ethnic, neighborhood, and other established social organizations, largely catering to their own members, who, knowing one another, restrained one another's tendencies to overstep moral rules. But in 1911, a dance "palace" was opened in New York City, an arrangement that spread rapidly, especially in the early 1920s.[20]

The dance halls dazzled and featured lively jazz, the sur-


72

roundings and the music (sometimes aided by liquor) encouraged the easy and spontaneous contact between unacquainted or slightly acquainted members of the opposite sex. The music was sensual, offering a rhythm in which two bodies moved smoothly to that music, together.

Into these halls come many types seeking many ends. There are those fascinated by the promise of a thrill, college boys whose purpose is to "sow wild oats," high school girls and boys in search of sophistication, the repressed and inhibited in conventional grundies, and frustrated women who seek Bohemianism. The majority, however, are not cases requiring social therapy. They find here a means of social contact. Here they may mingle freely with others in an emotionally charged atmosphere.[21]

The new dances of the era (sometimes called "tough dancing") were less formalized in the steps they demanded of participants, correspondingly offering room for expressiveness of body movement, in keeping with the jazz-derived rhythms that underlay them. "The dances fostered an unheard-of casualness between partners, permitted greater options in holds and distances, and symbolized the high value placed on mutual heterosexual intimacy and attraction." Working-class boys and girls ordinarily came separately to the dance hall, seeking out partners during the evening. But middle-class youth drawn into this world typically arrived at the dance palace in boy-girl couples, their dance-floor intimacy part of a longer-term "career" as a couple.[22]

Identifying the openness of sensual expression as the common element, an authority on adolescence explained the rapid spreading of petting among young people by the overt public acceptance of the new dances. "Some of the modern dances and petting are parallel forms of excitement and experience in sexual affairs."[23] Despite the obvious risks, high schools were quick to institute dances, in an effort of varying success to take the play away from commercial dance halls and roadhouses.[24] The meaning of dancing was, in the main, recreation and structured sociability, but inherently suggestive and potentially explosive. Boys, as a group, found it in their interest to press dancing in a sexual direction, which suited girls' purposes in-


73

sofar as the dancing also served as a declaration of generational freedom; but for them, the sexualization of dancing also inched the terms of the boy-girl negotiation that much closer to "going too far," at which point they had more to lose.

The morally innovative meaning of the movies was also apparent to adolescents.

It is important to consider that the movies do not come merely as a film that is thrown on a screen; their witnessing is an experience which is undergone in a very complex setting. There is the darkened theater . . . ; there is the music which is capable not merely of being suggestive and in some degree interpretive of the film but is also designed to raise the pitch of excitement, to facilitate shock and to heighten the emotional effect of the picture; there are the furnishings—sometimes gaudy and gorgeous, which help to tone the experience.[25]

Between 1921 and 1930, average weekly attendance at motion pictures increased rapidly. A weekly movie habit, or more, was typical of unmarried youth, who characteristically attended with age-peers, except among those exceptional boys and girls whose "moral habits" were considered especially "high" by adult standards.[26] Even in rural areas—at least those that were neither geographically isolated nor poverty-stricken—teenage patterns of recreation were no less transformed by this form of commercial entertainment.[27] Films were very special events for their new fans in the 1920s, if hardly rare ones, and in their content no less than in the emotional "tone" their purveyance suggested a larger world of possibility to their viewers.

A movie is judged by the thrill it produces. . . . The scenes which make the greatest appeal to the boys are usually those which satisfy some desire which is in them. The scenes which appeal most to the girls are those which correspond but apparently do not satisfy some desire they have. The boys seem to be content with the things as they see them on the screen while the girls only long for the things that they see there.[28]

Mary Pickford and Douglas Fairbanks, by easy stages, and then the morally ambivalent dramas of Cecil B. DeMille led Americans to accept the body as a legitimate locus of pleasure and


74

gratification as a worthwhile and even necessary aspect of marriage, replacing the compartmentalization of sex implied by the risqué short films characteristic of the previous era of the American industry.[29]

One student of contemporary motion pictures counted five and one-half love scenes per film. In love films in which the circumstances surrounding love could be determined, just under half the occasions of initial love were love at first sight and more than half the remaining occasions were at second or third sight. The formal needs of movie dramaturgy explained much of this, but there was a strong cumulative impact on young people's sense of the timing and sequencing of the emotional structure of the life course. Among the "goals of the leading characters," of 115 motion pictures studied, "winning another's love" was a clear goal in 70 percent. "Marriage for love" was present in 36 percent of the movies, "illicit love" in 19 percent. These three motives alone accounted for 45 percent of all goals detectable.[30] For the more impressionable young viewers, the films provided explicit content for sex fantasies and instruction in lovemaking techniques.[31]True Confessions , in working out its formula for morally subversive moralizing, moved from literally true confessions through a slick presentation of movie stars' glamorous, too-worldly lives, to a mixture of mythic confession and movie-star revelation.

If sexuality was purveyed commercially to youth in the 1920s in deliciously small doses, so also in what was deemed to be denatured form it was increasingly supplied gratis , by adult authorities. Right education about sexuality was an occasion for progressive educators to address children on a subject not perhaps of their own choice but under the circumstances, compelling.[32] The sex-education movement, in fact, had grown from the successful suppression of officially—or unofficially—tolerated segregated urban vice districts during World War I. Ironically, the movement thus triumphant had ramified considerably beyond the suppression of prostitution, shattering in the process "the conspiracy of silence" about sexuality that had coexisted with tolerated but circumscribed vice.[33] To banish vice, vice must be discussed, and thereby sexuality must be discussed, as a question of policy. The young conscripts of World


75

War I received a broadened range of official information regarding venereal diseases. Volunteer social hygienists told them yet more, placing their sexual drives into a larger context that moralized them but stopped short of the repressive levels of the prewar period. As tolerated prostitution was vanquished, the "social purity" movement expanded its concerns to include venereal disease, and thereby, sex education. At this point, a longlasting alliance was struck up with those promoting the notion of eugenic contraception, who sought to diffuse the motive and means of family limitation among the immigrant and working-class population.

By the 1920s, sex educationists were increasingly eager to free their subject from a narrow focus on the biological aspects of sexuality and to incorporate larger psychological and guidance components.[34] Sexuality and the study of sexuality became a national fascination, as science and scandal.[35] In the early 1920s, a large sample of junior high students was asked their opinion of taking a "course dealing with marriage, home and parenthood." At this time, only 46 percent of the girls and 41 percent of the boys who offered an opinion supported the courses—and a majority of the boys and one in three girls ventured no opinion at all. But by the mid-1930s, seven in ten Maryland 16-year-olds believed that the schools should incorporate sex education, one-quarter of these feeling that elementary school was the appropriate level. (Girls were slightly more in favor of sex education, especially early, than boys.) The only remaining pockets of opposition were among those youth who had dropped out of school at an early age.[36] One would not call the viewpoint of this movement "modern" today, certainly not in the sense of an explicit embracing of sexuality as a good. But as a public movement, as a group of respectable propagandists with scientific and religious authority to speak publicly on issues that for some time had been hushed, it certainly was a "modernizing" movement.[37]

Before the end of the 1920s, it had become conventional wisdom in substantial segments of the population that adult sexual expression was not merely permissible, but "a duty toward one's 'mental health' or 'whole personality,' " one that had a "pivotal place . . . in marriage."[38] Thus, the U.S. Children's Bureau cau-


76

tioned parents that in responding to their developing children's questions about sex, "no attempt should be made to bolster up good, sound advice with statements of dangers which, in the first place, may not exist and, in the second place, serve no other purpose than the creation of unreasonable fears at the time and may well become handicaps to him later in life."[39] By 1941, the American Association of School Administrators would seek to assume a moral entrepreneurship in the realm of the now thoroughly acceptable field by urging "that the school offer leadership to the entire community, and especially to parents, on problems of marriage and parenthood."[40] A systematic study of the results of a "personal improvement" curriculum offered within Home Economics in a Pittsburgh high school in the mid-1930s provides evidence that the curriculum was effective in promoting girls' "social skills," adding to the like impact of simply growing older. At the same time, the special curriculum contributed most to the social skills of daughters of middle-class parents. Further, self-perception of social competence proved to be largely independent of social skills, while exposure to the special curriculum seems to have increased working-class girls' self-consciousness at their own social failings even as their objective social skills increased.[41]

The High School and the Transition to Adulthood

In a setting of increasing disposable wealth, decreasing population heterogeneity, and an enlarged emphasis on the individual's ability to choose his or her own way of life, the idea of universal high school "took" as a broad-gauge instrument of socialization. The schooling explosion of the 1920s was characterized by a greater extension of schooling among most of those groups previously least exposed to schooling (and especially in urban places where high schooling was already relatively prudent): an educated population was presented as no less a public good than a private one.[42] To be sure, even at the end of the 1920s, native whites of native parentage received more schooling than nonwhites and than the foreign-born and their children, but the differences had narrowed significantly. Table 7


77
 

Table 7. Proportion Having Completed at Least One Year of High
               School, Cohorts of High School Age around 1918 and 1928,
               by Sex, Race, and Urban Proportion of Population in State of
               Residence (in percentages)

 

Most Rural

Rather Rural

Middling

Rather Urban

Most Urban

 

Male

Female

Male

Female

Male

Female

Male

Female

Male

Female

White

1918

38.7

47.0

43.7

52.2

44.8

52.6

37.3

52.0

46.7

48.5

1928

51.4

59.7

55.7

63.7

64.2

70.9

66.9

69.5

64.6

65.1

Nonwhite

1918

8.2

12.7

10.0

12.4

21.3

28.9

22.5

27.8

25.4

28.4

1928

13.0

20.8

16.5

24.9

36.9

45.5

42.7

48.1

40.4

45.3

    SOURCE : Census—Population Trends, table 6.
    NOTE : Proportion urban in state calculated as of 1960. Urban/rural differentials will be somewhat exaggerated by differential migration of the more educated to more-urban areas between 1918–1926 and 1960. More-urban states in 1960, however, by and large will have also been more urban in 1918–1926.

suggests the pace and location of high school attendance by contrasting the schooling of the birth cohorts who were of high school age in the late 1910s and the late 1920s, subdivided by race, sex, and the degree of urbanization of the state in which they lived. In the decade, high school experience (about twice as common as high school graduation ) became modal for whites in all categories, especially (and most rapidly) in the most urbanized states. And in the most urbanized states, too, high schools began to reach a majority of black youth. Starting considerably below females in both high school attendance and graduation, males caught up slightly during the decade, but only in the more urbanized states. In those places, as well, the rate for nonwhites grew the most rapidly, and converged the most rapidly on the rate for whites.

The enlargement of the high school experience proved to be of particular importance to young women, both because they found in the high school an especially consequential social setting and because they learned there employable skills that were to be useful immediately and were to draw them back into the


78

labor force in later decades. As Claudia Goldin's fine cohort analysis of women's work patterns shows,

although change in the labor force participation rates of married women did accelerate [only] after World War II, many of the preconditions for the expansion had been set decades before. . . . New social norms of the 1920s may have influenced the decisions of many young women to delay leaving the labor force until their first pregnancy, rather than with marriage. . . . This change may have . . . provided that critical break on which future change was founded.[43]

Each year in the decade saw greater numbers of both boys and girls graduating from high school. Between 1922 and 1924 there were annual increases of no less than 2.5 percent in the proportion of all 17-year-olds graduating.[44] The reduction of child labor in good part preceded the school increase, especially at the youngest ages—through 15—for both boys and girls. In Philadelphia, the proportion of white boys and girls out of school and in gainful employment at 15 declined from nearly half in 1915 to one in six in 1925, at which point it leveled off. The proportion of white boys and girls out of school without work, although far lower, dropped even more precipitously. Black boys and girls, far less likely to be gainfully employed but somewhat more likely to be at home without work, showed parallel trends. Both public schooling and Philadelphia's strong parochial school system eagerly took up those who moved out of the labor force. The school trend slowed considerably in Philadelphia in the second half of the decade, but pushed up again in the Depression, never to be reversed thereafter, except briefly during World War II. In 1932, only about 5 percent of Philadelphia's 15-year-olds were out of school.[45] In Pittsburgh, the numbers of boys at work at age 14 declined by two-thirds between 1923 and 1929, while the numbers of boys still enrolled at school at 15 increased by half. Girls had previously been less given to work, more given to school. In the seven-year period, enrollment gained and work declined by 40 percent. The distribution of what jobs there were shifted sharply away from adultlike work in heavy industry toward service, clerking, and message carrying.[46]


79

In 1910, half of all boys of 15 had been gainfully employed, nationally. By 1930, the proportion was down to one in six. Girls at work at 15 declined from one in four to one in twelve. These figures bespeak a change in the operation of the family economy so rapid that it must have been felt quite consciously. The age at which children would begin "paying back" their parents for the investments they had made in them was postponed over this period for two years or longer, the greater part of this change coming in the 1920s. Communities proudly established more and more high schools, formally training their adolescents for a new kind of work life and participating in a democratization of secondary education that brought in students of less-favored socioeconomic background as well as students of less promising scholastic aptitude.[47] Only the fact that parents were having fewer children allowed them, and their communities, to support the new youthful life course that was elaborated at this time. These patterns coincided with, and were intensified by, the development of nearly purely residential suburban areas much given to high-quality schooling, where parents had clearly made a choice about the kind of youthful life courses their children were to have.[48]

In the 1920s, age homogenization within grades was a self-conscious policy of many high school administrators, accomplished even as the schools expanded rapidly to include students from families that were close enough to the economic margin to require supplementary income from their children from time to time.[49] Failure to promote increasingly was seen to encourage dropping out, and this was seen as unfortunate. A dramatic example of the progress of age homogenization is found in the school system of heavily working-class Duluth, Minnesota. Among sixteen-year-olds attending Duluth public schools in 1920–21, only 25 percent of the boys and 36 percent of the girls were to be found in a single grade; five years of policies promoting age homogenization brought these figures to 33 percent and 40 percent. By the mid-1930s, they had been brought up to 42 percent and 54 percent.[50]

The 1930 census affords a full picture of children's passage out of school and into the work force at decade's end. Figure 6, presenting the story for males, shows that by that date, after


80

figure

Figure 6
 Boys' Passage Out of School and into the Work Force, 1930

the innovations of the past decades, the movement from school into work began apace for boys only between 15 and 16. School and extended gainful work were rather rarely pursued simultaneously. Nor did many more boys at any age emerge from school without promptly completing the transition to the work force. The proportion of boys enrolled in school who were at the same time in the work force rose from about one in seven at age 16 to nearly two in five at age 20, but the proportionate rise was more the product of leaving school than of finding jobs. Even at 16, fewer than one in three boys who had left school were not yet at work, a proportion that had dropped to about one in twelve by age 20. By 20 (by which age fewer than


81

one in eight young men were married), eight in ten had entered into adult labor force status.

Figure 7 presents like data for girls and young women. The dominant trend here, as with boys, is that as they get older, there is a movement away from exclusive attention to schooling. The level of enrollment at each age, and the inflection points, in fact, are very similar to boys' levels. Boys, however, quite clearly had a single complementary activity: work. When boys moved from school, they ordinarily tried to move directly into work, but the "idle" proportion among girls was considerably greater and increased with age. The proportion of girls neither in school nor at work, in fact, was at each age almost identical

figure

Figure 7.
Girls' Passage Out of School and into the Work Force, 1930


82

with that of girls at work. As it happens, the census data includes marital status for girls (but not for boys), which allows us to examine the extent to which the "idle" girls were generally married or, alternatively, helping around the parental home. The data indicate that only by age 19 were more than half the "idle" girls married. At age 17, by contrast, only three in ten girls who were neither at school nor at work were married. At this age, the "idle" constituted nearly one in four girls.

Dropping out of school, then, was itself in a sense normative for girls, rather than being immediately propelled by an immediate transition to wife or worker. Although the 1920s saw a very significant intertwining of courtship with schooling, it is equally clear that a subsequent—and far more elusive—phase of the female life course typically supervened before marriage. At the same time, the graph reminds us that gainful employment was quite common but far from universal for late-adolescent girls. For this reason, therefore, we may understand why girls less than half as frequently as boys both worked and attended school. Among the concomitants of recent economic developments was a noteworthy rise in women working at sales and clerical jobs.[51] At this date, however, married women worked, essentially, under necessity: with no pressing family need for income, being a homemaker was commonly prescribed. Yet the concept of necessity was being broadened.[52] The decade saw more, and more prominent, exceptions to that rule, and they seem to have conveyed to contemporaries a sense of a norm that was changing.[53]

We can detect emergent in the decade a new sense of life course organization closely connected to extended education, a sequence of events deliberately geared toward material accumulation and personal gratification, in which middle-class women's work was seen as far more compatible with marriage than formerly. The new pattern can be seen to advantage in the data collected in 1941 from native white Protestant couples in Indianapolis who had married in the years 1927 to 1929. These data reveal a close, but by no means perfect, connection among the young people's socioeconomic background, their own educational attainment, and their labor force experience. Socioeconomic background and educational attainment helped deter-


83

mine a cluster of subsequent behaviors, including the age at which women married, the frequency with which they had worked before marriage and after marriage, and as we shall see later, the timing (and means of managing the timing) of their transition to parenthood. Thus, among women who graduated from high school, only 14 percent had married at 18 or younger, a figure contrasting strongly with the 53 percent of the girls who did not finish high school who had married by 18. Among women who did not complete high school, although they married younger, 21 percent never worked before marriage, a figure exceeding the 15 percent among those who did complete high school. After marriage, however, these patterns were to reverse: 52 percent of women who had not graduated from high school but 58 percent of the graduates worked shortly after the marriage.

School and marriage were incompatible statuses for women during this period, and work and marriage, if decreasingly so, were substantially incompatible as well. There is really no secure way of estimating the separate "contribution" to young women's marriage of leaving school, entering work, and simply growing older. Table 8 shows the proportions of young women in 1930 who were married, for each single year of age, according to school and work status. It is apparent that age had a large direct effect on marriage-proneness, apart from its indirect effect through discouraging school enrollment and, conversely, encouraging gainful employment. Even among those in school, as among those out of school both in and out of work, proportions married increased with each single year of age. The whole sequence is best described as an occasionally varying sequence of transitions, school leaving coming first and proportions attending school falling dramatically between the ages of 15 and 18. Leaving school can be said to have precipitated some openness to marriage, but not overmuch, with a greater effect on promoting gainful employment. After age 17, the proportions of young women out of school who were gainfully employed leveled off at a shade under half, but this was to a growing extent a product of the competition between work and marriage: of those who were out of school and unmarried , the proportion at work increased year by year, reaching a majority


84
 

Table 8. Proportion of Young Women Married, by School Enrollment
               and Labor Force Status, 1930 (in percentages)

                                                  14                        15                        16                        17                        18                       19                         20

Young Women in School

Not working

  0.1

  0.2

  0.5

  0.9

  1.5

  2.2

  3.3

Working

  0.0

  0.1

  0.2

  0.4

  0.7

  0.9

  1.2

Young Women not in School

Not working

  3.6

  9.2

  17.7

30.6

45.7

58.5

68.1

Working

  1.9

  2.9

  3.4

4.6

  9.5

10.2

14.1

in School

92.9

84.5

66.8

48.8

29.4

18.8

11.7

% of those not in School

who Work

19.9

27.7

38.9

46.0

49.6

49.1

47.4

% of those not in School and

Unmarried

             

who Work

20.2

29.1

42.8

54.0

62.1

67.6

70.8

SOURCE : Derived, with some interpolations, from Census 1930–1, 1180–1181.

by age 17 and two in three by age 19. The increasing proportions married of the entire group—38 percent by age 20—was drawn substantially from those who had completed the entire series of transitions, or who promptly left school or work when they married.

The exfoliating high school was already a hotbed of anxieties and longings when the automobile, World War I, and a new definition of adolescence banished the chaperone and direct parental oversight of courtship. "Who has not observed the various ways in which the high school girl, while not admitting her motive even to her self, endeavors to draw the regard of her male companions?" asked psychologist Phyllis Blanchard in 1920. The high school, Blanchard observed, was a haven of "incessant giggling" produced by girls' "new consciousness of sexual differences" and the rigors of new "social situations for which she as yet feels herself lacking in poise. . . . With the dawn of adolescence comes a new self-consciousness as the awakening sexual and social instincts induce comparison with others and emphasize personal deficiencies hitherto discarded."[54] The old term "calf-love" no longer seemed to describe behavior


85

adequately and now seemed too dismissive. "If a child is the product either of a modern home or of a coeducational school of today, his adolescent fixations, if any, are likely to be directed heterosexually to a person of approximately his own age," wrote a researcher in 1934, contrasting his findings to those made in the first decades of the century. Child study experts observing adolescents in these settings discovered that girls now directed their early amorousness toward far more plausible love-objects, and shortly developed a vocabulary with which to engage them.[55]

The Dating System

"The outside world of today has no use for flimsy worshipers of petty idols such as 'popularity,' " thundered a Minneapolis Central High editorialist in 1923, but popularity was the universally understood term for what the great majority of high schoolers sought to a greater or lesser degree. Popularity and cliquishness were closely related and tied closely to dating: both were parts of a new system of social relations governed informally but firmly by young people themselves . Well before the adult world took much notice, most boys and girls from their mid-teens on came to organize their social lives around an institution not of their elders' making.[56] This was so even before they evolved the dating system. As early as the turn of the century, in places where high schooling was common enough that it enrolled a socially heterogeneous student body, it was there that students "transferred emotional ties from the family to the peer group. Students felt compelled to present themselves to win approval from their classmates," in activities in which they "carved out personalities derived from the role models of their parents and teachers, but infused with unique youthful styles to win popularity or prestige."[57] Not all youth saw dating in the same light, to be sure, even in the high schools. Material wherewithal made a difference; so also did cultural heritage. And asymmetries of gender roles were the armature around which the dating system would evolve.

A fine historical account of the emergence of the dating system first suggested by Paula Fass and recently elaborated by


86

Beth L. Bailey[58] understands dating as one among many of the achievements of a self-conscious generation, acting in part over and against its predecessors:

It was not caprice . . . that made them question traditional proprieties in sexual morality and in such areas as smoking, drinking, and dancing. These the young defined as the private sector, as a sphere for personal expression to be governed by need and taste rather than by laws and morals. . . . The young knew that their patterns and attitudes provided a margin of difference between them and their elders, and gave them a vehicle for group cohesion.[59]

Fass and Bailey demonstrate that the symbols of generational revolt were preeminently borne by women and that they took the form of the narrowing of the differences in the behaviors of the two genders: language, clothing, smoking, hair style, and social intercourse between the sexes, the latter constituting a modest challenge to the double standard of sexual propriety. If "freedom" or autonomy seemed to contemporaries to be at stake, in retrospect, youth—and young women in particular—seem to have proposed no fundamental changes in the moral order, only the lifting of limitations, based on age and gender, on their own right to choose among conventional options. They challenged received definitions of authority, not morality, positioning themselves to take advantage of the alluring but hardly revolutionary range of new consumer choice created by an expanding economy. As Bailey astutely notes, "sex became the central public symbol of youth culture, a fundamental part of the definition that separated youth from age."[60] If sex was now "as frankly discussed as automobiles or the advantage of cold storage over moth balls, why should our elders consider our interest in this subject a sign of unnaturalness or perversion? Should it not constitute the chief concern of those in whose hands the future generation lies?"[61]

Young people, however, did attend to what their elders said in condemnation and alarm, and they formed their own responses partly in opposition to them with sex no less symbolic to them as to their parents. "In forging the new conventions and living with them, the meaning of youth's sexual experience was transformed."[62] If young people in rejecting received court-


87

ship procedures also rejected traditional romanticism, they by no means rejected marriage or marriage based on love. The new dating system, as they understood it, was an institutional framework that subserved exactly this end.

While the Fass-Bailey description of the dating scene is persuasive, its focus on collegians slights evidence that the dating system evolved simultaneously among high school students. This part of the system affected more people and coincided with the phase of heterosexual awakening in participants' lives. Both the age homogenization of the high schools and their expansion promoted the evolution of a dating system, since dating depended on freely entered short-term agreements between near equals, differentiated mainly by gender and overseen by the opinion of mutually valued, interrelated sets of age peers. Age, with its correlated experience, earning capacity, was the kind of differentiator that could render too unequal the negotiation of dating's core. (In parallel with cultural expectations governing marriage, girls could date somewhat older boys.) Age homogenization limited exploitation and permitted the girls to move somewhat beyond the constrictive safety provided by adherence to the double standard.

The defining characteristic of the new dating system and, what is more critical here, of the graduated series of dates that might lead to a more lasting commitment between young men and women was that a date was away from home, proposed and paid for by the boy, unchaperoned, and not subject to detailed parental veto; it depended on the free election of the participants. "An invitation to go out on a date," as Bailey maintains, "was an invitation into man's world—not simply because dating took place in the public sphere (commonly defined as belonging to men), though that was part of it, but because dating moved courtship into the world of the economy," where the boy's money paid for the date.[63] Certainly, some American boys and girls of the middle classes had coupled in every imaginable way without parental awareness before dating was practiced, but encounters of this sort lacked the continuity and regularity that the full evolution of the dating system after World War I would permit. (Bailey offers some evidence of "dating"—for instance, the use of the word—among select groups as early


88

as the mid-1910s.)[64] Under the older system, there was no normatively sanctioned way for an adolescent to get "serious" about someone of the opposite sex without submitting the relationship for parental approval. Chaperonage asserted parents' oversight of what boys and girls might do together, and the home visit assured girls' parents of some control over whom their daughters might be seeing. Both were important, and both vanished with dating, which substituted peer oversight. Not the occurrence of emotional or physical intimacy but the question of whose advice guided young people in developing heterosexual ties was the critical difference between dating and the practice of "calling" and "keeping company" that it was rapidly supplanting in the 1920s.

Parents with cars or the wherewithal to get them indeed found them near the core of their conflicts with their children, as indicated in Middletown , where the automobile was the most visible sign of change. The Lynds note that "social fitness" and possession of an auto were closely linked in the minds of local high schoolers, explaining their exigency when addressing their parents on the subject. When, in the next paragraph, the Lynds explored changes in youth standards of sexual behavior, automobiles, along with movies, were cited as causes, or near-causes.[65] In this matter, conventional accounts have taken the Lynds too literally, affected perhaps by the fascination automobiles have long held for boys in "wild" phases, sexual and otherwise.[66] The far more mundane telephone would seem to have been a more crucial piece of dating technology, and the motion picture and the motion picture theater even more so.

Cars certainly were important to boys and girls who dated, and permitted much explicitly sexual behavior to transpire, but it is doubtful if the automobile importantly promoted the change that dating (or petting) constituted. For one thing, there simply were not enough cars to go around. Even in San Jose, California, at the end of the decade, nearly one in three high school junior boys never drove, and nearly as many again had the family car only "seldom."[67] The car was in fact only the most conspicuous of the heightened consumption patterns that were associated with dating. Cars were no more literally prescribed than was any other unique item or gesture. It was in


89

large cities, where cars remained notably fewer than in the countryside and small towns, that dating evolved; there, the streetcar sufficed as a means of moving about on dates. Transportation was less important than the availability of somewhere to go to that the girl was willing to go to, and able to convince her parents to let her go to, a legitimate but individualized activity. Thus, a sociologist's inquiry into cultural change in the 1920s that compared rural with county-seat life observed that it was in the rural areas—where dating was not yet practiced—that parents saw the automobile as a threat to their authority.[68]

The elaboration of dating as a system began in the first quarter of this century and spread apace during the 1920s and 1930s from its initially urban and middle-class center. A large study of schoolchildren in Kansas City, Kansas, and nearby communities in 1923–1926 found that among boys of 13, "having dates" (as the questionnaire collected from the subjects put it) was the tenth most favored activity (football was tops) and advanced to the fifth most favored by age 17. Girls at 13 (who liked reading best) were even more fond of dating, and retained their lead over boys in this regard, with dating the fourth leading activity at age 17. In a San Jose, California, high school, two-thirds of the sophomore boys and three-fourths of senior boys were dating in 1930. Data collected in 1933 from a large sample of high school girls (oversampled among Catholic schools) found that half of the freshmen and 84 percent of the seniors had begun dating. Blumenthal's 1932 ethnography of isolated "Mineville" found the dating system in operation.[69]

A careful study of upstate New York rural girls in 1933 revealed that the institution had begun to make its way into the countryside. Only 33 percent of the girls aged 15 to 17 had never yet dated, and an additional 49 percent did not yet date "consistently." At ages 18 to 20, a somewhat greater proportion, 58 percent, were not yet consistent daters, suggesting that dates had arrived there quite recently. For each younger cohort of girls interviewed, dating had begun younger, as the institution diffused. These girls, when they dated, went to movies, dances, and parties and motor rides, just as did urban youth.[70] But in less prosperous rural locations, social life devoted exclusively


90

to youth was exceptionally truncated. In the countryside, parents' capacity to exercise close supervision was often too great; many farm youth were even said to seek the city partly on this account.[71]

Urban working-class youth seem to have had quite sufficient distance from parental oversight to erect a dating system, but other matters at first militated against it. The sociological accounts of Donovan on waitresses, Thrasher on the boy gang, and Thomas on girl delinquents discuss non-middle-class milieus of the 1920s which lacked both material wherewithal and peer groups with wide enough consensus to oversee dating. Here, dating in the sense we are discussing clearly did not organize heterosexual contact.[72] Nor did it in factories and shops with mixed work forces, sexualized as byplay became there, precisely because females were at such a disadvantage in the work world that they ordinarily shied away from the kind of exploratory gestures characteristic of dating.[73] Whyte's Boston observations in the mid-1930s pointed out the continued existence of ethnic working-class settings in which highly asymmetrical assumptions about gender roles rendered dating inappropriate.[74] Working-class children at first could not control the time, place, or tempo of boy-girl contacts. They also lacked both the wherewithal for the "good time" dating asked of the boy and the effective, school-based, same-age peer group that oversaw behavior within the dating system.[75]

Information on black youth is rare, but a suggestive account can be put together which argues that lower-class urban blacks, at any rate, had not by the early 1930s elaborated a dating system on the order of that developed by whites. In Kansas City, Kansas, in 1926, although black children did date a bit in their teens, boys and girls both omitted dating from their list of favorite activities.[76] Instead, black boys and girls socialized commonly in large mixed-age settings of various sorts, some of which did but others of which did not offer the kinds of protections against boys' sexually threatening behavior that were provided by the elaborated dating system, as among whites.[77] Such protections were both lacking and—for a subset of girls who were striving for "respectability"—necessary because talk of sexual matters—and not just as fantasy—was prevalent among


91

both black boys and girls. A proper seventeen-year-old Washington girl of lower-class background told an investigator:

We girls often discuss boys and having relations with them. All my girls friends think about the same as I do. They don't want to have any now. I know it's natural, and I don't object if people want to do it. But, you see, my mother trusts me and lets me go with boys because she thinks I won't go wrong.[78]

Among urban black youth, relative license in sexual matters for boys, and a sharp discomfort caused by such license on the part of a self-consciously "respectable" grouping of the girls, fit with a marriage schedule that was appreciably earlier than for whites. The highly restrained attitude toward sexuality within dating that later marriage permitted seemed out of place to blacks who would shortly begin marrying, even without pregnancy. An eighteen-year-old boy in Cincinnati offered to a social investigator a plaint that, with moving naiveté, incorporated prompt marriage.

She has subthing of mine. I ask her to let me walk home with her but she said no. I have not got the nevers to ask her do she love me. I am going to stop school and get a job and I am going to ask her to marry me. she look like the morning star. she look like a sweet rose in a valley. I love her so my heart ach. it dance around. no Jive.[79]

In more realistic contact with the severe family economic circumstances that promoted early marriage was a fifteen-year-old girl.

My father has no regular job and I have some more little sisters and a brother. Friends have told me I ought to marry. But I want to go through high school. I haves a good home and very kind sweet loving parents. There is a boy who loves me and ask me to marry. But I refuse.[80]

Among middle-class whites, the fully evolved date itself had a compelling logic quite distinct from that of prior forms of courtship: it was a step in an ongoing negotiation, with rules defined and deviations punished by age peers. The logic of the


92

date anchored it in modest pleasures and centered the choices it occasioned in the daters themselves (within limits imposed by the peer culture). The home visit or chaperoned dance, in essence, had been either purely sociable—part of a group occasion—or explicitly related to courtship. The date might turn out to be either of these, or both, or, most commonly, something else again, but what it turned out to be depended on how well the negotiation at its core went, a negotiation regarding short-term gratification. By definition, boys planned and paid for "a good time" and asked of their girls a bit of physical intimacy. How a boy pled his case, how his date responded, and the future of the pair as a couple depended not only on the boy's sense of his investment and the girl's scale of values but also on the public commitment each was willing to make to the other and their capacity for emotional intimacy, which "modern" girls (like their nineteenth-century predecessors) wanted badly, and often missed, in their consorts.[81]

A charming 1929 story in The Ladies' Home Journal celebrated the diffusion of the date and its code by a nice reversal. When a rich and attractive, but somewhat behindhand, girl coolly plans to hone her date-related skills on the young handyman at her summer place—the better to succeed with her chosen suitor—her conventionalized behaviors work too well: her wiles capture both the handyman and her designated boyfriend. But the handyman captures her—and turns out to be working his way through college, and thus acceptable and, at the story's conclusion, accepted.[82]

The developing internal logic of the date can be discerned in the statements of those whose dating experiences seemed to them imperfect enough that they wrote to newspaper advice columnists.[83] In the broad shifts in vocabulary, usage, and assumption contained in these published letters can be seen the progressive definition of the institution of dating as it spread. Internal evidence points to regular editing (even apart from selectivity) by the columnists, and scuttlebutt suggests some fabrication; newspaper readerships were narrower than the full range of the population, and only readers possessing both a sense of moderate anguish and a yen for disclosure would even consider writing. However, if the letters had not smacked of


93

verisimilitude, the advice proffered would have read as a parody of itself; and to judge from the generally sober (while distinctly adolescent) tone of the great majority of the letters examined that dealt with problems in the early stages of boy-girl relationships, adolescent readers were in fact reached. That parodies appeared frequently in the high school newspapers attests to the intense, if ambivalent, interest of young readers.

Urban youth were not yet entirely familiar with dating in 1920: even the simplest rules of the dating system might not be well understood. Doris Blake's early correspondents often asked about when boys might be and should be invited to girls' homes, reflecting the transition from the older tradition. But it was the goodnight kiss that provided the most common perplexity at this early date. W. A. wrote to Blake, "I am a girl seventeen years of age. I have been going with a young man three years my senior, whom I love and admire very much. . . . Is 11 o'clock too late to arrive home from a show or some other place? Is it all right to allow him to kiss me good night, even though we are not engaged?" Within a few years, kissing would imply to all only the most evanescent commitment.[84]

The growing recognition that dates should incorporate an ambiguous mixture of physical pleasure and self-restraint did not by itself remove all the perplexities of daters. They had still to learn how to "read" the dating situation. R. S., for instance, could not quite fathom the implications of the behavior of the young man "that I care for." "He has declared his love for me also. But he goes to visit other girls and takes them to places and has never yet taken me anywhere. He's forever praising those girls. All this makes me doubt that he really cares for me. Do you think he does?" R. S. simply did not know whether "caring for" is in any way articulated to the dating system, and while she obviously intuited that there was such a thing as a boy's "line," she lacked confidence in her ability to discern it in action. Only over time were symbol and gesture fitted into a changing code of dating that was a thoroughly known part of the developing culture of adolescence. "I am 16, good looking and a good sport. A is 17, bashful, and not very good looking. His friends say he likes me"; "I am a young boy of 16 and am in love with a girl 5 months my junior. So far I have not told


94

the girl anything but have confided in two of my boy friends. One of these boys went back and told her. As a result she was just a bit peeved."[85]

Culture, as always, though supportive, could also be confining, as when the peer group's influence extended too far into the dating situation. "Heartbroken" was a girl of sixteen, dating a boy of seventeen in 1925: "I love this fellow very much and I know he loves me. When we are at a party or a dance he is always with me, and he always asks to take me home, and I let him. He is very nice, but when he is with a bunch of boys he just says hello and keeps right on going. I would like to know the reason for this (he is very bashful), because I love him." Or a jealousy composed of confused frustration might appear a product of divergent definitions of the two partners over the degree of articulation of the dating system with intimacy, on the one hand, and the peer popularity system, on the other: "My friend's chum is keeping him away from me because my sister doesn't care to go out with him."[86]

The reader of adolescent lovelorn letters from this period can hardly fail to observe the generally shallow connotation of the word "love." The notion, of course, was by 1920 carried into teenage courtship parlance through the insipid romantic fiction of stage, screen, and print, so teenagers had good authority for feeling "love" easily and often. In the 1920 letters, the vocabulary is limited to a few variant usages of "love" and occasional references to "care for." By 1925, the range of expression had widened a bit, with a new verb or two enlarging the capacity for discrimination and a raft of new, conventionalized intensive adverbs. By 1930, even the brief letters to Blake indicate a concern for emotional precision. Connie wrote Blake that her fellow "never told me he even cared for me." "Doubtful" reported to Blake that her "fellow says he loves me. I like him as a friend." Blue Peggy, seventeen years old, wrote to Martha Carr that she felt left out because while "several of my girl friends have fellows and seem so in love," she herself "can't seem to get enthused" over her "several boy friends." In Blue Peggy's view, being "in love" was something one might be but at least should "seem" to be at seventeen, and such a seeming


95

might be approached through enthusiasm in dating, if only she could experience even that.[87]

In the 1930–31 letters, "steady" relationships of one kind or another, including references to "going steady," virtually absent before, were quite common. Lacking such a defined stage, earlier daters like Tootsie had been confused:

I am a young girl of 17 and am really in love with a young man of 19. I have known him for over a year. We are not exactly engaged, but he has promised not to go with any other girls, nor I with any other boys. I am in a suburb now and am attending school. He goes to a university. I love this boy with all my heart. But some time it is such a temptation to go out with the boys.[88]

A few years later, a metropolitan seventeen-year-old would have known that going steady was easy to begin or terminate and that it combined clear behavioral prescriptions with undefined emotional commitment and was in fact merely the boundary between casual dating and the steep and demanding road to marriage, rather than the first step on that road. Tootsie could have negotiated with her young man for gradually enhanced emotional intimacy without such risk of irrevocable sexual intimacy or premature marriage, which was possible in an overheated, unstable relationship of "not exactly" engagement.

The main architects of the dating system were middle-class girls.[89] Girls had more to gain by the establishment of dating, because the new version of the double standard that it put in place was considerably less restrictive to them than the one it replaced. Before dating, parents had tended to construe strictly girls' obligation to enter marriage untainted by even a hint of scandal, and they supervised courting accordingly, limiting both its occasion and the set of eligibles. The boy who came calling had not only to be prepared to behave himself but he also had to pass prima facie muster as a boy who by reputation would behave himself. Under the double standard, however, boys' reputations were both subject to repair and of far less interest to their own families. Girls were far more constrained by parental oversight.

Despite their substantially united front toward their parents'


96

generation, boys and girls had by no means identical interests in the new dating scheme. The female physical-growth spurt came earlier and provided a convenient sign for what contemporaries believed (and thereby encouraged) to be girls' earlier awareness of the opposite sex as objects of interest. Contemporary accounts of adolescent behavior had boys entering the high school ages still in a "gang state," while girls had long before turned to "fancies . . . of men and boys, and of herself as the center of attraction and interest. . . . She becomes interested in dress and personal adornment . . . [and] ruin[s] her healthy skin with rouge and lipstick."[90] Furthermore, girls more often than boys remained through high school to graduation. If there were more girls in high school potentially to be seeking dates, so also higher proportions of them, particularly among the freshmen and sophomores, presumably hoped to date. Accordingly, girls sought to limit competition by defining its terms, and they sought to enlarge the pool of eligible boys. There was, of course, the alternative possibility for a girl to be a collegian's or an employed boy's "townie," but such a choice took the date outside its familiar negotiating balance and outside the supportive structure of peer-group gossip.[91] Gossip and the clique system operated to limit the terms of competition among girls, most particularly by regulating the amount of physical gratification with which they could reward their dates. Commonly, such gossip took the form of "catty" statements that anyone could get boys by giving a good deal of sex: doing so would only counterfeit popularity.

The date, as a bargain, was unromantic but affectionate. In dating, style mattered a great deal. Performance was far more important than the unmediated expression of feelings. The very ordinariness of dating placed practical limits on the amount of romantic idealization that courtship could now support.[92] The success of the dating system encouraged a set of rules, rules of performance more than of feeling, rules that even young boys and girls could learn. Thus, Ernie, thirteen, stoutly denied in 1931 that "I want to call on girls and take them out" but admitted to having girlfriends and that in defiance of his parents' wishes he liked "to have friendly talks with girls over the telephone." "Every boy my age likes to have


97

money to spend and to dress up," Ernie lectured a love advisor in 1931.[93]

The Gendered Reconstruction of Sexuality

Petting, that delicate standoff between sensual indulgence and constraint, was almost universal in the sense that all daters petted at some time but not in the sense that all couples petted. Graduated physical intimacy became an accepted part of lasting teen relationships, both a marker of affection and a spur to increased commitment. The sexual histories collected by Kinsey and his associates point to a distinct sexualization of noncoital relations far more pronounced than the often-remarked increase in premarital coitus also recorded. The Kinsey data here point to an increase between the pre-World War I and postwar adolescent cohorts—from 29 percent to 43 percent of girls who petted before sixteen and an increase from 41 percent of boys to 51 percent.[94]

A decided reduction in the typical age at which petting began was coupled with a marked increase, especially in women, in orgasm achieved by petting. Unconventional sex practices, like fellatio and cunnilingus, likewise increased, as did premarital coitus, especially with eventual marriage partners. Overall, the increase of sexualization of the whole path to marriage is inescapable, although qualitatively the downward extension of erotic petting was the most pronounced and the most significant in restructuring the life course.[95] One particularly acute observer of campus mores understood the enlargement of sensuality in the lives of students as an offset, engineered by girls, to the economically based reluctance to marry that young college men were expressing, and in this sense it was a reassertion of older values regarding marriage rather than an abrupt assertion of moral innovation. "Since petting leads to 'dates,' and dates lead to more dates and to real romance [i.e., marriage], one must pet or be left behind."[96] It was not thoughts of future bliss that bound two people together but mutual gratification in the present. "The modern lover daydreams not merely of a lifelong companionship, but of a lifelong state of being in love."[97]

In American middle-class ideology before the 1920s, the


98

deferral of sexual pleasure until marriage had provided the pledge that cemented love unions—the chastity of the bride and the definition by the groom of his prior sexual experiences, if any, as the unfortunate yielding to instinctive drives and the temptation of "bad" women of no account. Adherents of the older structure of values maintained that "in the general wreck" of prewar values, "the wreck of love is conspicuous and typical. . . . Sex, we learned, was not so awesome as once we had thought. God does not care so much about it as we had formerly been led to suppose; but neither, as a result, do we. Love is becoming gradually so accessible, so un-mysterious, and so free that its value is trivial."[98] Edward Sapir defined this concern presciently: "Sex as self-realization unconsciously destroys its own object by making of it no more than a tool to a selfish end."[99] But by the 1920s, modest sexual pleasure was little more than one of several commonplace "thrills" available to young people. "The adolescent convention of petting is used not as a preliminary to the sex act but as a pseudo-substitute for it, as a means of working off tense emotions."[100] Dating, and even petting, fit appropriately into a view of adolescent development that favored "an emotional attitude of free, wholesome contact with members of the opposite sex" during the teen years, when not thwarted by "psychologically inept efforts [by adults] to create inhibitions in the young" as by the "over-idealization of womankind . . . as . . . almost too delicate to touch."[101]

Many young women of the postwar generation asserted in word and gesture that they were sexual beings quite like men—and not ashamed of it . Paula Fass sees this insinuation in the most characteristic and explosive aspects of the "new woman's" appearance. Bobbed hair, flattened breasts, shortened skirts created "a well-poised tension between the informal boyish companion and the purposely erotic vamp. . . . Smoking implied a promiscuous equality between men and women and was an indicator that women could enjoy the same vulgar habits and ultimately also the same vices as men." Such signs could assert sexuality as long as they could play off against the double standard, focusing on the right to be openly sexual rather than the still-outrageous notion of actually behaving with "mascu-


99

line" lustfulness.[102] Sociologist Joseph Folsom, from his vantage point at Vassar College, argued that a "woman may conscientiously allow herself to feel passion to the same extent as the man, if she controls its expression."[103]

The double standard was not overthrown but modified. Rearguard actions, like the bills submitted in a number of state legislatures regulating the cut and material of women's dresses, so patently attacked symptoms alone that they invited ridicule that pressed the argument further than most defenders were ready to face. "Why should men be permitted to tell us how to dress? Why should women always have to protect their 'feelings'? Why are not men made to control their 'feelings' just as women are? Why should the fact that a girl has legs arouse the wrong kind of impulses in a man? Does he think we travel on wheels?"[104]

When asked for a stark statement of personal preference, sizable majorities of sophisticated young men and women (2/3 of the men and 7/8 of the women in one 1920s college study, half of the men and 2/3 of the women in another) rejected the dual standard.[105] Girls could now express themselves sexually. Indeed, as Folsom remarked, "a new method of adjustment [of gender relations] has begun, namely, the education of women to find greater pleasure in sex." The great majority of contemporary testimony, however, indicates that even while girls who seemed unawakened sexually were made the butt of humor, girls (not boys) who (even if in love) proceeded to coitus and spoke too widely of the fact were devalued. "Our newer mores permit us to experiment widely with human emotions, yet they do not permit us to observe freely the results of these experiments."[106] Young men of the times were "for the most part disposed to try to face the problem of sexual urgencies before marriage, and of responsibility, like the problems, on a more nearly mutual basis. To some extent, nevertheless, they too are likely to hope and expect that the girl will prove more worthy than they feel they can hope to be."[107]

In time, the "modern" perspective became mere common sense.

Not so very long ago make-up was associated with prostitutes and the kind of women who laid themselves out to attract men and


100

parents still associate the use of cosmetics with that class—and though latterly make-up is part of the general effect of the costume. . . . Leader thinks that's all there too it—The whole design counts—it's a matter of taste. . . . It's surely not necessary to consider the moral end of it. Fashions change. . . . But we are ruled by fashions. . . . It is better to conform to the prevailing style—as well as possible—take it out of the realm of right and wrong.[108]

The advice subtly moved beyond appearances.

However difficult it may be for parents who are themselves neurotically afraid of sex to accept the healthy conditions of our unsegregated modern adolescence, we cannot oblige them to turn back the clock to the patriarchal era. Our world needs adults who have grown up emotionally and who can be enough in love with their mates to stay in love without economic and social pressures. Petting-parties . . . are for Phyllis a natural and wholesome part of growing up emotionally into womanhood.[109]

These cultural assumptions underlay a dating system in which boys were by convention assumed to be always on the lookout for some petting, but girls were conventionally assumed to get far less physical pleasure on the whole from the act itself.[110] Boys pursued; girls rewarded boys who were affectionate, restrained, and provided a pleasant time; girls rewarded boys moderately. When girls were fond of petting, they found that their peer group (aided by boy gossip) stood in the way of their being too easy. Even for girls in love, peer pressure set limits to lovemaking. Thus, a high school girl noted in 1929, "The girl who permits liberties is certainly popular with boys, but her popularity never lasts very long with any one boy. You know the saying, 'Just a toy to play with, not the kind they choose to grow old and grey with.' "[111] Boys' behavior could be modified. "Even freshmen girls know . . . that a boy who considers himself a gentleman may have standards that vary according to those of the girl with whom he may be," wrote a high school dean of women.[112] Dating, thus, operated still within a double standard of sexual conduct that demanded of girls the strength to say no and the strength of mind to prevent matters from coming to such a pass.


101

In dating, physical pleasure was defined as properly a token of affection and commitment. Through dating, girls considerably before marriage could discover patterns of emotional intimacy with boys congruent with those the female subculture had long valued, but without ultimate commitment, physical or marital.[113] Nor need the task of finding a good mate be forgotten, for the dating system elaborated a series of stages that led toward engagement and beyond. The tender interpersonal qualities sought in a good date, while not identical to those of a good mate, were nevertheless among the desirable traits. For female readers slow to pick up the detailed, subtle relationship among sexuality, emotional intimacy, popularity, and eventual marriage, "A High School Boy" in True Confessions explicated the ideal girlfriend and acceptable variants. "The kind of girl that will kiss you and let you know that the kiss means something and that that's all there is, there isn't any more, is one of the square shooters and if you can get her to marry you you're lucky, and you needn't ask any questions."[114]

The terms of the dating exchange were widely understood among the young but not entirely uniformly. Petting was particularly often at the heart of misunderstanding, especially in that it incorporated a partial revision of the deeply inculcated double standard.[115] Certain adolescents, like "Miss Dateless," found themselves essentially outside of the dating pool because they failed or refused to recognize that this fundamental exchange in dating was normatively governed and structured by a sense of the emotions appropriate to age and stage.

I am 20 years old and, to use the slang expression, "hard up for dates." I am rather small, but have my share of good looks. I am inevitably cheerful, like sports of all kind and like to talk of them. I am interested in good music. . . . But—I sit at home without the boys. I think one of the reasons is that I am not common enough. I let a boy know it if he gets fresh with me and scratch him off my list. I use cosmetics, but sometimes look pale near some of these "clowns." However, they get "dates."[116]

Other girls, who yielded too readily to the combined pressures of boys' entreaties and the clear-cut injunction to date and—to


102

a degree—enjoy physical "thrills," were devalued as dates for being too easy. A popular and spirited girl, whose friend had inadvertently become the butt of boyish ribaldry ("they called her the 'lemon' because they said she was made to squeeze"), castigated the boys for their insensitivity. She recalled that she "told those boys just what I thought of them, and they hadn't a word to say when I got through, either." But then, her friend's behavior was to be explained by inexperience and excused because "she hasn't any mother."[117]

The formal extracurricular life of the high school quickly came to be articulated with the gender-structured dynamics of the dating system. "It is a well-known fact that club pins are an absolute necessity when a young man wishes to plight his time-enduring regard for some lady; but, even considering this, it ought not be necessary to have more than three or four."[118] Beyond visible symbols, word of mouth was powerful where everybody was likely to know everybody. "Why should we have so many idle gossipers in the school? . . . Much to our dislike we have many social groups and this lowers cooperation within the student body."[119] Gossip, of course, while lowering cooperation, also regulated behavior—reassuringly for the most part, oppressively on occasion. Trends in fashion were sharply defined and served to mark out those who qualified for the dating pool. A "bobbed hair census" at Little Falls (Minnesota) High in 1923 indicated the strength of fashion: in each of the four classes, more than three girls in four had adopted this hair style, so rich in affirmation of modernity.[120] Even among "subfreshmen," 65 percent had already caught on.

A ritualized jousting and chiding of the boy population in general (sometimes, happily for the historian, in the high school newspaper) served to bring marginal boys into the dating pool. Chiding served to educate boys to the proper ways of behaving toward girls, so that the rules of the dating system might be learned even by the more backward among them:

Boys, is it fair to make the girls come to a school entertainment unescorted? So far, I have not been to an entertainment without seeing three-fourths of the girls come without escorts. The most disgusting thing about it is, that the boys act as though they did


103

not realize the predicament they've placed the girls in. . . . I believe the faculty should make a rule that no girls come to the parties unescorted and that no boy be admitted without a young lady.[121]

Or:

What has come over the boys of this school? . . . Is it the lack of carfare? I am sure that we girls would be happy to supply that . . . instead of going home alone after 11 o'clock. Fewer girls will be allowed to attend parties at school, since they must return late alone. Just because a boy is gentleman enough to take a girl home, is no reason that he is in love with her. All we want is common courtesy, not husbands.[122]

Boys must be taught the nonbinding quality of a date, to distinguish it from the courtship system that dating was replacing. The complaint was not misdirected, for an earnest correspondent responded in the next issue:

There are many reasons. Not that the young man has not the price of carfare, or is too stingy, but that the girls of to-day are too different from those of yesterday. He has not as yet become acquainted with their ways. It will take a long time unless the girls do their part and bring the boys out of that bashful state which is keeping them from mixing in with the girls and being treated as equals. Therefore, act as though you wanted to be taken home, and I am sure you will not be disappointed.[123]

"Bashful" was the word. Throughout the decade, female correspondents in high school newspapers would resurrect it as an adjective of mild condescension addressed to the boys they hoped to recruit to the pool of dating eligibles:

As usual, only senior and junior girls are to be present, but boys of the lower classes are allowed to come. In that case the senior and junior girls must wait to be invited before they can attend. It would be unfortunate to have these girls left out and, weird as it may seem, the task of inviting them is up to the boys—bashful and otherwise. Let's have as many junior and senior girls asked as possible, boys.[124]

The public nature of the high school dance—aside from fueling the competitive element of the dating system—served girls'


104

purposes ideally. In the 1923–1926 Kansas City study, boys consistently ranked social dancing below "having 'dates' " while girls consistently ranked it above. When the Alexandria (Minnesota) High School in 1927 circulated a questionnaire to its students regarding more parties and, for the first time, school dances, both boys and girls voted overwhelmingly for more parties, but boys only split evenly on dances, which girls supported by five to two.[125] "Stags" posed a problem, however, and girls pressed for the elimination of stags and the establishment of fixed-partner dates at school dances and no doubt elsewhere. For girls, the stag arrangement and its attendant "cutting in" at dances was an invitation to humiliation or boredom and left all the power of decision making in the hands of boys, who not rarely looked after one another's interests and gave no thought to the wallflowers the system inevitably created. "Just fancy knowing that a boy is dancing past the stagline and waving a five-dollar bill behind your back as an offer to anyone who'll come and take you away?"[126]

Occasionally rebelling verbally against "girls who have dates four or five out of the seven days of the week" and the "sort of contest" among girls "to see who can get the most dates in one week," boys accepted the new regime.[127] For them, it was something of a gain, in the sensual pleasures of petting, in the tenderness of occasional intimate conversation, in the articulation of "popularity" with the bumptiously democratic tone (and stratified structure) of the new, expanded, age-homogenized high schools. " 'It's just that I like to take her places,' explained one among the many suitors of Bette, the most popular date in the junior class. 'You're sure to have a good time with her. She's never a liability, you know that she'll be the belle of the ball. But really I'm not crazy about her.' "[128]

Were the interests of middle-class girls harmed by the new institution they had promoted? Considerable evidence from the 1950s, to be discussed below, indicates that dating overwhelmed most other concerns for many high school girls—and many college girls—thereby perpetuating disadvantages in other realms to which schooling was relevant, especially the world of work. But domesticity hardly seems to have been the gender issue in the 1920s and 1930s that it was to become a


105

generation later. A more serious charge against the new dating mechanism concerns girls' sexual vulnerability. We have seen it to be the case that premarital coitus, both with fiancées and with others, did increase in the first cohort of girls within the new dating regime, at which point knowledge of birth control technique was obviously too shallow to offer reliable protection to many. But the evidence presented on age at marriage and pregnancy status at marriage do not point to forced marriages owing to pregnancy, nor to numbers of women condemned to spinsterhood through youthful loss of virginity and subsequent consignment to the category of "soiled goods." On balance, it seems that moral innovation did bring female sexuality into the arena of boy-girl relations in a new way but not without peer-group safeguards, imperfect but because quickly institutionalized perhaps no less effective than the foregone familial mechanisms that sometimes failed in the face of passion.

The Transition to Marriage

Youthful emotions were given more play in the heightened pace of courtship, which continued selectively into the 1920s. Details are shown in table 9, which presents proportions married at young and average ages for marriage, from which can be derived a sense of a cohort's movement into marriage.[129] For both young men and young women, the table shows, the most prominent continued downward trend of marriage age was among the native whites of native parentage who lived in cities, the prime locus of economic and cultural innovation—including the new dating system. Urban marriage ages moved downward, approaching those of rural people of like nativity. By contrast, the downward movement of marriage age for second-generation Americans and for blacks, like those of rural native whites of native parentage, essentially ceased.

The continuing eagerness of young people for marriage was a matter of some relief to contemporary students of manners and morals, for it spoke to a considerable continuity of values at a point of apparent upheaval. Blanchard and Manasses, thus, reported that their survey revealed that "the modern girl seems to want marriage most of anything in life." In their college sam-


106
 

Table 9. Proportions Ever Married, by Sex, Age Group, and
               Social Characteristics, 1910–1930 (in percentages)

 

Males

Females

                                            20–24                                  25–34                                   15–19                                   20–24

Native white of native parentage

       

Urban

       

1910

23.6

62.7

  9.1

44.5

1920

29.6

68.7

11.9

49.8

1930

31.0

73.3

12.3

52.0

Rural

       

1910

30.0

72.4

15.7

59.7

1920

32.9

73.6

15.0

60.4

1930

32.7

73.1

16.2

61.5

Native white of foreign or mixed parentage

       

1910

15.8

56.1

  5.6

37.2

1920

18.7

59.9

  6.5

40.8

1930

17.9

63.6

  6.0

41.5

Black

       

1910

40.3

74.9

18.8

65.1

1920

45.1

74.9

21.3

68.4

1930

45.3

76.2

22.1

66.9

    SOURCES : Calculated from Census 1920–1, 391–393; Census 1930–1, 848–850.
    NOTE : Data are available for foreign-born whites, but in a strict sense these, and those for the children of the foreign-born, are not commensurable across censuses in the same way that natives are, since they are highly subject to changing migration patterns, recent or remote.

ple, nine in ten supported wives' work to permit timely marriage, but only half this proportion when wives' earnings "are not necessary."[130] At the same time, they noted that this old-fashioned concern led to innovation, for girls now were choosing to remain at work after marriage "in order to achieve an earlier mating."[131] New material circumstances, innovations in courtship practices, and changing prescriptions for prudence promoted a confusion about the right age to marry. Young


107

people—"modern" urban girls particularly—seemed to press for a downward revision of the marriage schedule. Thus, an advice book, in some alarm, told young women not to rush, reminding them quite inaccurately that "marriages are being made much later than they were a few generations ago."[132] Correspondingly, "A Family Doctor" worried "with changing economic conditions, just how are we going to tide young people over the years when they are physically ready to marry but not yet ready financially." The author leaned even harder on young men, drawing on their sense of economic prudence to construct a right basis for assessing marriage. "If men entered into marriage as carefully and deliberately as most of them enter into business deals, the outcome would be more certain of success."[133]

With girls newly able to display their charm and sex appeal unashamedly and in new social settings, boys had to be more careful not to fall too quickly. "She intends to marry at a more specific date if she can bring it about, have a definite number of children at desirable intervals, and earn a definite sum toward the upkeep where she needs to. . . . And she is determined to have more of a grip on the bank account than her mother, to help to swell it with her own earnings, married or single, and to do so in chiffon stockings and silk underwear."[134] Success in entrancing men and designing one's own marriage, however, ran the risk not only of overprudent men but of experienced, exploitative ones. A popular short story on this theme, evoking the mythic opposition between country tradition and urban oversophistication, places a charming, capable young woman between the contrary pulls of a professionally ambitious spinster—her supervisor at the department store—and a good man from her old hometown. Ultimately, success at the department store required a too-blatant use of her body (in dress modeling). The plot resolves itself, on these grounds, in favor of the boy next door, and prompt marriage.[135]

Figures 8 and 9 show the changes over the decades of the 1910s and the 1920s in proportions of men and women who had married by successive single years of age.[136] Among men, it is apparent, the most substantial gains in proportions married in the 1910s had occurred at the younger ages—the late teens


108

figure

Figure 8.
Changes in Proportions of Men Who Had Married, by Successive
Single Years of Age, 1910s and 1920s

and early twenties—for both native whites of native parentage and for blacks (native whites of foreign parentage followed essentially the same path as native whites of native parentage, here and elsewhere). Black women, too, had moved toward earlier marriage again—in the 1910s, especially at 17 to 20 or so. Native white women of native parentage were more likely to be married in 1920, too, but not especially so at the youngest ages. But we should not make too much of the age-specificity of the decline: the strongest point is its substantial generality across groups and the amplitude of the change in the 1910s. Thus, even


109

after leveling off after the youngest ages, about 2.5 percent more young men were married at any given age in 1920 than in 1910.

The 1920-to-1930 trends are trickier. On the whole, the 1920s represent a dampened continuation of the downward movement in marriage age. By far the greater proportion of

figure

Figure 9.
Changes in Proportions of Women Had Married, by Successive
Single Years of Age, 1910s and 1920s


110

men married after age 21, and after that age, the 1930 data show even greater proportions married at single years of age than did the 1920 data. This is the case for both native whites of native parentage and for blacks. Since virtually all of marriages for people this young would have occurred at the very end of the decade, it is possible that the earliest phases of the Depression could have caused the observed downturn. More likely, however, is that just as 1920 was a period in which relatively young marriages for men became notably more common, the end of the decade saw a return to the previous pattern.

Beyond age 23, however, the 1920s saw an increase in the pace of men's marriage that was about as great as that in the 1910s. There can be no doubt (given the age schedule of men's marriages) that most of these marriages were taking place in the latter half of the decade, so it is safe to conclude that except for the relatively young ages, the 1920s were a decade in which, rather regularly, younger cohorts could look forward to somewhat younger marriages than their immediate predecessors and to distinctly younger marriages than in the generation of their parents.

Close examination of the year-to-year changes over the 1920s in numbers of first marriages of young men and women at selected single years of age in New York State (exclusive of New York City) reveals a rather complicated pattern. Among women, the most rapid increase before about 1926 was in young marriages—at ages up to about 20. After mid-decade, however, this trend reversed, and it was marriages between the ages of 20 and 25 that increased the most rapidly, particularly at age 21. For men, trends were gradual throughout the decade, toward younger marriages, at ages that increasingly approached those of women.[137]

When we examine figure 9, we find that most of the trends for women in the 1910s and 1920s resemble men's. For older white women, nuptiality was increasing even more rapidly in the 1920s than in the 1910s. For black women, however, things were different. The decade's marked black migration from southern agriculture to northern urban centers definitively interrupted the increases in nuptiality that had characterized the previous decade. If the 1910s had brought a great enlargement


111

of marriage probabilities for black women, the 1920s was a decade of return to prior patterns. But these black women provide the only marked example in which the 1920s saw a clear reversal of the downward movement of the marriage transition that had lasted over a generation.[138]

Because states differed from one another in the characteristics that may have promoted early marriage, we can enrich our sense of what lay behind the new life course scheduling by examining what was associated on a state-to-state basis with continuing nuptiality increase in the 1920s. The simplest and also the most powerful way to analyze these data is to examine proportions ever-married for the single age group 20–24, encompassing a critical half-decade of marrying for both male and female cohorts. (To avoid confounding with differential inmigration patterns, I examine only native whites of native parentage. No analysis of black marriage patterns is feasible by this method at this time, because black marriage patterns were considerably differentiated by region, and the period was one of great interregional migration among black people.) The analysis here, to be sure, is not causal, but, literally, only tells us what characteristics of states—rather large and heterogeneous places at that—were associated with increasingly earlier marriage in the years just prior to 1930.[139] But by including as a predictor variable the extent of downward edging of the marriage age in the 1910s, we will be able to see not only what characteristics conduced earlier marriage but also which of these were especially relevant to the shifts, discussed above, in the trends that were already extant by the 1920s.

Earlier marriage during the 1920s was facilitated by whatever promoted relatively rapid population growth.[140] The balance of migration—surely in a decade during which traditional channels of overseas migration were substantially plugged by restrictive legislation—probably indicates well the kinds of possibilities for starting a life on one's own that were reflected also in decisions to marry earlier rather than later. On balance, then, we may conclude that the prosperity of the 1920s conduced to couples taking the plunge into marriage. Two other easily measured factors were also reflected in the state-to-state patterns for males. The first—best measured by the proportion


112

foreign born in the population—suggests what inspection confirms: the states of the industrial belt of the Middle Atlantic and Midwest where large numbers of foreign-born persons resided more often than one would otherwise expect showed small or no statewide increases in proportions of men marrying young, a pattern strongly contrasting with the continued drop in marriage age in the homogeneously white areas in the Mountain and Western states (but not in the agriculturally depressed South). A second factor improving male marriage prospects depended not on improved resources in the hands of the potential groom but rather on improvements in the marriage market itself. Across the nation, ratios of native white males of native parentage to native white females of native parentage (taken as indicative of a single marriage pool) was in 1920 strongly skewed in favor of males in a considerable number of states, especially in the West, a product largely of "frontier" patterns of long-distance interstate migration. By 1930, this skewing had on balance considerably declined, and the state-to-state variance was markedly reduced.[141] The approach of states to sex parity among native whites of native parentage "explained" just about as much of the downward movement of male marriage ages as did population increase or proportions foreign born. We need not insist on the details of the model; but we should recognize that both the means to contract marriage younger and easier access to appropriate mates explain how men came, on balance, to continue their trend toward earlier marriage in the 1920s. In states that were among the faster in population growth and also among those in which the sex ratio moved most rapidly toward parity, 2.2 percent more of the males at 20–24, on average, were married in 1930 than had been in 1920. At the other end of the distribution, in those states where population growth was slower and in which sex ratios among native whites did not move rapidly toward parity, or moved away from it, about 0.7 percent fewer men were married at 20–24 in 1930 than had been a decade earlier.

For women, the "market" for acceptable mates also improved, and in such a way that accounted for about as much of the continued if modest decline in women's marriage age in the 1920s as did population increase. But women were of course not aided by the general movement toward parity in sex


113

ratio that had been important for men. Rather, what improved women's marriage-market opportunities was an institutional change—the great expansion of the high school. Women at this time married on average between two and three years younger than men; and as high schooling became typical within the population, the age of school-leaving came close enough to marriage age that the high school and its informal social life became a critical arena for increasing numbers of girls to contract marriages. Accordingly, when states like California, New York, and Ohio added to the school rolls something like 25 percent of the 16- and 17-year-olds during the 1920s, it is not surprising that for women at ages 20–24, the proportions who were married increased rather markedly. In states that were among the faster half in population growth and the half in which school was extended the fastest, the average increase in the proportion married at 20–24 was 2.1 percent; in the states below par in population growth and school extension, the comparable proportion was a reduction by 1.6 percent in the proportion married. In assessing the significance of these patterns, it is worth considering that, outside of those alluded to, regional patterns did not appear; nor did proportion urban or pace or urbanization; nor did initial values of youthful-marriage proportions explain trends; nor did the trends in the 1920s merely continue on a state-to-state basis the trends of the previous decade. That is, marriage age dropped where circumstances facilitated it .

These factors, statistically significant, certainly do not deny the emergence of new values that addressed the right construction of the life course. But the state data do not seem to "require" normative change, suggesting rather that the sometimes-realized but always-intriguing possibility of earlier marriages for "modern" urban young people emerged out of the new material and social circumstances of the decade.

Parenthood

The decade of the 1920s was a period of overall fertility decline, a continuation and intensification of prior trends. In several respects, these years were marked with a sharpening of the differences in family behavior between classes and between ru-


114

ral and urban residents. Strongly associated with the fertility decline, among white women, was the spread of secondary and higher education. Black women, too, reduced their fertility, but for them, the decline was also quite steep within groups divided by educational attainment, as it was only among the more well-educated whites. On the whole, these patterns of decline for whites (blacks were not so tabulated) held in both urban and in rural nonfarm areas, although they were attenuated among farm women.[142]

When we turn from overall fertility to the timing of initial parenthood, the pattern becomes more complex. Decline in first-childbirth rates was sharpest among women already somewhat on the old side to be having a first child—those 24 or 25 and Up.[143] By contrast, among those 18 or under, there was but little overall reduction in initial fertility. Indeed, something of a peak in first-parity fertility was attained by young women around 1925. While, overall, the 1920s were a period in which the likelihood of becoming a young mother at first grew, then declined markedly, for older women not yet mothers the pattern was one of slow decline for the first half of the decade followed by rapid decline in the second half.

Corresponding to and complementing the rise in nuptiality in the first part of the 1920s was a tendency to slightly more rapid movement into parenthood after marriage, a tendency that was common to whites and blacks, and across socioeconomic levels.[144] The trend was not matched, however, by similar increases in the years following the first one or two after marriage, a pattern that is congruent with the overall decline in fertility during the decade. First parenthood, then, following a marriage that often was earlier than in preceding decades was relatively often attained relatively early in the marriage: a substantial subset of couples was taking advantage of a period of relative economic promise by moving quickly through the steps of family formation. Other couples—notably those who had already resisted parenthood for the first phase of marriage—were more often than previously postponing parenthood for several more years, many presumably having chosen the temporary childless marriage as desirable for a while and having been in command of the technology that enabled this. After mid-decade, however,


115

this pattern ceased and was replaced by one in which more couples than in the first half of the decade delayed parenthood longer after marriage. It is apparent from the data that it was not lifetime childlessness that took the place of the trend toward early parenthood within early marriage but rather a somewhat delayed parenthood within marriages that themselves continued to be contracted earlier in the American life course.

Birth control was rather widely but by no means universally diffused even in the late 1920s. Clinic data collected in the 1920s and 1930s indicate that the proportion using some method of fertility limitation had nearly doubled since the early years of the century, condoms being the method accounting for much of this increase.[145] At the time of the late 1920s marriages of the native white Protestants who were to be studied by the first major survey of such matters, the Indianapolis Fertility Study,[146] about two in three of these couples employed some method of birth control (not necessarily a contraceptive method, of course). The proportion doing so was only slightly in excess of half for wives who had ended their schooling before attending any high school but exceeded seven in eight among wives who had attended college.[147]

A retabulation of Kinsey data indicates that there were at this time sharp changes in marital contraceptive use (at least among Kinsey's relatively sophisticated respondents).[148] Coitus interruptus and the douche gave way to the diaphragm, while condom use remained constant. Retrospective data from relatively sophisticated New York City couples showed, likewise, a reduction by about half in dependence on withdrawal, but the condom, rather than the diaphragm, was becoming the contraceptive method of choice.[149] Women who used diaphragms were not only in a sense taking control of their own bodies; they were also taking a part in a revolution of sexual attitudes, in which the formerly unspeakable was necessarily now spoken.

Contraceptive information was not yet so widespread as it would be in subsequent decades. The Lynds described contraception in Middletown as publicly condemned but gradually tolerated as inevitable, couples of advantaged backgrounds being the quickest to embrace the practice. The Lynds were


116

struck by a conflict between individual beliefs and behaviors and far more restrictive official group norms about birth control (as about much else) that suggested "an underlying bewilderment considerably . . . widespread and more pervasive of the rest of their lives."[150] The Kinsey data indicate that perhaps one in four women who had sexual intercourse before marriage conceived premaritally—a proportion that gave no signs of declining in the 1920s, even as the proportion of young women placing themselves at risk increased.[151] Only 6 percent of Indianapolis wives who were to practice some form of fertility control before their first births had known any contraceptive practices so much as a year before their marriages; another 35 percent learned shortly before their wedding day. The largest category of brides who practiced fertility control before their pregnancies learned about contraception on their wedding night . Indeed, almost 6 percent practiced fertility control in some form without ever learning about contraception. When wives knew of a method before their marriage, quite routinely that method was douching, a method that was under their own control, if not especially effective from a strictly contraceptive standpoint.[152] About half as many knew of the condom at this point; almost none of the Indianapolis wives knew of the medically controlled diaphragm.

The organized movement arguing the virtues of birth control and making an effort to diffuse contraceptive information reached few women, having elected a physician-and-patient model rather than a public-health model and promoting the highly effective but highly cumbersome and intricate diaphragm.[153]

In Indianapolis, only half of those couples marrying in the late 1920s who tried to postpone their first birth succeeded.[154] The risks of pregnancy were considerable even among those who approached coitus planfully, which suggests how extreme they must have been among neophytes, many of whom, even at marriageable age, were ignorant of all but the absolute rudiments of sexual matters.[155] Such uncertainty helped sustain the general understanding that premarital coitus was on balance not to be entered into with anyone whom one was not prepared to marry.


117

Within marriage, almost six in ten of the Indianapolis women who had not finished high school used contraceptives of some kind or other between their marriage at the end of the 1920s and their first childbirth. Three in ten postponed childbirth to beyond their second wedding anniversary. The comparable figures for high school graduates were four in five contraceptors and 45 percent with their first births so late. In Indianapolis, some means of fertility control was employed after marriage and before first conception by only half of those who, not finishing high school, married before 19. But two-thirds of their educational peers who delayed marriage somewhat more practiced family limitation in some form. Among high school graduates, 77 percent of those who married young employed a method of birth control, and no less than 85 percent of those who married at 23 or older, with the intermediate group middlingly prone to limit fertility.[156] The results are visible in the way these different groups of married couples structured the transition to parenthood. Fewer than one-quarter of the early-marrying non-high-school-graduates were childless on their third wedding anniversary (most of those with children having had them within the first year of marriage), but 35 percent were if they had delayed marriage until after 23. Among high school graduates, the proportions ranged from 36 percent of the young marriers childless at three years, and 51 percent childless among those who married late.

Girls' access to contraceptive information at this time, in fact, was closely related to their backgrounds and the kinds of lives they were moving toward. The clearest indication of this is probably the relationship of the timing of contraceptive education to level of formal education they eventually achieved, as shown in table 10.

We see at the end of the 1920s a diffusion of a morally significant piece of technological information. For those girls who were advancing the most markedly toward "modernity," contraceptive information was beginning to become something learned in adolescence, in advance of marriage, potentially able to structure boy-girl relationships, potentially able to affect the way marriage and family building were approached. Still modal, however, was for contraceptive knowledge to enter only


118
 

Table 10. Timing of Contraceptive Information, by Own
                 Eventual Educational Attainment, Indianapolis Wives
                 Who Contracept

   

            High School

                  College

                                     Grade
                                    School                           Some                        Graduate                        Some                        Graduate

As girl

    4.2%

    3.2%

    8.1%

    3.7%

    6.1%

Premarital

  26.3

  30.0

  32.6

  36.6

  47.9

At marriage

  40.7

  39.6

  42.2

  43.9

  47.8

After marriage

  28.8

  27.2

  17.1

  15.9

    8.2

 

100.0%

100.0%

100.0%

100.0%

100.0%

    SOURCE : Computed from Indianapolis Fertility Study data, unweighted.

as an element of explicit premarital instruction rather than as a part of the common wisdom of adolescence. (Girls of the most straitened socioeconomic background—and many of these must have seemed "bad" girls by the standards of the day—were the most likely to have learned about contraceptive methods in their girlhoods.)

Conclusion

Dating, petting, birth control, and an increase in the sexualization of life generally can be seen as having their roots as elements of the youthful life course well before World War I. That these new arrangements ramified through the experience of family life is apparent in contemporaneous revisions of divorce. The decade saw a rapid increase in the numbers of divorces which greatly alarmed contemporaries. In terms of crude divorce rates, however, the entire decade of the 1920s seems far more like an accommodation to patterns rather suddenly introduced during the war period and its immediate aftermath.[157] More precisely, divorce trends in the 1920s can be here seen in good part as a response to impermanencies structured into marriages at their inception . When one examines rates based on cohorts of marriages begun at different dates, the second half of the 1920s was a time when divorce did seem to pick up mo-


119

mentum.[158] Divorces characteristically occurred progressively earlier within 1920s marriages, although the most substantial enlargement of the deterioration of marriage occurred around the modal point for divorce, three to seven years into marriage. Accompanying the increase in the number of divorces was a parallel growth in the proportion of all divorces in which the stated grounds were "cruelty," rather than "desertion," the previous modal category. "Cruelty" was as close as the legal rules of most states at the time came to permitting divorce on the consensual grounds that the marriage just did not work out.[159] It was as though the visible challenge to lifetime marriage suggested by changing attitudes as the 1920s advanced led, in turn, to an increased willingness to contract marriages that were less and less seen as permanent at their inceptions. Elaine May's reading of divorce actions leads her to conclude that while "most divorcing urbanites were not in the vanguard of a moral revolution" at this time, they were subject to a new "confusion surrounding domestic aspirations" and the nature of marital happiness.

The pursuit of happiness took couples . . . into wedlock, and then out again. Along with marriage, divorce was another step in this quest. . . . But, . . . rather than a triumph, it often seemed like a personal failure. In the divorce court, unhappily married individuals blamed their spouses. But away from the Court, they often blamed themselves.[160]

The 1920s promoted the emergence of our modern youthful life course, normatively sanctioned for the middle class, spreading among other urbanites: extended schooling combined with an early and gradual peer-structured courtship system, while promoting an early and often romantic marriage, in which the romance was in effect prolonged by the modest postponement of parenthood. The value change so often remarked in the 1920s was the sound that the middle class made in recording its somewhat ambivalent approval of what had increasingly become its own behavior and in proposing these values to the rest of the population as the right way to live. Where these values seemed morally vulnerable and also felt bad, as in the case of easier divorce, there breast-beating occurred, along with a


120

tendency to try to excise that portion of the evolving family-formation process and define it as the product of individual error, capable of being reformed out of existence.

In the largest sense, we are dealing with a change in the way families organized their behavior over their life cycles and understood the ways they were doing so, thus influencing the way that individuals, in structuring their own life courses, anticipated coming into their own. This reorganization was considerably influenced by more general aspects of the outlook of Americans in the 1920s, an outlook in which proximate gratification of the self was more highly prized (or less commonly condemned) than before and in which optimism about the material possibilities of the future was at a new high. At the same time, such values—and the material circumstances that underlay them—were not uniformly shared throughout the population. They were, however, held particularly commonly by young people, thus creating a modest but challenging rift between generations and thereby setting newly formed families off on their own somewhat more than otherwise would have been the case.

Shortly, the Great Depression was to alter the economic organization of the family—and, to an extent, its moral organization. The impact of the extended economic downturn on the way young people came of age and sought to form their own families was ramified and was the more dramatic because of the contrast it posed to the 1920s. The life course had changed in the 1920s, when individuals gained new options. In the 1930s, fresh reminders of external constraint on the individual would modify once again the youthful life course.


121

4—
In the Great Depression

Youth and Work

If the 1920s did not bring prosperity to all, times had been good for the families of most urban entrepreneurs, professionals, and salaried persons, and during most of the decade, urban manufacturing and service workers enjoyed rising wages and rather steady employment. White-collar jobs for both men and women had proliferated, and postprimary schooling had expanded concomitantly. A motif common to persons of both genders from these backgrounds was the assumption of fuller control over the construction of their own life courses, even if the families they founded were not particularly innovatory. The Great Depression was to cut bitterly into these newly developed life course patterns, even before the first cohort to enjoy them as adolescents had had their own children, even before youth could expect parents directly familiar with the new patterns to help in their implementation. Only one in twenty-five early-Depression teenagers and only about one in four of the late-Depression cohort of 'teens had mothers whose own teenage years had followed World War I.[1] The effects of the Depression on the structuring of the youthful life course were to be felt for at least two decades thereafter. And in the case of its effects on engagement, an institution with particular implications for the structuring of this part of the life course, the effects were to last at least through the mid-1970s.

Unemployment offers us a vivid clue to the severity with which the Depression hit young people. Children were still counted on as emergency economic assets in times of family economic hardship. However much they may have wished to do so, this was not what happened to young people in the depths of the Depression, for reductions in employment opportunities


122

were greatest among young people, of both sexes. Seniority, quite understandably, often determined who would and who would not survive work force cutbacks. Initial entry to the labor force was exceedingly trying.[2] Partly because of the decline in competition by remunerative jobs, formal education continued its rapid expansion in the 1930s. Proportions of young people completing high school thus grew markedly during the Depression decade,even though the immediate economic rationale of extended schooling was weakened by the abysmal job market for new graduates. Especially prominent were the increases in the proportions of high school entrants who persisted to graduation.[3] The decade of the 1930s slowed but did not halt the long-term trend among youth away from gainful employment and toward school, while at the same time slightly hastening the more gradual long-term trend toward labor force participation by married women. Organized labor, and others not previously allied with the "progressive" coalition that had worked to reduce child labor, now had become more concerned that youth be as much as possible withdrawn from the already-too-large labor pool, that is, they should be retained in school or drawn into a National Youth Act group or other government-sponsored age-segregated youth setting.[4]

Massive numbers of young people sought work. The most detailed set of figures available are for a large group of varied urban places in 1935, a middling economic year by Great Depression standards. These patterns are displayed in figures 10 and 11, for young men and young women, respectively. Neither part-time work nor the combination of school and work cushioned youth's emergence onto the labor market: until their early 20s for both young men and young women, complete idleness was the modal experience at any given movement. Through age 18, more emerged to nonemployment than to employment, if we exclude those who were placed directly on the emergency-work rolls. A 1935 Michigan census showed that more adolescent youth were simply not looking for a job than were actively seeking a first job. Only at age 19 were more unemployed young men looking for their second jobs than for their first, despite their emergence from school some years earlier. And only at 18 were more of the employed youth working for


123

figure

Figure 10.
Patterns of Young Men Seeking Work, 1935

wage labor outside their family establishments. Dependence extended long for these young men.[5] In Massachusetts in 1934, a hard year in a hard-hit state, no fewer than 29 percent of all 19-year-old males in the labor force had been out of work for over a year, and 22 percent of the young women of that age. This was the peak for both, but even at ages 21 to 24, 22 percent of young men and 13 percent of young women in the labor force had been looking for jobs for no less than a year.[6]

More young women than young men, of course, did not offer themselves for gainful employment—and relatively often avoided distress in this sphere. There is some indication that given the occupational distributions of the two genders, young


124

figure

Figure 11.
Patterns of Young Women Seeking Work, 1935

women's unemployment was exaggerated somewhat less by the Depression than was men's.[7] Even so, the proportion of young women unable to find full-time work was terrible—12 percent unemployed and another 21 percent not employed but seeking work at age 18, or considerably more than those actually employed at this age. By their nineteenth year, more young women were entering marriage than were leaving school, but despite this and despite the fact that many sought no job, unemployment was a common experience, even among those who had had a job before. Inexperience was a terrible disadvantage to young people entering the Depression job market, with


125

graduation a highly anxious rather than a proud and optimistic moment.[8]

Many young people did not have the luxury of holding themselves out of a labor market they knew to be especially hard on them, because the pressing need felt by their families outweighed the trepidation they must have felt. Young men left school and became available for employment far, far more quickly than the clogged labor market could extend jobs to them. Six months after leaving school in 1936, over one in four male high school graduates in New York State were still unemployed, and one in seven was employed only part-time or seasonally. Of the employed male graduates, fully one-third had found jobs only as messengers, laborers, or assembly-line workers. Even though the quality of their jobs was somewhat better, graduation from high school had not much improved these boys' likelihood of finding any kind of a job at all, as compared with those who left school before finishing high school.[9] Those who found work, because of "their weak bargaining position, because of their inadequate knowledge of the labor market and the oversupply of young workers in relation to the available jobs," were forced into physically harsh, ill-paid, insecure positions with "little opportunity for advancement or training for more desirable work." This was hardly calculated to provide a sense of control over present or future.[10] A Massachusetts study of urban twenty-year-olds carried out in 1935 showed that unemployed boys were twice as likely as employed boys to "strongly agree" that the Depression had "retarded the prospects of youth" and one-and-one-half times as likely to disagree strongly that opportunities "to get ahead" were as good now as in the mid-1920s. Unemployed young women were even more discouraged—relatively and absolutely—than were young men. The attitudes of those still at school—evidently still somewhat protected from harsh realities—were approximately equivalent to those who actually had jobs.[11]

Youth's earning position by comparison with that of their elders had so deteriorated that, at the end of the Depression, the $629 median wage and salary income of young men 20–24 earning any income was less than half that of men their fathers' age. Even incorporating a wife's wage income, they summed to


126

less than 90 percent of father's income, and the measured young-family wage and salary income was less than 60 percent of that for families whose heads were one generation older. We might contrast these figures to those of prosperous times a decade later: total family incomes of families headed by persons under 25 were 68 percent that of a generation older. Nonheads at 20–24 had incomes that were 63 percent those of men of their fathers' generation.[12]

For such reasons, children remained longer than before as members of their families of origin. In 1940, it was only at age 23 that half of the males had left off living as "children" in their parents' households, and only at 25 were fewer than half living either as children or other relatives, generally dependent. Although women married younger, only at 22 had half of them left their parents' homes.[13] Sometimes, however, children lived at home to help support the family. At the end of the Depression decade, if we except total family unemployment, more families had secondary workers than a decade earlier.[14] Among Maryland youth in mid-Depression, proportions who said that their parents were at all dependent on them financially varied from 15 percent among those whose fathers worked in professional or technical occupations to no fewer than 57 percent among those with fathers in unskilled labor and 64 percent for children of farm laborers.[15] A 1936 study of youth in agricultural villages revealed that even here, where family labor was quite underpaid, by the late teens a quarter of boys and 14 percent of girls shared expenses with parents when they lived at home, and another sprinkling paid board. By age 22–24, 24 percent of sons and 11 percent of the daughters paid board, and 43 percent of the sons and 36 percent of the daughters shared expenses, despite the obvious difficulty of locating paid employment there.[16]

Of course, while part of these differentials can be accounted for by class variations in unemployment and insecurity during the Depression, part can be explained by different needs for supplementary income even in the best of times. The class system as it affected growing up was no creation of the Depression. But the Depression in various ways exacerbated its workings.[17] Thus, it was the least-educated youth who were drawn


127

into the labor force in the greatest proportion and who, with the greatest need to supplement family income, found that they were in the worst labor-force position.[18] They had to compete for unskilled jobs not only with one another but also with adult males who had been laid off from better jobs. The pathos of these differentials is recognizable close to the surface of the figures themselves. In two-worker families in Philadelphia in 1933, for instance, 25 percent of second workers were out of work when the first earner was himself employed. But where the first worker was himself out of work, no fewer than 58 percent of the supplementary workers were also out of work . This ironic syndrome had existed before the Depression, too, but now became considerably more acute.[19]

Even when husbands were out of work, wives were able to compensate by their own gainful employment only rarely. Wives of unemployed men found it especially difficult to find jobs when they presented themselves in the labor market, although it helped somewhat to be a bit older. But relatively few wives of unemployed workers were.[20] At this point, fully nine in ten men believed that married women should work only if their husbands are not "capable of supporting them." By comparison, nine in ten favored women working before marriage. At this time, 88 percent of women said wives should bow to their husbands' preferences in these matters. Another poll revealed that only 12 percent of the male respondents and 18 percent of the females said that they "believe[d] that married women should have a full-time job outside of the home," while an additional 31 percent of the men and 38 percent of the women would accept such employment by wives in time of duress.[21] The Depression thus helped create patterns of growing up that differed markedly from class to class just at a point when prosperity and social legislation, the cessation of immigration, and the development of a peer-conscious youth culture had seemed to be moderating such differentials. One tabulation in particular indicates the extent of this generalization and is presented in table 11 below. Based on a large and carefully conducted national urban survey taken in 1935–1936 aimed at eliciting socioeconomic variations in health conditions and services, the survey subdivided youth into five socioeconomic categories


128
 

Table 11. School or Labor Force Status of White Urban Youth, by
                 Age, Sex, and Family Income, 1935–36

 

Relief


<$1,000

$1,000–
  1,999

$2,000–
  2,999

$3,000–
  3,999

Males 16–17

In school

63.8%

73.9%

80.7%

87.8%

91.7%

Employed*

10.7

  8.6

  8.1

  6.1

  5.2

Seeking work

23.2

15.1

  9.8

  5.3

  2.3

Males 18–19

In school

20.9%

33.9%

37.4%

45.0%

45.8%

Employed

33.6

30.3

35.1

36.6

31.2

       Seeking work

43.2

33.6

25.9

17.0

  9.1

Males 20–24

In school

  2.6%

10.5%

  9.9%

12.7%

21.1%

Employed

58.5

62.6

69.6

71.8

68.7

Seeking work

37.2

25.9

19.8

14.7

  9.6

Females 16–17

In school

58.5%

65.4%

74.7%

80.5%

85.0%

Employed

  8.3

  7.6

  8.0

  9.2

  8.5

Seeking work

17.6

11.5

  8.8

  5.1

  2.4

Housewives

  5.8

  7.1

  2.7

  1.3

  0.8

Females 18–19

In school

14.0%

24.0%

26.3%

33.6%

39.9%

Employed

25.5

25.7

33.7

38.1

39.6

Seeking work

30.9

21.3

20.7

14.5

  8.0

Housewives

18.9

21.3

10.5

  4.8

  3.2

Females 20–24

In school

  1.4%

  5.3%

  4.4%

  7.1%

11.3%

Employed

25.6

33.9

43.1

57.1

59.9

Seeking work

17.9

11.6

10.2

  9.7

  7.1

Housewives

47.6

44.2

36.1

17.8

10.4

    * For families on relief only, this category includes youth enjoying work relief, 3.9%, 14.5% and 26.4%, respectively, of all males of the indicated ages and 1.8%, 5.9%, and 5.4% of all females. "Colored" youth (N = 38, 523) were tabulated for only three socio-economic levels and two age groups.
    SOURCE : Calculated from National Health Survey data tabulated in U.S. Federal Security Agency, Social Security Board, Bureau of Research and Statistics, Statistics of Family Composition, vol. 11: The Urban Sample (Bureau Memorandum No. 45 [Washington, n.p., 1942]), 165–167.
    NOTE : Between 0.5% and 3.3% of boys, varying from category to category, and between 0.8% and 11.3% of girls were "at home," generally living with their parents but not at school, at work, or looking for work.


129

based on family income and included a distinction between on-relief and other low-income families. The table demonstrates how profound a difference family income made in how one grew up. Poor youth—reliefers somewhat more than families who were off the rolls at the time of the survey—enjoyed less schooling but experienced less gainful employment. By age 18–19, when a quarter to a third of youth were gainfully employed, the proportion at work had equalized between poor and better-off youth, and with increased age it was the better-off who found jobs. This differential was, if anything, even clearer among young women. The experience that was characteristic of poor children was "seeking work"—unemployment.[22]

Growing up with Stringency

Families that suffered no direct economic deprivation seem, on the whole, to have maintained their prior consumption habits. But when families had to cut back, the realms in which they were most able to economize were precisely those areas that had become in the last decade so important to the new adolescent styles of life: recreation, automobiles, and clothing.[23] The Depression affected different categories of persons in different ways, even within individual families. Detailed expenditure data, presented in capsule form in table 12, offer a clue to how changes occurred. The table indicates how much a family with an income $10 greater than that of the next family would typically allocate to certain categories of clothing expenditures.[24] It shows, for instance, that in the expansive economy of 1918–19, a family that had $10 more annual income than the next family would probably allocate about 14 cents to more or better trousers and shirts for the male household head but fully 30 cents to more or better skirts, waists, and blouses for the late-adolescent daughter or daughters. In the straitened 1930s, fathers' pants would receive the same honor as before, while this aspect of daughters' clothing would receive only one-third the increment as before, less now than her father's.

During World War I, families had seemingly favored their adolescent children, especially their daughters. Daughters' increments had exceeded those for all other family members.[25]


130
 

Table 12. Average Increase in Expenditures on Given Family Members
                 for Selected Items of Clothing per $10 Increase in Family
                 Income, 1918–19 and 1935–36* (Increases in cents)

 

Trousers, etc. **

Hats, Caps

Shoes

 

1918–
    19

1935–
    36

1918–
    19

1935–
    36

1918–
    19

1935–
    36

Husband

13.8

14.1

  1.8

  1.8

  3.5

  3.1

Wife

13.7

  9.7

  3.4

  2.3

  3.3

  3.5

Son (15+)

22.0

  8.7

  3.2

  0.8

  7.4

  2.3

Daughter (15+)

30.2

10.0

  6.4

  1.3

10.5

  4.3

Son (12–15)

  8.2

  6.2

  1.0

  0.3

  7.4

  2.3

Daughter (12–15)

  5.0

  5.6

  1.6

  0.9

  2.9

  2.1

    * Average for New England, West Central, East Central, and Rocky Mountain states.
    ** Includes also suits, shirts, skirts, waists, blouses, etc.
    SOURCES : Estimated from U.S. Bureau of Labor Statistics, Cost of Living in the United States (Bulletin No. 357 [Washington, D.C.: USGPO, 1924]), Table C; and U.S. Bureau of Labor Statistics, Study of Consumer Purchases: Urban Technical Series. Family Expenditures in Selected Cities, 1935–36: Vol. III, Clothing and Personal Care (Bulletin No. 648 [Washington: USGPO, 1941]), table 5.
    NOTE : Based on families with such members with comparable range of family incomes for each year and excluding open categories. The sequencing of items in the two surveys indicates that the researchers sought comparable categories of expenditure. The 1918–19 set includes five income categories over the range $900–2,500, while the 1935–36 includes four over the range $500–3,000, because the latter set is not limited to workingmen. I have excluded higher income categories included in this latter study so as to approximate the same socioeconomic range of families. The consumer price index was just slightly higher in 1918–19 than in 1935–36, so deflation does not much confuse these reckonings.

The comparison with somewhat younger children—girls especially—is most instructive. These children were apparently seen as simply too young to need much more than utilitarian clothing, and family budget patterns reflected this. And although wives' increments for hats slightly exceeded that of their adolescent and young-adult sons, sons received more for shoes and for outer wear than did their mothers or fathers. In this generally prosperous period, families were using a relatively substantial portion of their prosperity to adorn their late-adolescent sons and daughters, as these children neared marriageable age in the new marriage market that was beginning to incorporate dating.

The Depression intervened in this development, and the new regime fits with a particular frustration felt by youth of


131

these ages, as we shall see.[26] On a per-capita basis, even after correcting for deflation, sons in two of the three measured categories of clothing and daughters in all three had absolutely lower expenditures on these kinds of clothes. There was, too, a general decline in the proportion of incremental income spent on clothing of all sorts for all family members except the male breadwinner, for whom expenditure patterns remained just about as they had been before the Depression. Wives and younger children lost out somewhat, but the really precipitous declines hit late-adolescent sons and daughters. The rearrangements of the budget is understandable: the father was still ideally and most often in fact the main breadwinner, and his appearance at work, or in applying for work, might be crucial in determining whether he could hold a job. Less important now was the sharp appearance an older son or daughter might make among his or her peers. Fortune 's acute mid-decade study, "Youth in College," discussed young women's tweeds, sweaters, and striking restraint in self-adornment. "With no make-up and little lipstick, she presents a casual, even an untidy, appearance while on the campus. And the casualness is carried over into the girl's surface air of self-possession, which is unstudied."[27]

Dating, which placed a premium on up-to-date dress and the material capacity for purchasing commercial entertainment for "a good date," was a system that had spread throughout the country only in the preceding decade. The parents of the children of the Depression had themselves generally not dated and thus had not experienced the particular strains on consumption capacities built into dating. While, as Glen Elder has cogently argued, in most families the need to pull together during hard times was obvious to all family members and no doubt generally acquiesced in, nevertheless, exactly how families might economize was by no means self-evident: each family had to make and did make its own decision. Who would pay the price of Depression stringencies and uncertainties had to be thrashed out again and again. This is exactly what Mirra Komorovsky remarked in her examination of family strain in the Depression. The extent of adolescent-parent breakdown in unemployed Depression families, she argued, depended considerably on the degree to which the child saw his or her own personal interests thwarted by the Depression. The example


132

Komorovsky used to illustrate this point had to do with expenditures on a girl's clothes.[28]

Depression stringencies and insecurities modified residence patterns, income composition, and family expenditure budgets. These modifications, in turn, affected the allocation of family members' time. The straitened circumstances described by Ethel Beer as pertaining to the single "business girl" living at home pertained far more widely. Families simply did not have the secure wherewithal to create the circumstances under which a young woman could move easily through courtship. She lacked the clothes. No less, she lacked the leisure. "She may not have any conscious antagonism towards her family. Nevertheless, she feels constrained in this environment." Marriage, for her

is an escape, the only escape she can conceive of from her family. The husband as the rescuer from this tedious, restricted household is the only possible hero of her dreams. . . . How can she compensate for her social lack unless she breaks bonds and reaches freedom? Since to her the wedding ring represents this freedom, it is quite understandable that she should bend every effort towards procuring it.[29]

Postponed Marriage

Knowing what we do about the operations of mate selection processes in American society at the time, it is fair to say that those who married young often married into chronic economic insecurity. The Depression in this sense served to exacerbate the impact of social class on family formation and, in effect, to segregate the young families of this time into two unusually sharply distinguished categories: those formed by people who could see a clear personal route through the uncertainties of the Depression, who married prudently and could generally hope to take advantage of a steady income and declining prices; and those formed only under some other, more generalized confidence of the ability to sustain their family in the future, by those who had but little to hope for economically in the immediate situation. In the Depression, marriages were postponed prudently by families of particular backgrounds and entered


133

into with less prudence by others of other backgrounds. Sociologist James Bossard explained these kinds of differential marriage patterns in terms of the differing criteria that "older, more established groups" and "newer, less established groups" entertained for the conferral of status. Both classes were challenged by Depression stringencies. But the more established groups postponed marriage to maintain their "plane of living," while for the others, simply to marry was more honorific and important.[30]

It was difficult for young people when they wanted to do so to gain independence from their parents as reflected in neolocality and nuclearity. In the Depression, the ratio of household heads to currently married men of ages 15–24 declined from 79:100 in 1930 to 76: 100 in 1940, although catching up to parity by ages 25–34.[31] Among whites, the more education, the more difference age meant to the circumstances of marriage: accumulation started slower among the more educated, making independent residence at marriage relatively rare, but given a few years, it had progressed faster, so that by their mid-twenties young married men with more education were considerably more likely to have their own households. Theirs was the greater "Depression penalty" if they married when young but the greater reward for patience.[32] Blacks married earlier, regardless of educational attainment, but for all educational levels and at all ages were far less able to afford independent households. At all levels of education, those who married later more typically settled into their own homes, independent of relatives and others. For blacks , whose generally precarious life courses were rendered all the more so by the Depression, it was invariably the more educated whose presumed wish for independent nuclear families was the more often denied—at least through their mid-twenties. The reason is eloquently simple: blacks in the Depression simply could not translate schooling into jobs that allowed a reasonably rapid accumulation for an independent residence at an appropriate standard of living.

The price paid for extended prudence under uncertainty was a challenge to the normative moorings that had governed the delicate timing of family-building transitions involved in prudence of this kind. How to maintain the nice degree of re-


134

serve required to commit oneself to a "steady" or fiancée but not foreclose other options, in view of the extended waiting time that prudence required in the Depression? How to postpone childbearing longer than the couple of years formerly characteristic for prudent but not downright emancipated couples, without falling into the selfishness that extended childlessness was thought to encourage? How to preserve the emotional tenor of such relationships over a different course than that worked out by the couples who had, within limits, redefined tradition just a decade before? Asked in a 1934 survey what irritated her about her extended engagement, a twenty-six-year-old secretary put it thus: "People asking when we are going to get married—giving us advice and making our plans and arrangements for us."[33] Much was now problematic: not the rules themselves as in the 1920s, but how these rules could work themselves out in particular lives.

It is eloquent testimony to the importance men and women attributed to marriage that although marriage rates were indeed affected by the Great Depression, they were only affected somewhat. The economic downturn far exceeded the nuptial downturn, a drop of 18 percent in gross national product between 1926–1929 and 1930–1934 for instance, was associated with only about a 9 percent drop in the number of marriages between the same dates. Marriages could be postponed only so long in hopes of happier economic circumstances without wholly disrupting the existing mechanisms of the marriage market. Eventually most would be celebrated, under modified economic expectations. Even during the course of the Depression, Jessie Bernard noted that young marriages that were postponed when the Depression first struck resulted in a bumper crop of marriages to somewhat older people only a few years later.[34] When we examine proportions ever married in the 1950 census, we find no lasting deficits whatever in the cohorts that might ordinarily have expected to marry during the Depression, those who might have found that they had postponed too long ever to marry.[35] Most did marry, even if many of these marriages were long delayed. Their taste for matrimony was not permanently or even long affected. Tastes for rather young marriage were still in place when war mobilization produced


135

circumstances that promoted a speedy transition. And, as we will shortly see, the Depression had by then weakened the moral—and chronological—effect of engagement.

The Depression even had an impact on the ceremonial context of those marriages that were contracted in the face of its hardships and uncertainties. The scant data point to a tendency for Depression marriages to be undertaken with less ritual than before and, arguably, with somewhat less ritual oversight than was the norm. A subsequent poll that asked the setting of respondents' marriages shows a distinct dip in church weddings among persons whose age suggests they married in the early Depression years. Weddings in both city hall and in the home of one of the couple's parents increased correspondingly.[36] In Philadelphia, weddings at city hall (only the tip of the iceberg of marriages with reduced ceremony) rose from about 7 percent just before the Depression to 11 percent in 1932, 12 percent in 1933, then back down to 11 percent in 1934, and dropping to an ordinary level in 1936. Jacobson notes declines in New York City and Milwaukee and comments that "some of the couples, who would have had a church wedding, postpone their marriage[s] until better times; others are wed by a civil officiant."[37]

On the aggregate, marriage age edged somewhat upward during the early Depression, as relatively young and average-age couples postponed matrimony. The initial cluster of marriage delays came disproportionately from persons living in socioeconomically less well-off areas, but these residents adjusted to lowered material standards at marriage relatively more quickly than those living in more prosperous areas.[38] Detailed New York State figures on first-marriage age by single year and single year of age reveal also a change in preferred or prescribed marriage ages. In the late 1920s in New York State (outside of New York City), the age distribution at first marriage had been bimodal for women, with the primary mode at 21 but a pronounced secondary mode at 18. For men, the lone mode was 21, and it was sharply peaked. When the Depression hit, the numbers of young women's marriages at 18 and young men's at 21 dropped markedly, while marriages at older ages gained. Even as the Depression was ending, the 18-year second-


136

ary mode was not reestablished among women, and 21 remained in eclipse among men. For men and women, especially women, the marriage ages, both the expected and the presumably preferred, had shifted later.[39] Whatever moral or emotional premium had been achieved by men and women who married precisely at 18 or 21 was evidently foregone in the face of sterner exigencies. But the marriage market continued to run smoothly.

Both the New York State data and Massachusetts annual data[40] reveal two interrelated patterns occurring over the Depression years. Numbers of first marriages varied from year to year, following business conditions, but not to an equal extent for all age groups . Both younger and older marriages—the latter trending somewhat upward, the former downward over the period—varied less than did marriages at about the modal ages. Younger marriages, it seems, were in a sense insulated from economic cycle by the fact that many were contracted under duress of pregnancy and in any case were rarely prudent. Many of the marriages of older people, in contrast, were insulated by the economic reserves many such people had accumulated.

Brides' marriage ages shifted toward grooms'. The point is of more than only demographic significance, as can be seen by contemplating the probable explanation for the phenomenon: women's gainful employment before marriage. Quite likely, material accumulation before marriage was seen widely as more crucial than before, if also more difficult. Women's contribution was proportionately emphasized in the enterprise, and this was surely consequential to the roles they could claim within marriage. And this new effort was most prominent among women with the training to take up clerical occupations, which fared rather better than manufacturing employment during the Depression, even drawing new kinds of entrants to the labor force.[41] The tensions of marriage postponement, and certainly the sexual ones, were mostly ascribed to the man, a pattern especially true in the late-marrying middle class.

Table 13 takes advantage of a special census carried out in Cincinnati at the middle of the Great Depression. It reveals that the downward trend in age at marriage that had characterized the 1920s experience of both males and females, for native


137
 

Table 13. Proportion Ever Married in Cincinnati, 1920–1935, by Sex and
                 Age, for Native Whites and Blacks (in percentages)

 

                               Males

                            Females

 

1920

1930

1935

1920

1930

1935

Native white

15–19

  1.3

  1.6

  1.3

  8.2

11.2

  5.0

20–24

22.9

28.1

22.3

40.9

46.8

33.4

25–34

61.8

68.8

68.0

67.7

74.6

73.9

Black

15–19

  4.7

  3.5

  3.1

25.4

26.3

17.4

20–24

42.9

43.5

38.6

69.7

72.9

66.5

25–34

69.4

74.4

71.8

85.4

89.6

86.4

    SOURCES : Calculated from Census 1920–1, 474; Census 1930–1, 977; Cincinnati Employment Center, Ohio State Employment Service, The Population of Hamilton County, Ohio, in 1935 (Studies in Economic Security: II [Cincinnati: Cincinnati Employment Center, 1937]), 52–53.

whites and blacks alike, was sharply reversed in the first half of the 1930s. Proportions married by age 24 were invariably reduced to lower levels even than in 1920. But for those above age 25, the Depression effect was almost invisible in Cincinnati. As we have gathered from the national data, Cincinnati men and women were already catching up on their marriages, after a delay, by mid-Depression.

Small-area data for 1935—107 census tracts in the city of Cincinnati—allow us to explore the factors that influenced marriage timing as of 1935, examining the characteristics of census tracts that contained relatively high proportions married there for native white young men and women at 20–24 and 25–29.[42] Because we can compare correlates of marriage timing for different ages, we can infer from the differences the way various relevant factors impinged on marriage decisions made by those who may have felt themselves at a relatively comfortable age to delay and those who, a bit older, perhaps were more likely to be strongly propelled into marriage now rather than later. A common set of factors influenced marriage timing across genders and ages but with certain telling differences. For all age categories we are examining, a neighborhood sex ratio


138

approximating equality with the opposite sex conduced to marriage, as did whatever circumstances had promoted population growth in that tract during the preceding five years. The presence of large numbers of foreign-born whites, for all groups, was associated with lower proportions married, probably because where there were many foreign born, there were also many of the young native white men and women who were second-generation Americans, long quite slow to marry. High-rent districts were characterized by late marriages. Finally, where high proportions of all adults—and thus of marriageable-age women—in the census tract were engaged in gainful occupations, marriage was considerably inhibited. The population growth variable (which may mainly have indicated suburbanization) was associated with enhanced marriage propensities of older more than of younger people. This relationship, however, was powerfully reversed for the economic variables, which were far more strongly associated with variation in younger people's marriage propensity. The impact of proportions at work, finally, differed for men and women: whereas for both sexes, high proportions of the tract's population at work inhibited marriage propensities, it did so more for younger men and for older women. But for neither was this factor anywhere near so powerful as the apparent tendency for those living in higher-priced areas to postpone marriage considerably more than those living in cheaper venues, especially young people . Put another way, whereas young people in relatively prosperous circumstances were much more likely than people of like age living elsewhere to put off their marriages in the middle of the Depression, by the time one was between the ages of 25 and 29, this differential caution no longer obtained. Normative considerations, and the possibility that people so well circumstanced had put aside an adequate nest egg to sustain a marriage even in uncertain times, now took over.

Numbers of marriages contracted from 1929 to 1935 by men in different occupational categories is shown in figure 12, indexed with pre-Depression numbers of marriages set equal to 100.0.[43] The graph shows that young men pursuing certain occupations in the early Depression—notably skilled workers (in part because of their sensitivity to the highly cyclical building


139

figure

Figure 12.
Marriage Rates for Men, by Occupation, 1929–1935

trades) and proprietors—characteristically followed the economic trends closely. But the semiskilled—factory employees—were slow to defer marriage and quick to resume it. Unskilled laborers, who no doubt were thrown out of work quicker than the others, showed a variant of this pattern. One suspects that for those with skills nowhere demonstrable outside the particular work situation, long-term prudence was not particularly called for, only good sense enough not to marry just at the moment one was oneself unemployed. Clerical workers were particularly quick to delay marriage but quite responsive as well to early signs of recovery. The professionals showed a distinctive pattern, promptly postponing their marriages and hardly letting up on their restrained nuptial tendencies, even as the Depression moderated, no doubt in response to the continuing need to establish a firm. Retrospective census materials on marriage age by educational attainment for native white women in 1940 show that as the Depression years passed, less-educated women seemed to countenance younger marriages, while the more educated essentially maintained their initial hesitancy. The kinds of cautions that the Depression imposed on young couples seemingly affected primarily those whose educational


140

backgrounds suggested middle-class aspirations and a relatively high target of material sufficiency for marriage.[44]

Current and prospective unemployment, too, inhibited marriage more for those in some occupations than in others. The dividing line was not socioeconomic status, it seems, but job-related outlook on the future. Among higher-status males, young professionals and low-status white-collar employees, but not proprietors , were rarely married, for their human capital investment in the future was wholly dependent on the right employment opportunity. Among manual workers, too, it was those with valued, hard-gained skills who apparently held off marriage until reemployment. But for those who, like miners or masons, had skills their industries all too frequently underused, marriage postponement during unemployment was rarer.[45]

The Subtle Alteration of Normative Patterns: Engagement

The dominant interpretation of the impact of the Depression on the American family is one in which the stress served in effect to weed out the less resilient families, emphasizing thus the considerable vigor of the family institution.[46] Elder's powerful work has especially had the effect of emphasizing this general conclusion.[47] There is, however, another way we can turn the family-in-the-Depression question around to suggest a somewhat different conclusion. For if the institution of the family emerged from the Depression strengthened, so also subtle changes were introduced. We will here turn attention to a single aspect of family building—engagement—that was subjected to a kind of pressure by the Depression which palpably modified its content. Thus modified, but still a common constituent of institutionalized family-building patterns, engagement made its contribution to post-Depression and particularly postwar marriages that looked much the same as what had come before but felt rather different.

One of eighteen "charges" brought against "Society" in 1935 during a well-publicized mock trial staged by the Council of Social Agencies was "allowing conditions to exist under which


141

young people are unable to marry due to lack of employment." After a reported eighty thousand words of testimony and due deliberation by a jury of twelve (adults) drawn from "all known organizations" in the community, "Society" was held guilty on six counts. Negligence to create material conditions conducive to marriage was one of the six.[48] To be sure, the connection with legislative action is strictly speculative, the item highly hypothetical and out of context in the brief questionnaire. However, for all the caveats it properly calls forth, it is suggestive of the terms in which large numbers of Americans contemplated the specter of delayed marriage in the middle of the Great Depression. The eighth of "Ten Modern Commandments" of love propounded in the same year by True Confessions was "Thou shalt make use of thy emotional energy through sublimation." It followed that "society's job just now is to make it economically possible for young folks of marriageable age, who are in love, to find fulfillment within marriage."[49] Roy Dickerson, the YMCA marriage counselor, held that when "the natural hopes of a [engaged] couple are frustrated, . . . they are likely to feel rebellious against a social system that they hold responsible for their disappointment."[50]

The idea of social action was cheerful fantasy. Perhaps instead, couples' "natural hopes" were redirected and the institutions channeling their hopes changed. At least one restrained scholarly study argued that Depression age norms were so violated by delayed marriage that either economic or moral restructuring must "certainly" soon come.[51] A 1935 survey of New York City youth found that one-fifth of young women aged 18 to 24 and one-third of young men aged 21 to 24 believed "the Depression had interfered with their marrying." The survey analysts noted "a sameness to the reasons [i.e., economic] that is monotonous until one tries to visualize something of the unique sense of misery" the youths were expressing. They concluded that "the effect on mental well-being . . . is far-reaching."[52] The engaged couples in the late-1930s study by Burgess and Wallin repeatedly expressed frustration because economic conditions prolonged their engagements unduly. Engagement, which had evolved into a time for testing true love, had become a trying time in a different sense.[53]


142

Such respected experts on adolescence as Carolyn Zachry reflected on the marital-scheduling difficulties of the Depression as an opportunity to intermingle subsidy and socialization for marriage.

To many an adolescent, a job is also the prime factor in determining whether or not he can get married. For those thousands of young people joblessness means frustration, not only of their ambitions in the business and professional world, but frustration of their psycho-sexual desires as well. Of course, in many boys and girls the desire for marriage is confused with their desire for status and prestige. . . . We—parents and educators and miscellaneous adults alike—[should] recognize that our task is much more than that of enabling young people to get married by helping them to become financially self-sufficient. We must also help them to achieve more mature attitudes toward marriage.[54]

Birth control advocate Robert Latou Dickenson, evolving a new notion of family formation that took fuller notice of the urgent drives of the parties involved, argued in 1936 that "early marriage and [a] shorter period of engagement are necessary; yet they are impractical unless contraception may be employed."[55]

Addressing frustrated youth directly, schlock publisher Bernarr MacFadden spoke less of maturity but trumpeted the same theme of the threat posed by frustration to the very institution of marriage.

Young people are afraid to marry these days unless they can begin, materially, where their parents left off. They want all material comforts ready to hand, and an assurance that they will not have to give up one small thing for the added gift of love. . . . They get things , but miss the fine edge of marriage. That fine edge belongs to youth. It is youth , the joy of struggling together, of building together. . . . Security is no gift from the outside. . . . Why not marry while young?[56]

And eugenic marriage counselor and popular writer Paul Popenoe likewise urged direct action lest the road to marriage stretch so long that disastrous tensions develop. "In heaven's name, why wait? . . . If you are sincerely in love, old enough to know what you are doing, understand what marriage means


143

and are free to enter into it, you have no right to let anything, least of all money, bar you from happiness. . . . I've never known of a home broken up by lack of money," exhorted the marriage counselor, noting that money worries were a special "ogre" to engaged people, so much so that it is sometimes "so frightening that they wish they weren't engaged."[57]

In 1937, the Roper Organization posed a remarkable question to a representative sample of the American population: "Should the government give financial aid to young people to help them get married and establish homes?" Remote as this was from the nation's highly private conception of the circumstances and basis of marriage, no fewer than 38 percent of all respondents answered "Yes." Just 54 percent rejected it.[58] Women were slightly more enthusiastic about Roper's government-subsidy proposition than were men: 41 percent to 37 percent. This difference showed up not among the younger respondents but among the older ones. Older women were almost as favorable to the proposition as were younger women: their vote was, in a sense, an assent to the critical importance of marriage to women generally rather than to their own immediate needs. For men, however, age mattered. Forty-three percent of both men and women under 24 supported the hypothetical proposition, but by gradual steps, assent among the male respondents declined to 29 percent among men 55 or older. Independent of age, socioeconomic status was negatively related to assent to the government subsidy.

Many contemplated a change in some of the rules surrounding the marriage process, a reorganization of some elements to preserve the essence. An institutionalized pattern that was located at a life course phase of deep frustration was engagement. Among Kinsey's respondents married during the Depression, between two-thirds and three-fourths had been engaged prior to marriage, with engagement just slightly more common among the college-educated than among those who never went to college. Kinsey's data also point to a slight but temporary decline of the incidence of engagement during the Depression.[59]

Too amorphous to be altered in any formal way, engagement was subtly—but lastingly—modified in its meaning, particu-


144

larly with regard to the constraint the institution placed on sexual expression. The 1938 Good Housekeeping Marriage Book maintained that "an engaged couple who are sure of their hearts and minds should be helped to marry as soon as the plans for the marriage can be wisely worked out." Since "this usually involves financing, . . . wise parents today cooperate so that the young couple do not have to wait too long."[60] Family sociologists at Cornell University reported that as the Depression wore on, parental subsidy was in fact offered more often than before and accepted (even expected) more often, although the students were still wary of the possible strings that might be attached to such support. Their campus survey suggested that in 1940–1941, more than one-third favored parental support for college marriages and support while couples "get on their feet," and an additional one-fifth favored emergency aid.[61] A student poll at the University of Colorado in mid-decade reported that six in ten male and female students believed that financial aid from parents was acceptable to permit marriage; almost as many also said they would accept a "dowry" system.[62] Parental aid to prompt middle-class marriage was in the early stages of being institutionalized. But this was not yet common enough to take the pressure off engagement. "I was engaged for a long time, but I couldn't get a job." "We broke off—no money."[63]

In the Depression, engagement became an especially ambiguous institution. "About four months ago I met the man that I have chosen for my husband. He proposed about a month ago, but has not as yet given me an engagement ring. Should I consider myself engaged before I have the ring?"[64] Neither an element of peer culture, as was dating, nor a step of unquestioned legal significance, as was marriage, engagement was nevertheless held by numbers of advisors and commentators to be either or both. In law, engagement was equivalent to a contract to marry—but a contract that was open to highly discrepant construction. The engagement ring was legally interpreted as a "consideration" that made a promise to marry contractual, but both legal texts and advice texts indicate that many couples instead saw the ring as a gift. Presumably, they also viewed engagement as less than a formal contract.[65]


145

According to Burgess and Wallin, "the proportion of broken engagements is on the increase" in the mid-1930s, because "even engagement has become a trial relationship during which love is assessed."[66] Etiquette books of the day became distressed at the dyadic, negotiated quality of recent engagement, which, unlike betrothal, no longer required the suitor to have gained prior permission from his prospective father-in-law.[67] Burgess and Wallin chose to interpret engagement as an institution in transit from a less to a more important function. "In the past three decades [i.e., since about World War I] there has been a marked change in attitude toward engagement. It is now considered as the last stage in the selection process, . . . its preeminent function the final opportunity for the couple to find out if they are fitted for each other."[68] Engagement thus linked dating to marriage, enlarging the sphere of young people's volition. All writers agreed with Ernest Groves that "the engagement can have little value as a preliminary testing of the relationship before marriage unless with it goes the possibility of breaking off the relationship."[69] An etiquette for breaking an engagement was developed which was more completely elaborated than that for establishing one. And Burgess and Wallin found that 30 percent of their engaged respondents in the mid-1930s had been previously engaged.[70]

Engagement was now supposed to serve to help a couple navigate a safe course to lasting marriage where tradition had largely ceased to offer explicit rules for behavior. By the 1930s, dating couples might form and reform without great social or emotional costs, enabling young women and men to learn the range of personalities to be found among socially acceptable partners. Dating, however, depended so much on the partners exchanging the material and physical wherewithal of "a good time" that dating seemed too brittle, too brief, perhaps too exploitative, to contain the more tender phases of courtship.[71] For some, but not all, "going steady" constituted an intermediate step in this direction.[72] The question of sexual compatibility was something else again. "The social attitude toward betrothal should not be too rigid," wrote Popenoe, but should allow for a gradual, cautious, loosening of the inhibitions that govern dating. "Where betrothal is regarded as equally sacred and bind-


146

ing with marriage," that is, governed by social controls rather than the situational application of internalized values regarding intimacy, on the one hand, and the double standard, on the other, "this [exploratory] function is largely lost. Equal loss results from taking the betrothal too lightly—where it is merely regarded as a convenient cover for intimacies that would not otherwise be approved socially."[73]

The critical distinction between engagement and dating lay in the way that the extent of physical intimacy was settled on. When dating, boys proposed and girls disposed, this being one element of a culturally defined and peer-overseen negotiation. In engagement, the couple was now publicly recognized as a unit, the constancy between the partners reinforced by the social recognition of the "opalescent mist of gossamer delicacy" that convention enjoined between the couple.[74]

When courtship prospers it leads to the mutual fixing of affection and this in turn creates need of a public recognition of a special relationship. The betrothal expresses the wish of both the man and the woman for a sense of security and exclusiveness in their love. From the point of view of its function as related to marriage, the engagement, by removing uncertainty in their relationship, provides favorable conditions for each person to become well acquainted with the other before making a commitment which is presumed to be a life union.[75]

The assumption was that the period of asymmetrical bargaining ended with engagement and that a wholly mutual period of "exploration and discovery of personalities, a period of adventuring in adjustments," ensued.[76] Engaged, one no longer simply accepted or rejected what one was offered. Instead, one sought to discern and perhaps to undertake those changes one should and could make in oneself in order to enrich the unity of the couple.

An engagement period of about six months is not too long . . . to be sure that upon the instinctive basis of sex attraction a truly personal love has been founded. For sex must be built upon to create love. . . . These ideals of sex relationships and love relationships should form part of that great bulk of questions that must be talked over between a betrothed couple.[77]


147

A suitable degree of physical intimacy short of coitus, which would lead inevitably to "an anti-climax of relationship,"[78] was exactly one of the things a couple was supposed to discover in engagement. Margaret Sanger laid out the situation with unusual precision in 1926, and on entirely conventional premises.

The fiancé's breath, odor, touch, embrace and kiss must be pleasing to her. If they are not . . . then under no circumstances should the engagement be prolonged. . . . The intimacies permitted during the engagement, the legitimate intimacies of kisses and caresses, in the protecting atmosphere of poetic romance, thus fulfill a distinct and all-important function—the deepening of desire and the commingling of the spiritual and the physical. The engagement with its growing emotional bond is thus not merely a social convenience; it is the fulfillment of a necessary and vital process.[79]

The sexual tensions of engagement were entirely congruent with the "frankness" appropriate to personality exploration. "Frankness means that whenever either one becomes aware of a rising surge of sexual desire, it will be possible to say, 'I think we had better be doing something else. . . . Engagement may be still further enriched by the development of the spiritual resources of personality."[80] An author in The Good Housekeeping Marriage Book reflected the ambiguity engendered by engagement in a time of general social stress and change when he reassured his readers both that, on the one hand, "if they have . . . decided to wait, they need have no fear that this indicates a lack of sex feeling," and that, on the other, if they find waiting hard, "they should be glad that they do have 'sex hunger.' "[81]

Some sense of the variability of engagement as an institution can be inferred from the great difference among instances of how long engagements lasted before marriage. Engagements might at their outset incorporate considerable certainty about a marriage date, or they might imply nothing more than an intention to marry at some point. Half a year to something over a year was generally held to be the optimal length for an engagement, but about one in three of Burgess's 1930s couples was engaged less than half a year before they married; Ter-


148

man's findings were similar.[82] A slightly greater proportion of the entire Kinsey sample had short engagements.[83] However, these same sources indicate that almost two in ten engagements that eventually led to marriage were at least two years long. The Burgess couples reinterviewed after their eventual marriages on average scored higher "marital adjustment" scores the longer they had been engaged, although the evidence presented suggests that this was perhaps a function of duration of acquaintance rather than a result of engagement per se .[84] In an engagement lasting indefinitely, could one sustain such a subtle interpenetration of egos without at the same time according one's partner other intimacies? The institution was vulnerable. Timing and content could not be separated where couples were expected to pet heavily but to refrain from coitus, but the duration of this period was subject to unpredictable upward revision. "I am 20 years old and am engaged to a fine boy who is 21. Unlike most boys, he realizes that it is not right to monopolize me and keep me from going with other boys, because he is not working seriously and cannot afford to take me everywhere or to marry just now."[85] What was to be the content of engagement that was thrown off schedule, where mutual exploration would lead perilously close to forbidden sexuality, where even day-to-day pleasures were either riskily domestic or prohibitively costly?

Engagement was exactly that point in the family-formation process at which young people were supposed to experience, and weather, their acute doubts about subsequent steps in the process. The opalescent mist was also almost invariably a period of episodic tension. Longer engagements may or may not have promoted such doubts, but surely they were the occasion for many of them, especially in view of how hard the economic uncertainties of the period bore down on those in the family-formation years. The Depression induced a particularly focused eagerness to be done with one's engagement. The war would soon provide an occasion for this—and the prosperity that followed—so that the family-formation process came to be permanently modified through an eagerness to change it at its weakest, least defined, least normatively satisfying element: engagement.


149

A 1935 True Confessions story, "Love Hazards," purports to be parallel interviews with an engaged pair, Dorothy, 20, and Bill, 22, who have realized, after two years of waiting in the engaged state, that another two years wait will be required.[86] The editorial presence asks, "What is Society going to do about them—all these young people who want to get married and can't?" The editor says that the Depression is immediately to blame for this tension but that really it is built into the mores and the economy more generally. The whole is a mythic explanation for, and thereby justification of, change in the normative structuring of the youthful life course.

"But gosh," exclaims Bill in his text, "nature never meant the preliminaries to last two years! Nature never intended the courtship to be dragged out forever. . . . I can't think that it's anything but natural for a chap who's in love to want his girl. I can't think he'd be much of a lover or much of a man if he didn't." Bill worries that if unsatisfied, his "urgent physical need of her" would so structure their engagement that all her other attentions would cease to please him. "Our engagement and my disposition are being ruined." Bill contemplates having recourse to Jenny, "an easy girl, a good-natured, cheap little girl," to slake his immediate cravings but tentatively rejects this solution, out of respect for Dorothy (not Jenny). He also looks forward to heightened sublimation through his studies. Yet he cannot believe that the institution of engagement is binding enough to prevent jealousy from creeping in, given a long delay: "Can I expect a gay, pretty girl to stay home and hold her hands evening after evening for me?" Bill proposes, as the only half-satisfactory solution to this dilemma, immediate sexual consummation with his fiancée, an emotionally appropriate if morally mediocre expression of intimacy in engagement dragged out too long. "Suppose I knew she was all mine—really mine. Do you suppose I'd be jealous of any chap then? Not I. And do you suppose I'd ever look at another girl?" He quiets his moral qualms in classic fashion: "It's not us that's wrong, but Society."

Dorothy says no, and the matter is in limbo as of the narrative present. Dorothy protests that her physical needs are every bit as urgent as Bill's. "Girls aren't different. I do want you. . . .


150

But don't you see, don't you see, you're so worth waiting for." Where Bill evokes nature, Dorothy evokes "a sentimental little picture in my mind of our wedding night—mine and Bill's, myself in white satin and lace, shy and yet eager, Bill ardent and glowing." Under proper restraint, Bill's fancied animality suits Dorothy's conception of things, but much of the time, now, it seems all too much like Bill's lusting after cheap, easy Jenny. Dorothy knows that sex is not a sufficient basis for lasting marriage. Dorothy protests that "much as I want to belong to Bill, I can be happy just being engaged to him." But Bill puts it thus: "She'll someday be my wife, legally as well as actually." Mapped on the somewhat shaky double standard of the day, the idea that sexuality is the touchstone of possession together with the enlightened, volitional definition of engagement as a period of growing mutual commitment had made the institution itself a murky battleground between the sexes.

We are offered an unresolved mythic struggle between nature and culture, following the then-polite convention of man as natural, woman as cultural. The context, however, is distinctly historical. In their different ways, both Bill and Dorothy deny the operating assumptions of the double standard of sexual conduct. Yet circumstances prevent a symmetrically structured family-formation process. Even if the sexual and moral natures of male and female were no longer assumed to be at opposite poles, nevertheless Dorothy's sense of the strength of their love is only enhanced by the challenge the economic pinch poses to their shared timetable. But for Bill, "Society" proved bankrupt when it failed to provide the couple in timely fashion the promised marriage that was to be their reward for noncoital courtship.

Contemporary students of mores recognized the difficulties of the situation. McGill and Matthews noted that even dating was difficult because so few had their accustomed pocket money. Even worse was a decline in the quality of friendships between boys and girls because of "what happens to friendship between the sexes when this important possibility [marriage] is ruled out." Cavan and Ranck note that employed girls had the advantage that income brought in their dating lives, noting such forlorn expressions among girls of marriageable age but


151

not employed as "has boy friends but no clothes to wear when she goes out" and "the boys do not have jobs."[87]

From a narrowly behavioral perspective, engagement timing seems to have changed only slightly during the period we are considering. We have already noted a slight reduction during the Depression among Kinsey's married respondents in the proportions who had been engaged. The same cohort seems to have had on the average only somewhat shorter engagements, perhaps only an extension of a trend established during the previous decade. The data, however, do point to a considerably more rapid decline in the average length of engagements as World War II approached and through the war, although the total period of acquaintanceship preceding engagement remained roughly unchanged.

Might not an institution, no longer restraining but now expressing a highly sexualized intimacy, shorter, and arising as marriage ages became younger now begin to lose the special, tentative, exploratory quality that many applauded, perhaps passing some of these on to the marriage itself? Might not the slippage of the institution, reflected in the notes of a young person's discussion group, be general? "Is it all right," the 18-to-25-year-olds asked the minister who led the discussion, "for engaged couples to have sexual relations and if not, why not? Suppose they can't afford to be married? What about couples who are not engaged? It's natural, isn't it?"[88] The unraveling of the one tie seemingly threatened the next. Nearly two in three University of Colorado students—with no difference by gender—told an interviewer that at least some "sex liberties" might be taken during the engagement period.[89] Hints of what happened can be found in the Kinsey sex-history data. These show that those who had not been engaged before marriage had for decades been those whose premarital sexuality had evidently been the less constrained by social conventions. This relationship held true in the Depression, at higher levels of premarital intercourse. In the Depression, too, long engagements led more often than before or subsequently to marriages in which the couples had bedded before marriage . In the Depression, long engagement came to mean something special to those who, like Dorothy and Bill, enacted it. We must at least entertain the idea


152

that a part of the Depression concern about delayed marriage responded to a real change in the ways engaged couples understood their relationship—in the resolution, coyly omitted from the text, of the struggle of Dorothy and Bill.

On the whole, Kinsey's respondents who had married without previously having had coitus with their intended expressed higher assessments of their marital happiness. This relationship, however, was related to engagement and, interactively, with date of marriage. Those who had never been engaged were the least affected by premarital intercourse in their assessment of their own marital happiness; those who had had quite long engagements spoke commonly of unfortunate consequences of their transgression. This relationship makes perfect sense in view of the ambiguous test of restraint and intimacy that engagement still posed until the Depression: those couples who took the test most seriously, and passed it, celebrated their joint triumph in the day-to-day context of their marriages. But those who yielded to their frustration had failed, and this failure promoted a sense of weakness in their marriages. This was true up to those cohorts marrying in the Depression. But the relationship began to be effaced in the Depression cohort . No overt revolution overthrew engagement. But it changed within, and with this, one more small social restraint on individual (or dyadic) volition began to vanish.

The exemplary demographic investigation by Preston and McDonald has shown us that, followed for long enough, marriages contracted in the Depression were relatively prone to divorce, as though subject to particular strain at the outset.[90] Their analysis contradicts the common perception—based on current numbers of divorces—that the Depression, if anything, had a moderating influence on the divorce trend, as families pulled together and as divorce came to be seen as simply too expensive. Two material aspects of the Depression had impact on marital durability: reduced material means in the crucial years immediately following the marriage and unemployment and uncertain employment. Marriages contracted young appear to have been relatively even more susceptible than they previously had been to eventual divorce during the Depression, as compared to those contracted at older ages at the same


153

time.[91] The largest effect of the Depression on marriage, however, may well have been on the cultural level, affecting in a widespread way people's senses of the importance of planning and self-reliance, of cooperation within intimacy. We may assume that these motifs were most prominent among the young, especially those whose imprudently conducted or terminated engagements broke loose from acceptable standards under Depression strains.

The Surprising End of the Baby Bust

If the Depression weakened engagement by calling into question too many engaged couples' capacity for right behavior, it in a sense strengthened the dyadic component of marriage. One common response of families when the Depression hit, thus, was to limit fertility, a development of which contemporaries were quite aware. A national survey of women in 1937 discovered that three in four said "no" to the proposition that "young married people should have a child in the first year of marriage." No fewer than 84 percent of women under thirty answered this way, as did about the same proportion of single women.[92]

A variety of means existed by which fertility might be reduced, including a number of means of mechanical and chemical contraception that had begun to diffuse rapidly in the 1920s and seem to have done so even more rapidly in the 1930s.[93] Four in five American women (nine in ten of those under 30 years of age) told a representative survey in 1938 that they were "in favor of birth control."[94] Abortion, too, seems to have played a role.[95] For all this, though, the effect of the Depression on fertility was by no means uniformly and invariably to reduce it. In fact, in the middle of the decade, the direction of the birth rate turned around altogether, after a good century of steady decline. This surprising reversal—which began a two-decade rise that came to be recognized as the "baby boom"—coincided with the point at which young people who had postponed their marriages because of the Depression began to "catch up" with the family-building plans that we may presume they (or the culture in which they grew up) had had. The family had evidently


154

retained—more than retained—its salience in hard times, and such a period now proved more conducive to the reassertion of older ways than had the assertive consumerism of the 1920s.

Some components of total fertility were more elevated as the Depression faded before the war economy than they had been at its beginning. Careful empirical investigations of the differential impact of the Depression on urban fertility rates indicate a common initial reaction of fertility reduction, across socioeconomic classes and holding for blacks as well as for whites. After about 1933, the more prosperous classes began to relax their cautions somewhat, while those poorer (but not blacks, at any economic level!) continued to restrict their childbearing. In a New York study, the reduction in differential fertility among whites was so great that the 67 percent separating the fertility rates of white women living in the poorest and richest fifths of the city's areas in 1929 dropped to 19 percent in 1941. The author of a Chicago study speculated that when combined with the generally earlier marriage schedule that emerged during World War II, the middle-class upsurge in fertility during the latter part of the Depression foreshadowed the less-class-differentiated family commitment so typical of the postwar baby boom.[96]

The differential fertility patterns, however, contrast sharply with those we have earlier seen with regard to marriage. Poorer people relatively quickly resumed their pace of marriage but returned only slowly to their earlier, relatively expansive fertility patterns. We may infer, accordingly, that the Depression systematically reordered the timing of childbirth and its relationship to the timing of marriage, and that it did so differentially by social class. Overall, just as the Depression made marriage timing more a decision that segregated the prudent from those whose circumstances did not encourage (or permit) prudence, so entry into parenthood came to constitute a second segregation point where once again the question of prudence was rather explicitly raised. At this point, those whose prudence in marrying had encouraged delay in marriage yet again postponed their transitions to parenthood. But as prosperity showed signs of returning, fertility patterns reversed themselves sharply, and the prudent moved briskly to complete their


155
 

Table 14. Average Annual Change during Selected Periods of Total
                 Fertility Rate and Age-Specific First-Parity Fertility Rates, by
                 Race (in percentages)

 

Whites 1st Children

Nonwhites 1st Children

 

All Fert.


All

15–
19

20–
24

25–
29

All Fert.


All

15–
19

20–
24

25–
29

1925–1930

-3

-1

-2

-3

-2

-4

-3

-3

-5

-6

1930–1933

-5

-3

-4

-6

-6

-3

-3

-3

-4

-7

1933–1937

  0

  3

  1

  4

  3

  0

  2

  4

  0

-1

1937–1941

  2

  4

  1

  4

  8

  1

  1

  0

  1

  3

    SOURCE : Calculated from Robert L. Heuser, Fertility Tables for Birth Cohorts by Color (Rockville, Md.: National Center for Health Statistics, 1976), 23–25, 30–32, 362–364, 369–371.

strongly held commitment to parenthood. The cumulative effects of postponing intimacy, marriage, and parenthood had brought the segments of the youthful population ordinarily the most prone to waiting prudently to the point where they were eager to grasp at the hopeful indications of future prosperity by the end of the decade to press toward early family formation.

A compressed account of selected birthrates is offered in table 14. First births—certainly among whites—had declined less sharply in the years preceding the Depression than did the fertility rate based on all birth orders. But in the first four years of the Depression, initial parenthood was if anything more inhibited than was the enlargement of families in being. The increase of first-birth rates after the Depression's initial impact is striking when compared with the trend for all births. So is its race- and age-specific pattern, the upturn affecting older childless white women from 1933, without a parallel tendency for black women. Postponing the transition to parenthood, then, was an early Depression strategy for coping with the uncertainty to which the disordered economy exposed young Americans, a way of permitting marriage while limiting sacrifice of material standards, a strategy that was consistent with already


156

existing trends, which had by now begun to affect almost all segments of the American population similarly. The return to parenthood expressed by white women by about the middle of the Depression, however, reached black women only in a far more meager way.[97]

It was not primarily deprivation per se , but an environment of widespread deprivation that encouraged prudent behavior. One study of fertility between 1929 and 1932 showed that the age-adjusted birthrate for those who had become "poor" after having had a "moderate" income before the Depression was 39 percent higher than those who had maintained their moderate incomes, while those who once "comfortable" but poor in 1932 had a birthrate 26 percent above that of those who started comfortable and remained comfortable or at least "moderate."[98] In Indianapolis, the proportion of long-term family planners among couples marrying just before the Depression ranged from 38 percent among those with low income in the first years of their marriages to 58 percent among those whose incomes were high. But whatever aspects of personal organization promoted family planning, they were so strongly associated with those qualities conducing to husbands' material success in the work world that among those of low initial income who were to achieve medium or high incomes by the late Depression, family planning was just as prevalent as among those whose initial incomes were high.[99]

The Indianapolis research found that a sense of "economic security" characterized middle-class families to a markedly greater degree than those whose husbands enjoyed less-favored occupations and that these same people were less likely to feel "economic tension"—a sense of a gulf between desired material well-being and attained well-being. But these were also the very people who had married later and were also more likely to be more planful generally and specifically with regard to fertility. The most planful, when asked to sum up whether they believed that the Depression had caused them to have fewer children than they might otherwise have wished to have or had intended to have, were the least likely to answer "yes." For many but not all Indianapolis couples by the late 1930s, self-conscious family planning was an element of their ambient cul-


157
 

Table 15. Proportions Still Childless in 1940, for Urban Native
                 White Women Still Living with Their First Husbands,
                 by Age at Marriage and Successive Marriage Cohort
                 (in percentages)

 

Age at Marriage


Approximate Date of Marriage



20–21



25–26



22–24



27–29

Older
Minus
Younger

1921–26

13.1

24.0

   

10.9

1923–29

   

18.8

28.6

  9.8

1926–31

15.7

29.7

   

14.0

1928–34

   

24.4

26.7

12.3

1931–36

24.9

38.4

   

13.5

1933–39

   

44.5

51.6

  7.1

1936–40

57.5

61.4

   

  3.9

    SOURCE : Calculated from Census 1940–4, 37–40.

ture, which they comfortably appropriated. And it served them in good stead in the trying decade. Planfulness in general, family planning in particular, a sense of economic security, feelings of personal adequacy, and a reportedly happy marriage were all correlated with one another. They were also all correlated positively to income.[100]

Retrospective questions about fertility and marital timing asked in the 1940 census enable us to sketch in more detail how the Depression family-building process altered the prudent considerations of families. Table 15 compares two sets of urban native white women, each still living in 1940 with her first husband, for a number of marriage cohorts, asking which of these sets—those who had married on the young side and those who married on the old side—still had not become mothers.[101] The women being compared, then, had been married just as long as one another but had married at different ages.

The rising absolute proportion of the childless, of course, is to a large degree simply a product of the briefer time those married more recently have had to give birth by the census


158

date. The differences by age among those married for the same amount of time, however, is significant. Invariably, those who married younger were quicker to have a first child, were less prone to remain childless for the intermediate or long run. And there is a marked increase in this differential between early- and mid-Depression marriages . For late Depression marriages, however, this differential declined , rather more than could be explained by the truncation effect produced by the 1940 census date. Those, that is, who for reasons of prudence or normative conformity to socioeconomic-group values were late to marry were also more leisurely about moving into parenthood. But this was truest in the middle of the Depression—at which time, as we have seen, the marriage-timing considerations of the socioeconomic groups were diverging from one another rather sharply. The 1941 Indianapolis Fertility Study of native white Protestant couples who had married just before the Depression looked closely into attitudes of this sort, and found that even among their restricted sample, class background made an enormous difference to planfulness, generally and with specific reference to fertility behavior. The Depression brought about an immediate slowing-down of childbearing among working-class families , which in some cases led to a repeated postponement that led to lifetime childlessness. For couples marrying in 1927, the pace of entry into parenthood was fairly evenly distributed across classes. By 1929, it was working-class couples who, initially postponing their childbearing, continued to postpone their fertility into and beyond the third, fourth, and fifth year of marriage, with substantial proportions in fact remaining childless throughout their lives.[102]

The kind of prudence implied by relatively late marriage nevertheless had its most substantial impact among those of lower socioeconomic status who held off marriage, as table 16 shows. For them, considerably delayed marriage was somewhat unusual at this socioeconomic level, perhaps reflecting an especially self-conscious calculation of the kind of family life they considered fitting in light of their values. Such "prudent" behaviors were considerably more common among the middle classes, and on the whole, it is likely that the Depression did as much to reawaken these attitudes among them as it did to dif-


159
 

Table 16. Proportion Still Childless in 1940, for Urban Native
                 White Women Marrying in 1930–1937 and Still Living
                 with Their First Husbands, by Age at Marriage and
                 Husband's Occupation (in percentages)

 

                                              Age at Marriage

 

16–21

21–26

26–31

Professional

39.2

47.5

47.6

Prop., Mgr.

38.0

44.5

49.6

Cler., Sales

37.7

48.2

56.5

Crafts

30.6

43.9

54.2

Operative

27.5

42.1

53.8

Service

35.3

45.2

54.7

Laborer

24.4

42.4

52.7

Unemployed

20.1

40.3

48.1

    SOURCE : Calculated from Census 1940–4, 37–38.

fuse them to the working class. Marriage with white-collar workers, almost in and of itself, was associated with a slow move into parenthood in the mid-Depression. Equal prudence was not rare among the working class, but it was less uniformly distributed: only those who had also married at an older age than was ordinary in this class were so cautious about entering parenthood.

Planful attitudes toward family building was one among a complex of values and attitudes, most commonly found among the middle classes, according to which individuals felt more in control of their own destinies and, so feeling, successfully employed tools like effective contraception to that end. Even apart from their common relationship with socioeconomic status, there was also a separate, independent relationship between fertility planning and planfulness in affairs more generally, the very essence of a "modern" attitude toward the disposal of life's means to one's own end.[103] Fertility control made sense to planful people because children were too important to be left to chance and because children's economic well-being could be seen to only if there were not too many of them. Fertility plan-


160

ners were no less child-centered than those who considered children a gift of God, although planners expressed more specific pleasure in specific children and less in parenthood per se . The Indianapolis researchers found that fertility planners were not motivated by a sense that the Depression (or anything else) had made it impossible for them to carry out their chosen procreative roles.[104]

Looking toward the Future

The 1920s had intimated to the middle class, and to those increasing numbers aspiring to a middle-class style of life, the possibility of an affluent, predictable world. In this spirit, many of the 1920s youth generation had set themselves off from the parental generation and successfully demanded the right to develop a modified pattern of family formation, incorporating enlarged volitional elements. Tendencies toward a lessening of the asymmetries of the double standard, toward a more overt acceptance of pleasure—and especially sexual pleasure—as a motive for behavior, and toward a more jointly planful attitude to family building characterized the patterns that spread during the decade.

The sharp age-specific reversals of the Great Depression undercut the material basis on which this new arrangement of affairs had rested. Youth were now needed as workers or money-earners with their parental families, and their income, when they had some, was less available for dating and for moving from engagement to marriage. Schooling was extended and with it, the setting in which the newly evolved dating system had developed, but the comfortable translation to adult roles that 1920s youth had anticipated was disquietingly uncertain for their immediate successors. There is evidence that an attitudinal set emphasizing planfulness surrounded and perhaps was furthered among many young people by the caution encouraged by the Depression. Others, however, their means of livelihood rendered regularly uncertain, were moved altogether to abandon planfulness and providence and to hope to find in their newly formed families a semblance of the content


161

and satisfaction they had grown up to anticipate would be theirs as young adults.

The two ends of the family-formation process as I have treated it here were affected by this age-specific impact of the Depression in simple enough ways. Dating, although some boys were priced out of entry into the pool and some girls were unable to dress up to the levels they felt dating required, was not seriously affected but continued to spread with the high school and into the working class. Cheaper dates were not all that difficult to accomplish, and one might just date less frequently.

Upon first childbirth, the Depression seems to have had a basically temporary demographic effect, with attitudes we have called "prudent" leading to delayed childbirth with marriage, but most typically only for a few years. Couples, even the most prudent among them, rarely decided to wait out the entire unsettled economic period before having children; rather, they merely awaited enough of an upturn that they could execute their prior plans. In a 1936 poll, less than one-half percent of white Americans said that their ideal number of children was none at all.[105] The proportion who in fact were to have none rose sharply—a price, perhaps, of prudence—but for most groups, this was not a common outcome.

Correspondingly, the strains on marriages formed during the Depression showed in their slightly elevated propensity to end in eventual divorce, perhaps no less because of the fading normative structure of engagement and attendant tensions in the couples as they approached the altar. Under challenge, the marital institution stood up very successfully indeed and would soon prove a most attractive beacon to young men and women whose lives were caught up by the demands of a nation at war. But the modest plasticity of the life course surrounding marriage pointed to circumstances less under the control of the contracting partners than the innovative cultural prescriptions which their immediate predecessors had invoked. The marital institution, however, emerged from the Depression in a healthy state. To it, the war that ended the Depression was to prove a surprising tonic, providing it both material wherewithal and a fresh cultural sanction.


162

5—
War and Its Aftermath

Prosperity and Optimism

Conventional wisdom holds that the frustrations built up during World War II gave rise to pent-up "familistic" motives whose eventual product was the baby boom. And there is some truth to the story. The war surely impinged in multifarious ways on the family-formation process. Yet, in the aggregate, it is hard to detect this. In fact, the dominant lasting effect of the war seems to have been that the economic forces it unleashed, and the personal optimism and sense of efficacy that it engendered, combined with prior preferences to set into being a postwar family-formation schedule that was at once more relaxed about what was seen to constitute adequate prudence, more flexible about both the sources and timing of economic wherewithal, and (perhaps consequently) more insistently early and modal in its timing. The impact of the war on various aspects of the life course was not uniform, sometimes surprisingly great, sometimes surprisingly slight, sometimes long-lasting, sometimes transitory. This chapter will show, for instance, that young people, briefly, reassessed the transition from school to work; that marriage rates fluctuated widely during the war and the years immediately preceding and following it, modifying the immediate context of marriage without modifying the rules of the marriage market; and that, seemingly, the new mechanisms and attitudes permitted over one-tenth of the American population to be called to arms and then returned to and reincorporated in the civilian population with life courses smoothly resumed.

Only when the nation entered World War II had economic optimism fully dissipated the cruel uncertainty that the seem-


163

ingly endless Depression had engendered. Just months before America entered the war, a national survey found that six in ten of its citizens believed that "after the present war is over," people would receive lower wages than before the war, and a like proportion believed that there would be considerable unemployment. Only 11 percent thought that there would "be jobs for everybody," and fully seven in ten responded that "after the present war is over . . . people will have to work harder . . . [than] before it started."[1] But only a few months into the war, 46 percent believed that "the average young man will have more opportunity . . . to get ahead than a young man had after the last war," and only 17 percent thought they would have less opportunity. One year further into the war, the proportion expecting young men's postwar opportunities to exceed those at the end of the last war had risen again, to six in ten.[2]

Between 1935–36 and 1941, incomes had on average increased by a quarter, basically overcoming the effects of the Depression. The distribution of that income shifted somewhat to the advantage of Americans of lower income. The war years exaggerated these trends. For families as a whole, average income increased by 28 percent between 1941 and 1944, with a redistribution that sharply favored those of relatively low income. The mean income of the poorest one-fifth of all families grew by no less than 73 percent in these three years. The next-poorest fifth saw average increases of 52 percent.[3] Jerome S. Bruner concluded in 1944 that an important feature of the home front during the war was "the almost unconquerable faith of Americans in their own personal futures." Seventy-nine percent of employed men and women queried in March 1943 believed that their present job would continue after the war. Even 50 percent of war workers in eight large cities gave this response. Asked if they would have enough money to tide them over until they found another job, should they lose theirs in reconversion, two-thirds thought that they did. People evidently now believed that the economy was meaningfully changed, and for the better.[4] Perhaps, too, the war's uniquely widespread sense of economic well-being was amplified by the postponement of spending caused by war shortages and re-


164
 

Table 17. Approximate Age Structure of U.S. Armed Services during
                 World War II, Males Surviving to 1947

       

Disability Compensation




Born:


  Number of
   Survivors
      1947


Proportion
    of All
Survivors

   % of
  Birth
Cohort
Served




    Number



  % of
Served

To 1902

     333,000

    2.4%

    2.4

     42,980

13.0

1903–12

  2,035,000

  14.8

  22.8

   343,412

16.9

1913–17

  2,507,000

  18.2

  46.8

   384,825

15.4

1918–22

  4,344,000

  31.6

  78.4

   606,343

14.0

1923–27

  4,218,000

  30.6

  74.6

   378,226

   9.0

1928–29

     328,000

    2.4

  17.9

        2,881

   0.9

 

13,765,000

100.0%

 

1,758,667

 

    SOURCES : Calculated from Census CPS P20-15, 15; U.S. Administrator of Veterans Affairs, Annual Report 1947 (Washington: USGPO, 1948), 160.
    NOTE : War fatalities might have had an age bias, which would lead to an undercount of the proportions of age groups who served. Civilian mortality was highest in the oldest cohorts, but civilian mortality before 1942 would not much distort these figures.

flected in war-bond purchases that included perhaps 40 percent of all families and single consumers in the first three months of 1942 alone.[5]

Economic well-being, of course, could coexist with a profound sense of being balked in one's most important personal projects, surely during a war that was, predictably but to an unpredictable extent, to draw heavily on the nation's youth. As the leaders and populace of the United States began to contemplate what belligerency was going to demand of just whom, it was not clear whether young men might continue their schooling, whether young women might be in effect drafted into the labor force, whether lovers might be separated by national manpower needs, whether conscientious parenthood might prove difficult in view of competing time commitments and the kind of widespread challenge to morality that wars commonly pose.[6]

The military effort was to be an enormous one, calling to arms some fifteen million American men, as shown in table 17.


165

Of these, constituting more than one-tenth the total American population at the beginning of the war, six in ten were provided by the birth cohorts 1918–1927, requiring military service by three in four of all living American men born in these years. Overall, more than one in eight suffered a disability of some sort—generally small—and received monetary compensation from the government.[7]

Mobilizing Adolescents

No segment of the population felt the war's impact more acutely than did youth, just as they had that of the Depression. For years a problem because unemployed and almost without hope of regular employment, young people were suddenly in heavy demand. This was to have an enormous impact on their life courses. Young people in the Depression had often extended their schooling so as to fill time usefully, but in the early 1940s, the vast enlargement of production began to draw young people from schools. Many of those remaining in school pressed for additional vocational training—echoing an argument made by some educators even before the war—that led in some cases to a reassignment of academic instructors to vocational courses. The national need came to be focused on the here-and-now, and schooling that led to use only through leisurely, indirect pathways caused distress.[8]

Shortly after Pearl Harbor, state legislatures were approached with proposals to relax child labor standards, but they generally resisted. Soon, however, legislators responded to the state of emergency. In 1943, "sixty-two acts affecting the employment of minors were passed in twenty-seven states. Of these, fifty-four included some backward steps. . . . Most of these statutes apply only for the duration of the war." Already in 1942 the pool of young people eager to enter the labor market was showing signs of wearing thin, so that by 1943, large numbers of young workers at 16 and 17 were taking full-time jobs, leaving part-time employment to those younger.[9] In industrial Franklin County, Ohio, increases in first-time work permits amounted to 52 percent in 1941, 184 percent in the following year, and 32


166

percent before they peaked in 1943.[10] A Census Bureau sample survey in April 1944 indicated that over one in five of schoolboys 14 to 15 were gainfully employed, and over two in five at 16 to 17. By this time, 35 percent had left school altogether and were working. Teenage girls were less prone to both full-time and part-time work, but fully a third had jobs by 16 to 18.[11] The twentieth-century trend toward an extended period of economic dependency, based on school extension and exclusion from the full-time labor force, had been reversed momentarily.

The expansion of manufacturing production explains a fair amount of youth's enlarged work opportunity. In 1940, only one in five employed youths 14 to 19 worked in manufacturing. But by 1944, more than one in four working boys and more than one in three working girls were in manufacturing. In Franklin County, Ohio, this was true even of first-time work permit applicants, previously more likely to find work only in ill-paid (if somehow age-appropriate) service jobs. Manufacturing employment had important implications for the way these children grew up, because the kinds of demands the coordination of manufacturing work made on children's time were different from those in other sectors of the economy, making academic training difficult to maintain.[12] Even when the United States Employment Service asked local manufacturers to employ part-timers if at all possible so as not to disrupt schooling, the manufacturers, straining to meet defense contract deadlines, gave the request only lip service.[13]

In the industrial and port city of Duluth, Minnesota, there were signs among boys of increased dropping out of school as early as 1942–43, at ages as young as 14 and 15. Perhaps as many as a quarter more boys than before the war dropped out at 15, a third more at 16. The change in the life course of girls was not so massive, but it too was substantial, especially by age 17. As with boys, changes were apparent from 1942–43. In 1943, the typical grade level of young workers receiving their first regular employment certificates began to drop nationally, as labor demand exceeded available sources of child workers at the higher grades.[14]

The hours were often demanding by any standard, some-


167

times illegally so, and even when legal, the combined hours of employment and school of many schoolchildren were excessive.[15] Young people, however, compared their options with those their older brothers and sisters had enjoyed and responded enthusiastically. During the war, families that had more workers were generally unusually prosperous—more so than had been the case before Pearl Harbor.[16] A survey of young people in three Michigan high schools in spring 1944 found that the students reported that they found their jobs both educational and enjoyable and that they did not interfere with school.[17] In fact, boys (but not girls) who had gainful employment received higher grades on average than those who had no jobs. School absences, contrary to prediction, did not increase.[18] Curricula were massively given over to the perceived needs of wartime morale, and to preparing boys more speedily for military service, favoring especially vocational and civics training.[19] There was no marked increase in school absences. The usual determinants of educational attainment were somewhat scrambled but in the long run, not the average level of attainment.[20] The war drew many young people's attention from the schools, but in no long-term way.

As the eventual military outcome became a certainty, educators began to worry that in conceding so much to war exigencies, they had sacrificed the long-standing trend toward increased schooling. The Director of Pupil Personnel and Counseling for the Philadelphia Board of Public Education expressed his anxieties eloquently in 1944, as he looked forward to the postwar. "We can't very well blame children for succumbing to the lure of easy money [during the war]. They have half a notion that it can't last, but it's quite another thing to expect youngsters undergoing all of the uncertainties of adolescence to suddenly and quietly settle down after having known such bonanza days."[21] Given the go-ahead for a "National Go-to-School Drive" as academic year 1944 began, educators rallied the community to their side.[22]

Hats off to American boys and girls! They have shown superb readiness and eagerness to share in the work of the war. . . . Mil-


168

lions of youngsters have taken full-time jobs. Others have added jobs on top of school work. Now the time has come when all of us must scrutinize far more carefully than we have in the first 3 years of the war the use that is being made of the capacities, energies, and time of our teen-age young people. . . . Some work experience may have significant educational value for some young people. For the vast majority of them, however, school provides the greatest opportunity for development, and adults should help them to give school PRIORITY NUMBER ONE now.

Pearlman and Eskin, however, writing at the end of the war, delineated acutely what the lasting impact of the wartime expansion of youthful employment would be. "The number of in-school workers . . . will vary with the level of economic activity. If a high level of employment is maintained, the number of students who take advantage of the opportunities for part time and summer work will probably exceed the number who were in the pre-war labor market."[23] One of the important schooling reforms of the past generation had been the reduction of the proportion of students who were straggling far behind their fellows, by a combination of exhortation, pressure, and easier promotion. Among boys, the war undid some of this, standard deviations of boys' grades at a given age increasing markedly at all ages during the war. In the years following the war, the trend toward increasing age standardization in school seems to have quickly reasserted itself. As of October 1945, however, the proportion of girls and, even more so, boys in their upper teens who were enrolled in school was still below that in 1940, and the older the youngster, the more pronounced this was.[24]

A common adult response to the reformulation of aspects of the adolescent life course is to decry "juvenile delinquency," and in this regard, the World War II period was no exception. The conventional wartime view asserted a substantial increase in juvenile delinquency and most particularly in the sexual delinquency of alarmingly young girls who were thought all too often to have "made the mistake of confusing sex with patriotism."[25] " 'A guy ought to have something to remember when he's facing submarines and death', he said huskily. 'Something more than a few hugs and kisses.' "[26] The scare reflected the


169

accurate perception adults had that the war had placed young people in control of aspects of their own lives formerly overseen by parents.[27]

Some maintained that apparent wartime increases in juvenile delinquency were substantially to be understood not in terms of anything particularly related to moral change or to particularly important shifts in the circumstances of young people but simply to prosperity and to the temptations for forbidden pleasures and objects that were all of a sudden placed before young people.[28] A U.S. Office of Education pamphlet addressed to counselors considered it "obvious" that there was an "unfortunate effect" of young people's sudden prosperity: their "opportunity to have a good time; to enjoy elaborate food, clothing, automobiles." Counselors were urged to encourage suddenly prosperous youth to buy war bonds or to make other savings, lest they develop tastes the future would not be able to meet.[29]

The formidably upright Children's Bureau itself pooh-poohed those who thought they detected major moral trends. They did note "a sharp rise in the number of girls' cases [in juvenile courts]," but explained this by enhanced legal vigilance. The police were now raiding places where promiscuity was said to be practiced.[30] The juvenile court data that are available in comparable form through the early war years give some support to the less alarmed view and certainly argue against signs of an unbridling of youthful sexuality.[31] Acute youth observers emphasized the "channeling of emotions into one burning feeling of patriotism," which often had no legitimate immediate channel. Because of conventional gender expectations, adolescent girls in particular suffered from this problem. They were permitted, for instance, to join the women's military branches only at age 20, although in adolescence their physical and emotional maturity was farther advanced than boys', and boys could join up at 18. Boys, too, with more money in their pockets and more responsibility by far than they had ever faced, were agitated. "Reports from schools and other sources indicate clearly that restlessness, turbulence, and emotional instability are increasing among adolescents everywhere. There are evidences also of increasing hostility toward adult au-


170

thority."[32] But in "Prairie City," the intensively studied adolescent was described by Havighurst and Taba as "down-to-earth and unimaginative," the peer group culture oriented to social participation, group loyalty, and individual achievement and responsibility.[33]

Accommodating War

At minimum, parents were beginning to feel differently about their children. War anxieties on the part of adults helped crystallize the notion of the age-stratified society, a formulation that was after the war to see full fruition in the functionalist concepts of—and partial concession to—a distinctive youth culture, notions that were reflected in youth policy during the war and afterward.[34] Although the government supported the war effort by exhorting married women to enter the wartime labor force, at the same time it supported the conventional role structure of the family (and, by intent, soldier morale) by providing dependency allocations for wives of servicemen. D'Ann Campbell points out that although the number of "new" adult women workers recruited to the labor force during the war years was only 2.7 million as compared with 12 million men drawn into military service, "wives continued to switch into and out of paid employment, only going into the job market a little more often than before the war," so that "the number of women with work experience" increased considerably more than would be gleaned from examining numbers at work at any given time.[35]

The best quantitative data on the relationship of women's labor-force behavior during the war to their family-formation patterns is contained in the 1944 Current Population Survey commissioned by the Women's Bureau. Retabulated slightly, table 18 indicates that whatever the pressures, whatever the opportunities for attractive or rewarding gainful employment during the war, marriage and particularly parenthood still militated heavily against employment. Women who were single in 1944 and had already been employed in December 1941, as had most single women not in school, usually remained in the labor force. But among women single and employed in 1941 who had married by 1944, large numbers had left the work


171
 

Table 18. Proportion of Women in the Labor Force in 1944, by
                 1941 Labor Force Status and Age and Marital Status in
                 1944 (in percentages)

 

                                Not in the Labor Force in 1941

 

<20

20–44

45+

Single

29.4

59.2

10.1

Married, husband present

16.2

11.9

  6.7

Married, husband in service

38.1

39.2

26.0

Married, other

23.9

32.7

13.9

Widowed or divorced

20.3

46.1

  7.5

 

                                     In the Labor Force in 1941

 

<20+

20–44

45+

Single

89.8

94.6

93.3

Married, husband present

26.0

65.2

81.9

Married, husband in service

56.0

71.8

87.6

Married, other

29.6

87.5

88.6

Widowed or divorced

43.7

93.2

86.0

    SOURCE : Calculated from Mary Elizabeth Pidgeon, Changes in Women's Employment During the War (U.S. Women's Bureau, Special Bulletin no. 20 [Washington: USGPO, 1944]), tables 11 and 12. (The Pidgeon study was based on a special Current Population Survey.)

force during the war, even when they had worked for a period of time to get the marriage soundly on foot. Wives in 1944 whose husbands were at home were, however, fairly likely to be gainfully employed if they had been before the war—and this was only slightly truer for wives whose husbands were away in the military. A considerably greater difference in 1944 labor-force participation rates was the product of women's 1941 work pattern rather than the stage of family formation . Only a small proportion of wives whose husbands were at home and who were not already in the labor force as of Pearl Harbor were induced into gainful employment by 1944. This proportion was considerably higher when these wives' husbands were off fighting the war. But in a sense the most striking finding here is the fact that only


172

about a third of wives with husbands away in the armed forces were in the labor force as of March 1944. (One in five of these had a child under 10, a proportion considerably lower than the proportion of working wives whose husbands were not under arms who had young children.)[36] The rest—surprisingly many of whom had already entered parenthood—were supported by their husbands' military allotments or in other ways.

As of May 1945, armed forces pay and allotments amounted to about one-third of the total family income among all families who received any such income, and because pay was higher for non-commissioned and especially commissioned officers, this proportion was roughly constant from relatively poor to relatively well-off families.[37] Military compensation amounted to approximately $900, almost identical to the average annual income at that time for gainfully employed women and about half the average annual income of all families headed by women, which often included military and dependency pay. The average income of these female-headed families, in turn, was about three-quarters that of families headed by men, a ratio that exceeded the comparable figure for 1939 and would not be equaled by the regularly declining ratios after the war ended.[38] To be sure, the proportion of single women who entered gainful employment during the war was not much higher, although these women did not have soldier's allotments as a source of income. It was preeminently the working wives of soldiers who did not share in the widespread wish of wartime women workers to continue work after the war was over.[39]

War and Marriage

As it turned out, the approach of war, and even much of the war period itself, actually promoted marriage. As the wartime marriage boom peaked, a family sociologist remarked, quite correctly, that "the function of war marriages is wider than that of marriages consummated in normal times" and cautioned marriage counselors that the basis for "success" in war marriages had become no less various.[40] An index of this is offered by a tabulation for the period 1940 to 1946 of the monthly totals of marriage licenses (which reflect impulse more directly than actual marriages) issued in thirty-four cities with popula-


173

figure

Figure 13.
Marriage Licenses Issued Monthly, 1940–1946

tions of over 100,000.[41] These are shown in figure 13. As international rearmament brought increasing prosperity to the nation, the impulse to marry trended upward. Pearl Harbor, in December 1941, considerably changed matters, with a deluge of marriage licenses being taken out. If some of these represented snap decisions on marriage partners, most were probably decisions between persons long embarked on courtship and fearful of the war's interruption of their plans. After a settling-in period, the pace of marriage picked up again by 1943. By mid-1945, the war in Europe and then in the Pacific was won, and the pace of marriage began to pick up once again. The real outburst of marriages awaited November and December 1945. This pace continued into mid-1946, reaching a peak in June.

So successful had young people been in marrying during the war that a higher proportion of women under age 20 was actually married in February 1944 than in 1940, fully three in four of them married to men away in the armed forces. For women 20 to 24 years of age, marriage probabilities also increased during the war: 58 percent were married in 1944 as compared to 51 percent in 1940. Almost one in three of their husbands were living away from home because of military service. For wives


174

figure

Figure 14.
Wartime Family Formation, 1941–1945

aged 25 to 34, 13 percent of whose husbands were away serving their country, however, marriage patterns had fallen behind the 1940 pace.[42] Some fascinating estimates by Paul C. Glick of the special factors that contributed one way or another to the wartime household-formation rate provide an initial view of how the family fared in World War II. A condensation of these is presented in figure 14, where the source-specific contributions to the rate are shown as percentages of the approximate rate of household formation that would likely have occurred had there been no war. Aggregating Glick's estimates for the five years during which the nation was at war, we can say that overall a mere 400,000 or so fewer households came into exis-


175

tence than perhaps might have had the nation not been at war—a tiny number compared with the millions of new households that actually did form over the period.

Fundamentally, the wartime household-formation rate during the rearmament year and the first year of American involvement was a composite of the early and huge increase in the marriage rate and the inability of many of these new couples to find separate housing for themselves, or their reluctance to do so in view of impending induction. By 1943 (the dates in the estimates are for activity from July 1 of the preceding year through June 30 of the named year), actual draft calls were accounting for about as much delay in household formation as was postponement of uncoupling by newly married couples, and these each were weighing about as heavily as the now-declining marriage rate. Adding to the overall decline in household formation by this time was a growing number of divorces, products of wartime stresses, but more than offsetting this in a statistical sense was a rapidly growing number of what Glick calls "wartime families," couples or fragments of couples who in ordinary times would still be living with parents or otherwise nonindependently but who had found independent housing because of the recoupling of soldier families or who had found so much prosperity on account of war employment that they uncoupled early, or actually maintained two households, one of them in a city to which temporary labor opportunities called them.

In 1944, military requirements peaked and moved deeply into the ranks of the married, uncoupling many families, a number no longer even remotely offset by war marriages. By this time, the number of war dead who had been household heads came to be large enough to be registered in these estimates—but its impact was small. And in 1945, as the war wound down, family formation startlingly resembled the ordinary.

Many contemporary observers did not see wartime marriage as benign. Sociologist Constantine Panunzio remarked that

the very movement of a considerable number of young people from the country districts and small centers to the large cities, the stimulation of city life, their being suddenly thrown together with persons of the opposite sex in boarding houses, shops, and restau-


176

rants, their need for intimate companionship to compensate for ordinary 'homesickness' and . . . [the] sudden possession of ready money in fairly good quantity—all of these no doubt contributed to the great increase of marriage in the larger cities.[43]

A religiously oriented marriage counselor wrote in 1945 that the war had powerfully exacerbated the tendency already in the American marriage system to elevate romance above other considerations in marriage. "Romantic marriage was society's attempt to recognize and protect the right to personal satisfaction and romantic happiness in marriage and its resulting parenthood." In wartime, personal satisfaction can hardly be had other than by lightning courtships and marriages.[44] Some contemporaries—including some marriage counselors seeking to enhance the felt need for their services—saw the rush as being led by women who feared that military casualties would spoil their chances of ever marrying.[45] The several phases of war-induced marriage patterns affected different age groups differently. That the impact of the war on marriage was extremely age specific is hardly surprising in view of the age specificity of military service. Prewar economic recovery most potently improved marriage chances at the more modal marriage ages, for both men and women. The beginnings of the draft, in contrast, produced a marriage rush that was quite focused in age among younger men (many of whom no doubt hoped to avoid military service through family deferment) but not quite so focused among women. War itself produced a dramatic surge of relatively young marriage—again more so for men than for women—followed by a dearth most apparent in the ages of army service. When the war began to stop moving young men about, they once again started to marry. Again, it was especially those in the very cohort—otherwise modally situated for marriage—that had been the most often denied timely marriages during the war who were the most affected. The women they married were characteristically of the ages deemed appropriate for such men, and this forced some younger men out of the marriage market—but not very many. As the impact of the wartime marriage deficit let up—and this occurred rather quickly—relatively young men and women seemed prone


177

to try to keep up the marriage boom, but this could not be sustained on such slim residues of marriageable people. Nevertheless, marriage timing was becoming more modal, and younger, as a result of the new patterns of marriage developed with the postwar stabilization of the wartime marriage market.

New York State data are deployed in table 19, showing for the years 1939–1946 the trends in numbers of men and women marrying for the first time at ages that were relatively young (20 for men, 17 for women), roughly modal (24 for men, 21 for women), and on the old side (27 for men, 24 for women). Among males, a deformation of the age structure of marriage was adumbrated already by 1941, as younger marriage became especially common. By 1942, the tendency had become more pronounced, as many men somewhat older were already in uniform. The overall deficit of marriages in 1943 and, to a lesser extent, 1944, yet again shifted the age structure of marriage downward. By the last year of the war, older men, undoubtedly including large numbers of returned veterans, were marrying. For older people, the enormous postwar marriage boom began promptly. By 1946, all age groups were participating heavily, postponed marriages in part accounting for the excess of the

 

Table 19. Annual Increase or Decrease of Numbers of First
                 Marriages in New York State (apart from New York
                 City), by Sex and Age, 1940–1946 (in percentages)

 

Males

Females

 

20

24

28

17

21

25

1939 to 1940

39.7

21.0

24.8

25.9

17.7

31.1

1940 to 1941

56.5

12.0

   2.1

18.4

24.7

   4.9

1941 to 1942

26.8

-17.5

-13.9

   8.9

  -7.4

-16.0

1942 to 1943

-40.7

-29.2

-26.4

-31.0

-61.8

-61.9

1943 to 1944

  -6.3

-12.1

-19.3

-11.0

63.5

53.2

1944 to 1945

   7.6

22.0

41.5

-52.4

15.1

75.5

1945 to 1946

56.3

87.9

77.0

43.1

71.0

74.9

    SOURCE : Calculated from New York State, Department of Health, Division of Vital
Statistics, Annual Report, annual.


178

modal-age marriage increase over the quite huge one for the young men and women. But just as apparently, the postwar marriage boom affected all ages.

Whites and blacks differed somewhat in their nuptial responses to the rapidly changing circumstances of World War II. Blacks apparently were a bit slower in intensifying their pace of marriage as the Depression faded—and the Depression really did not fade for blacks quite so rapidly.[46] By 1941–42, at any rate, blacks were moving even more smartly into marriage than were whites. Thereafter, blacks were somewhat more reluctant than whites to reduce their nuptiality in the mid-war period and, correspondingly, a little less explosive in their late-war and postwar marriage booms. Subject to severe constraints in their family-formation patterns during the especially long economic depression they suffered, blacks seemed to be even more reluctant than whites to hold back from transforming their newfound prosperity into marriage during the war.

Given the immensity of the personal upheavals promoted by the war and the tremendous annual variations in the raw number of persons marrying, the marriage market continued to function astonishingly smoothly, the widespread economic wherewithal proving able to conquer all in the presence of love.[47] The highly detailed New York data show that the age distribution of those in the marriage market varied greatly from year to year, as one would assume in view of the changing, age-specific nature of the draft call. One might anticipate that because the numbers of men who had access to the marriage market varied in an age-specific fashion, as that of women did not, the women theoretically available for men of different ages to marry would change over the war years.[48] Courtship patterns are structured by ascriptive characteristics, of which in the American system age is a most important one, as is also prior marital status. If there were a restructuring of the timing of marriage, a shuffling of age-related courtship patterns would be a likely concomitant. If younger women were most highly prized, for instance, one might anticipate that in a year like 1943, when relatively few men were present for marriage and in which the numbers of marriages was reduced accordingly, those men who did marry would marry relatively younger


179

women. But this did not happen. Massachusetts registration data on age at first marriage of both partners in new unions indicate that the marriage booms, both during and immediately after the war, were achieved without much altering the relationships of age of bride and groom.[49] That is, the data suggest that young men and women sought—and found—mates whose age bore roughly the same relationship to their own age as in the late Depression. First-time grooms were over time just a little more prone to marry young in the wartime marriage market than they had been before the war, but somewhat surprisingly, the same shift was apparent among first-time brides.[50]

There was, however, a tendency during the war for first-time grooms to marry previously married persons more readily than they had previously. But, on closer inspection, this too represents not a shift in market patterns but simply the fact that there were increasing proportions of once-married people in the marriage market during these years, as there were to be subsequently. Indeed, separate examination of the previously married shows that when one takes the number and proportion of all marrying people into consideration, they became just slightly less likely than before the war to join in marriages with persons who had never before married. The marriage market held remarkably firm: cultural preferences were being maintained through the war, despite circumstances that led to the shifts in the timing of many marriages. Youthful marriage was youthful for both partners. Even the marriage of veterans would be accomplished without any particular upsetting of the age structure of marriage, despite the dramatic "time out" that had seemingly been introduced into their search for a mate. Youthful girls were not snapped up by the returning heroes. Rather, the veterans apparently picked up where and with much the same age group as they had left off.

The marriage market itself held up in a surprisingly orderly fashion through the war, in the face of the remarkable bursts of nuptial energy documented earlier in this chapter. In figure 15, we shift our angle to a longitudinal one, examining the single-year marriage probabilities at given ages of three birth cohorts of men, those born about 1916, 1920, and 1924, tracking each cohort through the war years. The retrospective data


180

figure

Figure 15.
Percent of Single Men Marrying, for Three Birth Cohorts

used to construct the graph indicate that Americans only advanced or suspended marriage schedules during the war; they did not abandon them. And much of the change that took place was incorporated into a longer-term tendency to earlier marriage. Easily the largest part of the proportion of eligibles marrying at any given age as always was a reflection of the ordinary age curve of marriage. The 1924 birth cohort provides a nice example. The 4 percent of the single men of this cohort who married at 18 in the first full year of the war was way above the 2.7 percent of like eligibles who, four years before, had married at 18 but for all that, only 4 percent married , less than the proportion who married three years earlier at 19 at the end of the Depression, and less also than the single 19-year-old men in the very midst of the Depression.

The dominant pattern among the three birth cohorts of men in fact was not that induced immediately by the war but rather the secular trend toward increasingly earlier marriage. For all the distortions of the war, subsequent cohorts did not abandon under strain of war the "gains" in more rapid wedlock that previous cohorts had adopted. And shortfalls were made up quickly by those cohorts whose marriages had been temporarily postponed. In fact, among men, the retrospective data indicate


181

that the most dramatic single-year shifts occurred in the huge marriage boom after the war. And yet, these shifts were general enough that they too were considerably outweighed in their effect on the age structure of marriage by the secular trend downward in marriage age.

When a mid-1946 survey asked a representative sample of Americans what they thought was the ideal age for men and women to marry, it found that the then-current youthful patterns were closely embraced by the majority.[51] In fact, ideal ages produced a high degree of consensus—a considerably higher degree of consensus than the actual behavior at the time. The norms, moreover, were more uniformly held than they would be even at the height of the baby boom. As a result, the wartime-marriage cohort—despite the distortions in detail produced by the war—was more likely than either the marriage cohort of the Depression or those preceding cohorts to endorse their own marriage timing as roughly the ideal, and this was so for both men and women. In the war-marriage cohort, neither veterans of World War II nor working wives nor husbands of working wives—those respondents whose own marriage timing had most likely been affected in the complex development of the preceding half decade—differed significantly from others in their ideals of marriage timing.

Burgess, concerned about the predictably fragile quality of wartime marriages, set up for his students an extensive interview protocol and in about 1945 directed them each to interview a handful of married women who had been separated from their husbands by the latter's wartime duty. The questions probed into the circumstances of the marriage, the strains of the separation and modes of handling strains by both partners, sexual jealousies and concerns, letter-writing, and plans, hopes, and anxieties about the reunion that would soon take place. The sample was inevitably weighted toward the kinds of women who would be acquaintances of University of Chicago students, but many of the novice interviewers were at pains to contact a variety of respondents. Selected quotations provide a sense of the riskiness of life course formation in wartime, combined with a shared belief in the inevitability of the sequencing that made sense of these events for the women.


182

Rare were interviewees who worried that their marriages had been contracted in haste, or too young.