Home / essay help / essay writing services / a possible fifth stage of cognitive development that characterizes adult thinking is:

a possible fifth stage of cognitive development that characterizes adult thinking is:

CHAPTER 11 Physical and Cognitive Development in Young Adulthood

Garrett, a 19-year-old, is a 2nd-year college student. He enjoys many privileges that could be construed as “adult.” He drives a car, votes in elections, and owns a credit card. He shares an apartment near his college campus with two other undergraduates. He drinks alcohol with his friends at parties (albeit illegally) and has regular sexual relations with his girlfriend of 10 months. The couple split up once and subsequently reunited. During the separation, Garrett dated another young woman. At this point, Garrett and his girlfriend have no plans to marry. He wanted to get a job immediately after high school, but job opportunities in the trade he aspired to went to more experienced workers. He decided to give higher education a try. Garrett has little idea of what his ultimate career path will be. Garrett’s father, who is divorced from Garrett’s mother, pays for his tuition and housing costs. He has taken out loans to help finance his son’s education. Garrett’s mother provides him with an allowance for food. He works part-time for low wages at a clothing store in a local mall, which helps him pay for clothes and entertainment. Garrett is very responsible at work, but he is an unenthusiastic college student. His study habits lean toward procrastinating and then trying to make up for lost time by staying up all night to finish assignments by their deadlines. When Garrett broke his wrist playing sports, his mother had to take a few days off from her job to accompany him to doctor’s visits. Her health insurance covered most of the bills. Yet, despite their financial support, Garrett’s parents have no legal right to see Garrett’s college grades. Is Garrett an adult? Scholars are likely to disagree about the answer to this question. Most would agree that the onset of adolescence is marked by the changes of puberty. But there are no easily observed physical changes that signal entry into adulthood. Instead, adulthood is a social construction. One or more culturally determined criteria usually must be met before one’s maturity is established (Hogan & Astone, 1986), and the criteria vary depending on the observer and the culture. In the past, sociologists have emphasized the achievement and timing of marker events as criteria for adulthood. These have included completing formal education, entering the adult workforce, leaving the family home, getting married, and becoming a parent. Around the middle of the last century, a large proportion of the American population achieved these marker events between the ages of 18 and 24 (Rindfuss, 1991). However, if we evaluate our hypothetical student, Garrett, according to these traditional marker events, he would not be an adult despite being in the right age range. From a sociological perspective, it seems to take longer to grow up today than it did at earlier points in history for many reasons. Some of these include the demand for a highly educated workforce and the increased cost of this education (Jacobs & Stoner-Eby, 1998), the difficulties inherent in earning enough to support children and in achieving stable employment (Halperin, 1998b), and the frequency of early, nonmarital sexual activity and the availability of contraception (Warner et al., 1998). All have had profound effects on the timing of life events. On one hand then, some markers of adulthood are considerably delayed. For example, the median age for marriage in 1976 was about 22 or 23. By 2006, the median age for marriage had risen to about 27, a difference of more than 4 years in only 30 years (Arnett, 2010). On the other hand, other indicators of adulthood, such as the onset of sexual activity, occur much earlier than they did in the past. Such shifts in the timing of marker events appear to have delayed the onset of adulthood, especially in Western societies, where these shifts have most often occurred. However, even in more traditional, nonindustrialized cultures, the transition to adulthood can be a slow process. For example, after completing the puberty rites that induct boys into the adult ranks of some societies, young males in about 25% of cultures pass through a period of youth (Schlegel & Barry, 1991). In these societies, males are seen as needing a period of time to prepare for marriage. Serving as warriors during the transition period, for example, allows boys an opportunity to develop skills and to accumulate the material goods needed to afford a family. Girls enjoy a similar period of youth in 20% of cultures. Thus, even in many non-Western, traditional societies, the movement to full adult status takes time. In cultures such as that of the United States, the pathways to adulthood are remarkable in their variability, so that specifying when adulthood has been achieved is difficult. Arnett (2000, 2004, 2007; Arnett & Taber, 1994) has explored a new way of conceptualizing adulthood. In addition to examining the timing of marker events, he considered individuals’ own conceptions about what makes them adults. The reasoning goes like this: If one’s own judgment of adult status were based on criteria other than marker events, we might find more consistency in criteria across cultures or even within the same culture over a certain period. Arnett (2000) asked young people in the United States to rate the importance of criteria in several areas (such as cognitive, behavioral, emotional, biological, and legal criteria, role transitions, and responsibilities) as definitions of adult status. He also asked participants whether they felt they had reached adulthood. A majority of respondents in their late teens and early twenties answered both “yes and no.” Perhaps Garrett would say something similar. As you can see in Figure 11.1, the proportion of young people in Arnett’s study who judged themselves to be adults gradually increased with age, with a clear majority of participants in their late 20s and early 30s doing so. FIGURE 11.1 “Do you feel that you have reached adulthood?” SOURCE: Arnett, J. (2000). Emerging adulthood: A theory of development from the late teens through the twenties. American Psychologist, 55, 472. Reprinted by permission of the American Psychological Association. Can this delay in identifying oneself as an adult be attributed to the timing of role transitions? Apparently not. Chronological age and role transitions such as marriage and parenthood, on their own, were not considered significant markers. Arnett’s respondents indicated that the two most important qualifications for adulthood are first, accepting responsibility for the consequences of one’s actions and second, making independent decisions. Becoming financially independent, a traditional marker event, was third. Consequently, the subjective sense of being an adult may be more important than accomplishment and timing of discrete tasks. There is consensus among many researchers that a broad psychological shift toward increasing independence and autonomy characterizes the subjective experience of what it means to be an adult, at least for members of the American majority culture (Greene & Wheatley, 1992; Scheer, Unger, & Brown, 1994). Moreover, achieving a sense of autonomy is among the identity-related tasks described in Chapter 9. Because the shift to feeling autonomous is a lengthy process for many, Arnett (2000, 2004, 2007) advocates that the time period roughly between ages 18 and 25 be considered a distinct stage of life called emerging adulthood. In past decades, adult status was conferred through the achievement of marker events, such as marriage and parenthood. We should note that emerging adulthood is made possible by the kind of economic development that characterizes industrialized, Western cultures. If the labor of young people is not urgently needed for the economic well-being of their families, and if many occupations require extended years of education, then work is postponed, marriage and childbearing are likely to be delayed, and self-exploration can continue. Arnett (e.g., 2002) has argued that increasing globalization is spreading the experience of this new stage from Western societies to other parts of the world. Globalization is “a process by which cultures influence one another and become more alike through trade, immigration, and the exchange of information and ideas” (p. 774), a process that has shifted into high gear with advances in telecommunications and transportation. In developing countries, young people increasingly move from rural communities to urban centers as they pursue expanding economic opportunities. More occupational and lifestyle options are available, and self-development tends to continue well past the teen years. “For young people in developing countries, emerging adulthood exists only for the wealthier segment of society . . . however, as globalization proceeds . . . the emerging adulthood now normative in the middle class is likely to increase as the middle class expands” (p. 781). Whereas Arnett (2000) found that American respondents to his questionnaire emphasized individualistic criteria in defining adulthood, we should note that they also acknowledged the importance of emotional support. As you will see in the next chapter, strong relationships with others are important in and of themselves and can provide a bulwark for the development of personal autonomy. Encouragement and tolerance for independent action and belief, however, may continue to be greater for young people in the American majority culture than for those in non-Western societies, such as China (e.g., Nelson & Chen, 2007), or in some diverse communities in the United States. Even so, several studies suggest that young people in minority samples can be characterized as defining adulthood much the way majority White samples do, and as experiencing the self-exploration or identity development that seems to be the hallmark of emerging adulthood. For example, Black males define manhood individualistically, characterized by personal responsibility and self-determination, much as Whites do (Hunter & Davis, 1992). Yip, Seaton, and Sellers (2006) investigated African American racial identity across the life span, and found that although more young adults had reached identity achievement status than adolescents, identity issues were still front and center for more than half of the college age group. Yip et al. argue that college experiences can “intensify the process of developing a racial and/or ethnic identity” (p. 1515). Deaux and Ethier (1998) make a similar argument, suggesting that college itself tends to serve as a catalyst for ethnic identity development, partly because of opportunities such as access to groups organized around race and college courses on racial or ethnic history. These kinds of experiences may make race and ethnicity more salient. Interestingly, Yip and her colleagues found that the only age group in which an identity status (diffusion) was linked to depression was the college age group, indicating that identity constructions in this age range have critical psychological consequences. Whether or not emerging adults attend college, their social worlds are likely to expand beyond immediate family, friends, and neighbors (Arnett & Brody, 2008, p. 292). For African Americans, this often means moving into a much more ethnically diverse world than the schools and neighborhoods of their childhood. In college, fewer instructors or students are likely to be Black; in work environments, few employers and co-workers will be Black. Minority stress, the experience of prejudice and discrimination due to membership in a stigmatized group (Meyer, 2003) is very likely to increase. Arnett and Brody (2008) argue that dealing with identity issues with these added sources of stress may intensify the process. “We believe that identity issues are especially acute for African American emerging adults due to the injection of discrimination and prejudice, and that this may explain a range of puzzling findings” (p. 292). One is the racial crossover effect. African American adolescents engage in less substance use than White adolescents. But the reverse is true in adulthood, when African Americans use substances more than Whites. Another puzzling finding concerns male suicide rates. Males are much more prone to suicide than females. White males show a steady rise in suicide rates through much of adulthood, with the sharpest rise after age 65. But for Black males the peak suicide rate occurs much earlier, between 25 and 34. Arnett and Brody speculate that “there are uniquely formidable challenges to forming a Black male identity” in the United States and that for some the strain may become intolerable during early adulthood (p. 293). Gary grew up in a Vietnamese American family, who had certain expectations for his future. He describes clearly the stress that arises when need for autonomy and desire to meet family obligations come into conflict in young adulthood. Research on other ethnic minorities in the United States supports the idea that moving into a more ethnically diverse world after adolescence both extends the process of identity formation and increases its complexity. Fuligni (2007) looked at the children of Asian and Latin immigrant families and found, as you would expect, culture-specific concerns among young adults. The children of the immigrant families had a stronger sense of “family obligation” than European American offspring. They expected to support and assist their families in many ways (e.g., caring for siblings, providing financial support, living near or with the family) and believed that they should consider the family’s wishes when making important decisions. Interestingly, the researchers found that these kinds of values, while strongest in the immigrant children, also increased in young adulthood for all ethnic groups, including the European Americans. Apparently, the importance of connection, as well as autonomy, becomes clearer after adolescence. Immigrant children were most likely, as young adults, to experience a much more expanded social world if they went to college than if they did not, and this happened most often for those of East Asian background. With these young people, Fuligni found a trend toward the kind of extended identity development that characterizes emerging adulthood. Their sense of obligation to family did not disappear, but it competed with new aspirations, “to be able to be doing something that I like” and “to be the person you’re supposed to be” (p. 99). Is a stage of emerging adulthood a good or a bad thing? Could it be viewed as a tendency for modern youth to simply avoid taking on adult roles and responsibilities, nurtured by overprotective, indulgent parents? Arnett (e.g., 2007) argues that in fact, there could be a grain of truth here; many young adults seem to “find it burdensome and onerous to pay their own bills and do all the other things their parents (have) always done for them” (p. 71). But he generally sees the ambivalence of young adults as a recognition of the value of an extended period of self-development to help them prepare more adequately for taking on adult roles in a complex society. He points out that few emerging adults fail to “grow up.” By age 30 nearly all are stably employed, and three quarters are married and have a child. There is also evidence that as adolescents move into young adulthood today they are not substantially different from their counterparts in the 1970s. For example, high school seniors now and then report similar levels of loneliness, anti-social behavior, self-esteem, and life satisfaction (Trzesniewski & Donnellan, 2010). But some theorists are not as sanguine about this new stage. Hendry and Kloep (2007), for example, express concern that young people are more often inadequately prepared for adulthood than they are benefiting from an extended transition into adulthood. Modern parents, they suggest, tend to both overindulge their children and to pressure them to excel, rather than assuring that their children are adequately educated in basic life skills. Although these competing views may both have “grains of truth”—only continuing research on this interesting new stage can resolve the issue. In this and the next chapter we will examine some of the key characteristics of life after adolescence, primarily for young people in the United States. We will refer to the period from about 18 to 30 as young adulthood, although we acknowledge Arnett’s (2000, 2010) argument that many 18- to 25-year-olds are better described as “emerging adults.” The early years of young adulthood are often an extended period of transition involving exploration of potential adult identities. In this chapter, we will begin by examining the physical characteristics of young adults and then move on to consider the cognitive changes that are likely in this period of life. In Chapter 12, we will explore the complexities of forming intimate, enduring adult attachments, maintaining or revamping family relationships, and making vocational commitments. That is, we will look at some of the myriad processes involved in taking one’s place as a contributing member of an adult community. PHYSICAL DEVELOPMENT IN YOUNG ADULTHOOD Reaching Peak Physical Status By age 18 to 20, most people have reached their full physical growth. Sometime between 18 and 30, all our biological systems reach peak potential. For example, we can see, hear, taste, and smell as well as we ever will; our skin is as firm and resilient as it can be; the potential strength of muscle and bone is as great as we will ever experience; and our immune systems provide us with the most effective protection we will ever have from diseases ranging from the common cold to cancer. Not all physical capacities reach their peak simultaneously. Visual acuity, for example, reaches a maximum level at about age 20, with little decline for most people until about age 40. But auditory acuity appears to peak before age 20 and may show some declines soon after (Saxon & Etten, 1987; Whitbourne, 1996). There are certainly individual differences among us in the achievement of peak physical status—for example, some people reach their full height by age 15, whereas others may not finish growing until age 18 or 20. There are also substantial differences among different physical skills in the timing of peak performance, which is usually assessed by looking at the records of “super-athletes” (e.g., Tanaka & Seals, 2003; Schulz & Curnow, 1988). Schulz and Curnow examined athletic performance records for superathletes in a wide variety of sports. On one hand, they found that maximal performance for most sports is reached within the young adult period; on the other hand, they found that the average age of greatest skill (e.g., winning an Olympic gold medal in track or achieving a Number 1 world ranking in tennis) is different from one sport to another, and sometimes depends on which particular skill is examined within a given sport. For example, the average age at which Olympic swimmers win gold medals is 19; professional golfers typically do not achieve a Number 1 ranking until they have moved out of the young adult period, at age 34. For a professional baseball player, the average age for “most stolen bases” is 23, but the mean age for “peak batting average” is 28 and for “hitting the most doubles” is 32! The differences in age of peak performance suggest that the relative importance of practice, training, knowledge, experience, and biological capacity varies from one skill to another. Skills that are based on muscle strength, flexibility, and speed of movement and response tend to peak early. Abilities that are heavily dependent on control, arm–hand steadiness, precision, and stamina tend to peak later. Overall, the greater the importance of cognitive factors in performance, factors such as strategy knowledge and use, the later a skill will top out (Schulz & Curnow, 1988). An interesting finding from the research on superathletes is that physical development progresses at different rates for the two sexes. Men reach their peak of performance in many skills approximately 1 year later than women do. No simple explanation is available for the gender differences, but in some instances they appear to be based on earlier skeletal-muscular maturation in women. In other cases, they may depend on the fact that the smaller, more streamlined bodies of young adolescent females confer some speed advantages, as in long-distance swimming. Superathletes are those whose performance of a skill seems to match their full potential. Most of the rest of us are not concerned about achieving maximal skill, but we usually are motivated to maintain our physical capacities at high levels—including not only performance skills, but also sensory abilities, good health, and youthful appearance—during and beyond the early adult period. Clearly, biology plays a role here. For example, regardless of activity level, muscular strength begins to decline somewhat by about age 30. But research supports the importance of lifestyle in this process. There are good habits that help maintain peak or near-peak functioning and appearance, and there are bad habits that can erode functioning (Whitbourne, 1996). For example, regular exercise can help both younger and older adults maintain muscle and bone strength and keeps the cardiovascular and respiratory systems functioning well. Smoking, poor diet, and a sedentary lifestyle accelerate loss of peak cardiovascular and respiratory functioning and loss of muscle and bone. Smoking or any excessive drug or alcohol use can diminish functioning in a variety of physiological systems. For example, smoking contributes to more rapid wrinkling of the skin, and alcohol causes damage to the nervous system, the liver, and the urinary tract. “Eating right” is part of a healthy lifestyle. It means having regularly spaced meals (including breakfast) that are low in fat and that sample a range of food groups, allowing a proper balance of nutrients. Failure to eat right contributes to obesity, to depressed mood, and to many aspects of physical decline. Overweight and obesity are epidemic in the United States (65%) and Canada (59%), and weight gain is especially likely between the ages of 20 and 40 (Tjepkema, 2005; U.S. Department of Health and Human Services, 2005). Longitudinal studies that have followed participants for 40 to 50 years have made clear that people who fail to follow healthy lifestyles in their young adult years suffer from poorer health later and that they are less satisfied with their lives in late adulthood when compared to people who do adopt healthy habits in young adulthood (e.g., Belloc & Breslow, 1972; Mussen, Honzik, & Eichorn, 1982). None of this information is likely to be new to you. Many Americans, including young adults, are aware of the benefits of a healthy lifestyle and of the liabilities that bad living habits pose. Do they heed what they know? We learned in Chapter 10 that adolescents often act in reckless ways that compromise their health and wellness. Often young adults are not much better. Consider their alcohol use. As we have seen, nearly half of college students report that they drink heavily, often binge drinking (i.e., having at least five drinks in a row), and many indicate that their drinking has caused them problems, such as having unplanned or unprotected sex, getting hurt, or causing property damage (e.g., Fromme, Corbin, & Kruse, 2008; Hingson, 2010). Studies of adolescents and young adults indicate that the ongoing development of memory and learning abilities may be inhibited in binge drinkers (Ballie, 2001). In a longitudinal study of 33,000 people, problem drinking and drug use shifted during the young adult period (Bachman, 1997; Johnston, O’Malley, Bachman, & Schulenberg, 2009). These problems began to decline as participants reached their mid-20s and as their reasons for drinking began to change. “To have a good time with my friends” was the most common reason given by younger participants, but it gradually declined after age 20 and was surpassed by the desire “to relax or relieve tension” as participants approached age 30 (see Figure 11.2; Patrick & Schulenberg, 2011). This study also indicated that being in college is a contributing factor to substance abuse in the United States, because college students drink more alcohol and smoke more marijuana than same-age peers who have never attended college. Although the reasons for substance use in emerging adulthood do not seem to vary across demographic groups in the United States, the extent of substance use does. Alcohol consumption is greater for males than females, and it is greater for Caucasians than for other ethnic or racial groups, although Hispanics run a close second to Caucasians (e.g., Johnston et al., 2009; LaBrie, Atkins, Neighbors, Mirza, & Larimer, 2012; Smith et al., 2006). Sexual minority individuals appear to be at particular risk. Another large longitudinal study of adolescents and young adults found that even as teens, lesbian, gay and bisexual youth, especially girls, were at higher risk of alcohol abuse than heterosexuals (Dermody et al., 2013). And the risk disparity between sexual minorities and heterosexuals increased in young adulthood. It appears that the stresses of dealing with sexual identity issues, along with discrimination and victimization, may be important drivers of these differences in susceptibility to hazardous drinking (e.g., Hatzenbuehler, Corbin, & Fromm, 2011). FIGURE 11.2 Frequency of endorsement of five reasons for drinking in the young adult years. SOURCE: Patrick, M. E., & Schulenberg, J. E. (2010). How trajectories of reasons for alcohol use relate to trajectories of binge drinking: National panel data spanning late adolescence to early adulthood. Developmental Psychology, 47, 314. Reprinted by permission of the American Psychological Association. In addition to substance use problems, attending college appears to have negative health effects in general. In another survey, over 20,000 college students completed a questionnaire in the fall of their 1st year of college and again 1 year later (Keup & Stolzenberg, 2004). Over the course of that year, they reported substantial declines in their emotional well-being, in their physical health, and in their health habits (e.g., reduced levels of exercise). Fromme et al. (2008) also found substantial increases in alcohol and marijuana use, as well as sex with multiple partners, between high school and the end of freshman year of college in a sample of 2,000 young people. There was some variability depending on whether students lived at home (less increase) or not, and whether they came from rural, urban (less increase), or suburban high schools, but in all cases there was an increase. The unhealthy, underregulated lifestyles of many young adults are probably an outgrowth of multiple factors: poor application of problem-solving skills to practical problems (see the next section on cognitive development); perhaps a continuing sense of invulnerability that began in the adolescent years, which may be exacerbated by the fact that young adults can bounce back from physical stress far more readily than they will in later years; and the stresses of leaving home and facing the social and academic demands of college or the workplace—all steps that create new challenges to one’s identity. The Changing Brain As we saw in Chapter 9, it is now clear that major brain developments continue in adolescence and into early adulthood, to the surprise of many researchers. Measures of brain growth (utilizing magnetic resonance imaging, or MRI, procedures) indicate that a resurgent growth of synapses (synaptogenesis) occurs around puberty in some parts of the brain, followed by a long period of pruning, which continues into the adult years (e.g., Petanjek, Judas, Simic, & Rasin, 2011). Such changes may mean an expanded capacity for cognitive advancement. Hudspeth and Pribram (1992) measured brain activity in children and adults up to 21 years old (using electroencephalography, or EEG, technologies), and they were astonished to find accelerated maturing of electrical activity in the frontal cortex of the 17- to 21-year-olds. Pribram (1997) argues that such acceleration could mean that early adulthood is especially important for the advanced development of frontal lobe functions, such as the ability to organize and reorganize attention, to plan, and to exercise control over one’s behavior and emotions. He suggests that the typical timing of a college education may be ideally suited to what he assumes is the heightened flexibility and plasticity of the frontal cortex in young adulthood. A great deal more research needs to be done to determine the degree to which accelerated brain growth is related to the acquisition of cognitive skills and to evaluate the direction of effects. To what extent are observed brain changes the cause, as opposed to the outcome, of learning and learning opportunities? Clearly, new technologies are making this an exciting time in the study of adult brain development. COGNITIVE DEVELOPMENT IN YOUNG ADULTHOOD Each year, millions of young people participate in a rite of passage that marks their entry into young adulthood, the transition to college. Although not all adolescents go on to higher education, statistics indicate the numbers continue to increase. A century ago, fewer than 5% of young adults attended college in the United States; today, more than 60% do (Arnett, 2000). So, at least for a sizable subset of American youth, the college experience represents a major influence on their cognitive and social development. We can all picture the scenes: students piling out of cars driven by their anxious parents, descending on dorms at the beginning of the semester, bustling across campus as fall’s first colors begin to tint the foliage, eagerly anticipating the educational challenges that await them in their classes, gathered late into the night, talking and laughing with their newfound community of peers, relishing the heady freedom of young adulthood. If these images remind you of far too many movies you have seen, it may be because they have been overly romanticized, due in equal parts to advertising and to nostalgia. Below the attractive exterior these images suggest lays a core set of developmental challenges that await individuals at this time of life. For the most part, these tasks involve continuing the hard work of carving out an identity, now all the more pressing because of one’s status as an “adult.” Moreover, most of the work takes place outside the protected environments of home and high school, even though continuing attachments to family remain important, as we shall see in the next chapter. In this section, we will examine the changes in cognitive functioning that appear to characterize many young adults. Most of the research on young adult development has been done on college students and focuses on the kinds of change that one can expect to find among people with the opportunity to continue their education beyond high school, delaying many other adult responsibilities. As we noted earlier, more than half of young Americans fall into this category. We know very little about cognitive change in those individuals who move directly from adolescence into the world of work. In the next chapter, we examine some of the special issues that may apply to this segment of our young people. Unquestionably, early adulthood is a time of great learning. Whether in college or on the job, young people are faced with being the “novices,” the “unknowledgeable,” or the “inexperienced” when they enter the world of adults, and they spend a great deal of time building their knowledge base and becoming more expert in particular domains of knowledge, such as computer science or philosophy or mechanics. Not surprisingly, at the end of 4 years of college, students perform better on tests of general knowledge than they do as entering students, and the majority judge themselves to be “much stronger” than they were as 1st-year students in knowledge of a particular field (Astin, 1993). Comparable change measures are not available for young people who enter the workforce after high school, but it seems reasonable to assume that after 4 years on the job, some of which may be in job training programs, at least their knowledge of a particular field would have increased. On the whole, longitudinal research on intellectual change across the life span indicates that many skills (such as spatial orientation abilities and inductive reasoning skills) improve throughout young adulthood, with measures of knowledge acquisition or breadth, such as understanding of verbal meanings, showing the most improvement in this time frame (e.g., Schaie, 1994). As long as opportunities to learn exist, the acquisition of knowledge seems to proceed rapidly during early adulthood. In later chapters, we will look in more detail at the typical progress of specific intellectual abilities throughout adulthood. Logical Thinking: Is There Qualitative Change? Is growth of knowledge and skill the only kind of cognitive change that we can expect to find in early adulthood? Or, as in childhood, does the nature of one’s thinking and problem solving change as well? Piaget’s analysis of structural shifts in children’s logical thinking skills ends with the description of formal operational thought in the adolescent years (see Chapter 9). Formal operational thinking allows us to think logically about abstract contents. We can discover and understand the implications of relationships among pieces of information that may themselves be abstract relationships—such as proportions, for example. But many theorists have speculated that more advanced forms of rational thought are possible and emerge sometime in adulthood. Several schemes, many borrowing heavily from Piaget’s seminal work, attempt to describe the cognitive shifts that might occur in the adult years. Among these theories are those that propose a stage of adult cognitive thought that has variously been called postformal or fifth-stage thinking, implying an extension of Piaget’s sequence of stages (e.g., Arlin, 1984; Basseches, 1984; Commons & Richards, 1984; Sinnott, 1984, 1998). Some of these theories actually elaborate a number of substages in the movement from formal operations to postformal thought. Two of these—Perry’s (1970/1999) and Kitchener’s (Kitchener & King, 1981; Kitchener, King, Wood, & Davison, 1989)—we describe in detail in later sections to illustrate the possible processes of change in young adulthood. In all these theories, a fifth stage of logical thought is said to evolve as people begin to recognize that logical solutions to problems can come out differently depending on the perspective of the problem solver. In postformal thinking, the problem solver is said to coordinate contradictory formal operational approaches to the same problem. Each approach is logically consistent and leads to a necessary conclusion, and each can be valid. The postformal thinker can both understand the logic of each of the contradictory perspectives and integrate those perspectives into a larger whole. Although she will likely make a commitment to one of these for her purposes, she recognizes that more than one can be valid. Thus, postformal thought incorporates formal thinking and goes beyond it. The assumption is that one could not be a post-formal thinker without going through Piaget’s four stages of thought development. As you can see, postformal stage theorists have incorporated some features of the Piagetian framework, including stage sequences, into their theories. They assume that there are qualitatively different structural organizations of thought and its contents at each stage. They also have a constructivist focus. That is, they assume that what one knows and understands about the world is partly a function of the way one’s thought is, or can be structured. Yet gradual reorganizations of thought are possible as one confronts and accommodates her thinking to stimuli that cannot be fully assimilated into her current ways of thinking. Some theorists disagree with the concept of a fifth stage of cognitive development. They often argue that the formal operational system of thinking is powerful enough to address any kind of logical problem. They are more inclined to see qualitative differences in adult problem solving and logical thinking not so much as an indication of a stage beyond formal operations but as a sign that the kinds of problems adults must solve are different from those that children are usually trained and tested on. As a result, adults must adapt their existing problem-solving skills to the new kinds of problems they face in adulthood (e.g., Chandler & Boutilier, 1992; Labouvie-Vief, 1984; Schaie, 1977–1978). Part of what people may learn as they confront adult problems and responsibilities is the limits of their own problem-solving abilities. That is, they may grow in metacognitive understanding, recognizing that in some circumstances logical thinking will lead to a clear solution but that in other circumstances they must make decisions based in part on values, needs, and goals (e.g., Chandler, Boyes, & Ball, 1990; Kuhn, Garcia-Mila, Zohar, & Andersen, 1995; Moshman, 1998). In sum, theorists disagree about whether adult problem solving represents a fifth stage in the development of logical thinking or is a reflection of the fact that life presents adults with new problems. In the latter view, adults do not achieve a new rational system but learn to recognize the limits of their existing problem-solving systems and to evolve new strategies for applying them. Despite some disagreements, nearly all theorists agree that problem solving takes on a different look and feel in adulthood. In the following sections, we will summarize a few theoretical descriptions of adult logical thinking, and we will examine some of the research demonstrating that indeed, something changes as people face life’s grown-up challenges. Schaie’s View of Adults Adjusting to Environmental Pressures Schaie’s (1977–1978; Schaie & Willis, 2000) theory emphasizes the importance of new roles, needs, and responsibilities in determining adult intellectual functioning. Schaie does not argue for postformal thought but for shifts in cognitive functioning, or in the use of knowledge and skills, that are straightforward adaptations to the new demands that adults face at different times of life. According to Schaie, we can think of the child and adolescent years as a time when the individual is sheltered from much of life’s responsibilities. Schaie calls this the acquisition stage of cognitive development, when youngsters can learn a skill or a body of knowledge regardless of whether it has any practical goal or social implications. Practical problems and goal setting are monitored by parents and others who take on the responsibility for making decisions that will affect the child’s life course. The child has the luxury of learning for learning’s sake or problem solving just to sharpen her logical thinking skills. Many of the problems she confronts in this phase are those with preestablished answers. In young adulthood the protections of childhood rapidly recede and the individual is faced with taking responsibility for her own decisions. The problems she must solve—such as how to maintain good health, what career path to choose, whom to vote for, or whether to marry—usually do not have preestablished answers. Many theorists have described these kinds of problems as ill-defined or ill-structured. Not only do they have no preestablished answers, but the “right” answer may be different depending on circumstances and on the perspective of the problem solver. Further, when we solve such problems we often do not have access to all the information that might be helpful. Young adults are in the achieving stage of cognitive development, when an individual must apply her intellectual skills to the achievement of long-term goals, carefully attending to the consequences of the problem-solving process. Schaie assumes not that additional thinking skills are emerging beyond formal operational abilities but that previously acquired skills are being sharpened and honed on very different kinds of problems, such that the solution to one problem must be considered and adjusted relative to other life problems and goals. For example, an adult who is contemplating a divorce must contend with a number of issues: her future happiness, her economic status, and the well-being of her children, just to name a few. According to Schaie, each new stage of adult life brings new kinds of problems, with different skills more likely to play an important role in one stage than in another. In middle adulthood, the responsible stage, ill-defined problems are still the norm, but problem solving must take into account not only one’s own personal needs and goals but also those of others in one’s life who have become one’s responsibility: spouse, children, coworkers, members of the community. Schaie suggests that the greater impact of one’s problem solutions leads adults to become more flexible in their thinking and to expand their knowledge and expertise and use those qualities more widely than before. For people who take on supervisory functions at work and in the community, the extended impact of one’s problem solving is even greater than for others, and the responsible stage becomes the executive stage, requiring that one focus heavily on learning about complex relationships, multiple perspectives, commitment, and conflict resolution. Such individuals must sharpen skills in integrating and hierarchically organizing such relationships. People’s responsibilities usually narrow in early old age as their children grow up and retirement becomes an option. This is the reorganizational stage, when flexibility in problem solving is needed to create a satisfying, meaningful environment for the rest of life, but the focus tends to narrow again to a changed set of personal goals and needs. Practical concerns, such as planning and managing one’s finances without an income from work, require applying one’s knowledge in new ways. Many of the problems that emerging adults face are ill-defined and have no right answers. The ability to deal with complexity is a characteristic of postformal thought. As people move further into their elder years, called the reintegrative stage, they need less and less to acquire new domains of knowledge or to figure out new ways of applying what they know, and many are motivated to conserve physical and psychological energy. Schaie suggests that elderly people are often unwilling to waste time on tasks that are meaningless to them, and their cognitive efforts are aimed more and more at solving immediate, practical problems that seem critically important to their daily functioning. A legacy-leaving stage may also characterize people whose minds are sound but whose frailty signals that their lives are ending. Such people often work on establishing a written or oral account of their lives or of the history of their families to pass on to others. Consider Jean, who used her considerable organizational ability to construct a detailed genealogy to pass along to her only son. The activity gave her a sense of satisfaction, purpose, and meaning. Clearly, these goals require substantial use of long-term memory and narrative skill, more than problem-solving skills, but as Schaie points out, such accounts do require decision making, or the use of judgment, about what is important and what is not. This discussion of Schaie’s theory has involved describing cognitive functions beyond early adulthood. We will return to the later stages in Chapters 13 and 14. For now, Schaie’s depiction of cognitive functioning as heavily affected by the environmental pressures people face at different times of life should help set the stage for understanding other theories of young adult cognition. Most theories emphasize that advancements or changes in problem solving are embedded in the new experiences faced during adulthood. Amanda is still figuring out her adult roles and responsibilities. Which stage of Schaie’s theory of development would you say best characterizes her? Schaie’s description of environmental pressures clearly is focused on typical middle-class experiences in Western cultures. He would probably be the first to acknowledge that adults in other cultures, or in some North American cultural groups, might show different shifts in the polishing or use of cognitive skills through life, depending on the unique demands that their environments impose. Arnett and Taber (1994) point out, for example, that in Amish communities within the United States, the importance of mutual responsibility and interdependence is emphasized from childhood through all phases of adulthood. A sense of duty and of the need to sacrifice on behalf of others is central to everyone’s life within the culture. Thus, these obligations should be expected to affect cognitive functioning in important ways well before middle adulthood, which is when Schaie considers them to become influential. Keep in mind as you read the following accounts of postformal thought that they are almost entirely based on observations of members of the majority culture in Western societies and that it remains to be seen whether these conceptions adequately characterize adult cognitive development in other cultures. Postformal Thought Many theorists argue that the realities of adult experience actually lead to new forms of thought (e.g., Arlin, 1984; Basseches, 1984; Commons & Richards, 1984; Sinnott, 1984, 1998). The full flower of postformal thinking may not be realized until middle adulthood or even later, but the experiences of young adulthood contribute to the reconstruction of logical thinking. As we saw with formal operations, not all individuals will necessarily reach postformal operations. If they do, we can expect them to “skip in and out” of this type of thinking (Sinnott, 1998). Sinnott (1984, 1998) captures many of the features of postformal thinking described by others. For Sinnott, the essence of postformal thought is that it is relativistic: “[S]everal truth systems exist describing the reality of the same event, and they appear to be logically equivalent” (1998, p. 25). The knower recognizes both the consistencies and the contradictions among the multiple systems of truth, or systems of formal operations, and depending on her goals and concerns, in many situations she will make a subjective commitment to one; in other situations, she may seek a compromise solution that integrates some of each perspective, but will not lose sight of the inherent contradictions. For example, advanced study of a science often reveals that more than one theoretical system can account for much of the data, although perhaps not all of it. Let’s use Sinnott’s example from mathematics to begin our demonstration: The knower may be aware that both Euclidean and non-Euclidean geometries exist and that each has contradictory things to say about parallel lines. In Euclidean geometry parallel lines never come together; in non-Euclidean geometry, parallel lines eventually converge. These are two logically contradictory truth systems that are logically consistent within themselves and logically equivalent to one another. A mathematician bent on knowing reality must decide at a given point which system he or she intends to use, and must make a commitment to that system, working within it, knowing all along that the other system is equally valid, though perhaps not equally valid in this particular context. (Sinnott, 1998, p. 25) In the behavioral sciences and the helping professions, we are quite familiar with the phenomenon of competing truth systems. For example, a counselor may be aware of multiple theories to account for snake phobias. One might be biologically based, another based on assumptions about the symbolic meaning of snakes in a person’s life, and another a behavioral theory arguing that irrational fears are classically conditioned. Suppose she understands the logic of each theory and knows that each is supported by a set of evidence. Yet, in a therapeutic situation, she must make a commitment to one of these systems of “truth” for the purposes of developing a therapeutic plan that will achieve relief for her client as quickly as possible. As Sinnott argues, for her purposes that system then becomes her “true description of the world,” but if she remains aware of the inherent contradictions among the different systems and realizes that each has some claim on being true, her thinking has postformal characteristics. Truth is relative, but one truth system may be more valid than another, depending on our goals. This example illustrates that descriptions of postformal thinking have parallels in the descriptions of “reflective practice” presented in Chapter 1. Sinnott’s characterization of postformal thought is consistent with what Chandler (1987; Chandler et al., 1990) calls postskeptical rationalism, in which we abandon the empty quest for absolute knowledge in favor of what amounts to a search for arguably good reasons for choosing one belief or course of action over another . . . an endorsement of the possibility and practicality of making rational commitments in the face of the clear knowledge that other defensible alternatives to one’s views continue to exist. (Chandler et al., 1990, p. 380) Interestingly, Chandler and his colleagues disagree that postskeptical rationalism actually represents thinking that is more advanced than formal operational thought. They are more inclined to see it as a result of self-reflection, a growing metacognitive awareness that is the product of “an ongoing effort to reflect on the status of the general knowing process” (p. 380) and to understand its strengths and its limits. Whether relativistic thinking is truly postformal is perhaps less important than when and under what circumstances it emerges. Let’s consider in greater detail two descriptions of cognitive change in the college years and the research in which they are grounded. In each of these theories, the final accomplishment is relativistic reasoning like that described by Sinnott. But each draws on data from studies of young adults to specify in detail how thinking might be restructured and why it is, especially for college students. They each describe a series of stages that we might consider substages in the progression from early formal operational thought to a postformal kind of thinking. Perry’s Theory of Intellectual and Ethical Development in the College Years William Perry’s (1970/1999) theory focuses on the cognitive and moral development of college students. Perry was a professor of education at Harvard and founder of the Harvard Bureau of Study Council, a counseling and tutoring center. Using many of Piaget’s ideas, Perry proposed a stage-based theory that depicts the typical intellectual and ethical transitions experienced by students in higher education settings, from absolute adherence to authority to beliefs founded on personal commitment. Perry’s theory examines the changes that occur over time in the structure of young adults’ knowledge, or, put another way, the changes in their expectations and assumptions about the world. Perry’s original study involved hundreds of volunteer Harvard and Radcliffe students from 1954 through 1963. The theory was constructed from extensive interviews of students as they moved through their college years. In general, interview questions were open ended, such as “Why don’t you start with whatever stands out for you about the year?” (Perry, 1970/1999, p. 21), allowing students maximum freedom to talk about their experiences. Initially, Perry considered the differences in students’ thinking or worldviews to be a function of their personality differences. It was only after careful reflection on many transcriptions that Perry and his team of raters began to consider the possibility of a developmental sequence. He states, “We gradually came to feel that we could detect behind the individuality of the reports a common sequence of challenges to which each student addressed himself in his own particular way” (p. 8). Although Perry acknowledged that specific forms of knowing do vary across domains of knowledge (as we saw in our examinations of cognitive development in childhood), he believed it was possible to identify a dominant position or overarching form of thought for a given individual at a given time. Perry constructed a sequence of nine “positions,” or stages, ranging from extreme dualistic thinking to high levels of personally committed beliefs. What happens in between is the stuff of intellectual growth during the college years. Few students, if any, enter college at the first position, and few leave having achieved the ninth position. Like Piaget’s theory, Perry’s is a theory of continual movement and transition. Students “rest” for a time at each of the positions, but the dynamic clearly moves forward. From his perspective, the experience of a liberal arts college education accelerates the growth process, particularly in a society that values pluralism, because students are invariably confronted with diversity of thought, values, and beliefs. Emerging adults may find the challenges of new academic and social responsibilities difficult to cope with, absent the familiar supports of home and family. To understand Perry’s ideas, let’s consider each of the positions and the three alternatives to growth (see Table 11.1 for a summary). TABLE 11.1 Moving Toward Postformal Thought: Descriptions by Perry and Kitchener PERRY FROM DUALISM TO RELATIVISM KITCHENER EMERGENCE OF REFLECTIVE JUDGMENT Dualism Position 1: Strict Dualism There is right vs. wrong; authorities know the truth. Stage 1 Knowing is limited to single concrete instances.   Stage 2 Two categories for knowing: right answers and wrong answers. Position 2: Multiplicity (Prelegitimate) Multiple ideas exist; some authority knows what’s right. Stage 3 Knowledge is uncertain in some areas and certain in others. Position 3: Multiplicity (Subordinate) or Early Multiplicity Multiple perspectives are real and legitimate. Stage 4 Given that knowledge is unknown in some cases, knowledge is assumed to be uncertain in general. Position 4: Late Multiplicity Oppositional Solution: Either “authority is right” or “no one is right.” Relative Subordinate Solution: Some opinions are more legitimate (better supported); outside guidance may be needed to learn how to evaluate and to reach this conclusion. Stage 5 Knowledge is uncertain and must be understood within a context; can be justified by arguments within those contexts. Relativism Position 5: Contextual Relativism Respectful of differing opinions, but belief that ideas can be evaluated based on evidence. Stage 6 Knowledge is uncertain; constructed by comparing and coordinating evidence and opinions. Position 6: Commitment Foreseen Preference for a worldview begins to emerge despite awareness of legitimacy of other views. Stage 7 Knowledge develops probabilistically through inquiry that generalizes across domains. Positions 7, 8, and 9: Commitment and Resolve “Flowering” of commitment; resolve to continue reflecting.   SOURCE: Based on Kitchener, K. S., Lynch, C. L., Fischer, K. W., & Wood, P. K. (1993). Developmental range of reflective judgment: The effect of contextual support and practice on developmental stage. Developmental Psychology, 29, 893–906; and Perry, W. G. (1970/1999). Forms of ethical and intellectual development in the college years: A scheme. San Francisco, CA: Jossey-Bass. Position 1: Strict Dualism. Strict dualism is really a downward extrapolation of higher stages, given that virtually no one enters college at this level. Strict dualistic thinking implies a rigid adherence to authoritarian views, a childlike division between in-group (the group that includes me, my family, and authorities who have the “right” idea) and out-group (the group that is “wrong” or has no legitimate authority). Individuals in this stage simply never think to question their belief that authority embodies rightness. Because most adolescents have struggled with parents over autonomy issues and have experienced peers and teachers who, at the very least, have exposed them to various viewpoints, it is unlikely that many students would enter college with this extremely simplistic view of the world. Position 2: Multiplicity (Prelegitimate). Multiplicity (prelegitimate) is characterized by the student’s first encounters with multiplicity, that is, multiple ideas, answers to life’s questions, or points of view. Students now find themselves face-to-face with uncertainty when exposed to a mass of theories, social experiences, and information. Their confusion is exacerbated because they lack the structure to accommodate the sheer volume of ideas. Despite their confusion, however, individuals at this stage maintain the belief that some “authority” possesses the ultimate truth or right answers. It is just up to the individual to find it. It is not uncommon, according to Perry, for students to sort through and organize confusing or contradictory information by creating mental dichotomies. For example, they may distinguish between “factual” courses, such as those in the sciences, and “vague” courses, such as those in the humanities. Students who pursue fields that are relatively clear-cut, at least at the early stages of study, may experience confusion when they later have to confront the multiplicity inherent in advanced levels of study. (Remember our examples of multiple truth systems in advanced sciences.) As one student in Perry’s study complained about an instructor: He takes all sort of stuff that, that isn’t directly connected with what he’s talking about . . . so you get just a sort of huge amorphous mass of junk thrown at you which doesn’t really mean much until you actually have some sort of foundation in what the man is talking about. (1970/1999, p. 97) Position 3: Multiplicity (Subordinate), or Early Multiplicity. In the stage of multiplicity (subordinate), the individual grudgingly acknowledges the reality and legitimacy of multiple perspectives. For example, it becomes more difficult to deny that reasonable people can differ in their perspectives on life, and people who hold different views are not so easily dismissed as being wrong. Some of the students’ beliefs in a just world (Lerner, 1980), beliefs that the world is fair and that people in it get what they deserve, are now reevaluated. Students realize that working hard on assignments or putting many hours into studying does not necessarily guarantee wished-for results. They may observe other students doing far less work than they do themselves and getting better grades. They may be distressed by their inability to understand “what the professors want.” They are nudged toward the sometimes painful realization that even their professors and other authority figures around them don’t have all the answers. They may also be distressed by the fact that their teachers continue to evaluate them, despite not having the “right” answers themselves. Position 4: Late Multiplicity. Late multiplicity was the modal position of Harvard and Radcliffe students in the original study in the latter part of their first year. Perry’s research identified two possible epistemologies or adaptations to the problem of multiplicity at this point in development. In effect, students at this stage now fully realize that even experts differ among themselves in regard to what is true. Students handle the realization in one of two ways. One response, identified as oppositional, is characterized by legitimizing multiplicity as one pole of a new kind of dualism. The right–wrong dualism of Position 1 moves to one end of a new continuum, with multiplicity on the other end. Individuals taking this view of the world succeed in maintaining a dualistic either-or structure in their thinking. In other words, either “authority is right” or “all opinions are equally right.” One student in Perry’s study captured the essence of this position when commenting to the interviewer about his English course: “I mean if you read them [critics], that’s the great thing about a book like Moby Dick. Nobody understands it” (1970/1999, p. 108). The viewpoint that nobody possesses the truth, thus rendering all people’s opinions equally valid, can provoke students to irritation when they believe their work or the content of their ideas has been evaluated unfairly. The second alternative, called relative subordinate, is less oppositional. Students with this perspective begin to understand that some opinions are more legitimate than others, presaging the relativism of Position 5. The value of a perspective is now understood to be related to the supporting arguments and evidence for the position. However, the consideration of alternative points of view is still done primarily under the guidance of authority. A Perry interviewee reported that his first set of grades in a literature course was mediocre because he could not understand the kinds of thinking required: Finally I came to realize, about the middle of the second term, that they were trying to get you to look at something in a complex way and to try to weigh more factors than one, and talk about things in a concrete manner. That is, with words that have some meaning and some relevance to the material you were studying.” (1970/1999, p. 112) Often students receive explicit guidance in helping them weigh opinions or compare and contrast ideas. Instruction such as this fosters the kind of metacognition—awareness of how rational arguments are constructed and weighed—that is the foundation of later relativistic thinking. Position 5: Contextual Relativism. The move to Position 5, contextual relativism, represents a major achievement in intellectual development. The first four positions are variants of a basic dualistic structure. The later positions represent a qualitatively different way of looking at the world. Kneflekamp (1999) reports on a common misunderstanding of Perry’s theory, which confuses the “anything goes” quality of late multiplicity with the concept of relativism. He recalls what Perry himself used to say: “Relativism means relative to what—to something—it implies comparison, criteria, and judgment!” (pp. xix–xx). The individual can no longer accept the fiction that everyone’s ideas are as good as everyone else’s. Although she respects the rights of others to hold diverse views, the student at this stage possesses sufficient detachment to “stand back” on her own and consider ideas and values more objectively than before. In a very real way, the student develops the habit of thinking that relies on some standard of evidence that is appropriate to the domain in question. Students’ new analytic abilities allow them to appreciate the merits of diverse perspectives and to find convincing elements in multiple points of view. Thinking relativistically, or thinking about knowledge in context, becomes more habitual. Authority figures are seen more as colleagues than they were before, as people grappling with the same conflicts that beset students, only with more experience in dealing with those conflicts. They are figures no longer to be opposed but to be respected, as this 3rd-year college student from the study illustrates: I think when I was younger, when people in general are young, there’s [sic] so many problems that they feel they don’t have to face, and that’s why they’re indifferent to them. Either it’s something that somebody else—the hierarchy, like the family—worries about, or it’s something in the future that isn’t any problem yet. And then you, when you mature you begin facing these problems for yourself, and looking at them, and then the family just becomes a help to people . . . with more, with a lot of experience. To help you, and not to take the brunt of the problem or something that’s your worry. (Perry, 1970/1999, p. 138) Position 5 also represents a watershed stage for religious belief, the point of demarcation between belief and the possibility of faith. No longer can an individual’s religious belief rest on blind adherence to authority. Real faith, Perry purports, has been tested and affirmed in the context of a relativistic world. This implies that those who hold viewpoints other than one’s own may be wrong, but no more wrong than oneself, given that the student now rejects the idea of absolute truth. With some effort, individuals come to respect and tolerate those who hold different viewpoints even while they struggle to clarify their own beliefs. Position 6: Commitment Foreseen or Anticipation of Commitment. With commitment foreseen, we hear echoes of Erikson’s discussion of identity development (see Chapter 9). Thinking at this stage incorporates a measure of moral courage, as the individual begins to affirm what it is she believes in, all the while knowing that reason will never provide absolute proof that her ideas or perspectives are right or better than others. Commitments to a set of beliefs, to a field of study or career, to relationships, and so forth, like the constructed commitments we discussed in Chapter 9, can take place “after detachment, doubt, and awareness of alternatives have made the experience of choice a possibility” (Perry, 1970/1999, p. 151). This way of thinking incorporates not only respect for diverse ideas and understanding of their rationales but also emerging, personally chosen, preferences for worldviews. One student captures this element of Position 6: It seems to me that so much of what I’ve been forced to do here, this taking of two sides at once, just suspends my judgment. There is a value in it, in seeing any perspective, or any one particular facet of, of a problem. But there’s also a value in, in being able to articulate one side more than another. (p. 157) One notices a general trend in thinking toward personal meaning making or reflective thinking. Positions 7, 8, and 9: Commitment and Resolve. Perry discusses Positions 7 (initial commitment), 8 (multiple commitments), and 9 (resolve) together. Taken as a group, they suggest a flowering of the commitments anticipated in Positions 5 and 6. Changes in thinking are more qualitative than structural. According to Perry, 75% of students in the study had a level of commitment at Positions 7 or 8 at graduation. Despite its place at the end of the line, Position 9 does not imply a static resolution of existential conflict. On the contrary, it characterizes a state of courageous resolve to continue the work of reflecting on one’s commitments throughout adulthood. Perry also accounted for individuals who refrain from taking the intellectual challenge necessary for growth through these stages. Fallback positions include temporizing, retreat, and escape. Temporizing refers to delaying movement to the next stage. Escape characterizes a movement back to relativism when the demands of commitment prove too taxing. Retreat occurs when individuals revert to dualistic thinking in times of stress in order to seek the intellectual security of absolute right or wrong, a position that is unavailable at the level of committed relativism. Although Perry’s theory has been extremely popular, particularly among student personnel professionals, it has some limitations. The first five positions of the theory emphasize intellectual development; the last four pertain to moral and identity development. Thus, Perry’s scheme incorporates several abstract constructs such as identity, ego development, and cognitive development simultaneously, making it difficult for researchers to agree on definitions and measurement. Some have noted that the lack of uniformity in assessment of stages prevents researchers from making valid comparisons of their findings across studies. Consequently, there is a paucity of recent empirical data on the linkages between Perry’s theory and general cognitive processes in adulthood. Some efforts have been made to address the issue of assessment. Several instruments and questionnaires have been developed to identify stages of reasoning that are less time consuming to administer than the interview method used in Perry’s original work (see Baxter-Magolda & Porterfield, 1985; Moore, 1982; Taylor, M. B., 1983). Suggestions for informal assessment of cognitive development for use by residence life professionals have also been proposed (see Stonewater & Stonewater, 1983). There is some research using Perry’s framework to explore the connection between students’ beliefs about knowledge and their approach to learning. For example, Ryan (1984) found that relativists were more successful in their college classes because they tended to use more constructivist approaches to studying course material. They paid attention to context, constructed meaningful interpretations of textual information, and summarized main ideas. Dualists, on the contrary, were more likely to focus on memorization of factual information. These differences were significant even when the effects of scholastic aptitude were eliminated statistically. Using slightly different conceptual frameworks, other researchers have demonstrated that effective problem solving is related to relativistic thinking (Schommer, Crouse, & Rhodes, 1992) and that relativistic thinkers are more likely than dualistic thinkers to provide legitimate evidence to support their thinking and problem solving (Kuhn, 1992). Wilkinson and Maxwell (1991) found support for the relationship between college students’ epistemological style (dualistic, multiplistic, or relativistic) and their approach to problem-solving tasks. Dualists took a rather narrow view of the tasks, breaking them down into unrelated, discrete parts and ignoring some important aspects. Relativists were more likely to consider the whole problem, processing and taking into cognitive account all of its components before attempting a solution. Gilligan (1977) applied Perry’s description of the development of relativistic thinking to an analysis of Kohlberg’s scoring of young people’s responses to moral dilemmas. She discussed a phenomenon called “late adolescent regression,” wherein about one third of Kohlberg’s samples actually regressed to lower levels on his scoring criteria. Gilligan argued that this “regression” actually indicates a more contextualized, relativistic stance in response to moral dilemmas, representing a more inclusive form of principled reasoning. As such, so-called regressions should more reasonably be considered advances. Kohlberg’s original scoring system has since been revised. Kitchener’s Model of the Development of Reflective Judgment As we noted earlier, and as helpers know all too well, many problems of adulthood are ill-defined. An ill-defined problem has neither one acceptable solution nor one agreed-on way to solve it (Kitchener, 1983). Should a talented athlete stay in college or accept an attractive job offer? Should a young woman pursue a high-powered career that will leave little room in her life for marriage and child rearing? How can a young adult deal with the pressures of academic and social life? Moreover, how do helpers deal with the messy issues that come to them on a daily basis? Kitchener and her associates (Kitchener & King, 1981; Kitchener et al., 1989) have proposed a seven-stage theory outlining the development of reflective judgment, how people analyze elements of a problem and justify their problem solving (see Table 11.1). They presented individuals with a standard set of ill-structured problems from the social and physical sciences and questioned them about the reasoning they used in coming to conclusions about the problems. Like Perry, these researchers found a predictable, sequential progression that moved from a belief in the existence of absolute, fixed certainty to a kind of contextual relativism (see Kuhn, 2009; Moshman, 2008). For Kitchener, different stages of thinking can be differentiated on the basis of three dimensions: certainty of knowledge, processes used to acquire knowledge, and the kind of evidence used to justify one’s judgments. As you can see from Table 11.1, the early stages (1 through 3) are characterized by a belief in the existence of certainties and the use of personal justification (“This is just the way it is”) or reliance on authorities for guidance. Individuals in the early stages also tend to use personal observation as evidence of the rightness of their judgments. Individuals in the middle stages (4 and 5), similar to Perry’s multiplists, perceive knowledge as uncertain. They believe in the supremacy of personal opinion and tend to make judgments based on idiosyncratic kinds of reasoning. Those in the later stages (5 through 7) resemble Perry’s contextual relativists in that they tend to make judgments based on a set of rules or logic in combination with personal reflection. For example, one reflective judgment problem concerned whether certain chemicals in foods, such as preservatives, are good or bad for us. The following is a prototypic example of a Stage 5 response to such a problem: I am on the side that chemicals in food cause cancer, but we can never know without a doubt. There is evidence on both sides of the issue. On the one hand there is evidence relating certain chemicals to cancer, and on the other hand there is evidence that certain chemicals in foods prevent things like food poisoning. People look at the evidence differently because of their own perspective, so what they conclude is relative to their perspective.” (Kitchener, Lynch, Fischer, & Wood, 1993, p. 896) Some research demonstrates that reflective judgment is related to level of education (Dunkle, Schraw, & Bendixen, 1993; Kitchener & King, 1981) as well as to the kind of training one has received (Lehman, Lempert, & Nisbett, 1988). Graduate students, for example, reason at higher levels than do college undergraduates, and graduate students in psychology, a discipline that emphasizes statistical reasoning, show higher levels of proficiency on such tasks than do graduate students in chemistry, medicine, or law. Specific kinds of training or support appear to improve skills in reasoning and judgment. In one study, individuals from middle through graduate school were provided with prototypic statements like the one quoted above, with each statement modeling successively higher levels of reflective judgment, and then they were asked to explain the reasoning in the prototypic statement. The results indicated that after such modeling and practice, participants’ own levels of reasoning on such problems had advanced. One classic study illustrates that reflective judgment in social and personal issues tends to lag behind problem solving in domains that do not relate to one’s own personal concerns (Blanchard-Fields, 1986). In this study, participants ranging in age from 14 to 46 were presented with two accounts of each of three events. One event that had little personal relevance for most people was an account of war (the Livia task) by two opposing parties (see Kuhn, Pennington, & Leadbeater, 1982, for a full description of the task). The remaining two events were characterized as “a visit to the grandparents” and “the pregnancy,” and both events were rated by participants as emotionally involving. In the first of these, a teenage boy and his parent each present a story about a time when the boy was required to accompany his parents on a visit to his grandparents. The two stories are inconsistent in emotional tone and in many details (see Box 11.1 for the full text of the competing accounts). In the second emotionally involving event, a woman and a man each take a different stance on the woman’s pregnancy, she favoring an abortion, he against an abortion. For each of the three events, study participants were asked to explain what the conflict was about and what happened. They also responded to probe questions such as “Who was at fault?” and “Could both accounts be right?” The participants’ understanding and analysis of the events were scored based on six levels of reasoning, combining features of Perry’s (1970/1999) and Kitchener and King’s (1981) levels of cognitive maturity. Performance on the Livia task was better at earlier ages than performance on the more emotionally involving tasks. Performance continued to improve on all tasks from adolescence to young adulthood and from young adulthood to middle adulthood. The following are examples of performance for these three age groups on the emotionally involving “visit to the grandparents” event (from Blanchard-Fields, 1986). Level 2. This was the average level of response for adolescents on the two emotionally involving events. It is close to an absolutist conception of reality. There’s a lot more said by John of what they did and they had an argument and the parents did not say it like—how he talked, and that he wanted to be treated like an adult. It seems more right because I don’t like the parents’ talk.” (p. 327) Level 3. This level was about average for young adults in this sample. They recognized that different perspectives appear valid, but they tended to cling to the possibility that there may be an absolute truth, even in such ill-defined situations. Yes [they could both be right]. I think you’d have to have a third person not involved emotionally with either party. They’d be able to write without feeling, the facts, just what happened. (p. 327) Note that “just what happened” suggests one correct interpretation of events. Level 4. This level was about average for middle-aged adults. They were not biased toward one side or the other, and the idea that more than one truth might exist was intimated, but there was still a strong sense that one can identify an essential similarity or truth despite the different perspectives. I think the accounts, as far as the actual events, are pretty much the same. The important differences are in the presentation . . . the important differences are in their perceptions of what was going on. The actual “this happened” are the same, but the interpretation of it is different. (p. 328) Level 6. Perhaps you will recognize this relativistic view as one that helping professionals are trained to assume in highly personal matters. There is no hint here that one experience of the event was more valid than the other. They’d have to be able to really share [to resolve this conflict] . . . this way with each other, and even deeper, in terms of getting into some of their fears and some of their angers. They have to be able to accept it in each other, and maybe, they can resolve the conflict, only if they can accept feelings in each other. (Blanchard-Fields, 1986, p. 328) Only a few participants, all of them middle-aged, responded at such an advanced relativistic level in this study. As Kuhn et al. (1995) point out, “topics in the social sphere both engage people and challenge them. They are easy to think about but hard to think well about” (p. 120, italics added). In Box 11.2 we consider some of the mistakes in problem solving that researchers have found to be quite common, even among adults. You will notice that these tend to occur in situations that can be very personally involving, so they have particular relevance to helping professionals. Box 11.1: A Visit to the Grandparents Blanchard-Fields (1986, p. 333) presented adolescents and adults with three tasks, each consisting of two discrepant accounts of the same event. Participants were interviewed to assess their reasoning about the events. The following are the two accounts for the task called “A Visit to the Grandparents.” Adolescent’s Perspective I’d been planning on spending the whole weekend with my friends. Friday, in school, we’d made plans to go to the video game arcade and the school carnival. We were all looking forward to a lot of fun. Saturday morning my parents surprised me by telling me that we were going to visit my grandparents that day. They reminded me that they’d planned this a long time ago. But how am I supposed to remember those things? So, we ended up in one of those big arguments with the typical results; I gave in and went with them. The worst part was when they lectured me on why it was my duty to go without ever once looking at my side of it. When we finally got to my grandparents, I had to do everything they wanted to do. I had to answer silly questions about school, play their games, and see their old slides and movies. It eventually blew up in an argument between me and my parents over the legal age for drinking. Even though I was being as polite as I could, it was boring. I felt forced into everything. I just can’t wait until I am free and out on my own. I was really angry with them. So, on the way home I told them that I wanted to be treated more like an adult; that I wanted more respect from them and I wanted them to take my plans seriously. They seemed to agree with this and decided that I was now old enough to make my own decisions. Parents’ Perspective Two months had gone by since we had visited the grandparents. We try to visit them at least once a month, but everyone gets so busy that a month slips away very quickly. This time, we planned the visit far enough in advance so that everyone would come. When we tried to get John ready to go with us on Saturday morning, he put up a battle. After all, he hadn’t seen his grandparents for a long time. They are getting old and won’t be around much longer. We tried reasoning with John, stressing the importance of family unity and obligation as well as consideration of others. Certainly, in the future, he would regret not spending more time with his grandparents after they’ve gone. Although John was reluctant to go, he finally came with us and actually seemed to really enjoy himself. Since he seemed to be having a good time, we were surprised by how angry he became when we all got into a discussion about the legal age of drinking. Even though he was reluctant to go with us at first, he seemed to have a good time, to enjoy the family closeness. He showed respect for his grandparents and seemed to understand how good it made them feel to see him. What this means to us is that he’s old enough now to enjoy being with adults more and to learn from them. On the way home we agreed that John should take a more active part in discussions about family matters. SOURCE: Blanchard-Fields, F. (1986). Reasoning on social dilemmas varying in emotional salience: An adult developmental perspective. Psychology and Aging 1, 325–333. Reprinted by permission of the American Psychological Association. Box 11.2: Helper Beware: Decision-Making Pitfalls Our ability to think our way logically through problems improves and expands throughout childhood and adolescence. As this chapter indicates, problem-solving skill continues to improve in adulthood as well, especially for ill-defined problems. We have also seen in this chapter that effectively using our logical abilities to make decisions can be especially difficult when we deal with social or emotional issues that have personal relevance. Among the many kinds of logical fallacies that commonly ensnare adults (see Stanovich, 1998) are some that may be especially problematic for helping professionals and their clients. Suppose you want to encourage a client to consider beginning an exercise program. You are concerned that her sedentary lifestyle is contributing both to her depression and to other health problems. When you introduce the idea, however, she counters, “That won’t help. I have a neighbor who has run 20 miles a week for all the years I’ve known her, but she had to be hospitalized last year because she was suicidal.” The logical error that your client is making is sometimes called “the person who” fallacy. She is refuting a well-documented finding, like the correlation between regular exercise and well-being (both physical and emotional), by calling on knowledge of a person who is an exception. Especially when dealing with psychological and social issues, matters in which most of us have great personal interest, “people tend to forget the fundamental principle that knowledge does not have to be certain to be useful—that even though individual cases cannot be predicted, the ability to accurately forecast group trends is often very informative. The prediction of outcomes based on group characteristics is often called aggregate or actuarial prediction” (Stanovich, 1998, p. 149). So, for any individual, a prediction about the effectiveness of a treatment is more likely to be accurate if we base that prediction on general findings for people with that individual’s characteristics than if we base it on one or a few other individuals whom we have known or observed. Like many logical mistakes, “the person who” fallacy involves failing to step back and consider how well the evidence supports one’s theory. Your hypothetical client has a theory that exercise will not relieve her depressive symptoms, perhaps motivated in part by her distaste for exercise. She is aware of one instance in which the prediction of her theory appears to have been correct. She fails to recognize that a single case is an inadequate test of a treatment’s effectiveness. One outcome can be influenced by multiple factors, many of them unknown. Also, your client completely ignores the evidence against her theory: Proportionally, more people experience long-term emotional benefits from exercise than not. In other words, probabilistically, regular exercise has a good chance of helping. Part of the reason that “the person who” fallacy occurs is because of the vividness effect. When we are trying to make decisions, some salient or vivid facts are likely to attract our attention regardless of their actual value as evidence. Even when other facts are available the vividness of personal experience or of the personal testimony of other individuals can be greater than that of any other information that we might access. Wilson and Brekke (1994) documented the strength of the vividness effect in a study of consumer behavior. Participants in the study were told they would be given free condoms and that they could choose between two brands. They were also given access to two kinds of information about the condoms to help them choose a brand. One was an extensive analysis of survey data on the performance of condoms from a Consumer Reports magazine and the other was a pair of student testimonials. Objectively, the data from the Consumer Reports analysis were more useful in this case, but most participants asked for the testimonials as well, and when the testimonials were in conflict with the survey research, about one third of the participants were swayed by the testimonials. Thus, even in a situation in which both carefully collected group data and individual testimony were readily available, the vividness of the less appropriate individual testimony was hard for many participants to resist. The insidiousness of vividness effects is especially problematic when testimonials are one’s source of evidence for the effectiveness of a treatment. When your client’s Cousin George swears that his son was relieved of his depression by “Dr. Olivino’s oil immersion therapy,” the appeal of Cousin George’s testimonial may be irresistible to your client, especially when she observes for herself that George’s son does indeed appear to be doing quite well. Unfortunately, people can be found to offer testimonials for any therapy or treatment that has ever been offered. One reason for the ready availability of testimonials is, of course, the placebo effect. People sometimes improve with attention or in the course of time, no matter what the medical or psychological intervention. The actual effectiveness of a treatment can only be determined in careful studies in which some participants are given the treatment and some are given a dummy, or placebo, version of the treatment. Typically in such studies, a substantial percentage of participants given the placebo control will improve, and typically they are convinced that their “treatment” is responsible. Clearly, the actual effectiveness of a treatment approach cannot be determined by testimonials. The only useful indicator of the effectiveness of a treatment is whether treatment participants are benefited more than placebo controls. Counselors, too, often find the vividness effect hard to resist. For example, our own experience with individual cases often looms larger in our decision making about probable treatment outcomes than the actuarial evidence that is available to us from controlled studies of a treatment’s effectiveness. How can we protect ourselves and our clients from making poor decisions based on “the person who” fallacy and the vividness effect? Increased metacognitive awareness—that is, awareness of one’s own thinking—seems to be the key. For example, educating clients to see that personal experience or the testimony of others can be limited in its generality, despite its vividness, can help them resist the appeal of such evidence. More generally, we can encourage clients to understand the decision-making process itself. When we try to decide whether a treatment should be pursued, we are actually evaluating the theory that a particular treatment alternative will cause improvement. Our task is to specify all the possible theories (treatment options) and to evaluate the evidence for each. When people have a favorite theory, they often do not realize that it is just a theory. As Kuhn (1991) points out, they think with their theory, not about their theory. In other words, the theory guides their thought and what they pay attention to instead of being an object of thought and evaluation. So, for example, if your client believes that Dr. Olivino’s oil immersion therapy works, she will pay more attention to examples that support her theory and may either fail to look for, or will ignore, counterevidence. Helping clients make better decisions, then, will include encouraging them to recognize that they are working with a theory, not a fact. A theory must be justified by evidence, and all the available evidence, both pro and con, should be considered. In addition, alternative theories (treatment options) and the evidence for them should be considered. Testimonials or personal experiences are not likely to become less vivid in this process. But when our clients are armed with knowledge about the shortcomings of such data, they may become more cautious about using them as evidence. Applications On the cusp of adulthood, individuals in their late teens and 20s confront a number of new developmental milestones. The serious tasks of consolidating an identity, solidifying a career path, and realizing the capacity for intimacy (Erikson, 1968; Keniston, 1971) are both challenging and time consuming. One competency required for each of these tasks is the ability to make decisions and choices given a wide array of possible alternatives. Helpers are frequently called on to help clients make decisions, and, to some, this is the quintessential role of the helper. Decision-making (Krumboltz, 1966; Stewart, Winborn, Johnson, Burks, & Engelkes, 1978) and problem-solving (D’Zurilla, 1986; Egan, 1975) models are widely used among clinicians to help people with personal and career-related issues. Such models have in common a series of general steps, including (1) defining the problem, (2) setting realistic goals, (3) developing a variety of possible solutions, (4) assessing the costs and benefits of each alternative solution, (5) selecting and implementing one alternative, and (6) reviewing the effectiveness of the solution after implementation (Nezu, Nezu, & Lombardo, 2003). On its face, this strategy appears to incorporate some aspects of postformal thinking, specifically the relativistic, pro-and-con nature of solutions to fuzzy problems, for it is rare to find solutions to difficult problems that don’t have potential disadvantages as well as advantages. But it can also suggest to more dualistic thinkers that problems are “well defined” and have correct solutions, if only someone can figure them out. Although most professonals understand that any solution or decision incorporates some good and some bad, their clients may not. Given what we know about the importance of belief systems, the helper might be remiss if she did not consider the client’s epistemology, or her beliefs about the nature of knowledge, when working on decision making—or other problems, for that matter. In other words, seeing a dilemma through the client’s eyes requires an appreciation of the way she views truth, knowledge, and meaning in life. If a client tends to see the world in absolute terms, she might be confused as to why alternatives are even being considered! Twenty-year-old Hannah recently moved into her boyfriend Mike’s apartment near the college campus where both are students. She is upset that Mike wants to spend so much time with his friends, playing basketball and going to bars. Hannah resents the fact that he doesn’t spend his free time with her, and, to make matters worse, she believes drinking alcohol is morally wrong. Because the tension between the two is so great, Hannah consults a counselor for advice about how to change Mike’s behavior. Hannah becomes indignant when the counselor asks her to consider the advantages of allowing Mike to socialize with his friends and to reflect on her role in the relationship problems. “Mike is the one with the problem,” Hannah states emphatically. “I came here to find out how to get him to stop drinking and to spend time with me.” Hannah will become frustrated if the counselor presses her to adopt a relativistic stance too quickly, and she may seek out another “authority.” She will probably not return to this counselor. Level of epistemological thinking also influences academic success. Research has demonstrated that students who believe that learning should be quick, that prolonged concentration is a waste of time, and that memorization of concrete facts is sufficient for learning demonstrate poor performance on mastery tests despite reporting high levels of confidence in their ability (Schommer, 1990). Students who perform better on mastery tests are more likely to disagree with the view that knowledge is certain and that single, simple answers are most reflective of reality (Schommer, 1993; see Muis, 2007). Skills in reflective thinking are also correlated with greater understanding of multicultural issues (King & Howard-Hamilton, 1999). In general, relativistic thinking is considered to be more sophisticated because most problems in life involve uncertainty as well as some measure of uncontrollability. Expectations that complex problems should have clear-cut solutions can complicate the problem-solving process by increasing individuals’ anxiety when simple answers prove inadequate (D’Zurilla, 1988). The ability to generate solutions, assess advantages and disadvantages of each, and then integrate aspects of several solutions into one presumes a level of epistemic knowledge that has come to terms with the ambiguity of real-world problems. Yet as Perry and others have demonstrated, this skill develops gradually; it does not necessarily come with a high school diploma. Professionals who are knowledgeable about cognitive development can help clients resolve specific problems and help them acquire more general skills in decision making, such as knowing when and where a strategy is effective. Stonewater and Stonewater (1984) have proposed a problem-solving model based on Perry’s theory that balances a bit of challenge (to introduce disequilibrium) with support and engagement by the counselor. They suggest that individuals who are closer to the dualistic end of the continuum need more carefully controlled exposure to diverse ways of thinking and can benefit from the provision of adequate structure as a framework for incorporating new ideas. For example, students who find the process of career selection overwhelming may be aided by a very specific set of activities that guides them through the exploration process. Students who are at the stages of dualism or early multiplicity may find that knowing what questions to ask or what criteria to investigate allows them to engage in the confusing career decision-making process with a supportive road map. In a process very consistent with Piaget’s ideas, individuals will not be able to accommodate new ways of thinking if the complexity of the information is too great and the support is not sufficient. Likewise, individuals will not be motivated to accommodate at all if the challenge is minimal and support is overdone. King and Kitchener (2002) offer specific suggestions for teachers and others who work with emerging adults. Several of these ideas can be useful in promoting reflective thinking about academic and psychoeducational content. First, demonstrate respect for young adults’ thinking, keeping in mind that they are most likely to open up to new ways of reflection in a supportive atmosphere. Second, provide opportunities for young adults to gather information about an issue, evaluate the quality of that information, and draw conclusions based on that data. Third, teach young adults to explicitly examine the epistemological assumptions they use in making decisions about ill-defined problems. The theories of cognitive development described in this chapter make another important contribution to the practicing clinician. They help us understand that certain kinds of thinking, sometimes labeled “irrational,” may not be attributes of a dysfunctional personality style but rather the manifestation of a developmental stage. Construing clients’ thinking and judgments from this perspective allows us to be more forgiving of their idiosyncrasies and more supportive about their potential for growth and change. Sometimes, clients with fairly rigid belief systems may not have been exposed to the kinds of contexts that encourage them to examine beliefs carefully or to consider alternative explanations. Or clients who hold less mature perspectives may be retreating from the confusion of too many ideas and too little support. Here the helper has a good opportunity to assess the person’s thinking and provide a balance of support and challenge that facilitates progress from a dualistic, polarized belief system to one that is more cognitively complex (Sanford, 1962). Putting Things Off The road to adulthood is marked by a point when individuals are no longer bound by the externally controlled routines of home and school. This freedom can be exhilarating, but many are set adrift by the lack of structure. Feeling unsafe and framework-less once again, many young people search for the certainty of “right” answers and struggle with the challenge of organizing large blocks of unstructured time to work on tasks that may have no clear-cut boundaries. In the past, they could rely on authoritative sources to provide guidelines and directions. Now, the need to become more self-reliant increases dramatically. One problem that arises frequently at this point in the life span is procrastination. This problem has the power to derail, even if temporarily, emerging adults’ potential success in academic and work environments. Procrastination is a multidimensional construct having motivational, behavioral, cognitive, and personality components, and may be the end product of diverse pathways. Those who investigate procrastination as a dispositional, traitlike characteristic (Paunonen & Ashton, 2001; Schouwenburg, 2004) report that procrastinators share low levels of trait conscientiousness (see the section on personality in Chapter 13), rendering this group more likely to be disorganized, poor at planning, and lacking the self-regulation needed to accomplish things in a timely way. Several subtypes have been identified within the general population of procrastinators. Those with low levels of conscientiousness and high levels of neuroticism tend to be fearful, anxiety-prone, and perfectionistic. These perfectionistic procrastinators abide by the “procrastinator’s code” (Burka & Yuen, 1983), that specific cognitive tendency to equate failure in one area to a generalized sense of failure in all aspects of the self. An unrealistic set of perfectionistic standards and irrational beliefs, namely that failure is a fate worse than death, may fuel their avoidant behavior. Those whose combination of personality traits includes low conscientiousness, low neuroticism, and high extraversion are more likely to procrastinate without the worry. These individuals also avoid tasks they consider unpleasant and prefer to engage in more satisfying pursuits but do not do so because they are averse to failure. Some have also identified a “rebellious” type of procrastinator who resists authority and asserts independence through procrastination. In counseling situations, care should be taken not to try to fit everyone who procrastinates into one of these “types” or to assume that patterns of dilatory behavior in certain contexts necessarily reveal the nature of that person’s personality. Most of us have avoided aversive tasks at one time or another, and sometimes we may procrastinate in one setting (academic) but not in another (job). We suggest, as well, that there are developmental reasons for the ubiquity of procrastination at this stage of the life span. The landscape of emerging adulthood can be perceived by some as a frontier with open-ended possibilities and many high-stakes choices. The skills of decision making and self-management are honed in the identity development process, which proceeds concurrently with entry into college, military service, or the world of work. Procrastination may be the manifestation of confusion or lack of certainty about how to proceed. Although this is a growing area of research, we still need more evidence to sort out information about the nature of procrastination and ways to intervene to reduce it. Behavioral approaches such as instruction in time management and organizational skills are very popular (Tuckman, Abry, & Smith, 2002). These techniques can be useful in helping students learn to segment tasks into manageable components, develop timelines and benchmarks for task completion, and provide themselves with regular feedback about performance. Other approaches blend behavioral elements with specific components that address underlying psychological needs. For example, van Essen, van den Heuvell, and Ossebaard (2004) developed a self-management course that facilitated self-reflection about personal reasons for procrastination as the basis for change. Rational Emotive Behavior Therapy (REBT) principles were incorporated to address the irrational beliefs that supported procrastination. These authors reported that the program was effective in enhancing self-efficacy and reducing procrastination. For perfectionistic procrastinators, helpers may find stress-management interventions such as guided relaxation training and systematic desensitization procedures helpful adjunctive therapies (Flett, Hewitt, Davis, & Sherry, 2004). Growth and Change in Professionals’ Epistemology Don’t be alarmed if you recognize that some aspects of your own thinking might seem dualistic or authority-oriented. Perry noted that new learning in any discipline can evolve in a similar stagelike way. Table 11.2 presents Stoltenberg and Delworth’s (1987) model of stages in counselor development. As you can see, Level 1 counselors operate somewhat like dualists. They tend to be dependent on supervisors to tell them the “right” way to conceptualize cases. Level 1 counselors try to fit clients into categories and often rely on canned strategies. In general, they also tend to attribute too much pathology to their clients because of their own anxiety level. Level 2 counselors vacillate between dependence on supervisors and personal autonomy. Because they have more experience with difficult cases, they may be less optimistic about the possibilities for change in certain circumstances. As with multiplists, exposure to competing theories of human behavior and therapy undermines confidence in the validity of each. Counselors at this level are more skillful yet also more confused about their own efficacy. At Level 3, helping professionals are more independent and also more tolerant of divergent opinions. They come to accept the ambiguity that is inherent in the helping process. They are creative problem solvers and more objective in their assessment of their clients. In the fluid way they utilize the discipline’s knowledge base to adjust to the needs of the client, these counselors truly embody the characteristics of reflective practice. TABLE 11.2 Client Conceptualizations at Stages of Counselor Development Level 1 Self- and Other-Awareness—Emotional and cognitive self-focus. Indications: Diagnoses/conceptualizations will be “canned” or stereotypical, trying to fit clients into categories. Incomplete treatment plans will focus on specific skills or interventions, often quite similar across clients. Treatment plans may not reflect diagnoses. Motivation—High, with strong desire to learn to become effective diagnostician and therapist. Indications: Willing student, will seek out additional information from books, colleagues, and other sources. Autonomy—Dependent. Indications: Relies on supervisor for diagnoses and treatment plans. Locus of evaluation rests with supervisor. Level 2 Self- and Other-Awareness—Emotional and cognitive focus on the client. Indications: Realizes treatment plan is necessary and logical extension of diagnosis. May resist “labeling” client into diagnostic classifications. Treatment plans may prove difficult due to lack of objectivity. May reflect various orientations yet lacks integration. Motivation—Fluctuates depending on clarity regarding various clients. Indications: May be pessimistic, overly optimistic or confident at times. Autonomy—Dependency-autonomy conflict. Indications: May depend on supervisor for diagnoses and treatment plans for difficult clients, may avoid or resist supervisor suggestions concerning others. Confident with some less-confusing clients. More resistance to perceived unreasonable demands of supervisor, threats to tenuous independence and therapeutic self-esteem. Level 3 Self- and Other-Awareness—Emotional and cognitive awareness of client and self. Indications: Able to “pull back” affectively and cognitively, monitor own reactions to client. The client’s perspective and a more objective evaluation will be reflected in conceptualizations. Treatment plans will flow from diagnoses, taking into account client and environmental characteristics. Reflects therapist’s own therapeutic orientation. Motivation—Consistently high, based on greater understanding of personality-learning theory and self. Indications: Not as susceptible to permission or undue optimism. Diagnoses and treatment plans consistently thought through and integrated. Autonomy—Independent functioning. Indications: Seeks consultation when necessary. Open to alternative conceptualizations and treatment approaches but retains responsibility for decisions. Makes appropriate referrals. SOURCE: Stoltenberg, C. D. & Delworth, U. (1987). Supervising counselors and therapists: A developmental approach. San Francisco, CA: Jossey-Bass. Used with permission by John Wiley & Sons, Inc. Focus on Developmental Psychopathology Depression Most people have some familiarity with the concept of depression, either from professional training and work with clients, from widespread coverage in popular media, or from all-too-common personal experience. Unfortunately, depression can be trivialized and misunderstood by those who have not lived through its torment. The novelist William Styron (1990), having suffered from clinical depression himself, takes issue with our limited understanding of this terrible ordeal. He described the experience as “a howling tempest in the brain,” (p. 37), not at all consistent with the standard comments of people who have not experienced it, such as “We all have bad days” or “You’ll pull out of it.” This misunderstanding might be clarified if depression were specified more clearly. Some advocate differentiating between a depressive symptom (e.g., sad mood), a depressive syndrome (e.g., sad mood plus anxiety), and a depressive disorder (Rutter, Tizard, & Whitmore, 1970). According to formal diagnostic criteria, depression is a serious mood disorder that has several presentations, including unipolar, bipolar, dysthymic, and cyclothymic forms. Some of the core features for major depressive disorder (MDD) include sad affect, anhedonia, fluctuations in weight and/or sleep, psychomotor changes, fatigue, cognitive impairments, feelings of guilt or worthlessness, and suicidal thoughts or acts. At least five symptoms must be present for at least 2 weeks, be clinically significant, and impair normal functioning in order to qualify for diagnosis (APA, 2013). Although this seems simple enough, diagnostic problems do result from the counting up of symptoms because normal mood fluctuations can be misconstrued as clinical depression. Evidence is building that it is the number and intensity of symptoms and not the presence of symptoms alone that should distinguish between depressed and nondepressed individuals (Angst & Merikangas, 2001; Slade and Andrews, 2005). The various manifestations, symptom sets, risk factors, and courses of depression have led researchers to consider depression as a heterogeneous category of disorders (Chen, 2000; Chen, Eaton, Gallo, Nestadt, & Crum, 2000). Because all types of depression include alterations in mood, others argue that depression is best represented as a spectrum disorder, with unipolar depression on one end, atypical depression in the middle, and bipolar disorder at the other end (Akiskal & Benazzi, 2007). Although it is clear that children suffer from depression, the question of whether depression in childhood is qualitatively different from depression in adolescence and adulthood has not been resolved (Goodyear, 1996; Weiss & Garber, 2003). Nolen-Hoeksema (2008) sums up the issue, writing that “the notion that there is a coherent entity we call depression that will be identifiable in the body is just wrong, or at least, not very useful” (p. 178). Fundamentally, symptoms that collectively manifest as depression appear to arise from high levels of negative affect and stress. Prevalence and Comorbidity The World Health Organization predicts that depression will become the second most costly disease in the world by 2020 (WHO, 2004). The scope of the problem is staggering. Based on interviews used for the National Co-Morbidity Study (Kessler et al., 2003), the lifetime prevalence rate of MDD in the United States is 16.6%, or 32.6 to 35.1 million adults. Within the 12-month period prior to interview, 6.6% of U.S. adults (13.1 to 14.2 million) suffered from MDD. Among those in the 12-month MDD group, functional impairments were great. Approximately 97% reported significant problems in social relationships and work roles. As we have noted, depression is not just a problem for adults. Rates of depression increase from approximately 4% in childhood (Angold & Rutter, 1992) to 5% to 15% in adolescence (Brooks-Gunn & Petersen, 1991). There is also good evidence that the incidence of MDD is increasing worldwide, and its age of onset is decreasing. This troubling finding means that depression has been appearing in successively younger generations since 1940 (Cross-National Collaborative Group, 1992; Kessler et al., 2003). Recently, a similar trend has been observed for bipolar illness (Chengappa et al., 2003). This phenomenon, called the “Birth Cohort Effect,” cannot simply be explained by genetics, since genes presumably do not change so rapidly. Furthermore, earlier and earlier spikes in depression are supported by objective measures such as increases in hospitalizations and suicides. Feliziano, who has bipolar disorder, describes his experience with the depressive phases of his disorder. He recognizes that depression isn’t a cognitive problem—he has no “reason” to be depressed. Rather, depression is a mood disorder. Females are diagnosed with disproportionately higher rates of depressive syndromes and serious disorders than males from mid-adolescence through late adulthood (Compas et al., 1993). Before puberty, boys’ rates of depressive symptoms and disorders are equal to or higher than those of prepubescent girls (Nolen-Hoeksema et al., 1992). Both genders suffer from bipolar disorder in equal numbers, an exception to the sex-based trends found in unipolar disorders (American Psychiatric Association, 2000). Females are also more likely than males to ruminate about their sad moods when depressed, to admit feeling sad, and to seek help (Nolen-Hoeksema, 2002). However, we can’t assume from this evidence that males do not also suffer from depression in high numbers. Addis’s (2008) Gendered Responding Framework of depression offers a way of understanding how male gender norms interact with the experience of negative affect. In this framework, males’ experience of negative affect (grief, sadness, dysphoria) may be handled through distraction, anger, or avoidance because of prototypical masculine role expectations. Males, too, feel sad and upset, but they may learn, through social conditioning, to express their vulnerabilities in ways that don’t neatly map onto available checklists of depressive symptoms. Depression frequently coexists with other illnesses. In children, anxiety and impulse-control disorders have been found to manifest prior to depression and show high comorbidity with depression later on (Kessler et al., 2003). Angold and Costello (1993) reported that depressed children had rates of conduct problems 3 to 9 times higher and rates of anxiety disorders 2 to 25 times higher than nondepressed children. This pattern may point to a pathway from anxiety and/or behavioral problems to later depression in youth. As is the case in childhood, depression often coexists with other problems in adulthood, impacting physical health and mortality (Klerman & Weissman, 1989). Recently, Chatterji (2008) reported that people who suffered from depression in addition to a chronic illness such as diabetes, angina, arthritis, or asthma were in the poorest health category when compared to all other individuals. The Perplexing Search for Causes Despite enormous research efforts, the specific causes of depression remain unknown. The possible heterogeneity of the disease and the complexity of interacting influences make simple answers unlikely. However, some variables are clearly correlated with risk. Genetic/familial predispositions, early adverse life experiences, hormonal changes in puberty, cognitive and motivational processes such as attributional and coping style, number and intensity of stressors, and absence of protective factors like social support all have influential parts to play (Rutter, 1986). Much attention has focused on three main neurotransmitter systems: dopamine associated with loss of pleasure, norepinephrine associated with psychomotor retardation, and serotonin associated with depressive ideation (Sapolsky, 2004). There is no clear consensus about which system is most critical nor any clear understanding of whether there is too much or too little of some neurotransmitter in the brains of some depressed individuals. The question is complicated: Does a neurochemical imbalance cause depression or does depression (e.g., stress, trauma, distorted cognitions, etc.) cause a change in the chemistry of the brain? There is good evidence for the latter view, particularly in the case of early life stress. In Chapter 2, we described the HPA axis as central to the body’s stress response. Remember that the perception of a stressor causes the hypothalamus to trigger a cascade of neurochemical changes, via corticotrophin-releasing factor (CRF), that operate in a coordinated way to respond to threat. As it turns out, CRF is released from nerve terminals that also communicate directly with the serotonin, dopamine, and noradrenergic (epinephrine) systems, modulating the release of these neurotransmitters (Austin, Rhodes, & Lewis, 1997). Early exposure to adversity can alter the development of these systems and sensitize children so that it doesn’t take much subsequent stress to activate the neurochemical pathways associated with depression (Graham, Heim, Goodman, Miller, & Nemeroff, 1999; Rudolph & Flynn, 2007). Even a genetically based biochemical imbalance can be an underlying diathesis that can be activated by adverse life experiences. Sapolsky (2004) argues that the ability to recover from stress distinguishes who becomes depressed when stressed and who does not. The presence of a gene variant that limits the effectiveness of a neurotransmitter system to recover (like the 5-HTT gene mentioned in Chapter 7) may play a key role in determining who is most at risk. Certain medications, especially SSRIs (selective serotonin reuptake inhibitors), assist in the recovery of neurotransmitter systems and may explain their success in symptom reduction for some patients. It is important to emphasize, however, that reliable differences in neurotransmitter levels in the brains of depressed and nondepressed individuals have not been found, and medications are ineffective for a substantial percentage of depressed individuals (Leventhal & Martell, 2006). Pathways of Risk The recognition that serious adversities experienced in childhood are related to later depression is nothing new (Goodman, 2002). Early adverse experiences, such as parental loss, family disruption, and neglectful/abusive parenting, have long been considered serious contributors to psychopathology. To shed light on how these experiences might exert their impact, Duggal, Carlson, Sroufe, and Egeland (2001) followed a group of at-risk low-SES children from birth to age 17.5. During the group’s first 3.5 years, the researchers conducted observations of mother–child interaction style so as to assess attachment-related constructs. A high percentage (19% of 168 children) demonstrated clinical depression in childhood and 18% did so in adolescence, rates higher than expected from epidemiological estimates. Two different pathways to depression were observed. Depression that presented first in childhood was best predicted by the accumulation of adverse family circumstances and characteristics. Think of these children as being born into very difficult or inadequate family circumstances. In this pathway, it was the combination of number and types of stressors on mothers, insensitive and emotionally unsupportive parenting, maternal depression, and abuse that was shown to “interfere with responsiveness to developmental needs, increasing the probability of depressive symptoms” (p. 159). Adolescent-onset depression, in contrast, was best predicted by maternal depression and lack of early supportive care, rather than the aggregate of factors that predicted prepubertal depression. Interestingly, gender differences emerged in pubertal depression pathways. Maternal depression, assessed when children were 7 or 8, predicted pubertal depression best for females, possibly because of sex-role identification processes. For males, unsupportive early care in infancy and early childhood was the strongest predictor of depression in adolescence. As the authors conclude, “for males, it seems to be what the caretaker did rather than who the caretaker was during childhood that is most relevant to depressive symptomatology in adolescence” (p. 160). All of these unfortunate circumstances could facilitate the particular cognitive vulnerabilities associated with depression: hopelessness, low self-worth, negative expectations, and cognitive distortions to name a few (Garber & Flynn, 2001). Findings such as these highlight the complex processes involved in the development of vulnerability to psychopathology. Painful Passages Stressors that occur after childhood and adolescence can also precipitate the emergence of depression in vulnerable individuals. The role transitions involved in moving out of the home into the larger world of adult responsibilities are one example. While not all emerging adults experience depression or even depressive symptoms, the challenges that are part of the developmental work of constructing an adult identity can take their toll on mental health. The separation and losses involved in the predictable life transitions and changes in interpersonal relationships that occur at this stage are stressful and can make already vulnerable individuals more vulnerable. Emerging adulthood typically involves a major separation from home—physically, psychologically, or both. For some, the combination of leaving family and friends, disruptions in romantic attachments, stresses of academic or work responsibilities, and the day-to-day challenges of caring for oneself may seem overwhelming. A stark reminder is the fact that the highest prevalence rate for depression across the entire life span occurs during the ages of 15 to 24 (Blazer, Kessler, McGonagle, & Schwartz, 1994). The stress of the passage into adulthood can be especially intense for minority students in the United States, as you saw earlier in this chapter. Apparently as a result, minority college students can be more at risk of suffering from depression than majority students. The direct experience of racism and discrimination is not the only source of students’ minority stress. Minority status often brings with it other problems, such as “interethnic difficulties (e.g., difficulty in making White friends), within-group conflicts (e.g., being viewed as “acting White”), and achievement stress (e.g., feeling less intelligent or less capable than others, or the pressure of high expectations for college success from one’s family)” (Wei et al., 2010, pp. 411–412). An important factor that can reduce depression risk for minority students is bicultural competence, which involves having social skills for getting along in both the majority and minority culture “without compromising . . . one’s sense of cultural identity” (LaFromboise, Coleman, & Gerton, 1993, p. 404). These skills include knowledge of beliefs and values in both cultures, a belief that one can function well in two cultural groups and maintain one’s cultural identity, good communication skills, a repertoire of appropriate role behaviors in each culture, and so on. Minority students who perceive themselves as having such skills report fewer depressive symptoms and more feelings of psychological well-being (David, Okazaki, & Saw, 2009; Wei et al., 2010). Supporting bicultural competence may be an important added ingredient in any kind of treatment for minority students. Fortunately, there are effective treatments for depression, including pharmacological, cognitive-behavioral (CBT), interpersonal (IPT), and mindfulness-based cognitive (MBCT) therapies. The foundations of interpersonal therapy may have particular relevance for development, insofar as the struggle involved in making normal life transitions is one of the primary factors assumed to cause the disorder. Interpersonal therapy (Klerman, Weissman, Rounsaville, & Chevron, 1984; Weissman, Makowitz, & Klerman, 2000) assumes that depression is affected by and affects interpersonal relationships, and aspects of the depressed person’s social network are of primary importance in understanding and treating the disorder. The theoretical foundations of the interpersonal approach derive from the work of Sullivan (1953) and Meyer (1957), among others, who emphasized the here and now social contexts of the depressed individual over causes of depression that were intrapsychic and rooted in the distant past. While IPT does not dismiss the cognitive processes that support depression, irrational, distorted thinking is not the primary focus of treatment. The causes of depression are presumed to be four broadly defined interpersonal situations: grief, role disputes (conflicts with significant others), life changes/role transitions, and significant interpersonal deficits. Using a time-limited approach (roughly up to 20 sessions) and functioning as a supportive ally, the therapist assesses symptoms, connects depression to one of the four major problematic areas, and assigns the client a “sick role” (Parsons, 1951). Assigning a sick role legitimizes the client’s needs for support from others who may be included in the therapeutic process, temporarily frees the client from unmanageable responsibility, and allows the person to focus on recovery during the restricted period of therapy. During the intermediate stage of therapy, the therapist takes a moderately directive role and helps clients make real changes in relationships and renegotiate their roles in interpersonal contexts. In the case of difficult role transitions, such as the transition to adulthood, clients are helped to mourn the old “adolescent” role by reviewing what was good and bad about it, clarify feelings about the new role, and explore opportunities that the new role offers. When anxieties about one’s ability to manage the new life stage successfully surface, expectations about what “being an adult” might be are discussed and readjusted if necessary. Sources of social support are identified and recruited and incentives for taking on new developmental challenges are created. The final stage of therapy acknowledges termination as a loss but also focuses on sustaining the gains made in therapy in the posttherapy social environment. Some techniques used in IPT include those common to supportive psychotherapies: questioning, clarification, support of emotional expression, behavior change strategies, and development of a strong therapeutic alliance. IPT has also been adapted for use with younger depressed adolescents (Mufson, Moreau, Weissman, & Klerman, 1993). A key point in the interpersonal conceptualization of depression is its transactional emphasis. Unfortunately, it has been repeatedly demonstrated that depressed people suffer more frequent rejection from others than do nondepressed counterparts (Segrin & Abramson, 1994) and thus need to exert more effort and display more skill in order to overcome this social tendency. However, the life events that may have precipitated depression in vulnerable individuals in the first place, such as a move to a new school, reduction in contact with family and friends, and increased maturity demands, can produce symptoms (chronic fatigue, poor concentration, indecisiveness, sad mood, anhedonia, etc.) that make engaging in positive social interactions an enormous effort (Coyne, 1999). The vicious cycle is obvious. Regardless of theoretical bent, it is very important for helpers to pay serious attention to the environmental contexts that sustain depression. As Coyne warns, “depressed individuals’ statements about themselves and their relationships get interpreted as enduring cognitive structures, a sociotropic trait, or working model of relationships, and these reified entities are then given causal priority over any interpersonal processes” (1999, p. 368). Even though IPT is an individual approach to counseling, the usefulness of including significant others in some manner in treatment should be emphasized. More research is needed to help us understand how to intervene in these interpersonal processes more directly. Summary Specifying exactly when an individual reaches adulthood is surprisingly complex. Sociologists look to marker events, such as completing one’s education, entering the workforce, leaving the family home, and so on. Young adults themselves tend to emphasize accepting responsibility for their own behavior and making independent decisions. In today’s society, the period from about 18 to 25 may be described as a time of emerging adulthood. Physical Development in Young Adulthood Between 18 and 30, all biological systems reach peak potential. There are individual differences in when people reach peak potential, and there are differences among different systems and skills in the timing of peak status. For different skills, these differences reflect differences in the relative importance of practice, training, knowledge, experience, and biological capacity. Males and females also peak at different times. Lifestyle affects the achievement and maintenance of peak or near-peak functioning. Healthy lifestyles include getting regular exercise, eating a healthy diet, and avoiding unhealthy behaviors such as smoking and drug use. Such behaviors in young adulthood are reflected in poorer health later in adulthood, yet many young adults have unhealthy, underregulated lifestyles, probably as a result of poor application of problem-solving skills, continued feelings of invulnerability, the fact that young adults “bounce back” quickly from physical stress, and the many stresses they face. The pruning of synapses continues in young adulthood. The frontal lobes continue to mature, perhaps playing an important role in the young adult’s advancing abilities in organization, attention, planning, and self-regulation. Cognitive Development in Young Adulthood Young adulthood is a time of great learning. We have clear data on the growth of knowledge in the college population (roughly 60% of young adults today), but it seems likely that rapid growth of knowledge characterizes all young adults as they gain training and experience in their vocations and avocations. Logical thinking also appears to change beginning in young adulthood. Theorists and researchers disagree as to the nature of the change. Some propose that a more advanced kind of thinking, post-formal or fifth-stage thinking, emerges. Others argue that the formal operational abilities of the adolescent period represent the most advanced form of thinking for humans but that adults learn to apply this kind of thinking to the more ill-defined or ill-structured problems that adults face. As part of this process, they may also gain better understanding of the limits of their own problem-solving abilities. That is, they may grow in metacognitive understanding. Schaie argues against a new kind of adult thinking. Rather, he argues that at different times of adult life people face different kinds of problems, and different skills are brought to bear on those problems. He describes seven stages in adults’ intellectual functioning, with each new stage a result of shifts in the challenges people face. Theorists who argue that there is a stage of postformal thought suggest that it may not reach full development until middle adulthood, but its emergence begins in young adulthood. Most theorists, such as Sinnott, describe postformal thought as relativistic. The same reality can be described within several different truth systems, all of which are valid from one perspective or within some context. The postformal thinker recognizes the validity of different truth systems. She may make a subjective commitment to one in some situations or seek a compromise in other situations. Sinnott’s description of postformal thought is similar to what Chandler called postrational skepticism, although Chandler disagrees that this kind of thinking is more advanced than formal operational thought. Perry’s stage theory of intellectual and ethical development in the college years describes a sequence of steps in the movement from more absolutist or dualistic thinking (there is one right answer, other answers are wrong) to relativistic thinking (there is more than one correct way to view the same issue). In addition to Perry’s own longitudinal interview study, a number of researchers have provided some evidence for aspects of Perry’s theory. For example, students’ beliefs about knowledge have been found to relate to their approach to learning, as Perry’s theory predicts. Kitchener provides a seven-stage theory of the development of relativistic thinking, calling it reflective judgment. Research in which subjects are given ill-defined problems indicates that in the early stages of adult thinking, individuals believe in the existence of certainties. In the middle stages, people perceive knowledge as uncertain. In the later stages, they base their judgments on a set of rules or logic in combination with personal reflection. Essentially, they are relativistic. Some research indicates that reflective judgment is related to level of education and to the specific kind of training people have received. Graduate students in psychology, for example, tend to show more proficiency than graduate students in other disciplines. Evidence also indicates that people are benefited by modeling of forms of thinking more advanced than their own.CHAPTER 11 Physical and Cognitive Development in Young AdulthoodGarrett, a 19-year-old, is a 2nd-year college student. He enjoys many privileges that could beconstrued as “adult.” He drives a car, votes in elections, and owns a credit card. He shares anapartment near his college campus with two other undergraduates. He drinks alcohol with his friendsat parties (albeit illegally) and has regular sexual relations with his girlfriend of 10 months. Thecouple split up once and subsequently reunited. Duringthe separation, Garrett dated another youngwoman. At this point, Garrett and his girlfriend have no plans to marry. He wanted to get a jobimmediately after high school, but job opportunities in the trade he aspired to went to moreexperienced workers. Hedecided to give higher education a try. Garrett has little idea of what hisultimate career path will be. Garrett’s father, who is divorced from Garrett’s mother, pays for histuition and housing costs. He has taken out loans to help finance his son’s education. Garrett’smother provides him with an allowance for food. He works part-time for low wages at a clothing storein a local mall, which helps him pay for clothes and entertainment. Garrett is very responsible atwork, but he is an unenthusiastic college student. His study habits lean toward procrastinating andthen trying to make up for lost time by staying up all night to finish assignments by their deadlines.When Garrett broke his wrist playing sports, his mother had to take a few days off from herjob toaccompany him to doctor’s visits. Her health insurance covered most of the bills. Yet, despite theirfinancial support, Garrett’s parents have no legal right to see Garrett’s college grades. Is Garrett anadult? Scholars are likely to disagree about the answer to this question. Most would agree that theonset of adolescence is marked by the changes of puberty. But there are no easily observedphysical changes that signal entry into adulthood. Instead, adulthood is a social construction. One ormoreculturally determined criteria usually must be met before one’s maturity is established (Hogan& Astone, 1986), and the criteria vary depending on the observer and the culture. In the past,sociologists have emphasized the achievement and timing of markerevents as criteria for adulthood.These have included completing formal education, entering the adult workforce, leaving the familyhome, getting married, and becoming a parent. Around the middle of the last century, a largeproportion of the American population achieved these marker events between the ages of 18 and 24(Rindfuss, 1991). However, if we evaluate our hypothetical student, Garrett, according to thesetraditional marker events, he would not be an adult despite being in the right age range. Fromasociological perspective, it seems to take longer to grow up today than it did at earlier points inhistory for many reasons. Some of these include the demand for a highly educated workforce andthe increased cost of this education (Jacobs & Stoner-Eby,1998), the difficulties inherent in earningenough to support children and in achieving stable employment (Halperin, 1998b), and thefrequency of early, nonmarital sexual activity and the availability of contraception (Warner et al.,1998). All have had profound effects on the timing of life events. On one hand then, some markers ofadulthood are considerably delayed. For example, the median age for marriage in 1976 was about22 or 23. By 2006, the median age for marriage had risen to about 27, a difference of more than 4years in only 30 years (Arnett, 2010). On the other hand, other indicators of adulthood, such as theonset of sexual activity, occur much earlier than they did in the past. Such shifts in the timing ofmarker events appear to have delayedthe onset of adulthood, especially in Western societies,where these shifts have most often occurred. However, even in more traditional, nonindustrializedcultures, the transition to adulthood can be a slow process. For example, after completing thepuberty rites that induct boys into the adult ranks of some societies, young males in about 25% ofcultures pass through a period of youth (Schlegel & Barry, 1991). In these societies, males are seenas needing a period of time to prepare for marriage. Serving as warriors during the transition period,for example, allows boys an opportunity to develop skills and to accumulate the material goodsneeded to afford a family. Girls enjoy a similar period of youth in 20% of cultures. Thus, even inmany non-Western, traditional societies, the movement to full adult status takes time. In culturessuch as that of the United States, the pathways to adulthood are remarkable in their variability, soCHAPTER 11 Physical and Cognitive Development in Young AdulthoodGarrett, a 19-year-old, is a 2nd-year college student. He enjoys many privileges that could beconstrued as “adult.” He drives a car, votes in elections, and owns a credit card. He shares anapartment near his college campus with two other undergraduates. He drinks alcohol with his friendsat parties (albeit illegally) and has regular sexual relations with his girlfriend of 10 months. Thecouple split up once and subsequently reunited. During the separation, Garrett dated another youngwoman. At this point, Garrett and his girlfriend have no plans to marry. He wanted to get a jobimmediately after high school, but job opportunities in the trade he aspired to went to moreexperienced workers. He decided to give higher education a try. Garrett has little idea of what hisultimate career path will be. Garrett’s father, who is divorced from Garrett’s mother, pays for histuition and housing costs. He has taken out loans to help finance his son’s education. Garrett’smother provides him with an allowance for food. He works part-time for low wages at a clothing storein a local mall, which helps him pay for clothes and entertainment. Garrett is very responsible atwork, but he is an unenthusiastic college student. His study habits lean toward procrastinating andthen trying to make up for lost time by staying up all night to finish assignments by their deadlines.When Garrett broke his wrist playing sports, his mother had to take a few days off from her job toaccompany him to doctor’s visits. Her health insurance covered most of the bills. Yet, despite theirfinancial support, Garrett’s parents have no legal right to see Garrett’s college grades. Is Garrett anadult? Scholars are likely to disagree about the answer to this question. Most would agree that theonset of adolescence is marked by the changes of puberty. But there are no easily observedphysical changes that signal entry into adulthood. Instead, adulthood is a social construction. One ormore culturally determined criteria usually must be met before one’s maturity is established (Hogan& Astone, 1986), and the criteria vary depending on the observer and the culture. In the past,sociologists have emphasized the achievement and timing of marker events as criteria for adulthood.These have included completing formal education, entering the adult workforce, leaving the familyhome, getting married, and becoming a parent. Around the middle of the last century, a largeproportion of the American population achieved these marker events between the ages of 18 and 24(Rindfuss, 1991). However, if we evaluate our hypothetical student, Garrett, according to thesetraditional marker events, he would not be an adult despite being in the right age range. From asociological perspective, it seems to take longer to grow up today than it did at earlier points inhistory for many reasons. Some of these include the demand for a highly educated workforce andthe increased cost of this education (Jacobs & Stoner-Eby, 1998), the difficulties inherent in earningenough to support children and in achieving stable employment (Halperin, 1998b), and thefrequency of early, nonmarital sexual activity and the availability of contraception (Warner et al.,1998). All have had profound effects on the timing of life events. On one hand then, some markers ofadulthood are considerably delayed. For example, the median age for marriage in 1976 was about22 or 23. By 2006, the median age for marriage had risen to about 27, a difference of more than 4years in only 30 years (Arnett, 2010). On the other hand, other indicators of adulthood, such as theonset of sexual activity, occur much earlier than they did in the past. Such shifts in the timing ofmarker events appear to have delayed the onset of adulthood, especially in Western societies,where these shifts have most often occurred. However, even in more traditional, nonindustrializedcultures, the transition to adulthood can be a slow process. For example, after completing thepuberty rites that induct boys into the adult ranks of some societies, young males in about 25% ofcultures pass through a period of youth (Schlegel & Barry, 1991). In these societies, males are seenas needing a period of time to prepare for marriage. Serving as warriors during the transition period,for example, allows boys an opportunity to develop skills and to accumulate the material goodsneeded to afford a family. Girls enjoy a similar period of youth in 20% of cultures. Thus, even inmany non-Western, traditional societies, the movement to full adult status takes time. In culturessuch as that of the United States, the pathways to adulthood are remarkable in their variability, so

About eduhawks

Check Also

choose the statement that best describes one of the themes explored in the novel so far

The giver Choose the statement that best describes one of the themes explored in the …