What does it take to get to college graduation? The question becomes more urgent as college tuitions rise and education debt accumulates, even though baccalaureate completion remains a baseline credential for at least modestly secure employment. Our sprawling nation’s deep divisions in terms of class, race, and geography mean that people arrive at college with very different expectations of what will happen to them there, and with different amounts and kinds of resources for navigating the college journey.
Social scientists have responded to this problem with an array of monographs on the challenges students and families face. The titles of such works are typically ominous: Lower Ed, Parenting to a Degree, Paying for the Party, and Paying the Price, to name a few. Their general message: college, long imagined as a virtuous doorway to lifelong prosperity, can be a risky, costly, and emotionally taxing experience.
Now comes Anthony Abraham Jack’s Privileged Poor, a heartfelt ethnography by a sociologist on the faculty of the Harvard Graduate School of Education. Jack spent two years as a participant observer on the campus of an elite private university with highly selective admissions, drawing from interviews with over one hundred students who self-identify as white, black, or Latinx. Jack’s analytic focus is on what social scientists call the “intersectionality” of race, class, and organizational context as simultaneous mechanisms of social ordering and inequality. Jack never names the school, but he does describe it as an institution directly adjacent to Harvard Square in Cambridge, which he calls “Renowned University.”
Jack’s reported findings likely made many administrators at Renowned University, and not a few Renowned students, squirm. We learn of students who casually swapped notes about upcoming holidays in Europe, within earshot of students who wouldn’t have steady meals when Renowned’s dining halls closed for spring break; of students wealthy enough to hire professional decorators to spiff up their dorm rooms while others worked to clean toilets in those same residence halls. Jack also draws out more subtle contrasts among students in terms of their academic sensibility and know-how. One student, for example, was savvy enough to compliment a professor, whom she and Jack bumped into at a theater performance, in order to ask for a book signing. This social skill is compared to that of other students who, as a result of systemic inaccessibility, have no idea what faculty “office hours” are; such uneven distribution of resources is responsible for the intimidation that prevents such students from actively seeking the very mentorship many professors are eager to give.
What Jack contributes to the recent spate of books on college is not only the inside access to what we might reasonably presume to be America’s oldest and most prestigious university, but the illumination of a distinct group of students within this elite institution. These students are “the privileged poor” of Jack’s title: students from economically disadvantaged backgrounds, but with prior experience in elite educational settings. The privileged poor are often black or Latinx. They come to Renowned from selective boarding and country day schools, where their admission has been celebrated as enhancing “diversity.” The privileged poor know a lot about how wealthy white people work, talk, dress, and play; they tend to be less overwhelmed by college than those Jack calls the “Doubly Disadvantaged,” who arrive at Renowned as if from another planet. The Doubly Disadvantaged have no prior experience with elite schooling, so they have that much more to learn, that much more journey to travel, that much more renegotiating of self to do if they choose to remain in the competitive social world of the educated upper-middle class.
How much inequality are elite colleges willing to sustain on their campuses, and at what social cost?
It is the shift in the meaning of college from a privilege to a quasi-right that defines current debates about what it takes to get to graduation day.
A century ago, there were far fewer privileged poor students on elite college campuses. Harvard, Princeton, Yale, and Stanford were for country-club WASPs, hopeless wonks, shameless strivers, future ministers, and schoolteachers. Each had their own reasons for being on campus. For the WASPs, it was in pursuit of the paradigmatic “College Man”: a comely fellow from an affluent family who is athletically capable and modestly—but never excessively—interested in academic pursuits. The College Man lived in a masculine world. Girlfriends and future wives attended women’s colleges, elsewhere, if indeed they were deemed college material at all. For College Men, the undergraduate years were time for the serious play that nurtured strong affective bonds, ensuring the intergenerational reproduction of elite (“old boy”) networks. Wonks, strivers, and do-gooders were necessary contributors to the social machinery of campus. Such students enabled people to believe that higher education was a virtuous civic endeavor, worthy of its tax exemption and freedom from government regulation. And as E. Digby Baltzell pointed out years ago, these students incrementally brought fresh intelligence and network ties to an otherwise homogeneous upper class milieu.
What changed? Conventional wisdom and a good bit of social science has it that college changed due to technological transformations in the character of work, which obliged people to seek postsecondary degrees in order to maintain their employability as the 20th century progressed.1 But it is equally true that college and university leaders have long been adept at selling academic credentials to new patrons.2
First were members of the status-seeking professions—doctors and lawyers—who used postsecondary degrees to establish cartels of who could legally diagnose tonsillitis, prescribe narcotics, file lawsuits, and settle estates. Next were newly affluent urban Jews and Catholics, who vied for prestige with Anglo-Protestant elites. Then came the federal government—US higher education’s biggest patron to date—whose various agencies relied on colleges and universities to train and reward human capital for the serial military conflicts of the 20th century: World Wars I and II, and the subsequent Cold War.
The epitome of government patronage of college degrees is the Servicemen’s Readjustment Act of 1944, popularly known as the GI Bill, which ultimately underwrote college educations for two million Americans while linking the idea and value of college to the nation’s most privileged category of citizens: white men.3 The GI Bill laid the foundation for a truly revolutionary expansion of college access during the Cold War.4 As part of Lyndon Johnson’s Great Society initiatives, the federal government pumped up tuition subsidies, while at a state level, legislatures built new campuses to absorb the federally subsidized demand and spur regional economic development. Very quickly, college became a rite of passage for every American fortunate enough to have finished high school.
It is this shift in the meaning of college, from a privilege to a quasi-right, that defines current debates about what it takes to get to graduation day, and who is responsible for getting people there.
In the 1940s, the responsibility of successfully navigating college clearly lay with students themselves. Then, the opportunity of going to college was a gift, and getting out was defined as an individual accomplishment. Schools offered little in the way of support services or even creature comforts along the way. Food and lodgings were famously poor (that’s part of what fraternity houses were for) and extracurricular shenanigans largely unsupervised (ditto fraternity houses). The distances between administrators, faculty, and students that characterized college life through the middle of the 20th century was largely taken for granted. Today the social contract among students, schools, governments, and families is considerably more complex.
First, there is college cost. Retail tuition rates have grown faster than the rate of inflation for a generation, making finishing college a major financial investment that typically involves borrowing from future income. The opportunity costs of postsecondary education now typically include debt that is especially difficult to pay off without the benefit of a college diploma.5 This makes the financial risk of starting a degree, but not finishing it, much higher than it used to be.
Second, college students are much younger, phenomenologically, today. When the massification of college access began in the 1940s, 18-year-olds were presumed to be functional adults who were capable of making long-term decisions on their own behalf. Today that same chronological age is defined as adolescence.6 Eighteen-year-olds are children, parents remain heavily involved in their lives, and the grown-ups who administer academic and extracurricular life in college are held at least partially liable for the academic progress, psychological well-being, and physical safety of students under their charge.
Third, college campuses are now among the more economically and ethnically diverse domains of a society that is otherwise increasingly segregated by class and race. This is another far-reaching consequence of postsecondary expansion during the Cold War. As college became a quasi-right, people from a wide variety of social backgrounds successfully claimed more and better real estate in higher education. Black, Latinx, and Asian students from a wide range of class backgrounds came to vastly outnumber wealthy white men—even on campuses like Jack’s Renowned University, once ground zero for the College Man. Students from very different backgrounds encounter one another in classrooms, bathrooms, and residence halls. They overhear each other’s elevator conversations, struggle to decipher the semiotic codes of clothing and hairstyles, and are alternately intrigued and frustrated by their differences. Such diversity generates thrilling insight and pleasure, routine misunderstanding, and—occasionally—outright conflict.
Finally, college itself has become a bewilderingly complicated social asset. It is partly a civic, public good: tax-exempt, tuition-subsidized, and reliant on government grants to support basic research. It is partly a philanthropic endeavor, supported by the intergenerational patronage of the very wealthy. It is partly big business, even when officially “nonprofit”: with semiprofessional sports teams, multibillion-dollar research portfolios, and endless capital campaigns. It is also partly a private good, an “investment” that individual students make in their own life and an opportunity for future well-being. These complexities makes it difficult to find a clear rationale for just who is entitled to the experiences afforded by college, and who is responsible when things go awry.
Mere access to college is no longer a sufficient mechanism of opportunity creation in American educational culture.
In the present moment, these questions are addressed within two primary arenas of public discourse: affordability and civility. Take affordability: How much should college cost, and who should pay? This question is only partly about the approximately $1.5 trillion that Americans have amassed in student loan debt. Progressive advocates of greater public subsidy for higher education have expanded their ambitions to include childcare, healthcare, and food security. To the extent that academic persistence is predicated on basic health and well-being, they argue, colleges and universities should explicitly endorse the overall welfare of the students they enroll.
Next is civility: What do college students owe each other in the course of their daily academic lives, and what is the proper role of faculty and administrators in enforcing the reciprocal recognition of difference? If college was once defined as an individual accomplishment—with bad food, party cultures, and racial discrimination (both subtle and outright) as part of the bargain—it is now experienced as a joint venture in which parents, campus grownups, and fellow students are all implicated in making the journey to graduation a humane endeavor. What, in the end, do schools like Renowned University owe their students, and what do its students owe each other?
Our ability to ask these questions out loud, as Jack does in the Privileged Poor, undoubtedly counts as moral progress—but it has a price tag. Mere access to college is no longer a sufficient mechanism of opportunity creation in American educational culture. Ensuring that students graduate with their identities intact is now part of the cost of academic business. Add that cost to the tuition, loans, tax exemptions, and years of life that college already consumes, and Americans have a social welfare project whose scale and reach have scarcely been apprehended.
This article was commissioned by Caitlin Zaloom.
- Claudia Goldin and Lawrence F. Katz, The Race between Education and Technology (Harvard University Press, 2010). ↩
- Randall Collins, The Credential Society An Historical Sociology of Education and Stratification (Columbia University Press, 2019). ↩
- Suzanne Mettler, Soldiers to Citizens The G.I. Bill and the Making of the Greatest Generation (Oxford University Press, 2005). ↩
- Christopher P. Loss, Between Citizens and the State: The Politics of American Higher Education in the 20th Century (Princeton University Press, 2011). ↩
- Joel Best and Eric Best, The Student Loan Mess: How Good Intentions Created a Trillion-Dollar Problem (University of California Press, 2014). ↩
- Richard Settersten and Barbara E. Ray, Not Quite Adults: Why 20-Somethings Are Choosing a Slower Path to Adulthood, and Why It’s Good for Everyone (Bantam, 2010). ↩