Forty years prior to COVID-19, Alvin Toffler saw the future of working from home, and it looked very good. In his 1980 best seller, The Third Wave, the futurist author declared that modern economies would soon shift away from the office and toward “the electronic cottage”—a retro-utopian update of the preindustrial days of home work and piece work, now wired to the modern world via desktop computers, faxes, and dial-up modems. “The electronic cottage raises once more on a mass scale the possibility of husbands and wives, and perhaps even children, working together as a unit,” he explained. This arrangement would propel “greater community stability” and “a renaissance among voluntary organizations.”
Fast-forward to the great pandemic and shutdown of 2020, an extraordinary social experiment unfurling at global scale and astounding speed. By June, 42 percent of the American workforce was working from home. The benefits of the new normal became readily apparent—no commutes! comfy sweatpants!—and many relished the slowdown in the relentless pace of 21st-century life. As Toffler predicted, America’s remote-working classes became simultaneously placeless and newly rooted in place, their mental maps shrinking to a few neighborhood blocks, the local grocery, the nearby park.
Yet Toffler’s optimistic, communitarian forecast failed to perceive how this new electronic reality would exact a toll on mental and financial health; split open new fault lines of class, gender, and race; and accelerate a long-brewing social reckoning. Schools and child-care facilities shuttered, leaving working parents, especially mothers, struggling to balance professional and domestic duties. Some had to cut back work hours; others quit their jobs altogether.
Seven months into the pandemic, the US employment statistics reflected the sharp inequalities of COVID’s economic toll, with job losses falling disproportionately on women and people of color.
Many such losses were among those who could not stay home in the first place, on whose labor in grocery stores and Amazon warehouses and meatpacking plants all the comforts of the electronic cottage were dependent.
There already has been a great deal of speculation about the lasting effects of this information-overloaded digital year on work, schooling, and the public realm. As retailers shutter and major corporations announce they will keep workers home for good, it is clear the pandemic has already changed some things permanently. But looking backward to the roots of remote work is equally important. It turns out that these systems were never really designed to benefit the groups that could gain the most from them: working mothers, caretakers, and their children, especially those without easy access to new technology.
Perhaps the lesson to take from this year of living online is not just about making better, more humane work-and-learn-from-anywhere technology. It is about recognizing technology’s limits.
By moving out of the office, workers lost many of the rituals and regulations that protected them from overwork and exploitation.
Like many of Toffler’s ideas, the “electronic cottage” combined ahistorical grandiosity with canny insight into emerging technological and demographic trends. By the early 1980s, a growing number of mothers of young children had entered America’s waged workforce and, thanks to woefully inadequate child-care infrastructure, were struggling to balance work and family life. One job they could do from home? Code.
As the personal-computer market boomed, some employers hungry for talented programmers used telecommuting as way to recruit women back into the waged workforce. “I want to spend as much time as possible with my child and not having to commute gives me extra time,” one mother and part-time programmer told the New York Times in 1985. Work-from-home life offered flexibility and job fulfillment even for those without children. “I have the kind of personality that likes to make my own schedule,” one unmarried female programmer told the Times. She arranged to spend three days a week in the office and two at home.
But both the women and the men who became telecommuting’s early adopters quickly saw the tradeoffs. By moving out of the office, workers lost many of the rituals and regulations that protected them from overwork and exploitation. “Whenever I’m awake,” one telecommuting engineer admitted to a Washington Post reporter in 1980, “I’m working.” Labor unions were so concerned about remote workers’ susceptibility to employer surveillance and isolation that the AFL-CIO issued a resolution opposing “computer homework” in 1983.
The commercialization of the internet in the 1990s set off another wave of ebullient predictions about the work-from-home future, with little attention paid to addressing those early concerns about its impact on workers’ well-being. Thus, even as work-from-anywhere information-technology jobs increased, the actual percentage of telecommuters remained vanishingly low. One 1994 survey of companies that allowed remote work found that less than 1 percent of employees took advantage of it. The chief obstacle was managerial resistance. “Managers won’t give up control,” one researcher noted. “They still can’t trust that employees are working when they aren’t present.”
Stubborn insistence on face time helped explain why, even at the apex of the dot-com boom in 1999, a mere 7 percent of the American workforce worked remotely. What’s more, 1990s telecommuters often were not working from home. Instead, they flocked to satellite offices built to shorten commutes in traffic-choked regions like Los Angeles and Washington, DC. Grand predictions that the average knowledge worker would soon retreat to an internet-enabled cabin in the woods never came to pass.
The case of the tech industry is particularly revealing. Even as dot-com-era leaders steadfastly preached the gospel that computer hardware and software would upend the way the world worked, played, and communicated, they too remained firmly committed to the office. Skyrocketing real-estate prices in 1990s Silicon Valley and Seattle reflected that even the builders of this miraculous new online infrastructure believed it was far better to work face-to-face.
This intensified after 2000. Instead of dispatching workers to self-directed lives in their electronic cottages, internet-age Silicon Valley traded in drab office spaces for far-grander facilities designed to make workplaces compelling playgrounds that met employees’ every need. Google, founded by two Stanford graduate students, built an elaborate Silicon Valley headquarters that was a fantasy version of a richly endowed college campus, drenched with amenities like free food in the cafeteria, climbing walls, and massage rooms.
As CEO of Apple and Pixar, Steve Jobs helped popularize the gospel of innovation-by-serendipitous-encounter, facilitated by offices with open layouts and spots for impromptu connection. The perks that tech companies loaded into these campuses reflected the kind of employee they wanted to recruit and retain: young, unattached, able to put work first at any cost. Apple’s new corporate headquarters, opened in 2017, featured custom-designed ergonomic desk chairs and a two-story yoga room. Missing from the $5 billion facility: a child-care center.
Even firms that once embraced telecommuting pulled back from it. IBM had made a big remote-work push at the start of the “electronic cottage” era, but slumping stock prices and employee attrition helped prompt a reversal in policy. In a preview of what many would experience in 2020, IBM found that remote work made it difficult to build strong teams and mentor junior employees. Workers could easily be lured away by superstar tech companies with glitzy campuses where they could, as Amazon’s employee motto put it, “work hard, have fun, make history.” By the mid-2010s, Big Blue had joined the rush to build perk-filled offices in what one executive termed “really creative and inspiring locations.”
Soon after Marissa Mayer, a longtime Googler, became CEO of Yahoo! in 2012, she banned remote work altogether. “Speed and quality are often sacrificed when we work from home,” the company’s human-resources director said at the time. Many employees found Mayer’s move particularly distressing because the CEO was the mother of young children. They had hoped she would be more sympathetic to the pressures working mothers faced.
Discouragement of telecommuting supercharged the workaholic vibe of the tech world and contributed to an abysmal record on gender diversity that has worsened over time. A 2018 survey of 80 tech companies found that women made up only 24 percent of the technical workforce, down from 36 percent in 1991. Employees—many of whom, as ever, were mothers of young children—asking for partial or full-time work-from-home arrangements found themselves sidelined from important projects or denied permission altogether. Generous parental benefits were at odds with the realities of workplace culture, one Seattle-area engineer lamented to a labor researcher in 2019. “Everyone is supported before they take maternity leave, but when it comes time to be promoted they are questioned for being absent.”
Into this state of affairs came the novel coronavirus. The American workforce suddenly divided into three: those thrown out of work by the shutdowns, those deemed “essential”—from grocery clerks to surgeons—who continued to work outside the home, and those now working through screens, with nearly all human contact filtered through software. The vast majority of America’s children and college-age adults abruptly began learning online as well, their teachers and professors scrambling to catch up.
The economic and psychic effects rippled outward from the electronic cottage. “Weekends and weekdays are the same,” one Chicago-area mother remarked to a reporter in September. “I don’t really know where I am in time, if that makes sense.” Working parents, meanwhile, were not all right. “All the choices stink,” one researcher was quoted in August, as schools prepared for another term of mostly remote education. “Parents tell me about not being able to sleep because they’re so anxious, or tell me they’ve been crying a lot.”
As the homebound classes logged onto Facebook, ordered necessities from Amazon, and upgraded laptops and smartphones, the quarterly earnings and market valuations of tech’s largest companies soared into the stratosphere. The most popular portal for work and learning, Zoom, transformed from a software product into a verb. By the end of summer, the value of the American tech sector exceeded that of the entire European stock market.
But as public events were canceled, so too were serendipitous connections and accidental meetings. Online events meant that human experience was opted into, not happened upon. Running into strangers immediately signaled danger, whether on the subway or in the grocery aisle. City dwellers no longer wandered down the street to choose a restaurant; delivery-app algorithms chose the restaurant for you.
Even in this placeless fog, geography was destiny, perhaps more than ever. Generations of racial and economic segregation of the housing market meant that where you lived at the start of the pandemic greatly determined how well you survived its physical and economic hardships. America’s pixelated portals filled with scenes of stark inequity. Hundreds of cars lined up outside food banks as the unemployment rate soared to levels unseen since the Great Depression. The toxic combination of spatial segregation, health-care inequity, and economic precarity was compounded by a profoundly bungled federal response that left Black Americans and other racial minorities more likely to fall ill and die from COVID-19.
The destiny of geography, race, and income was evident in the Department of Labor’s September 2020 jobs report, which revealed that the COVID recession was starkly different from those before it. The Great Depression had generated social solidarity partly because the economic pain was so broadly felt across social classes. So too had the Great Recession of 2008–9 left its impact on nearly every income tier.
But 2020’s great disruption made already-staggering inequality even greater. Employment in upper income brackets—including and especially within the roaringly profitable tech industry—was bouncing above pre-COVID levels, while employment among working-class and poor Americans spiraled down.
Digital tools and connections neither transcend society’s problems nor solve them. Sometimes, they make inequities greater.
What comes next? The uncertain trajectory and duration of the pandemic make it particularly difficult to see what work, school, and life will look like on the other side, but clues are emerging.
It is now easier to see why earlier forecasters got the electronic revolution wrong, and why telecommuting and online education never quite gained traction. Months without normal social interaction have reinforced the value of ordinary human connection and the power of place, whether it be a national park or a café table on a car-free city street. The deficit is particularly keen in education, most glaringly for younger children but even visible at the collegiate level, as professors struggle mightily to maintain student engagement via Zoom.
These trends uncovered something that information-society futurists and the techno-optimistic Silicon Valley moguls long pushed to one side: computers do not change everything. Digital tools and connections neither transcend society’s problems nor solve them. Sometimes, in fact, they make inequities greater. Analog ways of doing things—going to work in an office, going to school in a classroom, attending college on a physical campus—have persisted not only because of technological limitations, but because they serve human needs that digital tools cannot fulfill.
Yet the great disruption of 2020 is establishing digital habits that are unlikely to disappear. It turns out that Google and Amazon can continue to generate enormous profits even while their employees work from home. Smaller companies don’t need costly office leases dragging down their bottom line, nor do employees need to live in expensive places like the Bay Area, where the average home price exceeds $1 million.
Some companies have declared they are giving up the office for good. Others may return, but would only ask their workers to come in a few days per week. The partly-from-home arrangements so many workers hungered for in the past may now become commonplace. Some workdays will be spent in the office, the others in the electronic cottage. But the old, familiar issues with remote work are still with us, no closer to being addressed by employers or government regulators.
There are worrisome signs, too, that the bifurcated economy of our pandemic year may continue to intensify. The commercial-real-estate sector faces a reckoning as offices shrink or disappear, and the retailers and restaurants that depend on office-worker business will fade away with them. As the urban job base shrinks, large American cities face a possible redux of the fiscal crises of the 1970s and 1980s, with the urban working class again suffering the brunt of the pain from cuts to transit, housing, and health-care programs. Public universities, their budgets never fully restored since state legislatures slashed funding during the last decade’s financial crisis, are newly reeling from the economic impact of lost tuition and the costs of online instruction.
In some ways, the nation’s public realm is back to where it was at the time Toffler was writing The Third Wave, at the end of the stagflating 1970s: sapped of resources, riven by social discord and public distrust, unable to serve all fairly. At the time, technologists saw technology as the answer to all of America’s many problems. A computer on every desk, a modem to communicate, an electronic cottage to free oneself from workplace drudgery.
But it was already clear in 1980 that the digital realm is ultimately a poor substitute for the public one. Our responsibility, after 2020, is to realize how these digital shortcomings are reflections of the analog world: job insecurity for too many, a child-care system that remains piecemeal and patchwork, standards of work performance that rarely take into account the realities and rhythms of everyday life for workers of every age, gender, and caregiving status. If we do that, we might finally create the sort of electronic cottage that works for everyone.