If you descend into the bowels of the subway station at 34th Street–Herald Square, you might find yourself inside a Burger King. On a Monday evening, it’s packed with people—on their way to or from somewhere else—grabbing a quick meal. And, so, the small, subterranean space smells like grilled meat—like Whopper®, to be exact. But, now, it’s possible to order a new one that is meatless: “100% WHOPPER®, 0% Beef.” Stuck inside a soft, sesame seed–studded bun and topped with ketchup, mayonnaise, slices of tomato, crinkle-cut pickles, some lettuce, and rings of raw white onion, this particular quarter-inch thick Impossible™ burger patty is so thin and thoroughly cooked that it does not “bleed” as advertised. Still, the “0% beef” does taste pretty good.
Ironically, the success of novel foods like the Impossible™ Whopper®—a plant-based version of an iconic American fast-food chain’s signature hamburger—comes from a worrying blindness. These new foods make sense only in denouncing the way we eat now as “unsustainable.” And, yet, in many ways, they depend on the same power relations and preexisting production and distribution infrastructures as do the “old foods” they aim to replace. Indeed, they even depend on the same patterns of fast and cheap consumption and on the aesthetics of consumer desire—flavor, form, and texture—of “old foods,” like industrially raised and slaughtered beef, one of the major causes of the mass ecological harm that these “new foods” claim to fight.
Consequently, understanding the history of why beef hamburgers have become so cheap, ubiquitous, and desired is necessary, especially for understanding why their replacements “must” be plant-based burgers that taste and bleed like ground bovine flesh.
So, the popularity of meat analogues, like the Impossible™ burger, and “the rise of cultured meat”—meat grown from animal stem cells multiplied into muscle fiber in a lab and that’s predicted to hit markets soon—explains Lenore Newman, “simply reflects the hegemony of meat itself.” Newman, a Canadian professor of culinary geography, is the author of Lost Feast: Culinary Extinction and the Future of Food. Newman’s is one of a handful of recently published books—including American journalist Amanda Little’s The Fate of Food: What We’ll Eat in a Bigger, Hotter, Smarter World and British food writer and historian Bee Wilson’s The Way We Eat Now: How the Food Revolution has Transformed Our Lives, Our Bodies, and Our World—that place current eating practices within long histories of human consumption. And all these books do so in hopes that looking to the past will help steer the way toward a “sustainable” future.
The issue is certainly urgent. Global demand for meat is growing and industrialized meat production as it is practiced today does have a huge ecological footprint. Food production now “accounts for about a fifth of total greenhouse gas emissions annually, which means that agriculture contributes more than any other sector, including energy and transportation, to climate change,” and, at the same time, “the single biggest threat of climate change is the collapse of food systems.”
The Impossible™ burger does pollute less; CEO and founder Patrick Brown says that “pound for pound, [his] product generates less than an eighth of the greenhouse gasses created by conventional beef production.” But how big or meaningful of a change is this really, especially in the face of capitalism’s continued control of the global food system?
Despite claims of revolutionary, tech-driven innovation, the past is always present in the “future” of food. The inverse of Burger King’s “100% WHOPPER®, 0% Beef” campaign rings dully but just as true: “0% Beef, 100% WHOPPER®.” What we are watching is not the replacement of old, bad foods with new, good ones, but, instead, how what appears as resistance to our environmentally destructive capitalist food system actually perpetuates it. We may leave beef behind, but, apparently, we can’t leave the Whopper ®. Which is another way of saying, “so long as capitalism controls what we eat, then capitalism is here to stay.”
It was marketed as a “carnivore’s dream.” In 2016, after seven years of development, the Impossible™ burger hit the scene.
The brainchild of Stanford biochemist Patrick Brown, the Impossible™ burger is a plant-based patty flavored with synthetic animal blood—“heme,” short for hemoglobin—that is manufactured in modified yeast cells. Snippets of genome from soybean root nodules have been added to these yeast cells, turning them into little heme factories that can now produce the iron-rich substance that gives animal blood its deep red color and thick metallic flavor. The rest of the patty is made from water, textured wheat protein, coconut oil, potato protein, yeast extract, salt, and about a dozen other additives commonly found in processed food.
Chef David Chang was the first to put an Impossible™ burger on a restaurant menu. In July 2016, the plant-based, lab-derived meat debuted at his restaurant Momofuku Nishi in New York City’s Chelsea neighborhood.
Since then, the FDA has approved Impossible™ burger and its “heme,” and the Silicon Valley based start-up company has raised nearly $1.3 billion to push its products into the mainstream. Impossible™ meat—ground beef, sausage—is now widely available. You can buy it at the grocery store and at fast-food chains around the country. White Castle now sells its signature slider made with an Impossible™ patty, for $1.99.
This new food product is a new answer to the perennial question: “What should we have for dinner?” This question articulates how eaters make daily calculations when deciding on food, including factors of taste, availability, cost, nutrition, and convenience.
But now, many eaters factor in “sustainability” as well. This shift in the question comes as more consumers understand more about their eating habits and personal health, and about how these patterns are inextricable from the future of our global food systems and of the planet.
Fortunately, companies like Impossible™ appear to offer a path forward, ushering us into this “more sustainable” future. But even as they claim to provide new and better answers to this old question, many of these companies remain mired in the long history of our toxic, exploitative, and unsustainable contemporary food system.
In the 19th century, the violent frontier army enabled grass-fed cows to dominate the landscape, as the army strategically killed off the buffalo in order to eliminate the Plains Indians by cutting off their food supply and destroying native economies and ways of life.1 And the invention of barbed wire, in 1874, transformed cattle raising from a small, pastoral practice to a massive, agricultural industry. Tellingly, the iconic figure of the American cowboy emerged from this era of westward expansion, as well—an indication of the influence of these vast changes on world culture.2
In the 20th century, grain-fed cows dominated American cuisine. Thanks to industrialization, the railways, refrigeration, and Chicago’s packing companies, America “turned the cow into a cog in an industrial machine, putting beef on every American plate” and transforming it into an “everyday luxury.”
Decades later, postwar America proved the perfect setting for the rise of burgers. “When wartime rationing ended,” Newman writes, “a newly prosperous and increasingly suburban nation raised a simple beef preparation, the hamburger, to iconic status.”
The American beef industry, born as a solution for the post–World War I grain gluts and buoyed by the government grain subsidies that have persisted ever since, has made beef widely available and cheap. Today, “on average, Americans eat three burgers a week.”
This dietary shift to eating lots of meat is part of a larger trend—abandoning staple foods and embracing industrially farmed and processed foods—that well-known nutrition researcher Barry Popkin has called the “nutrition transition.” And this dietary transition, as Wilson notes in The Way We Eat Now, is linked to sociopolitical changes, as well: the shift from manual labor to industrialization, the expansion of cities, and more sedentary lifestyles.
All these changes manifest in human populations as the decline of deficiency diseases and the rise of diet-related chronic disease. In the West, the transition began in the decades after World War II.
During the Cold War, the United States became obsessed with food production as a weapon in the war against communism. Consequently, the United States doubled down on policies to encourage abundant excess, such as subsidies for farmers based on sheer quantity—rather than, say, quality—of food produced. These domestic policies—as well as the spread of high-yield crops and expensive technologies (such as synthetic-chemical fertilizer and pesticides) to the global South since the 1950s—resulted in “the greatest expansion of agriculture the world has ever seen.” Today, the nutrition transition is happening around the globe—and quickly.
This is no accident. All that excess from “quantity over all else” global agriculture had to go somewhere.
It wasn’t just grain (soy, too, for example), and the sacrifice wasn’t just the quality of the food, but entire local economies that previously had not been based on monocultures.
The taste of beef—be it raised on a sprawling industrial feedlot, or designed in an Impossible™ lab—is the much-craved taste of consumption-driven development under capitalism.
And the glut—just like after World War I—was fed to animals, fueling a growing global meat industry. (The excess grain was also dumped into developing countries as “aid,” undercutting local farmers and economies, and it was made profitable when transformed into highly processed snack foods.) It is hard to overemphasize the power and dominance that multinational food companies accrued in this process. Laments Wilson: “It was these companies, more than the farmers themselves, who profited from the overproduction of subsidized crops in the West.”
The nutrition transition is not just about changes at the level of supply. Wilson writes that “it has also altered our personal hungers, so that we become people who—to a bizarre extent—gravitate toward the same foods … The recent global homogenization of taste is unprecedented.” When it comes to meat, rising incomes bolster this global homogenization. Citing a study that shows that for each $1,000 increase in annual income, meat consumption goes up by 2.6 pounds per person in Asian countries. Wilson states: “There’s a direct correlation between bigger incomes and eating more meat.”
Historian Julie Livingston puts this another way, in Self-Devouring Growth: A Planetary Parable as Told from Southern Africa, a critique of consumption-driven growth under capitalism, when she writes:
Globally, the demand for beef continues to escalate at a dizzying rate of approximately 10 million tons per decade. Even as wealthier Europeans and Americans fearing the risk of cardiovascular disease or colon cancer might begin to cut back on beef consumption, the global South middle classes and the urbanizing working classes will absorb the difference and then some. The taste of beef, in other words, is the taste of development.
Meat, then, isn’t something we can or can’t choose with our individual meal. Meat dominates the planet, for today’s meat industry is one of the pillars of capitalism today.
When placed in the context of this long history of subsidized food production, conglomerated corporate power, and global inequality, does opting for an Impossible™ Whopper in the Burger King in Herald Square really make a difference? The answer is no. “Voting with our forks” simply is not enough. That’s because when we buy food—as Wilson shows—we are acting within large economic structures that are beyond our control: “Much of what we consume is virtually pushed down our throats by forces of supply over which we have no control and of which we are only dimly aware.”
As we eat our way into the future, instead of choosing “new foods” that perpetuate the same systems, perhaps we should be opting into different systems. Today, many farmers, scientists, and tech entrepreneurs around the world are hard at work trying to make changes in our food system by shifting the priority in food production from “quantity”—a holdover from the high-yield postwar era—to “quality,” by which they varyingly mean nutrient density, flavor, low water usage, low waste, tolerance to climate volatility, and community focused.
“The single biggest blowback of the Green Revolution is climate change,” explains professor and acclaimed journalist Amanda Little, who profiles some of them in The Fate of Food: What We’ll Eat in a Bigger, Hotter, Smarter World. But it doesn’t stop there, according to Little. The people whose work she investigates are part of what some are calling “the Next Green Revolution”: the patchy movement to utilize long-held wisdom and technological innovations to lower the environmental impact of the way we eat. Impossible™ is but one of a wide range of players in this movement, and the path it is charting forward is not the only one.
One of the biggest consequences of a food system based on high-quantity agriculture is mass amounts of waste. In the United States, 52 million tons of food are sent to garbage dumps each year and another 10 million tons are left to rot on farms. Little speaks to Darby Hoover, a waste researcher with the environmental group Natural Resources Defense Council, who stresses the importance of transforming our linear economies into circular ones based on “growing, reusing, and generating resources,” rather than “consuming, depleting, and throwing out.” The idea of circularity is old, “as old as time,” Little notes, and “still present in subsistence farming systems the world over,” though it has been intentionally engineered out of the industrial food system. Hoover says: “Now’s our chance to engineer it back in.”
The taste of beef—be it raised on a sprawling industrial feedlot, or designed in an Impossible™ lab—is the much-craved taste of consumption-driven development under capitalism. What is the taste of a circular economy? We don’t yet know exactly what those circular economies of the future might look or taste like. But we do know that they won’t be the same global food systems that most of us rely on for our dinner now.
This article was commissioned by Caitlin Zaloom.
- David D. Smits, “The Frontier Army and the Destruction of the Buffalo: 1865–1883,” Western Historical Quarterly, vol. 25, no. 3 (1994), p. 314. ↩
- Thinking about what else was happening during this time, Newman suggests another factor that may have contributed to the proliferation of cows. “Lamb might have caught on,” she writes, “but the Southern cotton industry, fueled by slave labor, left little room for wool to compete as a textile.” ↩