Is Our Food Culture Killing Us?

The Ethics of Eating

How our food culture is killing us.

Facebook
Twitter
Email
Flipboard
Pocket

I was born and grew up mostly in Japan, the second of four children of missionary parents who went there in 1955 to convert the Japanese to Lutheranism. Instead, the Japanese converted my family to better eating. Japan’s luscious fruits seduced us first (loquats, persimmons, nashi, the ubiquitous mandarin oranges), but we children fell hard too for its “diner” foods (donburi, ramen) and fishy snacks, especially dried squid and those delectable tiny spicy fish whose bones crunched so satisfyingly when we bit into them. My parents were Western Canadians, so we mostly sat down at dinner to what I still think of as “Lutheran food”—meatloaf, scalloped potatoes, casseroles—but here too Japanese habits slowly transformed our table. Feeding four children on a missionary’s pay is no easy business. Meat was expensive, and there weren’t really any supermarkets in Nagoya or Tokyo in those days, either. So my mother biked daily to the local shops, buying small quantities of meat or fish and larger quantities of vegetables—greens, beans, those lovely little eggplants—that she began heretically sautéing or steaming rather than boiling to death. The electric rice cooker that is a fixture in all Japanese kitchens got hard use in ours, too, and while my mother cooked most of the time, my adventurous father occasionally took a turn. The food I still associate most strongly with my childhood is my father’s fried rice. He’d sauté some onions, throw in whatever leftovers he found in the fridge—two hot dogs, some cooked carrots, half a cup of peas—add a couple of cups of boiled rice from the cooker and pour soy sauce over the whole mess. To our mother’s irritation, it was the favorite food of us kids.

Families with more than one or two children put them through a kind of sorting hat, and I somehow ended up as both surrogate son (the one my father shouted for when he needed to move a couch) and surrogate cook, left in charge when my parents had Bible studies or fellowships. I can’t remember when I learned to roll piecrust, knead bread, fry a chicken, or make a basic white sauce, but long before we left Japan for Minnesota in 1974, when I was almost 15, I could be trusted to put on the table a dinner that everyone would eat. And while we mostly went back to Lutheran food in Minnesota and began buying the kind of sugared cereals and soft drinks that have caused obesity and diabetes rates to shoot up worldwide, we still ate more vegetables than any other family we knew. My father, who never paid for anything that he could make or do himself, pegged out a huge garden behind our mission-owned house in St. Paul, and it was my job and that of my older sister to keep it weeded. This was hot work, but we had salads throughout the summer and frozen beans, peas, and corn, as well as jams and jellies and quarts of pickles, throughout the winter.

I left my family, and my faith, before I was out of my teens, but the foodways of my childhood have stayed with me. I still tend to shop almost daily, often by bike, and for four decades, I’ve cooked and sat down to dinner—with housemates, with family, or on my own—almost every night. Sure, I went vegetarian for a time in my 20s. I also learned that the white sauce and piecrust I made as a child have fancy French names (béchamel, pâte brisée) and developed more range. But I still don’t cook the expensive cuts of meat that I never ate as a child (or buy prepared foods or order takeout), and the pastas and risottos I served my kids and their ever-famished friends are, after all, just the Italian relations of my father’s fried rice. I stopped growing tomatoes when I moved from Massachusetts to New York and stopped baking all our bread about a decade after that, but I still can’t roast a chicken without boiling the carcass down for stock or pass the farmers market at the end of the summer, when field tomatoes are knocked down to a dollar a pound, without making my son carry home 40 pounds so I can cook and freeze pasta sauce for the winter. “You might be a professor, Susan,” one friend once said to me, picking her kids up after dinner, “but you’re also a farm wife.”

I’m not, of course, not least because a real farm wife in the United States today might live amid an ocean of soybeans and in a food desert, miles from a decent grocery store. I, on the other hand, live in Manhattan, less than six blocks from two farmers markets laden with grass-fed beef, free-range eggs, and all manner of fresh produce from Ulster and Dutchess counties’ orchards and family farms. We think of how we eat within a framework of choice, but as Bee Wilson tells us in her new book, The Way We Eat Now, that is ridiculous. We “choose” within a contained food environment, one shaped by availability and advertising, prices and profits, traditions and trends. How we eat has less to do with conviction and still less to do with virtue than with habits and traditions, environment and especially economics—that is, with the complex social order within which we live. New York, for example, seems to offer almost limitless “choice,” but its cornucopia and variety (its Michelin-starred restaurants and gourmet food trucks, its farmers markets and specialty stores) coexists with rampant inequality and ill health. New York might seem a foodie’s heaven, but one-fifth of its residents live below the poverty line and an additional 25 percent in what the city calls “near poverty,” and homelessness has reached levels not seen since the Great Depression. In 2016, 1.2 million New Yorkers were “food insecure.” Today almost 1 million are living with Type 2 diabetes. Commendably activist though the city’s government is, battling those statistics with a host of neighborhood- and school-based health and nutrition programs, New York nevertheless captures perfectly the polarization and paradoxes of “the way we eat now.” And this is why I fell so hungrily on Wilson’s book—devouring it, really—in a quest to understand a global food culture that, frankly, is killing us.

Wilson doesn’t mince words about the magnitude of the problem. We think of food crises in terms of famine or scarcity, and while those can still strike hard, we also face a different kind of challenge today. For the first time, more people (1 billion) are overweight or obese worldwide than are underfed (about 800 million). Today, more people die from diet-related diseases than from the effects of tobacco use. To explain how a prosperous world could defeat hunger, only to fall victim to toxic diets, Wilson turns to nutrition scientist Barry Popkin’s account of the stages through which human diet has evolved. To the meat and foraged plants of the hunter-gatherer days (Stage 1), settled societies added a staple farmed cereal such as wheat, rice, sorghum, or maize (Stage 2), and from the crop rotation and commercialization that allowed for more variety (Stage 3), they saw the rise of cheap-energy-enabled food processing, industrial farming, and the heavy consumption of meat, fats, and sugars (Stage 4). The nations of the West had mostly moved to Stage 4 by the 1960s and ’70s, but the big story about food is how quickly the rest of the world followed suit. Rising mass purchasing power, food companies’ restless search for new markets, and our human cravings have utterly transformed diets—and with them, health—around the globe.

Reading Wilson’s first chapters, I realized that I had lived through one of these transitions in the 1960s, when Japan moved from Stage 3 to Stage 4 in the midst of a period of rampant economic growth. Suddenly, other aspects of my childhood fell into place. The fathers of my friends at the American School in Japan, I remember, worked for Chase Manhattan, Caterpillar, and Coca-Cola. They were changing Japanese food culture just as Japan was changing my family’s dinner. And Japanese diets, although still rice-based, did change and change fast in that decade, incorporating more animal products, fewer vegetables, and much, much more sugar. According to Japan’s National Nutrition Survey, begun by the Americans in 1946 to assess need in the terrible postwar period, average meat consumption tripled between 1961 and 1972, and confectionery consumption peaked in 1971, sparking an epidemic of tooth decay among preschoolers. McDonald’s moved in that year, too, opening its first Tokyo branch in Ginza, the heart of the downtown area. My family started going there for lunch after church on Sundays.

This transformation would be repeated, more rapidly and often more disastrously, in one country after another. As the price of cooking oil tumbled, food became oilier. Fast foods, snack foods, and carbonated beverages swept across the world, and more calories (but not healthy nutrients) came in these unsatisfying forms. Mexico, which serves the behemoth US food market and has been infiltrated by American food companies in turn, shows those effects especially strikingly. The percentage of the population classed as overweight or obese has nearly doubled, from 33 percent in 1988 to almost 60 percent a mere 10 years later but without an equivalent increase in healthy nutrients, as women in all weight percentiles showed equal propensity for anemia. The rest of Latin America, China, and India are also undergoing rapid change. Everywhere, people are consuming more calories (500 more per day than 50 years ago), in forms that are energy-dense but nutrient-poor, creating the phenomenon of the overweight malnourished—who, in a world addicted to a language of choice, are then blamed for their own illnesses. Westerners too readily think of Africa as the locus of famines, but—though food shortages are very real—the continent is a place where most people still eat a staple cereal and a lot of unrefined foods, making it a last home of healthy diets in terms of quality. But African diets, too, are worsening, as beverage companies work to get coolers and sodas into every village shop.

Wilson does a creditable job summarizing the economics and nutritional effects of these transitions, but she’s a food writer, not an agronomist or policy-maker. She’s more interested in eaters and consumers than in growers and producers. She says relatively little about the factories and methods that produce so much of our food (including our meat) and still less about the sustainable and ethical farmers challenging those conglomerates. Nor is she focused mainly on body politics: While she denounces the ubiquitous stigmatization and sheer cruelty to which the overweight are subject and cites studies that demonstrate that stigma’s serious economic, health and psychological effects, she doesn’t really tackle the question of how treating obesity (and not ill health) as an epidemic itself contributes to that stigmatization or delve into recent research on the “healthy obese.” What she does do, very well, is tease out the relationship between the pell-mell socioeconomic and lifestyle changes since the 1980s, changes that have swept the globe and affected virtually all of us, and our often obsessive and self-punishing attitudes and habits around eating. Our “culture of extreme individualism,” she shows, is a major cause of the way we eat now.

Take mealtimes, for example. Most European societies once had fairly strict norms about mealtimes, even though those norms differed. Germans often ate a main meal at midday and sat down to Abendbrot in the evening. Scandinavians eat dinner comparatively early, and the Spanish comparatively late. But “areagrafs”—visualizations of the proportion of the population doing a given activity (working, eating, sleeping, leisure) at a given time—reveal that while sleep and work still fall into conventional periods, eating has become unmoored. People in Mediterranean Europe still have regular mealtimes, with clear activity bubbles showing about 50 percent of the population eating at some point between noon and 2 pm and between 7 pm and 10 pm, but in Northern Europe and especially in England, the most “liberalized” European country, those bubbles have pretty much disappeared. Instead, as Wilson puts it, eating has just become “something that around 10% of people, give or take, might be doing at any given time.” Recent research on London families confirms this: Fewer than a third managed to eat together most nights.

Nor is this just a matter of family meals. Regular workplace mealtimes have nearly vanished too. Labor unions spent years struggling for decent meal breaks and regular hours, but factory canteens have gone the way of most factories, and more work is shift work done in hospitals or offices with nothing but snack machines to turn to during breaks. And once eating becomes radically individualized and done on the fly, norms about what constitutes a meal erode as well. Instead, meals are replaced by a kind of generalized grazing.

It’s true that we eat out more than ever, at all income levels, and order prepared food to our homes as well—patterns that undoubtedly add interest and convenience but worsen our health for the simple reason that restaurant (especially fast food) meals tend to be less healthy than ones cooked at home. We also eat more and more snacks. By the turn of the 21st century, Americans were eating on average 22 pounds of commercial snack food per year; 10 years later, the average American child was getting 37 percent of calories (but nothing like a third of nutrition) from snacks, and the rest of the world isn’t far behind. Of course, people knew that cheese and onion flavored crisps and shrimp flavored crackers weren’t exactly healthy, but American capitalism is adept at turning critique into opportunity. Before we knew it, a whole line of ostensibly healthy snacks had hit our shelves. Wilson, bless her, is absolutely scathing about these, like the yogurt-covered strawberry bits that have more sugar than a Mars bar, the pumpkin and chia “power ball” that has more sugar than Ben & Jerry’s chocolate fudge brownie ice cream. The “protein bar,” she says stoutly, is just “a license to eat candy and call it a virtuous main course.”

Wilson recognizes that millions of people want to eat healthily and well, and she catalogs the many hindrances in their way. Nowadays, diets promising health can gain an immediate global following, but since food grows more slowly than Facebook likes, trends can also have unanticipated effects. The identification of quinoa as a superfood drove prices up by 600 percent in eight years, leading the Bolivian farmers who grow it to make the economically rational but nutritionally disastrous decision to let college students pile it on their Sweetgreen salads and feed their own families American processed foods instead. The craze for guacamole and avocado toast quadrupled US avocado consumption, leading to deforestation and water depletion in the Michoacán region of Mexico and attracting the interest of the drug cartels—which are experienced, after all, at running protection rackets on desirable crops. Crazes induce adulteration, too. As Wilson dryly notes, not enough pomegranates are grown in the whole world to account for all the juice marketed as “100% pomegranate juice.”

Small wonder, then, that debates over food have become so strident and angry—and that some people have just gone off “food” altogether. The market for drinkable “meal replacements” like Soylent and Huel is growing. If you want to meet your nutritional needs while minimizing your carbon footprint and not harming animals, mixing up regular portions of Huel, which advertises itself as a vegan “human fuel” concoction of pea protein, rice, flaxseed, and all the vitamins and nutrients you need, is the way to go. Wilson, who is nothing if not game, tried Huel for a time but just didn’t like it. (She found it “grainy” and “slimy.”) But she clearly also hated seeing food simply in terms of “fuel” rather than as a medium through which people have created culture and meaning for millennia.

So what is to be done? Wilson closes her book with some sensible, if fairly obvious, suggestions: Downsize your plates and glasses; don’t drink anything but water; eat regular meals and avoid snacks; eat more vegetables and less meat; ignore food crazes. It would be a shame, though, if people read this book as simply a guide to healthy living, for Wilson’s more profound point is that we need to stop treating eating as an individual “choice,” as if it were just an “amusing leisure activity” rather than a “basic human need.” Improving diets and health, she says rightly, will require action on many fronts, including much more aggressive public policy. This will not be easy. “The idea that government has a duty to help its citizens eat and drink more healthily remains deeply controversial,” she notes, and it’s readily dismissed as yet another elitist ploy to keep people from eating (and feeding their kids) whatever they like.

Wilson, however, insists that when diets are killing us, “we are allowed to be pejorative,” not toward our fellow eaters but toward those corporations that have virtually shoved bad food down our throats. In her last chapter, she spends some time on the state and local interventions that are attempting to change food cultures. Chile, which had the highest per capita consumption of sugary drinks on the planet in 2016, has brought in tough laws taxing sodas, banning cartoon characters on cereal boxes, and sticking scary black warning labels on foods high in sugar, salt, or fat, including foods like sweetened yogurts long marketed as “healthy.” Still more heartening are policies that seek to turn some good things—clean water, fresh vegetables, collective meals, cooking or farming skills—from “choices” into habits or even entitlements. Wilson credits Amsterdam’s ban on cookies, cupcakes, and all drinks except milk and water in schools with contributing to bringing childhood obesity rates down, but what has brought the city’s children on board, at least if the Dutch 10-year-old I know is any guide, is the program that gave him and his school friends allotments to farm for a year.

New York City has also taken on the battle against food insecurity and poor diet through efforts to raise incomes and expand public services. Wilson approvingly cites a city program that distributes farm produce to senior centers, among other places, where it is turned into low-cost healthy communal meals. The city has other programs as well, including the creation of school gardens and farm-to-school initiatives that bring fresh produce into schools for students to handle, prepare, and eat. It is also committed to increasing public water fountains, which could cut plastic and soda consumption alike. (Next frontier, please: attended public toilets. If the Germans can have these, why can’t we?) Seeing food policy as a matter of distribution and right, not just incentive and choice, might also help us address the pressing question of how we might eat in ways that will do less damage to our already overstretched planet—an issue about which Wilson says too little and that may have to be addressed through rationing, as Guardian columnist Sonia Sodha has suggested, instead of through tax policies that will put carbon-heavy foods (like beef) out of the reach of poor people altogether.

But if we need to think about food more in terms of entitlements, we also need to think of it more in terms of skills and duties, which brings me back to where I started: We need to think about cooking. Wilson is obviously a good and imaginative cook, someone who wants more people to know how to handle a chopping knife and a frying pan and to put a reasonable meal on the table in 30 minutes. Her respect for the skill and value of cooking leads her to say nice things about meal kits, which for all their wasteful packaging and expense are at least instructing some novice cooks. She even finds hope in statistics that reveal that although only 10 percent of people say they love to cook and 45 percent say they hate it, at least 45 percent are ambivalent and perhaps willing to give it a go. I tend to think, however, that this whole survey is another sign of our mixed-up ideas about eating. I cook every day, but I don’t know whether I’d say I “love to cook.” That’s like asking me whether I love to wash or loved to walk my kids to school. It’s like asking hunter-gatherers whether they love to hunt and gather. Some things are necessities or duties, not choices. We treat a great many skills—riding a bike, driving a car, typing, using a smartphone—that way. What gives me hope, then, is less the pricey innovations like meal kits and more the return of cooking to the schools, and not in the girls-only “home economics” form it was taught to me but as part of farm-to-school and, in New York, Cookshop programs that teach much younger children to touch, taste, prepare, and like fresh food.

When my daughter was in the first grade, she had a wonderful, charismatic teacher who disliked the school administration’s determination to put 6-year-olds in front of computer screens and filled her classroom instead with books and drawings, silkworms and turtles. Unsurprisingly, she was fired after a year. (Safety! Hygiene!) But for a few months another mother and I found ourselves in the school’s kitchen every so often, teaching the children math through cooking: If you need one teaspoon of baking soda and half a teaspoon of salt for one pan of cornbread, how many teaspoons do we need for three pans? If the recipe calls for one cup of cornmeal and one cup of flour and we know we need two-thirds as much of the wet ingredients as of the dry, how much honey and buttermilk should we put in if the egg we’re adding measures about a quarter of a cup? The children would puzzle out the math, measure and stir, taste and pour—and then we ate the cornbread together.

Ad Policy
x