Twenty-first-century affluent societies have a schizophrenic approach to food, a combination of neglect and preoccupation. Neglect is well documented. Google’s Ngram Viewer, quantifying the appearance of specific words in English-language books published between 1800 and 2008, shows the following relative maximum frequencies for topics of public interest: peaks of up to 0.02% for both “population” and “energy,” only half as much for “environment” and “economy” (0.01%), a mere 0.004% for “agriculture,” and 0.001% for “nutrition.”1 While discussion of the environment peaked around the year 2000, concerns about energy and population peaked some 40 years ago. Those about agriculture have gone downhill ever since the 1960s.
An outright irrational perspective justifying this neglect comes from the economists, who point out that agriculture is now a marginal sector that contributes little to gross domestic product in modern societies. For example, in 2016, farm production accounted for 0.7% of the United States’ GDP.2 My suggestion: just let them live off the output of the most important sector, which now accounts for more than 20% of GDP, the category labeled by the Bureau of Economic Analysis as “finance, insurance, real estate, rental, and leasing.” Bon appétit!
Food output in affluent societies, which are already producing vastly more food than they can consume, keeps on rising. Advertising for food shows no signs of decline. And preoccupation with the consequences arising from a surfeit of food—ubiquitous dieting and obesity—or from consuming specific nutrients—fats, sugars, vitamins—is reaching new highs. These highs are in no small part aided by the many unscrupulous endorsements of outrageous diets, which range from pseudo-Paleolithic carnivory to uncompromising veganism, by one-time Hollywood actresses and Oz-like publicity-hungry physicians.
Of course, our society’s collective lack of concern about food production is a perfect testimony to the enormous achievements that have resulted from more than a century of agricultural advances. Land and labor have been made more productive through innovations in plant breeding, including improved cultivars and genetically modified crops, and through agronomic, technical, and managerial advances ranging from the introduction of crop rotations to the use of field machinery, synthetic nitrogenous fertilizers, and other agrochemicals. In 1800, a New England farmer, with his two oxen, wooden plow, brush harrows, sickles, and flails, had to invest more than seven minutes of labor to produce a kilogram of wheat that would make two small loaves of whole wheat bread.3 By the year 2000, better cultivars, fertilizers, pesticides, and mechanization reduced that time to less than six seconds. During the second half of the twentieth century, average growth of US agricultural productivity had surpassed that of manufacturing productivity.4
Such advances were replicated in the cultivation of other crops and in the production of meat, eggs, and dairy products. The unprecedented availability of high-quality foods resulted in a nutritional transition, in which many positive outcomes were joined by a few worrisome consequences. No outcomes have been more welcome than the elimination of famine, the drastic reduction of undernutrition, and the declining share of disposable income spent on food. Unfortunately, agriculture’s great productivity has been accompanied by increased food waste—typically one-third or even two-fifths of all produced food—as well as by excessive eating, unhealthy diets, and rising obesity. Some foods and nutrients have been rightly implicated in these shifts, but the best evidence shows that others have been undeservedly maligned.
Ending Famines and Reducing Undernutrition
Lists of modern advances are dominated by technical inventions. In a poll of a representative sample of Western intellectuals, ending famines would likely not be mentioned among the first half-dozen most important accomplishments in the modern world. Such a list would almost certainly include computers, mobile phones, and the internet.
In all of Europe except Russia, threats of recurrent famine were eliminated during the early stages of agricultural transition.5 The Irish famine of 1845–49 was the last event that could be attributed in a large degree to failed harvests.6 Russia’s last peacetime famine was in 1891–92. The major famines in the USSR, between 1921 and 1923 along the Volga, in the Ukrainian Holodomor between 1932 and 1933, and in the summer of 1947, were not due to any inherent inability to produce enough food, but were the result of violent conflicts and destructive policy decisions, including Joseph Stalin’s deliberate attempt to starve the Ukrainian population into submission.7 The months of severe malnutrition and famine in western provinces of the Netherlands, Hongerwinter between October 1944 and May 1945, were caused by the Nazi blockade of food shipments.8
In Europe, North America, and Australia, agricultural changes eliminated the worst forms of malnutrition more than a century ago, decades before the common use of agrochemicals and the complete mechanization of field work. Thanks to scientifically based food rationing, Britain was able to maintain adequate food supply even during the nearly six years of World War II.9 As stunting was eliminated, average heights for every age group rose.10 The largest height increments were registered in modernizing Asia. South Korean women gained just over 20 centimeters on average over the twentieth century,11 and 18-year-old Japanese males were nearly 12 centimeters taller than their counterparts in 1900.12 Japanese data show how sensitive this key anthropometric variable is to any deterioration of food supply. Between 1900 and 1940, the average height of 10-year-old boys rose by about 0.15 centimeters per year, but wartime food shortages reduced it by 0.6 centimeters per year. Because of continuing postwar food shortages and severe rationing, the growth resumed only in 1949, and subsequently 10-year-old boys gained on average 0.25 centimeters per year for half a century.13
Affordability of food supply is best measured by the share of disposable income that an average household spends on food. A relationship between income and family expenditure on food was first defined by Ernst Engel, a German economist and statistician, more than one and a half centuries ago: “The poorer a family, the greater share of its expenditure must be spent on the procurement of food.”14 This relationship holds in every modernizing society. Engel’s law refers only to the share of disposable income; the total amount spent on food is actually rising in wealthy households and in affluent societies, while its share of the total expenditure is falling.
During the late nineteenth century, poor, working-class, urban households commonly spent 60% of their disposable per-capita income on food. In 1900, the share in large US cities was about 43%. By 1950, it was reduced to 20%, and since the year 2000 it has been below 10%.15 Within this transition, an increasing share of income is being spent on eating out. In the US in 1900, that share was well below 10%. By 1950, it was about 20%. The two kinds of spending nearly matched by the early 1990s. Since 2010, total expenditures on food away from home have been higher than on food prepared at home; the latest totals show a difference of about 16%.16 While restaurant food delivery has long had a significant niche in the urban lunch and dinner market—sushi in Japan and pizza in the US being two common examples—this practice has become more common with effortless ordering on mobile phones. And there is yet another new, hybrid, category consisting of meal-kit options with portioned ingredients and accompanying instructions delivered to be cooked at home.17
Higher affordability of food has been accompanied by a still-increasing diversity of readily available foodstuffs. Even in cold-climate societies whose average food supply contained sufficient energy and provided the minima of essential nutrients, traditional diets were rather monotonous. In non-Mediterranean Europe, diets were dominated by bread, coarse grains, and, starting in the sixteenth century, potatoes, usually eaten in soups and stews. There was little distinction in the composition of the three, or sometimes just two, daily meals. Now most supermarkets in affluent countries carry products from distant continents: Andean quinoa, Ethiopian teff, a variety of Asian noodles, Spanish jamón, Italian prosciutto, and guava or mango juice. Successive waves of non-Western cuisines have been introduced in often less-than-authentic restaurants—epitomized by the Chinese-American creations of beef and broccoli, orange chicken, and fortune cookies.18 Imported cuisines eventually entered the mainstream: perhaps nothing characterizes this better than tikka masala becoming the most popular meal in England.
In the US, the non-Western cuisines that have been absorbed, adapted, embraced, and mainstreamed include Chinese, Indian, Japanese, Korean, Mexican, and Thai. In the UK, the most pronounced influence comes with Bangladeshi, Indian, and Pakistani dishes. In continental Europe, there have been strong incursions of Chinese, Indian, Lebanese, Thai and Turkish cuisines. In terms of social impact, these dietary invasions and domestications are an important part of coming to terms with “others” during rapid globalization, particularly in societies that were until recently fairly insular.
Food Waste and Obesity
All affluent countries have been producing substantially more food than could be possibly eaten, even by gluttonous populations. With the sole exception of Japan, average daily supply is now in excess of 3,000 kcal per capita. More than 30%, and sometimes even more than 40%, of all food energy originates from lipids. Dietary protein is far in excess of daily needs: the highest supplies are up to 120 grams per day, with 55–65% of it coming from animal foods. Annual meat consumption generally exceeds average body weight (70 kilograms); in terms of carcass weight, and in some countries it surpasses 100 kilograms per capita.19 The two highly undesirable phenomena of food waste and obesity should not be seen just as inevitable consequences of excessive food supply. Rather they are specific, food-related demonstrations of much wider social unraveling—with one characterized by the human disregard of other forms of life, and the other by the retreat of once commonly shared behavioral norms.
Food production is the single largest activity putting humans into competition with other species inhabiting the biosphere. For that reason, it is particularly destructive to produce food that will be wasted. Food balance sheets put the recent American food supply between 4,000 and 4,200 kcal per capita.20 Only tall, well-muscled and hardworking adult males—think of lumberjacks, fishermen, or miners—would need that much food energy to cover their average daily needs. That much food is actually never available at the retail level. Adjustments for “spoilage and other waste” in production applied by the United States Department of Agriculture reduce the average daily supply to about 2,600 kcal per capita. But even this rate, although 35% lower than the gross supply, is still excessive as a population-wide mean of actual consumption. While it may be required by healthy, active young males, it far exceeds the needs of infants, young children, and housebound octogenarians. Individuals in the last group may need fewer than 1,500 kcal per day.
US studies of actual average daily food consumption rely on dietary recall, self-reported food consumption by a representative sample of population. These studies have determined a population-wide mean consumption of about 2,100 kcal per capita.21 With average supply at 3,600 kcal per capita, this results in daily food waste of 1,500 kcal per capita. This is an astonishingly high rate that could be dismissed by pointing out the notorious inaccuracy of dietary recall methods. But there is another way to confirm the large gap: quantifying metabolic requirements calculated using physiological models which relate body weight to the amount of food eaten.
Kevin Hall et al. found a slight increase in the average amount of food eaten by the US population, from about 2,100 kcal per day in 1974 to 2,300 kcal per day in 2005.22 But during those three decades, average per-capita food supply rose from about 3,000 to 3,700 kcal per day. Average food waste increased from 28% of the supply in 1974 to almost 40% in 2005 when it averaged more than 1,400 kcal per day. If that rate had not increased by 2017, then this amount of wasted food energy from the US population of 325 million could have provided adequate nutrition, assuming 2,200 kcal per day, for just over 200 million people. That is almost the entire population of Brazil, the world’s fifth most populous nation.
American food losses are no exception. Japan has been the affluent world’s most frugal food consumer. As its population ages, the actual average per-capita intake has fallen below 2,000 kcal per day.23 Even so, the food supply is about 2,500 kcal per capita. This prorates to retail and household food waste of at least 25%. British surveys found that waste amounts to about 21% of all food purchased by households, with specific rates including 40% for vegetables and 20% for meat and fish.24 Some items, including sausages, bacon, and meat-based ready-to-eat and takeaway meals, are often thrown away unopened. Canada has the highest loss-and-waste estimate: 58% of all food produced, a total of 35.5 megatons (Mt) per year. Of this amount, 34% is wasted during processing and 24% during production.25 Revealingly, researchers found that 32% of this loss and waste could be rescued for consumption.
A study by the Food and Agriculture Organization of the United Nations concluded that annual losses in Europe and North America prorate to about 100 kg per capita.26 In 2018, with the combined population of about 860 million in the EU, US, and Canada, that would amount to nearly 90 Mt/year. In mass terms, this is nearly equivalent to the combined annual wheat harvests in Canada, France, and Germany. Waste rates for specific food categories range from more than 10% for meat to 25% for grains and 20–30% for vegetables. But behind every unit of wasted meat, there are at least three to five units of wasted plant feed used to raise the animals.
Environmental damage is caused by nitrate leaching, excessive soil erosion, and antibiotic resistance attributable to wasted food. The damage is compounded by inability to deal with the large volumes of refuse that cannot be easily recycled since a large share of food waste is commingled with paper, metal, glass, and several kinds of plastic. Given the scale and complexity of food production, trade, distribution, and preparation, it would be unrealistic to believe that food losses can be reduced to rates in the low single digits. At the same time, there is no justification for tolerating losses in excess of 20 or 25%.
Current food production is so excessive that, even after large-scale losses, too much remains to be consumed. This excess contributes to the recent wave of obesity. This condition is not a metabolic inevitability. It can be directly traced to a combination of ignorance regarding healthy eating, deliberate and often compensatory overeating, and preference for indolent, sedentary lifestyles. The body mass of any population is normally distributed; there will be always people whose weight is far above the mean value. But there is no reason to claim that the mean itself must be expected to rise.
Historical data show clearly that the recent wave of obesity has not been a simple function of rising food supply but, overwhelmingly, a matter of choice. The US has the highest obesity rate among all populous countries. Normal weight is defined as a body mass index (BMI) between 19 and 25.27 Overweight people have a BMI between 25 and 30; above 30 signifies obesity. After World War II, the share of the US population with a BMI below 25 remained steady until the late 1970s. The overweight population was a third of the overall population, and the obese share was less than an eighth. By the century’s end, excessive energy intake and decline in daily activity caused a well-documented epidemic of obesity: the share of obese adults had more than doubled. By 2010, it rose to nearly 36% for adults aged 20 years and older; 5% of these were categorized as extremely obese.28 Almost the same number of adults were overweight. Three in four American adults now have an abnormal BMI, and this unmistakable rotundity is evident in all public spaces. Unhealthy weight gain now begins at earlier ages. Fifty percent of both children 6–11 years old and youth aged 12–19 years are overweight or obese; percentages are even higher among lower-income groups. This reality is particularly worrisome because childhood obesity commonly becomes a life-long condition.
America, although still the unenviable obesity leader among populous countries, has some close runners-up: combined overweight and obesity ratios for adult men are above 60%, in descending order, in tiny Kuwait and Qatar, as well as in Australia, Mexico, the UK, Germany, the Czech Republic, and Portugal. Among adult females, obesity shares are higher than 50% in all of the above countries as well as in South Africa and Morocco.29 The worldwide trend has been going in the wrong direction: between 1980 and 2008, adult BMI increased in all but eight of the world’s nearly 200 countries. Global BMI averages rose to about 24 for both men and women. The total of obese children and adolescents increased tenfold in four decades to reach 124 million by 2016, at which time a further 213 million were overweight.30 Continuation of this trend would see the worldwide number of obese children surpassing that of undernourished youngsters as soon as 2022.
Nutrition and Cardiovascular Disease
Research to link cardiovascular diseases, the leading cause of death in affluent societies, to specific nutrients has been ongoing for decades. But any review of the evidence should consider this caveat: Edward Archer et al. believe that during the past 60 years, all large-scale diet–disease studies that relied on dietary recalls have failed to measure actual food intakes and thus have engendered a fictional discourse.31 Memory-based assessment methods are unreliable, unverifiable, and obviously antithetical to any scientific inquiry. Yet such methods have been used in thousands of studies to collect data on millions of individuals, which means that, “[c]ontrary to current conjectures, there are no valid data demonstrating that ‘diet’ per se is causal [emphasis original] to increased mortality from obesity, [non-communicable diseases], and metabolic diseases.”32 Is this an excessively sweeping conclusion? Consensus has reversed regarding links between diet and disease; most past claims seem to have been unsound.
Post–World War II epidemiological studies, and perhaps most prominently the Framingham Heart Study, identified the intake of saturated fat and cholesterol in meat, eggs, and dairy products as a very important, even the decisive contributor to the rise of cardiovascular mortality in the Western world.33 Michael Rees concluded that “[t]here is no doubt [emphasis original] that diet and diet alone is responsible for the vast majority of all coronary heart disease in western [sic] society.”34 In 1957, the American Heart Association recommended lowered fat intake as the best way to reduce the incidence of coronary heart disease.35 Subsequently, Ancel Keys became the leading promoter of low-fat Mediterranean diet.36
These findings led to a strong advocacy of low-fat diets for all Americans. Between 1968 and 1987, all major recommendations concerning diet and coronary heart disease advocated that dietary shift.37 American, Canadian, and European dietary guidelines for adults recommended that fat intake should remain below 30% and saturated fat consumption below 10% of total food energy, and that daily cholesterol intake should not exceed 250–300 milligrams. Ann La Berge chronicled in detail how this
low-fat approach became an overarching ideology, promoted by physicians, the federal government, the food industry, and the popular health media … even though there was no clear evidence that it prevented heart disease or promoted weight loss. Ironically, in the same decades that the low-fat approach assumed ideological status, Americans in the aggregate were getting fatter, leading to what many called an obesity epidemic. Nevertheless, the low-fat ideology had such a hold on Americans that skeptics were dismissed. Only recently has evidence of a paradigm shift begun to surface.38
Even during the 1970s and the 1980s, at the high point of low- and no-fat advocacy, there was plenty of epidemiologic and demographic evidence that made it impossible to conclude that “these interventions alone will result in major declines of cardiovascular mortality.”39 This particular diet–disease link was never as strong as claimed by its proponents. Since the 1950s, consumption of all animal lipids has substantially lowered, and lipid consumption has shifted from saturated to poly- and monounsaturated fats. Despite this transformation, the latest studies confirm that there are no strong, universal causal links between diet and heart disease. More specifically, the intake of saturated fats is not associated with all-cause mortality, cardiovascular and coronary heart disease, ischemic stroke, or type 2 diabetes.40
The most comprehensive metastudy of links between the consumption of saturated fats and heart disease concluded that, “the pattern of findings from this analysis did not yield clearly supportive evidence for current cardiovascular guidelines that encourage high consumption of polyunsaturated fatty acids and low consumption of saturated fats.”41 Moreover, a study of nearly 3,000 older US adults concluded that long-term exposure to the fatty acids present in dairy was not significantly associated with total mortality or with incidents of cardiovascular disease (CVD) but that “[h]igh circulating heptadecanoic acid was inversely associated with CVD and stroke mortality and potentially associated with higher risk of non-CVD death.”42 Again, this is a reversal of former conclusions, one that recognizes the complex nature of the exposure: heptadecanoic acid appears to be CVD- and stroke-protective, but its long-term presence carries a potentially higher risk of non-CVD death.
Another, more specific link between fatty acids and cardiovascular disease now appears in doubt. Higher intakes in food or supplements of omega-3 polyunsaturated fatty acids (eicosapentaenoic acid and docosahexaenoic acid) from oily fish, as well as of alpha-Linolenic acid from plants, were advocated to lower the risks of cardiovascular events. But the most extensive meta-analysis of relevant studies shows, with a high degree of confidence, that higher intakes of these acids have little or no effect on overall mortality or on cardiovascular events, including cardiovascular death, coronary death, stroke, or heart irregularities.43
The latest addition to this mounting reversal of long-ruling orthodoxies is a finding about the perils of eating red meat.44 A systematic review concluded that adults should continue current moderate consumption of both unprocessed and processed red meat—the latter being charcuterie ranging from hams and sausages to pâté and terrines. This conclusion is based on the fact that “the certainty of evidence for the potential adverse health outcomes associated with meat consumption was low to very low” and that “there was a very small and often trivial absolute risk reduction based on a realistic decrease of 3 servings of red or processed meat per week.”45
Simple Lessons
The most obvious concern should not be what specific foods to avoid and what diets to follow. Our evolutionary heritage is indisputable: we are an omnivorous species. The advice to consume a wide variety of plant and animal foodstuffs has been always sound. In recent years, this course has been reaffirmed in new findings that overturn, or greatly weaken, former restrictive and prescriptive recommendations, some of which had gone as far as demonizing almost any intake of bread or red meat.
This omnivory should remain within the confines of actual energy and nutrient needs, particularly because in modern societies dominated by sedentary employment, excessive food consumption tends to be combined with reduced physical activity. But if the grossly excessive food supply is not reduced, remaining within the confines of healthy omnivory would further increase today’s unacceptably high level of food waste. Omnivorous moderation should be then accompanied by significant curtailment of the oversupply of food. Such a change would bring many environmental benefits, but it would require a fundamental shift of priorities, from growth to deliberately managed retreat, an antithetical move for modern economies.