Humans may be the species with the longest list of nutritional requirements (Bogin 1991). This is due to the fact that we evolved in environments where there was a high diversity of edible species but low densities of any given species. Homo sapiens sapiens require 45–50 essential nutrients for growth, maintenance, and repair of cells and tissues. These include protein, carbohydrates, fats, vitamins, minerals, and water. As a species, we are (or were) physically active with high metabolic demands. We are also omnivorous and evolved to choose foods that are dense in essential nutrients. One of the ways we identified high-calorie resources in our evolutionary past was through taste, and it is no accident that humans find sweet, salty, fatty foods appealing.
The human predisposition toward sugar, salt, and fat is innate (Farb and Armelagos 1980). Receptors for sweetness are found in every one of our mouth’s 10,000 taste buds (Moss 2013). Preference for sweet makes sense in an ancestral environment where sweetness signaled high-value resources like ripe fruits. Likewise, “the long evolutionary path from sea-dwelling creatures to modern humans has given us salty body fluids, the exact salinity of which must be maintained” (Farb and Armelagos 1980), drawing us to salty-tasting things. Cravings for fat are also inborn, with some archaeological evidence suggesting that hominins collected animal bones for their fatty marrow, which contains two essential fatty acids necessary for brain development (Richards 2002), rather than for any meat remaining on the surface (Bogin 1991).
Bioarchaeological studies indicate Paleolithic peoples ate a wider variety of foods than many people eat today (Armelagos et al. 2005; Bogin 1991; Larsen 2014; Marciniak and Perry 2017). Foragers took in more protein, less fat, much more fiber, and far less sodium than modern humans typically do (Eaton, Konner, and Shostak 1988). Changes in tooth and intestinal morphology illustrate that animal products were an important part of human diets from the time of Homo erectus onward (Baltic and Boskovic 2015; Richards 2002; Wrangham 2009). Once cooking became established, it opened up a wider variety of both plant and animal resources to humans. However, the protein, carbohydrates, and fats preagricultural peoples ate were much different from those we eat today. Wild game lacked the antibiotics, growth hormones, and high levels of cholesterol and saturated fat associated with industrialized meat production today (Walker et al. 2005). Wild game was also protein dense, providing only 50% of energy as fat (Lucock et al. 2014). The ways meat is prepared and eaten today also have implications for disease.
Meats cooked well done over high heat and/or over an open flame, including hamburgers and barbecued meats, are highly carcinogenic due to compounds formed during the cooking process (Trafialek and Kolanowski 2014). Processed meats that have been preserved by smoking, curing, salting, or by adding chemical preservatives such as sodium nitrite (e.g., ham, bacon, pastrami, salami, and beef jerky) have been linked to cancers of the colon, lung, and prostate (Abid, Cross, and Sinha 2014; Figure 16.1). Nitrites/nitrates have additionally been linked to cancers of the ovaries, stomach, esophagus, bladder, pancreas, and thyroid (Abid, Cross, and Sinha 2014). In addition, studies analyzing the diets of 103,000 Americans for up to 16 years indicate that those who ate grilled, broiled, or roasted meats more than 15 times per month were 17% more likely to develop high blood pressure than those who ate meat fewer than four times per month, and participants who preferred their meats well done were 15% more likely to suffer from hypertension (high blood pressure) than those who ate their meats rare (Liu 2018). A previous study of the same cohort indicated “independent of consumption amount, open-flame and/or high-temperature cooking for both red meat and chicken is associated with an increased risk of type 2 diabetes among adults who consume animal flesh regularly” (Liu et al. 2018). Although meat has been argued to be crucial to cognitive and physical development among hominins (Wrangham 2009), there has been an evolutionary trade-off between the ability to preserve protein through cooking and the health risks of cooked meat and chemical preservatives.
Although carbohydrates represent half of the diet on average for both ancient foragers and modern humans, the types of carbohydrates consumed are very different. Ancient foragers ate fresh fruits, vegetables, grasses, legumes, and tubers, rather than the processed carbohydrates common in industrialized economies (Moss 2013). Their diets also lacked the refined white sugar and corn syrup found in many modern foods that contribute to the development of diabetes (Pontzer et al. 2012).
Physical Activity Patterns
How do we know how active our ancestors were? Hominin morphology and physiology provide us with clues. Adaptations to heat discussed in Chapter 14 evolved in response to the need for physical exertion in the heat of the day in equatorial Africa. Human adaptations for preventing hyperthermia (overheating) suggest an evolutionary history of regular, strenuous physical activity. Research with modern foraging populations also offers clues to ancient activity patterns. Although subject to sampling biases and marginal environments (Marlowe 2005), modern foragers provide the only direct observations of human behavior in the absence of agriculture (Lee 2013). From such studies, we know foragers cover greater distances in single-day foraging bouts than other living primates, and these treks require high levels of cardiovascular endurance (Raichlen and Alexander 2014). Recent research with the Hadza in Tanzania indicates they walk up to 11 kilometers (6.8 miles) daily while hunting and gathering (Pontzer et al. 2012), engaging in moderate-to-vigorous physical activity for over two hours each day—meeting the U.S. government’s weekly requirements for physical activity in just two days (Raichlen et al. 2016; Figure 16.2). The fact that humans were physically active in our evolutionary past is also supported by the fact that regular physical exercise has been shown to be protective against a variety of health conditions found in modern humans, including cardiovascular disease (Raichlen and Alexander 2014) and Alzheimer’s dementia (Mandsager, Harb, and Cremer 2018), even in the presence of brain pathologies indicative of cognitive decline (Buchman et al. 2019).
Population size and density remained low throughout the Paleolithic, by some estimates only 0.4 inhabitants per square kilometer (McClellan and Dorn 2006). This limited morbidity and mortality from infectious diseases, which sometimes require large populations to sustain epidemics. Our earliest ancestors had primarily two types of infections with which to contend (Armelagos 1990). The first were organisms that adapted to our prehominin ancestors and have been problems ever since. Examples include head lice, pinworms, and yaws. A second set of diseases were zoonoses, diseases that originate in animals and mutate into a form infectious to humans. One example is the Human Immunodeficiency Virus (HIV) that originated in nonhuman primates and was likely passed to humans through the butchering of hunted primates for food (Sharp and Hahn 2011). Zoonoses that could have infected ancient hunter-gatherers include tetanus and vector-borne diseases transmitted by flies, mosquitoes, fleas, midges, ticks, and the like. Many of these diseases are slow acting, chronic, or latent, meaning they can last for weeks, months, or even decades, causing low levels of sickness and allowing victims to infect others over long periods of time. Survival or cure does not result in lasting immunity, with survivors returning to the pool of potential victims. Such diseases often survive in animal reservoirs, reinfecting humans again and again (Wolfe et al. 2012). A study of bloodsucking insects preserved in samples of amber dating from 15 to 100 million years ago indicates that they carried microorganisms that today cause diseases such as river blindness, typhus, Lyme disease, and malaria (Poinar 2018). Such diseases may have been infecting humans throughout our history and may have had significant impacts on small foraging communities because they more often infected adults, who provided the food supply (Armelagos et al. 2005).
Given their diets, levels of physical activity, and low population densities, nomadic preagricultural humans were likely in better health than many modern populations. This assertion is supported by comparative research conducted with modern foraging and industrialized populations. Measures of health taken from 20th-century foraging populations demonstrate excellent aerobic capacity, as measured by oxygen uptake during exertion, and low body-fat percentages, with triceps skinfold measurements half those of white Canadians and Americans. Serum cholesterol levels were also low, and markers for diabetes, hypertension, and cardiovascular disease were missing among them (Eaton, Konner, and Shostak 1988; Raichlen et al. 2016).