While it may be hard to imagine today, for most of our species’ existence we were nomadic: moving through the landscape without a singular home. Instead of a refrigerator or pantry stocked with food, we procured nutrition and other resources as needed based on what was available in the environment. Instead of collecting and displaying shelves of stuff, we kept our possessions small for mobility. This section gives an overview of how the foraging lifestyle enabled the expansion of our species and how the invention of a new way of life caused a chain reaction of cultural change.
The Foraging Tradition
There are a variety of possible subsistence strategies, or methods of finding sustenance and resources. To understand our species is to understand the subsistence strategy of foraging, or the search for resources in the environment. While most (but not all) humans today live in cultures that practice agriculture (whereby we greatly shape the environment to mass produce what we need), we have spent far more time as nomadic foragers than as settled agriculturalists. As such, our traits have evolved to be primarily geared toward foraging. For instance, our efficient bipedalism allows persistence-hunting across long distances as well as movement from resource to resource.
How does human foraging, also known as hunting and gathering, work? Anthropologists have used all four fields to answer this question (see Ember n.d.). Typically, people formed bands, or kin-based groups of around 50 people or less (rarely over 100). A band’s organization would be egalitarian, with a flexible hierarchy based on an individual’s age, level of experience, and relationship with others. Everyone would have a general knowledge of the skills assigned to their gender roles, rather than specializing in different occupations. A band would be able to move from place to place in the environment, using knowledge of the area to forage (Figure 12.21). In varied environments—from savannas to tropical forests, deserts, coasts, and the Arctic circle—people found sustenance needed for survival. Our species’s omnivorousness and cultural abilities led us to excel in the generalist-specialist niche.
Humans made extensive use of the foraging subsistence strategy, but this lifestyle did have limitations. The ease of foraging depended on the richness of the environment. Due to the lack of storage, resources had to be dependably found when needed. While a bountiful environment would require just a few hours of foraging a day and could lead to a focus on one location, the level and duration of labor increased greatly in poor or unreliable environments. Labor was also needed to process the acquired resources, which contributed to the foragers’ daily schedule (Crittenden and Schnorr 2017).
The adaptations to foraging found in modern Homo sapiens may explain why our species became so successful both within Africa and in the rapid expansion around the world. Overcoming the limitations, each generation at the edge of our species’s range would have found it beneficial to expand a little further, keeping contact with other bands but moving into unexplored territory where resources were more plentiful. The cumulative effect would have been the spread of modern Homo sapiens across continents and hemispheres.
After hundreds of thousands of years of foraging, some groups of people around 12,000 years ago started to practice agriculture. This transition, called the Neolithic Revolution, occurred at the start of the Holocene epoch. While the reasons for this global change are still being investigated, two likely co-occurring causes are a growing human population and natural global climate change.
Overcrowding could have affected the success of foraging in the environment, leading to the development of a more productive subsistence strategy (Cohen 1977). Foraging works best with low population densities since each band needs a lot of space to support itself. If too many people occupy the same environment, they deplete the area faster. The high population could exceed the carrying capacity, or number of people a location can reliably support. Reaching carrying capacity on a global level due to growing population and limited areas of expansion would have been an increasingly pressing issue after the expansion through the major continents by 14,600 years ago.
A changing global climate immediately preceded the transition to agriculture, so researchers have also explored a connection between the two events. Since the Last Glacial Maximum of 23,000 years ago, the Earth slowly warmed. Then, from 13,000 to 11,700 years ago, the temperature in most of the Northern Hemisphere dropped suddenly in a phenomenon called the Younger Dryas. Glaciers returned in Europe, Asia, and North America. In Mesopotamia, which includes the Levant, the climate changed from warm and humid to cool and dry. The change would have occurred over decades, disrupting the usual nomadic patterns and subsistence of foragers around the world. The disruption to foragers due to the temperature shift could have been a factor in spurring a transition to agriculture. Researchers Gregory K. Dow and colleagues (2009) believe that foraging bands would have clustered in the new resource-rich places where people started to direct their labor to farming the limited area. After the Younger Dryas ended, people expanded out of the clusters with their agricultural knowledge (Figure 12.22).
The double threat of the limitation of human continental expansion and the sudden global climate change may have placed bands in peril as more populations outpaced their environment’s carrying capacity. Not only had a growing population led to increased competition with other bands, but environments worldwide shifted to create more uncertainty. As people in different areas around the world faced this unpredictable situation, they became the independent inventors of agriculture.
Agriculture around the World
Due to global changes to the human experience starting from 12,000 years ago, cultures with no knowledge of each other turned toward intensely farming their local resources (see Figure 12.22). The first farmers engaged in artificial selection of their domesticates to enhance useful traits over generations. The switch to agriculture took time and effort with no guarantee of success and constant challenges (e.g. fires, droughts, diseases, and pests). The regions with the most widespread impact in the face of these obstacles became the primary centers of agriculture (Figure 12.23; Fuller 2010):
- Mesopotamia: The Fertile Crescent from the Tigris and Euphrates rivers through the Levant was where bands started to domesticate plants and animals around 12,000 years ago. The connection between the development of agriculture and the Younger Dryas was especially strong here. Farmed crops included wheat, barley, peas, and lentils. This was also where cattle, pigs, sheep, and goats were domesticated.
- South and East Asia: Multiple regions across this land had varieties of rice, millet, and soybeans by 10,000 years ago. Pigs were farmed with no connection to Mesopotamia. Chickens were also originally from this region, bred for fighting first and food second.
- New Guinea: Agriculture started here 10,000 years ago. Bananas, sugarcane, and taro were native to this island. Sweet potatoes were brought back from voyages to South America around the year C.E. 1000. No known animal farming occurred here.
- Mesoamerica: Agriculture from Central Mexico to northern South America also occurred from 10,000 years ago; it was also only plant based. Maize was a crop bred from teosinte grass, which has become one of the global staples. Beans, squash, and avocados were also grown in this region.
- The Andes: Starting around 8,000 years ago, local domesticated plants started with squash but later included potatoes, tomatoes, beans, and quinoa. Maize was brought down from Mesoamerica. The main farm animals were llamas, alpacas, and guinea pigs.
- Sub-Saharan Africa: This region went through a change 5,000 years ago called the Bantu expansion. The Bantu agriculturalists were established in West Central Africa and then expanded south and east. Native varieties of rice, yams, millet, and sorghum were grown across this area. Cattle were also domesticated here.
- Eastern North America: This region was the last major independent agriculture center, from 4,000 years ago. Squash and sunflower are the produce from this region that are most known today, though sumpweed and pitseed goosefoot were also farmed. Hunting was still the main source of animal products.
By 5,000 years ago, our species was well within the Neolithic Revolution. Agriculturalists spread to neighboring parts of the world with their domesticates, further expanding the use of this subsistence strategy. From this point, the human species changed from being primarily foragers to primarily agriculturalists with skilled control of their environments. The planet changed from mostly unaffected by human presence to being greatly transformed by humans. The revolution took millennia, but it was a true revolution as our species’ lifestyle was dramatically reshaped.
Cultural Effects of Agriculture
The worldwide adoption of agriculture altered the course of human culture and history forever. The core change in human culture due to agriculture is the move toward not moving: rather than live a nomadic lifestyle, farmers had to remain in one area to tend to their crops and livestock. The term for living bound to a certain location is sedentarism. This led to new aspects of life that were uncommon among foragers: the construction of permanent shelters and agricultural infrastructure, such as fields and irrigation, plus the development of storage technology, such as pottery, to preserve extra resources in case of future instability.
The high productivity of successful agriculture sparked further changes (Smith 2009). Since successful agriculture produced a much-greater amount of food and other resources per unit of land compared to foraging, the population growth rate skyrocketed. The surplus of a bountiful harvest also provided insurance for harder times, reducing the risk of famine. Changes happened to society as well. With a few farming households producing enough food to feed many others, other people could focus on other tasks. So began specialization into different occupations such as craftspeople, traders, religious figures, and artists, spurring innovation in these areas as people could now devote time and effort toward specific skills. These interdependent people would settle an area together for convenience. The growth of these settlements led to urbanization, the founding of cities that became the foci of human interaction (Figure 12.24).
The formation of cities led to new issues that sparked the growth of further specializations, called institutions. These are cultural constructs that exist beyond the individual and have wide control over a population. Leadership of these cities became hierarchical with different levels of rank and control. The stratification of society increased social inequality between those with more or less power over others. Under leadership, people built impressive monumental architecture, such as pyramids and palaces, that embodied the wealth and power of these early cities. Alliances could unite cities, forming the earliest states. In several regions of the world, state organization expanded into empires, wide-ranging political entities that covered a variety of cultures.
Urbanization brought new challenges as well. The concentration of sedentary peoples was ideal for infectious diseases to thrive since they could jump from person to person and even from livestock to person (Armelagos, Brown, and Turner 2005). While successful agriculture provided a large surplus of food to thwart famine, the food produced offered less diverse food sources than foragers’ diets (Cohen and Armelagos 1984; Cohen and Crane-Kramer 2007). This shift in nutrition caused other diseases to flourish among those who adopted farming, such as dental cavities and malocclusion (the misalignment of teeth caused by soft, agricultural diets). The need to extract “wisdom teeth” or third molars seen in agricultural cultures today stems from this misalignment between the environment our ancestors adapted to and our lifestyles today.
As the new disease trends show, the adoption of agriculture and the ensuing cultural changes were not entirely positive. It is also important to note that this is not an absolutely linear progression of human culture from simple to complex. In many cases, empires have collapsed and, in some cases, cities dispersed to low-density bands that rejected institutions. However, a global trend has emerged since the adoption of agriculture, wherein population and social inequality have increased, leading to the massive and influential nation-states of today.
The rise of states in Europe has a direct impact on many of this book’s topics. Science started as a European cultural practice by the upper class that became a standardized way to study the world. Education became an institution to provide a standardized path toward producing and gaining knowledge. The scientific study of human diversity, embroiled in the race concept that still haunts us today, was connected to the European slave trade and colonialism.
Also starting in Europe, the Industrial Revolution of the 19th century turned cities into centers of mass manufacturing and spurred the rapid development of inventions (Figure 12.25). In the technologically interconnected world of today, human society has reached a new level of complexity with globalization. In this system, goods are mass-produced and consumed in different parts of the world, weakening the reliance on local farms and factories. The imbalanced relationship between consumers and producers of goods further increases economic inequality.
As states based on agriculture and industry keep exerting influence on humanity today, there are people, like the Hadzabe of Tanzania, who continue to live a lifestyle centered on foraging. Due to the overwhelming force that agricultural societies exert, foragers today have been marginalized to live in the least habitable parts of the world—the areas that are not conducive to farming, such as tropical rainforests, deserts, and the Arctic (Headland et al. 1989). Foragers can no longer live in the abundant environments that humans would have enjoyed before the Neolithic Revolution. Interactions with agriculturalists are typically imbalanced, with trade and other exchanges heavily favoring the larger group. One of anthropology’s important roles today is to intelligently and humanely manage equitable interactions between people of different backgrounds and levels of influence.
Special Topic: Indigenous Land Management
Insight into the lives of past modern humans has evolved as researchers revise previous theories and establish new connections with Indigenous knowledge holders.
The outdated view of foraging held that people lived off of the land without leaving an impact on the environment. Accompanying this idea was anthropologist Marshall Sahlins’s (1968) proposal that foragers were the “original affluent society” since they were meeting basic needs and achieving satisfaction with less work hours than agriculturalists and city-dwellers. This view countered an earlier idea that foragers were always on the brink of starvation. Sahlins’s theory took hold in the public eye as an attractive counterpoint to our busy contemporary lives in which we strive to meet our endless wants.
A fruitful type of study involving researchers collaborating with Indigenous experts has found that foragers did not just live off the land with minimal effort nor were they barely surviving in unchanging environments. Instead, they shaped the landscape to their needs using labor and strategies that were more subtle than what European colonizers and subsequent researchers were used to seeing. Research from two regions shows the latest developments in understanding Indigenous land management.
In British Columbia, Canada, the bridging of scientific and Indigenous perspectives has shown that the forests of the region are not untouched wilderness but, rather, have been crafted by Indigenous peoples thousands of years ago. Forest gardens adjacent to archaeological sites show higher plant diversity than unmanaged places even after 150 years (Armstrong et al. 2021). On the coast, 3,500-year-old archaeological sites are evidence of constructed clam gardens, according to Indigenous experts (Lepofsky et al. 2015). Another project, in consultation with Elders of the T’exelc (William Lakes First Nation) in British Columbia, introduced researchers to explanations of how forests were managed before the practice was disrupted by European colonialism (Copes-Gerbitz et al. 2021). Careful management of controlled fires reduced the density of the forest to favor plants such as raspberries and allow easier movement through the landscape.
Similarly, the study of landscapes in Australia, in consultation with Aboriginal Australians today, shows that areas previously considered wilderness by scientists were actually the result of controlling fauna and fires. The presence of grasslands with adjacent forests were purposely constructed to attract kangaroos for hunting (Gammage 2008). People also managed other animal and insect life, from emus to caterpillars. In Tasmania, a shift from productive grassland to wildfire-prone rainforest occurred after Aboriginal Australian land management was replaced by British colonial rule (Fletcher, Hall, and Alexander 2021). The site of Budj Bim of the Gunditjmara people has archaeological features of aquaculture, or the farming of fish, that date back 6,600 years (McNiven et al. 2012; McNiven et al. 2015). These examples show that Indigenous knowledge of how to manipulate the environment may be invaluable at the state level, such as by creating an Aboriginal ranger program to guide modern land management.