Humans’ ecological footprint has been increasing while the Earth has remained the same size. Especially in the last three centuries, the impact of human populations on surrounding landscapes and resources has grown enormously. In the United States, the footprint’s swelling can be explained in large part by the change from subsistence to profit-minded production. The colonists who brought European ideas and techniques to America instigated this shift, which began in the late seventeenth century and has arguably continued till the present. The abundance of resources in early America, and the fact that they could be so easily exploited, facilitated this change towards a profiteering mindset. It is with this observation in mind that I can suggest that the fertile nature of early America contained the seeds of our profit-oriented attitude of today, leading to an ever-growing ecological footprint.
Men such as Gifford Pinchot and John Muir realized the dangers of the attitude towards excess and, in order to avoid exploitation of American forests and mountains, attempted to incorporate the ideas of conservation and preservation into the nation’s framework by fighting for national parks. Many of their own arguments, however, were subverted by the same profit-minded attitude they sought to restrain. Pinchot’s idea of conservation, heavily influenced by his background as a forester, was very much focused on the development and use of resources by the present generation: “Conservation does mean provision for the future, but it means also and first of all the recognition of the right of the present generation to the fullest necessary use of all the resources with which this country is so abundantly blessed” (emphasis mine). Although Pinchot did call for prevention of waste, egalitarian benefits, and “foresight, prudence, thrift, and intelligence” while using resources, he partly rejected what we might now call “sustainability,” and instead took a less modest view toward the resources left for future generations by allowing qualified exploitation of the land. The more modern goal of ecological footprint analysis, on the other hand, is to achieve sustainability by preventing human consumption from growing larger than “biocapacity”— i.e., the capability of earth’s renewable natural resources to yield goods and services such as biological materials and waste absorption.
John Muir, too, exhibited the influence of our profit-minded roots. Although he celebrated areas of nature that he advertised as “untainted” by the human footprint, he still hoped people would visit them by the hundreds: “All the Western mountains are still rich in wildness, and by means of good roads are being brought nearer civilization every year.” Although he worried that the discovery of gold might bring about the destruction of the Klondike mountains with holes “burned and dug into the hard ground here and there, and into the quartz-ribbed mountains and hills … and mills and locomotives will make rumbling, screeching, disenchanting noises,” he then reasoned that “the roads of the pioneer miners will lead many a lover of wildness into the heart of the reserve, who without them would never see it.” Muir did not seem to care about the carbon or iron footprint of the railways leading to Californian mountain ranges, or the effect that thousands of visitors a year would have on the local resources; he wanted people to seek salvation in the mountains and valleys, and accrue support for the preservation of these places through national parks. This self-contradiction is an example of Muir’s generally ambiguous air towards the relationship between wildness and humans, where he ignores the consequences of the footprint these people might leave on the local ecology.
The environmental impact of those paths leading to the West cannot be passed over, however. Mere decades before Muir published his work on national parks, the bison population of the Great Plains had been decimated to fewer than two thousand where there had once been over thirty million, and wherever railroads unfurled, bison disappeared. The Native Americans had hunted bison since the late-17th century, but they were spiritually (and perhaps “survivalistically”) bound to using every part of each huge bovid, and killed them for subsistence rather than profit, just as other tribes had done with beavers and passenger pigeons in pre-Columbian times. Upon the arrival of colonists, all these animals suffered (in the famous case of the pigeon, terminally) from the drastic alteration in the way humans pursued them. The colonists (and later, Americans) did not single-handedly diminish the populations of such fauna, however. Their “Western” (European), profit-seeking mentality, engendered by the abundance of the land, changed the Native American’s traditional hunting methods by adding a price to bison hides, beaver furs, and pigeons. Commodifying animal parts was easy when plentiful resources were such quick opportunities for money, which in turn could buy land, tools, and food more easily than hides or feathers could.
The drive for profit also encouraged slavery in the South and deforestation in New England. Immense plantations of monocrops, especially ones like tobacco, had detrimental consequences for the soil; even after emancipation, sharecropping led to the same effects. Deforestation occurred at such a high rate in the early years of United States history that Alexander de Tocqueville and his friend Gustave August de la Bonniniere de Beaumont exclaimed there was a “general feeling of hatred against trees,” as colonists chopped and burned trees for reasons as varied as potash production, mast exportation, livestock herding, and grain agriculture. In many paintings of the period, it often seems that the colonists were literally attempting to stamp their print into the forest, cutting down trees as if they were combating the darkness itself, and planting their cottages in the middle of countless stumps.
Indeed, as the Southern plantations and New England cattle pastures showed, agriculture and livestock, far from “improving” the land, quite often destroyed it. Microclimate change, soil erosion, infertility, and other problems plagued fields all around the country. Although the difficulties were not ubiquitous, areas such as the Midwest faced calamitous cases of soil loss and drought, especially in the famous Dust Bowl of the 1930s. Arthur Rothstein depicts such manifestations of catastrophe in his photographs “Soil erosion, Alabama, 1937” and “Farm in Dust Bowl, Oklahoma, 1936” (The Depression Years, 16-17). In “Alabama,” a young man is leaning dejectedly against a porch, his cheeks slightly sunken, his overalls tattered, his right foot shoeless. Failing potted plants fill a yard faultily fenced with useless dry posts that barely reach waist-height. The land in the background is violently fissured: deep, undulating tracks rip like a god’s angry gouges (or salty tears sowing infertility) down the dirt of a hill that is crowned with the leafless skeleton of what was once a tree, now a lightning rod for the dereliction that has overtaken the land. Even worse desertification can be seen in “Oklahoma,” where a group of cattle huddle together in a small outcrop of wispy dead trees, surrounded on all sides by fine, dusty sand. A windmill is savagely spun by a wind the same color as the dust; there is little color or even textural difference between the sky and the ground. Rothstein chose to make these two photographs in black-and-white, perhaps because the natural hues were already so desaturated that the grimness of the situation could only be further highlighted. I cannot argue that American greed was the cause of the Dust Bowl, but it seems likely that the unsustainable agricultural practices brought across the Atlantic—the monocropping, oppressive tilling, overgrazing, and deforestation—all contributed to the resource problems that American farmers faced at one point or another.
George Perkins Marsh argued in his 1864 book “Man and Nature” that all ancient civilizations collapsed due to resource overexploitation. Current scholars such as Jared Diamond have echoed this sentiment, but I think that the most succinct sentence arguing against the Promethean-Cornucopian philosophy (i.e., the idea that we ought to approve of a free market unleashing human ingenuity on the world’s “unlimited” resources) was formulated by John Lorain in the late 1700s: “Man is the most destructive force in the universe when he considers his resources infinite [i.e., abundance generates wastefulness].” When discussing the ecological footprint of the United States, one of most important things to realize is that the biocapacity deficit has been running for the past couple of centuries and has grown significantly during that time mostly because we have considered our resources all too plentiful. To those who would advocate technological innovation as a solution to any potential resource shortages (Promethean-Cornucopians), I would point out that the Second Industrial Revolution’s shifts in energy from wood to coal to oil (and perhaps to nuclear power) have all extended our efficiency, but also contributed to our grossly swollen ecological (and specifically carbon) footprint as a nation.
Passenger pigeons, white pines, codfish, and fertile soils were all considered close to infinitely abundant during early American history, but eagerness to fill purses (i.e., the profit-seeking mentality I have described) from the fantastic cornucopia was not matched by a readiness to conserve for the future. In the end, part of Pinchot’s job was not only to plant and nurture trees, but also to harvest them. His emphasis on the present generation’s right to use resources “to the fullest” shows the surprising (but not incredible) resilience and popularity of the idea that America is forever “abundantly blessed.”