The Anthropocene Debate – Part I
At the heart of modern environmentalism is the idea that the planet must be saved from further damage by humanity. But it is far from clear that this is possible or even that the transformation of nature by human beings with technology is necessarily a bad thing.
Oh anthropocentrism, how is it I never tire of you’re flaccid arguments and blind hypocrisy in search of species domination. You’re like a middle school bully playing king of the hill who has suddenly decided that even this wasn’t enough; to reach new levels of arrogance and domination he must now also take credit for creating the hill. Then again, perhaps you’re more like a sloppy drunk in some back corner karmic bar sipping your own lukewarm swill all the while telling old war stories about how you “conquered” the monster of nature, who was red in tooth and claw, and now you are basking in the glory that has come to you like some modern day Gilgamesh after you smote the ruins of this wicked beast. Wildness, you smugly declare, has been “improved.” Now we can all live safer, more aseptic lives, secure in our belief that we have become like unto God, and that our dominion reaches even to the ends of space and time.
The latest piece in this ideological goose step comes from Michael Lind in a post at Salon titled Is it time to embrace environmental change? This post is yet another PR piece by those pushing the Anthropocene hoax as the new mantra for a greenwashed neoliberal environmentalism which traces its ideological roots to writings like The Skeptical Environmentalists (1998) and The Death of Environmentalism (2004). Lind writes:
In 2000 Eugene Stoermer, an ecologist, and Paul Crutzen, a Nobel Prize-winning atmospheric chemist, proposed that human agency has so transformed the Earth’s ecosystem that we are living in a new geological epoch: the Anthropocene, or the Human Age. Proponents of the Anthropocene concept disagree about when the era began. Was it with the industrial revolution, which began to release great quantities of pollutants and gases including greenhouse gases into the atmosphere? Was it earlier, when agrarian societies cleared vast tracts of wilderness for farms and pasture? Or was it earlier still, at the end of the last Ice Age, when, according to the “Pleistocene overkill” hypothesis, hunter-gatherers drove large animals including mammoths, mastodons and giant ground sloths to extinction in both the Old World and the New?
I stumbled on this Salon post after reading a short hype at the Breakthrough Institute for their own book – Love Your Monsters, which is an edited anthology trumpeting the wonders of anthropocentrism and technology from the liberal greenwashing duo Schellenberger and Nordhaus, who I’ve written about here before.
Apparently never satisfied with their efforts to dupe a well meaning but ill informed public, this ideological group has now begun pushing the attack on nature with renewed vigor and vim under the guise of the name Anthropocene–or the human era. The basic claim behind this theory, as originally developed by Paul Crutzen in a piece called The New World of the Anthropocene, goes something like this: humans have advanced to the point where we are now able to shape ecological and geological processes to such an extent that modern environmental changes and phenomenon are more man made than natural, thus signal the beginning of a new geologic era which is defined by man–hence the cene of Anthropos. The problem with this claim, among other things, is its flimsy scientific basis and thick ideological veneer.
When we look at the available evidence, this so-called theory holds about as much water as a wet duck. Take one of the often-cited examples from this camp–the “Pleistocene overkill” hypothesis. While impossible to totally debunk–one of the true powers of speculative counter factual historical claims–the evidence to support this claim is sketchy, and discounted by many serious folks who have looked into the matter without an a priori belief that humans must have caused those past extinctions, cause they need a historical analog for the modern ones, to support their ‘golly gee we found a historical parallel for human intervention on a massive scale, which obviously supports our claim for modern intervention as natural’ argument. Viola! It’s a great magic trick that uses a lot of science and has just enough plausibility to make it seem convincing.
But as one review of the latest studies on this theory notes, the evidence to support it is scant, and as a single causal variable (ie: over hunting killed off all the megafauna), it has more holes than the Titanic.
Intuitively, the relative rarity of remains of extinct species in many archaeological sites could simply reflect a situation in which remains of these heavy animals were seldom brought back to any human site. Yet, the extinct large mammals were equally rare in paleontological sites at similar latitudes. This latter observation indicates that the rarity of extinct species within human ranges (and not just in archaeological sites) was a real ecological phenomenon, and strongly argues against the notion that humans were responsible for their demise. Our data suggest that if a direct avoidance of humans by these species via gathering in northern areas was possible, it was not due to human overexploitation in the south.
For a more favorable argument in support of part of the “Pleistocene overkill” hypothesis, check out the study by Barnosky et. al: Assessing the Causes of Late Pleistocene Extinctions on the Continents, which was published in Science magazine in 2004. In this piece they argue that:
Evidence from paleontology, climatology, archaeology, and ecology now supports the idea that humans contributed to extinction on some continents, but human hunting was not solely responsible for the pattern of extinction everywhere. Instead, evidence suggests that the intersection of human impacts with pronounced climatic change drove the precise timing and geography of extinction in the Northern Hemisphere. The story from the Southern Hemisphere is still unfolding.
Regardless of the overall merits of this argument, one interesting finding from this study, which certainly should interest us all, and caught my attention for its conservation biology implications, as well as its larger connection with some of my other ecology research, is this point about food webs and system sustainability:
A significant implication for conservation biology is that the coupling of marked climatic change with direct human impacts on fauna is especially pernicious. Both effects are under way today at unprecedented rates. Data generated in the Pleistocene extinctions debate are now robust enough to support earlier contentions that the modern global ecosystem is unique in having vast populations of one species (humans) and a depauperate [ie: less diverse] array of megafauna. The net effect, through loss of many herbivores, carnivores, and scavengers, has been simplification and loss of redundancy in food webs. This has implications for the stability of global ecosystems.
But setting aside this complex and on-going debate on the Pleistocene, there’s another problem with the Anthropocene argument, which has to do with dating the Anthropocene to the Industrial Revolution. This time period (1750-1800’s-ish) is when carbon starts to rise and humans burn a lot of fossil fuels and stuff like that, and on most charts it looks like that’s were a lot of the current CO2 starts to appear. The problem with this claim, like the above argument, is that the science is sketchy on this as well. In fact, one important study from 2003 suggests that this timeline is so off it’s almost laughable–global climate change, as measured in CO2 and CH4 (carbon dioxide and methane) levels, have been rising since somewhere between 6000 and 8000 BP. And guess what the cause is–you got it, humans–or more specifically, deforestation for agriculture.
The study, titled THE ANTHROPOGENIC GREENHOUSE ERA BEGAN THOUSANDS OF YEARS AGO, and published by William Ruddiman in Climate Change magazine (Vol. 61 pg. 261-293), argues that with the introduction of both rice farming in Asia and India, and farming in the Middle East and North Africa, as well as part of Europe and the Americas, massive deforestation occurred, which led to significant CO2 and CH4 increases, which then spawned plagues which killed off people, allowing for farms to revert to forest and thus re-sequester some amount of those formerly released gases. Here’s his conclusion in the paper:
In contrast, a self-consistent story emerges if carbon sequestration on farms is assumed to be the first step in the causal chain [of CO2 level changes]. If the 10-ppm CO2 decreases are caused by plague-induced reforestation events, they would cool northern hemisphere temperatures by ∼0.17 ◦C, assuming a 2 × CO2 sensitivity of 2.5 ◦C. A cooling of this amplitude fits well within the constraints of the temperature record reconstructed by Mann et al. (1999). In short, if the CO2 records from Taylor and Law Domes are a more accurate measure of past CO2 changes than other icecore records from Antarctica, then plague-induced reforestation events are strongly implicated in the amplitude and timing of the 10-ppm drops in CO2. Moreover, if plague caused most of the 10-ppm CO2 drops…it must also have been a major factor in the climatic cooling that led from the relative warmth of 1000 years ago to the cooler temperatures of the Little Ice Age. A tentative assessment based on the relative radiative forcings…is that CO2 changes were on average comparable in importance to solar and volcanic forcing in this cooling. Solar and volcanic forcing appear to have been dominant at times such as the cooler decades near 1450 and 1825 AD. Plague-driven CO2 decreases were probably most important just after 1350 AD and between 1500 and 1750 AD. A more complete assessment of the role of plague driven CO2 changes in climate change during the last millennium would require a narrowing of uncertainties in both the spatial and temporal occurrence of plague and in the amount of farm abandonment (and reforestation), as well as a resolution of the inconsistencies among the CO2 trends from different Antarctic ice cores.
Finally, Lamb (1977) has argued that cooler Little Ice Age climates caused famine and depopulation, as well as increased incidence of disease. This study comes to nearly the opposite conclusion: plague outbreaks caused major population reductions and at the same time contributed significantly to cooler climates.
The whole paper is worth reading, as it is quite fascinating, and calls into question the basic assumption that human climate change started only in the last 150-300 years, as is the common line in climate debates. If humans have been changing the climate for the last 5000-8000 years, dating this hypothesized Anthropocene to somewhere in the past 250 years not only makes little sense, it would also be empirically false.
But to really understand this debate, we need to know what we are supposedly looking for in terms of visible changes that the Anthropocene would be based on, and here is where it gets even more complicated. Currently, there are several benchmarks in the geologic community to consider when evaluating such a theory:
- Changes to Physical Sedimentation
- Carbon Cycle Perturbation and Temperature
- Biotic Change
- Ocean Changes
Each of these factors tells us something about how the earth is, or is not, changing, and taken together, they can tell us a lot about how the earth was in the past, as well as how it might look in the future. When considered together, they paint a story of sorts that can help us piece together this climatic puzzle. Briefly, the ways we can get a handle on the 4 items above are as follows:
- What are known as “distinct lithostratigraphic signals”, ie: major changes to the soil layers, primarily through erosion and construction (both physical building and the damming of natural waterways). Think of it like this, when you take a cross-section of rock, you are looking at soil layers from the past.
- This is a fancy way of saying changes in Carbon levels, which lead to changes in temperature. According to the IPCC (UN climate panel) estimates, we will see a rise of temp. between 1.1 °C to 6.4 °C by 2099. The last time this happened the dinosaurs died off and we started into the most recent Ice Age. Yeehah!
- As noted above, one good way to tell things are changing is when all the living stuff dies or changes radically. In particular, mass extinctions, major species migration and loss of biodiversity are all key markers of major biological changes which would be visible given a long enough historical view.
- The ocean warms and cools, sometimes releases wild stuff like frozen layers of Methane hydrate, and even freezes over at times. And as the temperatures rise or cool, ocean acidity and sea levels also change.
Luckily, we don’t have to do all of the work ourselves, which is a good thing given the complexity of the topic. So back in 2008, members of the Stratigraphy Commission of the Geological Society of London wrote a paper called Are we now living in the Anthropocene? In this paper they set out to evaluate the Anthropocene hypothesis as proposed by Crutzen. So what did they conclude?
The preceding discussion makes clear that we have entered a distinctive phase of Earth’s evolution that satisfies geologists’ criteria for its recognition as a distinctive stratigraphic unit, to which the name Anthropocene has already been informally given.
We consider it most reasonable for this new unit to be considered at epoch level. It is true that the long-term consequences of anthropogenic change might be of sufficient magnitude to precipitate the return of “Tertiary” levels of ice volume, sea level, and global temperature that may then persist over several eccentricity (100 k.y.) cycles…This, especially in combination with a major extinction event, would effectively bring the Quaternary period to an end. However, given the large uncertainties in the future trajectory of climate and biodiversity, and the large and currently unpredictable action of feedbacks in the earth system, we prefer to remain conservative. Thus, while there is strong evidence to suggest that we are no longer living in the Holocene (as regards the processes affecting the production and character of contemporary strata), it is too early to state whether or not the Quaternary has come to an end.
Sufficient evidence has emerged of stratigraphically significant change (both elapsed and imminent) for recognition of the Anthropocene—currently a vivid yet informal metaphor
of global environmental change—as a new geological epoch to be considered for formalization by international discussion. The base of the Anthropocene may be defined by a GSSP in sediments or ice cores or simply by a numerical date.
So the short answer is no, there is not enough scientific consensus to support the Anthropocene claim that we have taken over control of the planet, but the long answer is yes, there is enough evidence to suggest that we sure are doing a bang-up job in our clumsy effort to play God–so good in fact that we can be assured of a fossil legacy of mass devastation under our watch–an Ecological Dark Age if you will. But then again, perhaps that would be a fitting “golden spike” for the Anthropocene after all, marking both the source of our rise to power and the cause of our ultimate industrial collapse. I bet the dinosaurs are really laughing at us about now…
As a final not, I ran across a Resilient Earth post by Doug Hoffman called Welcome to A Brave New Epoch, which ends with a point similar to mine on this debate that I though was worth sharing:
I have a suggestion for Crutzen, Zalaziewicz, McCarthy and all the rest of the Anthropocene fan club—leave the identification of the current geological time period, its timing and naming, to future generations of geologists. The Anthropocene, if it is ever recognized, will simply be the ruins of human hubris writ in stone. Given that science is hard pressed to explain the vagaries of climate and the environment during the present, there is little hope of correctly identifying what permanent impact humans will have on Earth, if any.
Be safe, enjoy the interglacial and stay skeptical.
Stay tuned for my follow up post, The Anthropocene Debate – Part II, where I’ll delve into more of the environmental politics behind the Anthropocene crowd and try to understand just what makes them tick.
Until next time…Go Tertiary!