Wednesday, October 19, 2016

The Future Hiding in Plain Sight

Carl Jung used to argue that meaningful coincidences—in his jargon, synchronicity—were as important as cause and effect in shaping the details of human life. Whether that’s true in the broadest sense, I’ll certainly vouch for the fact that they’re a constant presence in The Archdruid Report. Time and again, just as I sit down to write a post on some theme, somebody sends me a bit of data that casts unexpected light on that very theme.

Last week was a case in point. Regular readers will recall that the theme of last week’s post was the way that pop-culture depictions of deep time implicitly erase the future by presenting accounts of Earth’s long history that begin billions of years ago and end right now. I was brooding over that theme a little more than a week ago, chasing down details of the prehistoric past and the posthistoric future, when one of my readers forwarded me a copy of the latest Joint Operating Environment report by the Pentagon—JOE-35, to use the standard jargon—which attempts to predict the shape of the international environment in which US military operations will take place in 2035, and mostly succeeds in providing a world-class example of the same blindness to the future I discussed in my post.

The report can be downloaded in PDF form here and is worth reading in full. It covers quite a bit of ground, and a thorough response to it would be more the size of a short book than a weekly blog post. The point I want to discuss this week is its identification of six primary “contexts for conflict” that will shape the military environment of the 2030s:

“1. Violent Ideological Competition. Irreconcilable ideas communicated and promoted by identity networks through violence.” That is, states and non-state actors alike will pursue their goals by spreading ideologies hostile to US interests and encouraging violent acts to promote those ideologies.

“2. Threatened U.S. Territory and Sovereignty. Encroachment, erosion, or disregard of U.S. sovereignty and the freedom of its citizens from coercion.” That is, states and non-state actors will attempt to carry out violent acts against US citizens and territory.

“3. Antagonistic Geopolitical Balancing. Increasingly ambitious adversaries maximizing their own influence while actively limiting U.S. influence.” That is, rival powers will pursue their own interests in conflict with those of the United States.
“4. Disrupted Global Commons. Denial or compulsion in spaces and places available to all but owned by none.” That is, the US will no longer be able to count on unimpeded access to the oceans, the air, space, and the electromagnetic spectrum in the pursuit of its interests.

“5. A Contest for Cyberspace. A struggle to define and credibly protect sovereignty in cyberspace.” That is, US cyberwarfare measures will increasingly face effective defenses and US cyberspace assets will increasingly face effective hostile incursions.

“6. Shattered and Reordered Regions. States unable to cope with internal political fractures, environmental stressors, or deliberate external interference.” That is, states will continue to be overwhelmed by the increasingly harsh pressures on national survival in today’s world, and the failed states and stateless zones that will spawn insurgencies and non-state actors hostile to the US.

Apparently nobody at the Pentagon noticed one distinctly odd thing about this outline of the future context of American military operations: it’s not an outline of the future at all. It’s an outline of the present. Every one of these trends is a major factor shaping political and military action around the world right now. JOE-35 therefore assumes, first, that each of these trends will remain locked in place without significant change for the next twenty years, and second, that no new trends of comparable importance will emerge to reshape the strategic landscape between now and 2035. History suggests that both of these are very, very risky assumptions for a great power to make.

It so happens that I have a fair number of readers who serve in the US armed forces just now, and a somewhat larger number who serve in the armed forces of other countries more or less allied with the United States. (I may have readers serving with the armed forces of Russia or China as well, but they haven’t announced themselves—and I suspect, for what it’s worth, that they’re already well acquainted with the points I intend to make.) With those readers in mind, I’d like to suggest a revision to JOE-35, which will take into account the fact that history can’t be expected to stop in its tracks for the next twenty years, just because we want it to. Once that’s included in the analysis, at least five contexts of conflict not mentioned by JOE-35 stand out from the background:

1. A crisis of legitimacy in the United States. Half a century ago, most Americans assumed as a matter of course that the United States had the world’s best, fairest, and most democratic system of government; only a small minority questioned the basic legitimacy of the institutions of government or believed they would be better off under a different system. Since the late 1970s, however, federal policies that subsidized automation and the offshoring of industrial jobs, and tacitly permitted mass illegal immigration to force down wages, have plunged the once-proud American working class into impoverishment and immiseration. While the wealthiest 20% or so of Americans have prospered since then, the other 80% of the population has experienced ongoing declines in standards of living.

The political impact of these policies has been amplified by a culture of contempt toward working class Americans on the part of the affluent minority, and an insistence that any attempt to discuss economic and social impacts of automation, offshoring of jobs, and mass illegal immigration must be dismissed out of hand as mere Luddism, racism, and xenophobia. As a direct consequence, a great many working class Americans—in 1965, by and large, the sector of the public most loyal to American institutions—have lost faith in the US system of government. This shift in values has massive military as well as political implications, since working class Americans are much more likely than others to own guns, to have served in the military, and to see political violence as a potential option.

Thus a domestic insurgency in the United States is a real possibility at this point. Since, as already noted, working class Americans are disproportionately likely to serve in the military, planning for a domestic insurgency in the United States will have to face the possibility that such an insurgency will include veterans familiar with current counterinsurgency doctrine. It will also have to cope with the risk that National Guard and regular armed forces personnel sent to suppress such an insurgency will go over to the insurgent side, transforming the insurgency into a civil war.

As some wag has pointed out, the US military is very good at fighting insurgencies but not so good at defeating them, and the fate of Eastern Bloc nations after the fall of the Soviet Union shows just how fast a government can unravel once its military personnel turn against it. Furthermore, since the crisis of legitimacy is driven by policies backed by a bipartisan consensus, military planners can only deal with the symptoms of a challenge whose causes are beyond their control.

2. The marginalization of the United States in the global arena. Twenty years ago the United States was the world’s sole superpower, having triumphed over the Soviet Union, established a rapprochement with China, and marginalized such hostile Islamic powers as Iran. Those advantages did not survive two decades of overbearing and unreliable US policy, which not only failed to cement the gains of previous decades but succeeded in driving Russia and China, despite their divergent interests and long history of conflict, into an alliance against the United States. Future scholars will likely consider this to be the worst foreign policy misstep in our nation’s history.

Iran’s alignment with the Sino-Russian alliance and, more recently, overtures from the Philippines and Egypt, track the continuation of this trend, as do the establishment of Chinese naval bases across the Indian Ocean from Myanmar to the Horn of Africa, and most recently, Russian moves to reestablish overseas bases in Syria, Egypt, Vietnam, and Cuba. Russia and China are able to approach foreign alliances on the basis of a rational calculus of mutual interest, rather than the dogmatic insistence on national exceptionalism that guides so much of US foreign policy today. This allows them to offer other nations, including putative US allies, better deals than the US is willing to concede.

As a direct result, barring a radical change in its foreign policy, the United States in 2035 will be marginalized by a new global order centered on Beijing and Moscow, denied access to markets and resources by trade agreements hostile to its interests, and will have to struggle to maintain influence even over its “near abroad.” It is unwise to assume, as some current strategists do, that China’s current economic problems will slow that process. Some European leaders in the 1930s, Adolf Hitler among them, assumed that the comparable boom-bust cycle the United States experienced in the 1920s and 1930s meant that the US would be a negligible factor in the European balance of power in the 1940s. I think we all know how that turned out.

Here again, barring a drastic change in US foreign policy, military planners will be forced to deal with the consequences of unwelcome shifts without being able to affect the causes of those shifts. Careful planning can, however, redirect resources away from global commitments that will not survive the process of marginalization, and toward securing the “near abroad” of the United States and withdrawing assets to the continental US to keep them from being compromised by former allies.

3. The rise of “monkeywrenching” warfare. The United States has the most technologically complex military in the history of war. While this is normally considered an advantage, it brings with it no shortage of liabilities. The most important of these is the vulnerability of complex technological systems to “monkeywrenching”—that is, strategies and tactics targeting technological weak points in order to degrade the capacities of a technologically superior force.  The more complex a technology is, as a rule, the wider the range of monkeywrenching attacks that can interfere with it; the more integrated a technology is with other technologies, the more drastic the potential impacts of such attacks. The complexity and integration of US military technology make it a monkeywrencher’s dream target, and current plans for increased complexity and integration will only heighten the risks.

The risks created by the emergence of monkeywrenching warfare are heightened by an attitude that has deep roots in the culture of US military procurement:  the unquestioned assumption that innovation is always improvement. This assumption has played a central role in producing weapons systems such as the F-35 Joint Strike Fighter, which is so heavily burdened with assorted innovations that it has a much shorter effective range, a much smaller payload, and much higher maintenance costs than competing Russian and Chinese fighters. In effect, the designers of the F-35 were so busy making it innovative that they forgot to make it work. The same thing can be said about many other highly innovative but dubiously effective US military technologies.

Problems caused by excessive innovation can to some extent be anticipated and countered by US military planners. What makes monkeywrenching attacks by hostile states and non-state actors so serious a threat is that it may not be possible to predict them in advance. While US intelligence assets should certainly make every effort to identify monkeywrenching technologies and tactics before they are used, US forces must be aware that at any moment, critical technologies may be put out of operation or turned to the enemy’s advantage without warning. Rigorous training in responding to technological failure, and redundant systems that can operate independently of existing networks, may provide some protection against monkeywrenching, but the risk remains grave.

4. The genesis of warband culture in failed states. While JOE-35 rightly identifies the collapse of weak states into failed-state conditions as a significant military threat, a lack of attention to the lessons of history leads its authors to neglect the most serious risk posed by the collapse of states in a time of general economic retrenchment and cultural crisis. That risk is the emergence of warband culture—a set of cultural norms that dominate the terminal periods of most recorded civilizations and the dark ages that follow them, and play a central role in the historical transformation to dark age conditions.

Historians use the term “warband” to describe a force of young men whose only trade is violence, gathered around a charismatic leader and supporting itself by pillage. While warbands tend to come into being whenever public order collapses or has not yet been imposed, the rise of a self-sustaining warband culture requires a prolonged period of disorder in which governments either do not exist or cannot establish their legitimacy in the eyes of the governed, and warbands become accepted as the de facto governments of territories of various size. Once this happens, the warbands inevitably begin to move outward; the ethos and the economics of the warband alike require access to plunder, and this can best be obtained by invading regions not yet reduced to failed-state conditions, thus spreading the state of affairs that fosters warband culture in the first place.

Most civilizations have had to contend with warbands in their last years, and the record of attempts to quell them by military force is not good. At best, a given massing of warbands can be defeated and driven back into whatever stateless area provides them with their home base; a decade or two later, they can be counted on to return in force. Systematic attempts to depopulate their home base simply drive them into other areas, causing the collapse of public order there. Once warband culture establishes itself solidly on the fringes of a civilization, history suggests, the entire civilized area will eventually be reduced to failed-state conditions by warband incursions, leading to a dark age. Nothing guarantees that the modern industrial world is immune from this same process.

The spread of failed states around the periphery of the industrial world is thus an existential thread not only to the United States but to the entire project of modern civilization. What makes this a critical issue is that US foreign policy and military actions have repeatedly created failed states in which warband culture can flourish:  Afghanistan, Iraq, Syria, Libya, and Ukraine are only the most visible examples. Elements of US policy toward Mexico—for example, the “Fast and Furious” gunrunning scheme—show worrisome movement in the same direction. Unless these policies are reversed, the world of 2035 may face conditions like those that have ended civilization more than once in the past.

5. The end of the Holocene environmental optimum. All things considered, the period since the final melting of the great ice sheets some six millennia ago has been extremely propitious for the project of human civilization. Compared to previous epochs, the global climate has been relatively stable, and sea levels have changed only slowly. Furthermore, the globe six thousand years ago was stocked with an impressive array of natural resources, and the capacity of its natural systems to absorb sudden shocks had not been challenged on a global level for some sixty-five million years.

None of those conditions remains the case today. Ongoing dumping of greenhouse gases into the atmosphere is rapidly destabilizing the global climate, and triggering ice sheet melting in Greenland and Antarctica that promises to send sea levels up sharply in the decades and centuries ahead. Many other modes of pollution are disrupting natural systems in a galaxy of ways, triggering dramatic environmental changes. Meanwhile breakneck extraction is rapidly depleting the accessible stocks of hundreds of different nonrenewable resources, each of them essential to some aspect of contemporary industrial society, and the capacity of natural systems to cope with the cascading burdens placed upon them by human action has already reached the breaking point in many areas.

The end of the Holocene environmental optimum—the era of relative ecological stability in which human civilization has flourished—is likely to be a prolonged process. By 2035, however, current estimates suggest that the initial round of impacts will be well under way. Shifting climate belts causing agricultural failure, rising sea levels imposing drastic economic burdens on coastal communities and the nations to which they belong, rising real costs for resource extraction driving price spikes and demand destruction, and increasingly intractable conflicts pitting states, non-state actors, and refugee populations against one another for remaining supplies of fuel, raw materials, topsoil, food, and water.

US military planners will need to take increasingly hostile environmental conditions into account. They will also need to prepare for mass movements of refugees out of areas of flooding, famine, and other forms of environmental disruption, on a scale exceeding current refugee flows by orders of magnitude. Finally, since the economic impact of these shifts on the United States will affect the nation’s ability to provide necessary resources for its military, plans for coping with cascading environmental crises will have to take into account the likelihood that the resources needed to do so may be in short supply.

Those are the five contexts for conflict I foresee. What makes them even more challenging than they would otherwise be, of course, is that none of them occur in a vacuum, and each potentially feeds into the others. Thus, for example, it would be in the national interest of Russia and/or China to help fund and supply a domestic insurgency in the United States (contexts 1 and 2); emergent warbands may well be able to equip themselves with the necessary gear to engage in monkeywrenching attacks against US forces sent to contain them (contexts 4 and 3); disruptions driven by environmental change will likely help foster the process of warband formation (contexts 5 and 4), and so on.

That’s the future hiding in plain sight: the implications of US policies in the present and recent past, taken to their logical conclusions. The fact that current Pentagon assessments of the future remain so tightly fixed on the phenomena of the present, with no sense of where those phenomena lead, gives me little hope that any of these bitter outcomes will be avoided.

There will be no regularly scheduled Archdruid Report next week. Blogger's latest security upgrade has made it impossible for me to access this blog while I'm out of town, and I'll be on the road (and my backup moderator unavailable) for a good part of what would be next week's comment cycle. I've begun the process of looking for a new platform for my blogs, and I'd encourage any of my readers who rely on Blogger or any other Google product to look for alternatives before you, too, get hit by an "upgrade" that makes it more trouble to use than it's worth.

Wednesday, October 12, 2016

An Afternoon in Early Autumn

I think it was the late science writer Stephen Jay Gould who coined the term “deep time” for the vast panorama opened up to human eyes by the last three hundred years or so of discoveries in geology and astronomy. It’s a useful label for an even more useful concept. In our lives, we deal with time in days, seasons, years, decades at most; decades, centuries and millennia provide the yardsticks by which the life cycles of human societies—that is to say, history, in the usual sense of that word—are traced.

Both these, the time frame of individual lives and the time frame of societies, are anthropocentric, as indeed they should be; lives and societies are human things and require a human measure. When that old bamboozler Protagoras insisted that “man is the measure of all things,” though, he uttered a subtle truth wrapped in a bald-faced lie.* The subtle truth is that since we are what we are—that is to say, social primates whow have learned a few interesting tricks—our capacity to understand the cosmos is strictly limited by the perceptions that human nervous systems are capable of processing and the notions that human minds are capable of thinking. The bald-faced lie is the claim that everything in the cosmos must fit inside the perceptions human beings can process and the notions they can think.

(*No, none of this has to do with gender politics. The Greek language, unlike modern English, had a common gender-nonspecific noun for “human being,” anthropos, which was distinct from andros, “man,” and gyne, “woman.” The word Protagoras used was anthropos.)

It took the birth of modern geology to tear through the veil of human time and reveal the stunningly inhuman scale of time that measures the great cycles of the planet on which we live. Last week’s post sketched out part of the process by which people in Europe and the European diaspora, once they got around to noticing that the Book of Genesis is about the Rock of Ages rather than the age of rocks, struggled to come to terms with the immensities that geological strata revealed. To my mind, that was the single most important discovery our civilization has made—a discovery with which we’re still trying to come to terms, with limited success so far, and one that I hope we can somehow manage to hand down to our descendants in the far future.

The thing that makes deep time difficult for many people to cope with is that it makes self-evident nonsense out of any claim that human beings have any uniquely important place in the history of the cosmos. That wouldn’t be a difficulty at all, except that the religious beliefs most commonly held in Europe and the European diaspora make exactly that claim.

That last point deserves some expansion here, not least because a minority among the current crop of “angry atheists” have made a great deal of rhetorical hay by insisting that all religions, across the board, point for point, are identical to whichever specific religion they themselves hate the most—usually, though not always, whatever Christian denomination they rebelled against in their adolescent years. That insistence is a fertile source of nonsense, and never so much as when it turns to the religious implications of time.

The conflict between science and religion over the age of the Earth is a purely Western phenomenon.  Had the great geological discoveries of the eighteenth and nineteenth centuries taken place in Japan, say, or India, the local religious authorities wouldn’t have turned a hair. On the one hand, most Asian religious traditions juggle million-year intervals as effortlessly as any modern cosmologist; on the other, Asian religious traditions have by and large avoided the dubious conviction, enshrined in most (though not all) versions of Christianity, that the Earth and everything upon it exists solely as a stage on which the drama of humanity’s fall and redemption plays out over a human-scaled interval of time. The expansive Hindu cosmos with its vast ever-repeating cycles of time, the Shinto concept of Great Nature as a continuum within which every category of being has its rightful place, and other non-Western worldviews offer plenty of room for modern geology to find a home.

Ironically, though, the ongoing decline of mainstream Christianity as a cultural influence in the Western world hasn’t done much to lessen the difficulty most people in the industrial world feel when faced with the abysses of deep time. The reason here is simply that the ersatz religion that’s taken the place of Christianity in the Western imagination also tries to impose a rigid ideological scheme not only on the ebb and flow of human history, but on the great cycles of the nonhuman cosmos as well. Yes, that would be the religion of progress—the faith-based conviction that human history is, or at least ought to be, a straight line extending onward and upward from the caves to the stars.

You might think, dear reader, that a belief system whose followers like to wallow in self-praise for their rejection of the seven-day creation scheme of the Book of Genesis and their embrace of deep time in the past would have a bit of a hard time evading its implications for the future. Let me assure you that this seems to give most of them no trouble at all. From Ray Kurzweil’s pop-culture mythology of the Singularity—a straightforward rewrite of Christian faith in the Second Coming dolled up in science-fiction drag—straight through to the earnest space-travel advocates who insist that we’ve got to be ready to abandon the solar system when the sun turns into a red giant four billion years from now, a near-total aversion to thinking about the realities deep time ahead of us is astonishingly prevalent among those who think they’ve grasped the vastness of Earth’s history.

I’ve come to think that one of the things that feeds this curious quirk of collective thinking is a bit of trivia to be found in a great many books on geology and the like—the metaphor that turns the Earth’s entire history into a single year, starting on January 1 with the planet’s formation out of clouds of interstellar dust and ending at midnight on December 31, which is always right now.

That metaphor has been rehashed more often than the average sitcom plot. A quick check of the books in the study where I’m writing this essay finds three different versions, one written in the 1960s, one in the 1980s, and one a little more than a decade ago. The dates of various events dance around the calendar a bit as new discoveries rewrite this or that detail of the planet’s history, to be sure; when I was a dinosaur-crazed seven-year-old, the Earth was only three and a half billion years old and the dinosaurs died out seventy million years ago, while the latest research I know of revises those dates to 4.6 billion years and 65 million years respectively, moving the date of the end-Cretaceous extinction from December 24 to December 26—in either case, a wretched Christmas present for small boys. Such details aside, the basic metaphor remains all but unchanged.

There’s only one problem with it, but it’s a whopper. Ask yourself this: what has gotten left out of that otherwise helpful metaphor? The answer, of course, is the future.

Let’s imagine, by contrast, a metaphor that maps the entire history of life on earth, from the first living thing on this planet to the last, onto a single year. We don’t know exactly when life will go extinct on this planet, but then we don’t know exactly when it emerged, either; the most recent estimate I know of puts the origin of  terrestrial life somewhere a little more than 3.7 billion years ago, and the point at which the sun’s increasing heat will finally sterilize the planet somewhere a little more than 1.2 billion years from now. Adding in a bit of rounding error, we can set the lifespan of our planetary biosphere at a nice round five billion years. On that scale, a month of thirty days is 411 million years, a single day is 13.7 million years, an hour is around 571,000 years, a minute is around 9514 years, and a second is 158 years and change. Our genus, Homo,* evolved maybe two hours ago, and all of recorded human history so far has taken up a little less than 32 seconds.

(*Another gender-nonspecific word for “human being,” this one comes from Latin, and is equally distinct from vir, “man,” and femina, “woman.” English really does need to get its act together.)

That all corresponds closely to the standard metaphor. The difference comes in when you glance at the calendar and find out that the present moment in time falls not on December 31 or any other similarly momentous date, but on an ordinary, undistinguished day—by my back-of-the-envelope calculation, it would be September 26.

I like to imagine our time, along these lines, as an instant during an early autumn afternoon in the great year of Earth’s biosphere. Like many another late September day, it’s becoming uncomfortably hot, and billowing dark clouds stand on the horizon, heralds of an oncoming storm. We human mayflies, with a lifespan averaging maybe half a second, dart here and there, busy with our momentary occupations; a few of us now and then lift our gaze from our own affairs and try to imagine the cold bare fields of early spring, the sultry air of summer evenings, or the rigors of a late autumn none of us will ever see.

With that in mind, let’s put some other dates onto the calendar. While life began on January 1, multicellular life didn’t get started until sometime in the middle of August—for almost two-thirds of the history of life, Earth was a planet of bacteria and blue-green algae, and in terms of total biomass, it arguably still is.  The first primitive plants and invertebrate animals ventured onto the land around August 25; the terrible end-Permian extinction crisis, the worst the planet has yet experienced, hit on September 8; the dinosaurs perished in the small hours of September 22, and the last ice age ended just over a minute ago, having taken place over some twelve and a half minutes.

Now let’s turn and look in the other direction. The last ice age was part of a glacial era that began a little less than two hours ago and can be expected to continue through the morning of the 27th—on our time scale, they happen every two and a half weeks or so, and the intervals between them are warm periods when the Earth is a jungle planet and glaciers don’t exist. Our current idiotic habit of treating the atmosphere as a gaseous sewer will disrupt that cycle for only a very short time; our ability to dump greenhouse gases into the atmosphere will end in less than a second as readily accessible fossil fuel reserves are exhausted, and it will take rather less than a minute thereafter for natural processes to scrub the excess CO2 from the atmosphere and return the planet’s climate to its normal instability.

Certain other consequences of our brief moment of absurd extravagance will last longer.  On our timescale, the process of radioactive decay will take around half an hour (that is to say, a quarter million years or so) to reduce high-level nuclear waste all the way to harmlessness. It will take an interval of something like the same order of magnitude before all the dead satellites in high orbits have succumbed to the complex processes that will send them to a fiery fate in Earth’s atmosphere, and quite possibly longer for the constant rain of small meteorites onto the lunar surface to pound the Apollo landers and other space junk there to unrecognizable fragments. Given a few hours of the biosphere’s great year, though, everything we are and everything we’ve done will be long gone.

Beyond that, the great timekeeper of Earth’s biosphere is the Sun. Stars increase in their output of heat over most of their life cycle, and the Sun is no exception. The single-celled chemosynthetic organisms that crept out of undersea hot springs in February or March of the great year encountered a frozen world, lit by a pale white Sun whose rays gave far less heat than today; the oldest currently known ice age, the Cryogenian glaciation of the late Precambrian period, was apparently cold enough to freeze the oceans solid and wrap most of the planet in ice. By contrast, toward the middle of November in the distant Neozoic Era, the Sun will be warmer and yellower than it is today, and glacial eras will likely involve little more than the appearance of snow on a few high mountains normally covered in jungle.

Thus the Earth will gradually warm through October and November.  Temperatures will cycle up and down with the normal cycles of planetary climate, but each warm period will tend to be a little warmer than the last, and each cold period a little less frigid. Come December, most of a billion years from now, as the heat climbs past one threshold after another, more and more of the Earth’s water will evaporate and, as dissociated oxygen and hydrogen atoms, boil off into space; the Earth will become a desert world, with life clinging to existence at the poles and in fissures deep underground, until finally the last salt-crusted seas run dry and the last living things die out.

And humanity? The average large vertebrate genus lasts something like ten million years—in our scale, something over seventeen hours. As already noted, our genus has only been around for about two hours so far, so it’s statistically likely that we still have a good long run ahead of us. I’ve discussed in these essays several times already the hard physical facts that argue that we aren’t going to go to the stars, or even settle other planets in this solar system, but that’s nothing we have to worry about. Even if we have an improbably long period of human existence ahead of us—say, the fifty million years that bats of the modern type have been around, some three and a half days in our scale, or ten thousand times the length of all recorded human history to date—the Earth will be burgeoning with living things, and perfectly capable of supporting not only intelligent life but rich, complex, unimaginably diverse civilizations, long after we’ve all settled down to our new careers as fossils.

This does not mean, of course, that the Earth will be capable of supporting the kind of civilization we have today. It’s arguably not capable of supporting that kind of civilization now.  Certainly the direct and indirect consequences of trying to maintain the civilization we’ve got, even for the short time we’ve made that attempt so far, are setting off chains of consequences that don’t seem likely to leave much of it standing for long. That doesn’t mean we’re headed back to the caves, or for that matter, back to the Middle Ages—these being the two bogeymen that believers in progress like to use when they’re trying to insist that we have no alternative but to keep on stumbling blindly ahead on our current trajectory, no matter what.

What it means, instead, is that we’re headed toward something that’s different—genuinely, thoroughly, drastically different. It won’t just be different from what we have now; it’ll also be different from the rigidly straight-line extrapolations and deus ex machina fauxpocalypses that people in industrial society like to use to keep from thinking about the future we’re making for ourselves. Off beyond the dreary Star Trek fantasy of metastasizing across the galaxy, and the equally hackneyed Mad Max fantasy of pseudomedieval savagery, lies the astonishing diversity of the future before us: a future potentially many orders of magnitude longer than all of recorded history to date, in which human beings will live their lives and understand the world in ways we can’t even imagine today.

It’s tolerably common, when points like the one I’ve tried to make here get raised at all, for people to insist that paying attention to the ultimate fate of the Earth and of our species is a recipe for suicidal depression or the like. With all due respect, that claim seems silly to me. Each one of us, as we get out of bed in the morning, realizes at some level that the day just beginning will bring us one step closer to old age and death, and yet most of us deal with that reality without too much angst.

In the same way, I’d like to suggest that it’s past time for the inmates of modern industrial civilization to grow up, sprout some gonads—either kind, take your pick—and deal with the simple, necessary, and healthy realization that our species is not going to be around forever. Just as maturity in the individual arrives when it sinks in that human life is finite, collective maturity may just wait for a similar realization concerning the life of the species. That kind of maturity would be a valuable asset just now, not least because it might help us grasp some of the extraordinary possibilities that will open up as industrial civilization finishes its one-way trip down the chute marked “decline and fall” and the deindustrial future ahead of us begins to take shape.

Wednesday, October 05, 2016

The Myth of the Anthropocene

To explore the messy future that modern industrial society is making for itself, it’s necessary now and again to stray into some of the odd corners of human thought. Over the decade and a bit that this blog has been engaged in that exploration, accordingly, my readers and I have gone roaming through quite an assortment of topics—politics, religion, magic, many different areas of history, at least as many sciences, and the list goes on. This week, it’s time to ramble through geology, for reasons that go back to some of the basic presuppositions of our culture, and reach forward from there to the far future.

Over the last few years, a certain number of scientists, climate activists, and talking heads in the media have been claiming that the Earth has passed out of its previous geological epoch, the Holocene, into a new epoch, the Anthropocene. Their argument is straightforward: human beings have become a major force shaping geology, and that unprecedented reality requires a new moniker. Last I heard, the scholarly body that authorizes formal changes to that end of scientific terminology hasn’t yet approved the new term for official use, but it’s seeing increasing use in less formal settings.

I’d like to suggest that the proposed change is a mistake, and that the label “Anthropocene” should go into whatever circular file holds phlogiston, the luminiferous ether, and other scientific terms that didn’t turn out to represent realities. That’s not because I doubt that human beings are having a major impact on geology just now, far from it.  My reasons are somewhat complex, and will require a glance back over part of the history of geology—specifically, the evolution of the labels we use to talk about portions of the past. It’s going to be a bit of a long journey, but bear with me; it matters.

Back in the seventeenth century, when the modern study of geology first got under way, the Book of Genesis was considered to be an accurate account of the Earth’s early history, and so geologists looked for evidence of the flood that plopped Noah’s ark on Mount Ararat. They found it, too, or that’s what people believed at the time. By and large, anywhere you go in western Europe, you’ll be standing on one of three things; the first is rock, the second is an assortment of gravels and compact tills, and the third is soil. With vanishingly few exceptions, where they overlap, the rock is on the bottom, the gravels and tills are in the middle, and the soil is on top. Noting that some of the gravels and tills look like huge versions of the sandbars and other features shaped by moving water, the early geologists decided the middle layed had been left by the Flood—that’s diluvium in Latin—and so the three layers were named Antediluvian (“before the flood”), Diluvian, and Postdiluvian (“after the flood”).

So far, so good—except then they started looking at the Antediluvian layer, and found an assortment of evidence that seemed to imply that really vast amounts of time had passed between different layers of rock. During the early eighteenth century, as this sank in, the Book of Genesis lost its status as a geology textbook, and geologists came up with a new set of four labels: Primary, Secondary, Tertiary, and Quaternary. (These are fancy ways of saying “First, Second, Third, and Fourth,” in case you were wondering.) The Quaternary layer consisted of the former Diluvian and Postdiluvian gravels, tills, and soil; the Tertiary consisted of rocks and fossils that were found under those; the Secondary was the rocks and fossils below that, and the Primary was at the bottom.

It was a good scheme for the time; on the surface of the Earth, if you happen to live in western Europe and walk around a lot, you’ll see very roughly equal amounts of all four layers. What’s more, they  always occur in the order just given.  Where they overlap, the Primary is always under the Secondary, and so on; you never find Secondary rocks under Primary ones, except when the rock layers have obviously been folded by later geological forces. So geologists assigned them to four different periods of time, named after the layers—the Primary Era, the Secondary Era, and so on.

It took quite a bit of further work for geologists to get a handle on how much time was involved in each of these eras, and as the results of that line of research started to become clear, there was a collective gulp loud enough to echo off the Moon. Outside of India and a few Native American civilizations, nobody anywhere had imagined that the history of the Earth might involve not thousands of years, but billions of them. As this sank in, the geologists also realized that their four eras were of absurdly different lengths. The Quaternary was only two million years long; the Tertiary, around sixty-three million years; the Secondary, around one hundred eighty-six million years; and the Primary, from there back to the Earth’s origin, or better than four billion years.

So a new scheme was worked out. The Quaternary era became the Quaternary period, and it’s still the Quaternary today, even though it’s not the fourth of anything any more. The Tertiary also became a period—it later got broken up into the Paleogene and Neogene periods—and the Tertiary (or Paleogene and Neogene) and Quaternary between them made up the Cenozoic (Greek for “recent life”) era. The former Secondary era became the Mesozoic (“middle life”) era, and was divided into three periods; starting with the most recent, these are the Cretaceous, Jurassic, and Triassic. The former Primary era became the Paleozoic (“old life”) era, and was divided into six periods; again, starting with the most recent, these were are the Permian, Carboniferous, Devonian, Silurian, Ordovician, and Cambrian. The Cambrian started around 542 million years ago, and everything before then—all three billion years and change—was tossed into the vast dark basement of the Precambrian.

It was a pretty good system, and one of the things that was pretty good about it is that the periods were of very roughly equal length. Thus the Paleozoic had twice as many periods as the Mesozoic, and it lasted around twice as long. The Mesozoic, in turn, had three times as many complete periods as the Cenozoic did (in pre-Paleogene and Neogene days)—the Quaternary has just gotten started, remember—and it’s around three times as long. I don’t know how many of my readers, as children, delighted in the fact that the whole Cenozoic era—the Age of Mammals, as it was often called—could be dropped into the Cretaceous period with room to spare on either end, but I did. I decorated one of my school notebooks with a crisp little drawing of a scoreboard that read DINOSAURS 3, MAMMALS 1. No, nobody else got the joke.

In recent decades, things have been reshuffled a bit more.  The Precambrian basement has been explored in quite some detail, and what used to be deliciously named the Cryptozoic eon has now sadly been broken up into Proterozoic and Archean eons, and divided into periods to boot. We can let that pass, though, because it’s the other end of the time scale that concerns us. Since Cenozoic rock makes up so much of the surface—being the most recently laid down, after all—geologists soon broke up the Tertiary and Quaternary periods into six shorter units, called epochs: from first to last, Eocene, Oligocene, Miocene, Pliocene, Pleistocene, and Holocene. (These are Greek again, and mean “dawn recent, few recent, some recent, many recent, most recent,” and “entirely recent”—the reference is to how many living things in each epoch look like the ones running around today.) Later, the Eocene got chopped in two to yield the Paleocene (“old recent”) and Eocene. Yes, that “-cene” ending—also the first syllable in Cenozoic—is the second half of the label “Anthropocene,” the human-recent.

The thing to keep in mind is that an epoch is a big chunk of time. The six of them that are definitely over with at this point lasted an average of almost eleven million years a piece. (For purposes of comparison, eleven million years is around 2200 times the length of all recorded human history.) The exception is the Holocene, which is only 11,700 years old at present, or only about 0.001% of the average length of an epoch. It makes sense to call the Holocene an epoch, in other words, if it’s just beginning and still has millions of years to run.

If in fact the Holocene is over and the Anthropocene is under way, though, the Holocene isn’t an epoch at all in any meaningful sense. It’s the tag-end of the Pleistocene, or a transition between the Pleistocene and whichever epoch comes next, whether that be labeled Anthropocene or something else. You can find such transitions between every epoch and the next, every period and the next, and every era and the next. They’re usually quite distinctive, because these different geological divisions aren’t mere abstractions; the change from one to another is right there in the rock strata, usually well marked by sharp changes in a range of markers, including fossils. Some long-vanished species trickle out in the middle of an epoch, to be sure, but one of the things that often marks the end of an epoch, a period, or an era is that a whole mess of extinctions all happen in the transition from one unit of time to the next.

Let’s look at a few examples to sharpen that last point. The Pleistocene epoch was short as epochs go, only a little more than two and a half million years; it was a period of severe global cooling, which is why it’s better known as the ice age; and a number of its typical animals—mammoths, sabertooth tigers, and woolly rhinoceri in North America, giant ground sloths and glyptodons in South America, cave bears and mastodons in Europe, and so on—went extinct all at once during the short transition period at its end, when the climate warmed abruptly and a wave of invasive generalist predators (i.e., your ancestors and mine) spread through ecosystems that were already in extreme turmoil. That’s a typical end-of-epoch mess.

Periods are bigger than epochs, and the end of a period is accordingly a bigger deal. Let’s take the end of the Triassic as a good example. Back in the day, the whole Mesozoic era routinely got called “the Age of Reptiles,” but until the Triassic ended it was anybody’s guess whether the dinosaurs or the therapsid almost-mammals would end up at the top of the ecological heap. The end-Triassic extinction crisis put an end to the struggle by putting an end to most of the therapsids, along with a lot of other living things. The biggest of the early dinosaurs died off as well, but the smaller ones thrived, and their descendants went on to become the huge and remarkably successful critters of the Jurassic and Cretaceous. That’s a typical end-of-period mess.

Eras are bigger than periods, and they always end with whopping crises. The most recent example, of course, is the end of the Mesozoic era 65 million years ago. Forty per cent of the animal families on the planet, including species that had been around for hundreds of millions of years, died pretty much all at once. (The current theory, well backed up by the data, is that a good-sized comet slammed into what’s now the Yucatan peninsula, and the bulk of the dieoff was over in just a few years.) Was that the worst extinction crisis ever? Not a chance; the end of the Paleozoic 251 million years ago was slower but far more ghastly, with around ninety-five per cent of all species on the casualty list. Some paleontologists, without undue exaggeration, describe the end-Paleozoic crisis as the time Earth nearly died.

So the landscape of time revealed to us by geology shows intervals of relative stability—epochs, periods, and eras—broken up by short transition periods. If you go for a walk in country where the rock formations have been exposed, you can literally see the divisions in front of you: here’s a layer of one kind of rock a foot or two thick, laid down as sediment over millions of years and then compressed into stone over millions more; here’s a thin boundary layer, or simply an abrupt line of change, and above it there’s a different kind of rock, consisting of sediment laid down under different climatic and environmental conditions.

If you’ve got a decent geological laboratory handy and apply the usual tests to a couple of rock samples, one from the middle of an epoch and the other from a boundary layer, the differences are hard to miss. The boundary layer made when the Mesozoic ended and the Cenozoic began is a good example. The Cretaceous-Paleogene boundary layer is spiked with iridium, from space dust brought to earth by the comet; it’s full of carbon from fires that were kindled by the impact over many millions of square miles; and the one trace of life you’ll find is a great many fungal spores—dust blown into the upper atmosphere choked out the sun and left most plants on Earth dead and rotting, with results that rolled right up the food chain to the tyrannosaurs and their kin. You won’t find such anomalies clustering in the rock sample from the middle of the epoch; what you’ll find in nearly every case is evidence of gradual change and ordinary geological processes at work.

Now ask yourself this, dear reader: which of these most resembles the trace that human industrial civilization is in the process of leaving for the rock formations of the far future?

It’s crucial to remember that the drastic geological impacts that have inspired some scientists to make use of the term “Anthropocene” are self-terminating in at least two senses. On the one hand, those impacts are possible because, and only because, our species is busily burning through stores of fossil carbon that took half a billion years for natural processes to stash in the rocks, and ripping through equally finite stores of other nonrenewable resources, some of which took even longer to find their way into the deposits we mine so greedily. On the other hand, by destabilizing the climate and sending cascading disturbances in motion through a good-sized collection of other natural cycles, those impacts are in the process of wrecking the infrastructure that industrial society needs to go its merry way.

Confronted with the tightening vise between accelerating resource depletion and accelerating biosphere disruption, the vast majority of people in the industrial world seem content to insist that they can have their planet and eat it too. The conventional wisdom holds that someone, somewhere, will think of something that will allow us to replace Earth’s rapidly emptying fuel tanks and resource stocks, on the one hand, and stabilize its increasingly violent climatic and ecological cycles, on the other.  That blind faith remains welded in place even as decade after decade slips past, one supposed solution after another fails, and the stark warnings of forty years ago have become the front page news stories of today. Nothing is changing, except that the news just keeps getting worse.

That’s the simple reality of the predicament in which we find ourselves today. Our way of life, here in the world’s industrial nations, guarantees that in the fairly near future, no one anywhere on the planet will be able to live the way we do. As resources run out, alternatives fail, and the destructive impacts of climate change pile up, our ability to influence geological processes will go away, and leave us once more on the receiving end of natural cycles we can do little to change.

A hundred million years from now, as a result, if another intelligent species happens to be around on Earth at that time and takes an interest in geology, its members won’t find a nice thick stratum of rock marked with the signs of human activity, corresponding to an Anthropocene epoch. They’ll find a thin boundary layer, laid down over a few hundred years, and laced with exotic markers: decay products of radioactive isotopes splashed into the atmosphere by twentieth-century nuclear bomb testing and nuclear reactor meltdowns; chemical markers showing a steep upward jolt in atmospheric carbon dioxide; and scattered freely through the layer, micron-thick streaks of odd carbon compounds that are all that’s left of our vast production of plastic trash. That’s our geological legacy: a slightly odd transition layer a quarter of an inch thick, with the usual discontinuity between the species in the rock just below, many of whom vanish at the transition, and the species in the rock just above, who proliferate into empty ecological niches and evolve into new forms.

In place of the misleading label “Anthropocene,” then, I’d like to propose that we call the geological interval we’re now in the Pleistocene-Neocene transition. Neocene? That’s Greek for “new recent,” representing the “new normal” that will emerge when our idiotic maltreatment of the planet that keeps us all alive brings the “old normal” crashing down around our ears. We don’t call the first epoch after the comet impact 65 million years ago the “Cometocene,” so there’s no valid reason to use a label like “Anthropocene” for the epoch that will dawn when the current transition winds down. Industrial civilization’s giddy rise and impending fall are the trigger for the transition, and nothing more; the shape of the Neocene epoch will be determined not by us, but by the ordinary processes of planetary change and evolution.

Those processes have been responding to the end of the so-called Holocene—let’s rename it the Late Pleistocene, given how extremely short it turned out to be—in the usual manner.  Around the world, ice caps are melting, climate belts are shifting, acid-intolerant species in the ocean are being replaced by acid-tolerant ones, and generalist species of animals such as cats, coyotes, and feral pigs are spreading rapidly through increasingly chaotic ecosystems, occupying vacant ecological niches or elbowing less flexible competitors out of the way. By the time the transition winds down a few centuries from now, the species that have been able to adapt to new conditions and spread into new environments will be ready for evolutionary radiation; another half a million years or so, and the Neocene will be stocked with the first preliminary draft of its typical flora and fauna.

It’s entertaining, at least to me, to speculate about what critters will roam the desert sands of Kansas and Nebraska or stalk its prey in the forests of postglacial Greenland. To many of my readers, though, I suspect a more pressing question is whether a certain primate called Homo sapiens will be among the common fauna of the Neocene. I suspect so, though of course none of us can be sure—but giving up on the fantasy that’s embodied in the label “Anthropocene,” the delusion that what our civilization is doing just now is going to keep on long enough to fill a geological epoch, is a good step in the direction of our survival.

Wednesday, September 28, 2016

The Coming of the Postliberal Era

One of the big challenges faced by any student of current events is that of seeing past the turmoil of the present moment to catch the deep trends shaping events on a broader scale. It’s a little like standing on a beach, without benefit of tide tables, and trying to guess whether the tide’s coming in or going out. Waves surge, break, and flow back out to sea; the wind blows this way and that; it takes time, and close attention to subtle details, before you can be sure whether the sea is gradually climbing the beach or just as gradually retreating from it.

Over the last year or so, though, it’s become increasingly clear to me that one of the great tides of American politics has turned and is flowing out to sea. For almost precisely two hundred years, this country’s political discourse has been shaped—more powerfully, perhaps, than by any other single force—by the loose bundle of ideas, interests, and values we can call American liberalism. That’s the tide that’s turning. The most important trends shaping the political landscape of our time, to my mind, are the descent of the liberal movement into its final decadence, and the first stirrings of the postliberal politics that is already emerging in its wake.

To make sense of what American liberalism has been, what it has become, and what will happen in its aftermath, history is an essential resource. Ask a believer in a political ideology to define it, and you’ll get one set of canned talking points; ask an opponent of that ideology to do the same thing, and you’ll get another—and both of them will be shaped more by the demands of moment-by-moment politics than by any broader logic. Trace that ideology from its birth through its adolescence, maturity, and decline into senescence, and you get a much better view of what it actually means.

Let’s go back, then, to the wellsprings of the American liberal movement. Historians have argued for a good long time about the deeper roots of that movement, but its first visible upsurge can be traced to a few urban centers in the coastal Northeast in the years just after the War of 1812. Boston—nineteenth century America’s San Francisco—was the epicenter of the newborn movement, a bubbling cauldron of new social ideas to which aspiring intellectuals flocked from across the new Republic.  Any of my readers who think that the naive and effervescent idealism of the 1960s was anything new need to read Nathaniel Hawthorne’s The Blithedale Romance; it's set in the Massachusetts counterculture of the early nineteenth century, and most of the action takes place on a commune. That’s the context in which American liberalism was born.

From the very beginning, it was a movement of the educated elite. Though it spoke movingly about uplifting the downtrodden, the downtrodden themselves were permitted very little active part in it. It was also as closely intertwined with Protestant Christianity as the movement of the 1960s was with Asian religions; ministers from the Congregationalist and Unitarian churches played a central role in the movement throughout its early years, and the major organizations of the movement—the Anti-Slavery Societies, the Temperance League, and the Non-Resistant League, the first influential American pacifist group—were closely allied with churches, and staffed and supported by clergymen. Both the elitism and the Protestant Christian orientation, as we’ll see, had a powerful influence on the way American liberalism evolved over the two centuries that followed.

Three major social issues formed the framework around which the new movement coalesced. The first was the abolition of slavery; the second was the prohibition of alcohol; the third was the improvement of the legal status of women. (The movement traversed a long and convoluted road before this latter goal took its ultimate form of legal and social equality between the genders.) There were plenty of other issues that attracted their own share of attention from the movement—dietary reform, dress reform, pacifism, and the like—but all of them shared a common theme: the redefinition of politics as an expression of values.

Let’s take a moment to unpack that last phrase. Politics at that time, and at most other periods throughout human history, was understood as a straightforward matter of interests—in the bluntest of terms, who got what benefits and who paid what costs. Then and for most of a century thereafter, for example, one of the things that happened in the wake of every Presidential election is that the winner’s party got to hand out federal jobs en masse to its supporters. It was called the “spoils system,” as in “to the victor belongs the spoils;” people flocked to campaign for this or that presidential candidate as much in the hope of getting a comfortable federal job as for anyother reason. Nobody saw anything wrong with that system, because politics was about interests.

In the same way, there’s no evidence that anybody in the Constitutional Convention agonized about the ethical dimensions of the notorious provision that defined each slave as being 3/5ths of a person. I doubt the ethical side of the matter ever crossed any of their minds, because politics was not about ethics or any other expression of values—it was about interests—and the issue was simply one of finding a compromise that allowed each state to feel that its interests would be adequately represented in Congress. Values, in the thought of the time, belonged to church and to the private conscience of the individual; politics was about interests pure and simple.

(We probably need to stop here for a moment to deal with the standard response: “Yes, but they should have known better!” This is a classic example of chronocentrism. Just as ethnocentrism privileges the beliefs, values, and interests of a particular ethnic group, chronocentrism does the same thing to the beliefs, values, and interests of a particular time. Chronocentrism is enormously common today, on all sides of the political and cultural landscape; you can see it when scientists insist that people in the Middle Ages should have known better than to believe in astrology, for example, or when Christians insist that the old Pagans should have known better than to believe in polytheist religions. In every case, it’s simply one more attempt to evade the difficult task of understanding the past.)

Newborn American liberalism, though, rejected the division between politics and values. Their opposition to slavery, for example, had nothing to do with the divergent economic interests of the industrializing northern states and the plantation economy of the South, and everything to do with a devoutly held conviction that chattel slavery was morally wrong. Their opposition to alcohol, to the laws that denied civil rights to women, to war, and to everything else on the lengthy shopping list of the movement had to do with moral values, not with interests. That’s where you see the impact of the movement’s Protestant heritage: it took values out of the church and tried to apply them to the world as a whole.  At the time, that was exotic enough that the moral crusades just mentioned got about as much political traction at the time as the colorful fantasies of the 1960s did in their own day.

Both movements were saved from complete failure by the impact of war. The movement of the 1960s drew most of its influence on popular culture from its opposition to the Vietnam War, which is why it collapsed nearly without a trace when the war ended and the draft was repealed.  The earlier movement had to wait a while for its war, and in the meantime it very nearly destroyed itself by leaping on board the same kind of apocalyptic fantasy that kicked the New Age movement into its current death spiral four years ago. In the late 1830s, frustrated by the failure of the perfect society to show up as quickly as they desired, a great many adherents of the new liberal movement embraced the prophecy of William Miller, a New England farmer who believed that he had worked out from the Bible the correct date of the Second Coming of Christ. When October 22, 1844 passed without incident, the same way December 21, 2012 did, the resulting “Great Disappointment” was a body blow to the movement.

By then, though, one of the moral crusades being pushed by American liberals had attracted the potent support of raw economic interest. The division between northern and southern states over the question of slavery was not primarily seen at the time as a matter of ethics; it was a matter of competing interests, like every other political question, though of course northern politicians and media were quick to capitalize on the moral rhetoric of the Abolitionists. At issue was the shape of the nation’s economic future. Was it going to be an agrarian society producing mostly raw materials for export, and fully integrated into a global economy centered on Britain—the southern model? Or was it going to go its own way, raise trade barriers against the global economy, and develop its own industrial and agricultural economy for domestic consumption—the northern model?

Such questions had immediate practical implications, because government policies that favored one model guaranteed the ruin of the other. Slavery was the linchpin of the Southern model, because the big southern plantations required a vast supply of labor at next to no cost to turn a profit, and so it became a core issue targeted by northern politicians and propagandists alike. Read detailed accounts of the struggles in Congress between northern and southern politicians, though, and you’ll find that what was under debate had as much to do with trade policy and federal expenditures. Was there to be free trade, which benefited the South, or trade barriers, which benefited the North? Was the federal budget to pay for canals and roads, which benefited northern interests by getting raw materials to factories and manufactured products to markets, but were irrelevant to southern interests, which simply needed riverboats to ship cotton and tobacco to the nearest seaport?

Even the bitter struggles over which newly admitted states were to have slave-based economies, and which were not, had an overwhelming economic context in the politics of the time. The North wanted to see the western territories turned into a patchwork of family farms, producing agricultural products for the burgeoning cities of the eastern seaboard and the Great Lakes and buying manufactured goods from northern factories; the South wanted to see those same territories made available for plantations that would raise products for export to England and the world.

Yet the ethical dimension became central to northern propaganda, as already noted, and that helped spread the liberal conviction that values as well as interests had a place in the political dialogue. By 1860, that conviction had become widespread enough that it shaped thinking south of the Mason-Dixon line. As originally written, for example, the first line of the Confederate song “The Bonny Blue Flag” ran “fighting for the property we won by honest toil”—and no one anywhere had any illusions about the identity, or skin color, of the property in question. Before long, though, it was rewritten as “fighting for our liberty, with treasure, blood and toil.” The moment that change occurred, the South had already lost; it’s entirely possible to argue for slavery on grounds of economic interest, but once the focus of the conversation changes to values such as liberty, slavery becomes indefensible.

So the Civil War raged, the Confederacy rose and fell, the Northern economic model guided American economic policy for most of a century thereafter, and the liberal movement found its feet again. With slavery abolished, the other two primary goals took center stage, and the struggle to outlaw alcohol and get voting rights for women proceeded very nearly in lockstep.  The 18th Amendment, prohibiting the manufacture and sale of alcohol in the US, and the 19th Amendment, granting women the right to vote, were passed in 1919 and 1920 respectively, and even though Prohibition turned out to be a total flop, the same rhetoric was redirected toward drugs (most were legal in the US until the 1930s) and continues to shape public policy today.  Then came the Great Depression, and with the election of Franklin Roosevelt in 1932—and above all with his landslide reelection victory in 1936, when the GOP carried only two states—the liberal movement became the dominant force in American political life.

Triumph after triumph followed.  The legalization of unions, the establishment of a tax-funded social safety net, the forced desegregation of the South: these and a galaxy of other reforms on the liberal shopping list duly followed. The remarkable thing is that all these achievements took place while the liberal movement was fighting opponents from both sides. To the right, of course, old-fashioned conservatives still dug in their heels and fought for the interests that mattered to them, but from the 1930s on, liberals also faced constant challenge from further left. American liberalism, as already mentioned, was a movement of the educated elite; it focused on helping the downtrodden rather than including them; and that approach increasingly ran into trouble as the downtrodden turned out to have ideas of their own that didn’t necessarily square with what liberals wanted to do for them.

Starting in the 1970s, in turn, American liberalism also ended up facing a third source of challenges—a new form of conservatism that borrowed the value-centered language of liberalism but used a different set of values to rally support to its cause: the values of conservative Protestant Christianity. In some ways, the rise of the so-called “new conservatism” with its talk about “family values” represented the final, ironic triumph of the long struggle to put values at the center of political discourse. By the 1980s, every political faction in American public life, no matter how crass and venial its behavior or its goals, took care to festoon itself with some suitable collection of abstract values. That’s still the case today; nobody talks about interests, even when interests are the obvious issue.

Thus you get the standard liberal response to criticism, which is to insist that the only reason anyone might possibly object to a liberal policy is because they have hateful values.

Let’s take current US immigration policy as an example. This limits the number of legal immigrants while tacitly allowing unlimited illegal immigration.  There are solid pragmatic reasons for questioning the appropriateness of that policy. The US today has the highest number of permanently unemployed people in its history, incomes and standards of living for the lower 80% of the population have been moving raggedly downward since the 1970s, and federal tax policies effectively subsidize the offshoring of jobs. That being the case, allowing in millions of illegal immigrants who have, for all practical purposes, no legal rights, and can be employed at sweatshop wages in substandard conditions, can only drive wages down further than they’ve already gone, furthering the impoverishment and immiseration of wage-earning Americans.

These are valid issues, dealing with (among other things) serious humanitarian concerns for the welfare of wage-earning Americans, and they have nothing to do with racial issues—they would be just as compelling if the immigrants were coming from Canada.  Yet you can’t say any of this in the hearing of a modern American liberal. If you try, you can count on being shouted down and accused of being a racist. Why? I’d like to suggest that it’s because the affluent classes from which the leadership of the liberal movement is drawn, and which set the tone for the movement as a whole, benefit directly from the collapse in wages that has partly been caused by mass illegal immigration, since that decrease in wages has yielded lower prices for the goods and services they buy and higher profits for the companies for which many of them work, and whose stocks many of them own.

That is to say, a movement that began its history with the insistence that values had a place in politics alongside interests has ended up using talk about values to silence discussion of the ways in which its members are pursuing their own interests. That’s not a strategy with a long shelf life, because it doesn’t take long for the other side to identify, and then exploit, the gap between rhetoric and reality.

Ironies of this sort are anything but unusual in political history. It’s astonishingly common for a movement that starts off trying to overturn the status quo in the name of some idealistic abstraction or other to check its ideals at the door once it becomes the status quo. If anything, American liberalism held onto its ideals longer than most and accomplished a great deal more than many, and I think that most of us—even those who, like me, are moderate Burkean conservatives—are grateful to the liberal movement of the past for ending such obvious abuses as chattel slavery and the denial of civil rights to women, and for championing the idea that values as well as interests deserve a voice in the public sphere. It deserves the modern equivalent of a raised hat and a moment of silence, if no more, as it finally sinks into the decadence that is the ultimate fate of every successful political movement.

The current US presidential election shows, perhaps better than anything else, just how far that decadence has gone. Hillary Clinton’s campaign is floundering in the face of Trump’s challenge because so few Americans still believe that the liberal shibboleths in her campaign rhetoric mean anything at all. Even among her supporters, enthusiasm is hard to find, and her campaign rallies have had embarrassingly sparse attendance. Increasingly frantic claims that only racists, fascists, and other deplorables support Trump convince no one but true believers, and make the concealment of interests behind shopworn values increasingly transparent.  Clinton may still win the election by one means or another, but the broader currents in American political life have clearly changed course.

It’s possible to be more precise. Bernie Sanders and Donald Trump, in stark contrast to Clinton, have evoked extraordinarily passionate reactions from the voters, precisely because they’ve offered an alternative to a status quo pervaded by the rhetoric of a moribund liberalism. In the same way, in Britain—where the liberal movement followed a somewhat different trajectory but has ended up in the same place—the success of the Brexit campaign and the wild enthusiasm with which Labour Party voters have backed the supposedly unelectable Jeremy Corbyn show that the same process is well under way there. Having turned into the captive ideology of an affluent elite, liberalism has lost the loyalty of the downtrodden that once, with admittedly mixed motives, it set out to help. That’s a loss it’s unlikely to survive.

Over the decades ahead, in other words, we can expect the emergence of a postliberal politics in the United States, England, and quite possibly some other countries as well. The shape of the political landscape in the short term is fairly easy to guess.  Watch the way the professional politicians in the Republican Party have flocked to Hillary Clinton’s banner, and you can see the genesis of a party of the affluent demanding the prolongation of free trade, American intervention in the Middle East, and the rest of the waning bipartisan consensus that supports its interests. Listen to the roars of enthusiasm for Bernie Sanders and Donald Trump—or better still, talk to the not inconsiderable number of Sanders supporters who will be voting for Trump this November—and you can sense the emergence of a populist party seeking the abandonment of that consensus in defense of its very different interests.

What names those parties will have is by no means certain yet, and a vast number of other details still have to be worked out. One way or another, though, it’s going to be a wild ride.

Wednesday, September 21, 2016

A Time for Retrovation

It's been a little more than a year now since I started the narrative that wrapped up last week. The two weeks that Peter Carr spent in the Lakeland Republic in late November of 2065 ended up covering a little more ground than I’d originally intended, and of course the vagaries of politics and culture in the twilight years of the American century got their share of attention on this blog. Now that the story’s told and the manuscript is getting its final revisions before heading off to the publisher, I want to talk a bit about exactly what I was trying to do by taking an imaginary person to an imaginary place where things work better than they do here and now.

Part of it, of course, was an attempt to sketch out in detail the practical implications of a point I’ve been exploring on this blog for a good while now. Most people in today’s industrial society believe, or think they believe, in progress: they believe, that is, that human history has a built-in bias that infallibly moves it from worse things to better things over time. These days, that belief in progress most often attaches itself to the increasing complexification of technology, and you get the touching faith in the imminence of a Star Trek future that allows so many people these days to keep slogging through the wretchedly unsatisfactory and steadily worsening conditions of the present.

Faith does not depend on evidence. If that statement needs any further proof, you can get it by watching the way people respond to technological failure. Most of us these days know perfectly well that every software “upgrade” these days has more bugs and fewer useful features than what it replaced, and every round of “new and improved” products hawked by the media and shoveled onto store shelves is more shoddily made, more loaded with unwanted side effects, and less satisfactory than the last round. Somehow, though, a good many of the people who witness this reality, day in and day out, still manage to insist that the future is, or at least ought to be, a paradise propped up by perfectly functioning machines. That the rising tide of technological failure might be something other than an accidental roadbump on the way to utopia—that it might be trying to tell us something that, by and large we don’t want to hear—has not yet entered our society’s darkest dream.

It so happens that in very many cases, older, simpler, sturdier technologies work better, producing more satisfactory outcomes and fewer negative side effects, than their modern high-tech equivalents. After most of two years taking apart the modern mythology of progress in a series of posts that became my book After Progress: Reason and Religion at the End of the Industrial Age, and most of another year doing the more pragmatic posts that are being turned into a forthcoming book tentatively titled The Retro Future, I decided that the best way to pursue the exploration further was to imagine a society very much like ours that had actually noticed the declining quality of technology, and adjusted public policies accordingly. That was the genesis of Retrotopia: the attempt to show, by means of the toolkit of narrative fiction, that deliberate technological regression as public policy didn’t amount to a return to the caves—quite the contrary, it meant a return to things that actually work.

The form that this exploration took, though, was shaped in important ways by an earlier venture of the same kind, Ernest Callenbach’s Ecotopia. I don’t know how many of my readers realize just how dramatic a change in utopian literature was marked by Callenbach’s solidly written tale. From the days of Thomas More’s novel Utopia, which gave the genre its name, utopian literature worked with the contrast between the world as it is and an ideal world as imagined by the author, without any connection between the two outside of the gimmick, however worked, that got a viewpoint character from one to the other. More’s Utopia was a critique of the England of Henry VIII, but there was never any suggestion on More’s part that England might be expected to turn into Utopia one of these days, and nearly all the utopian tales that followed his embraced the same approach.

With William Morris, things began to shift. Morris was a socialist, and thus believed devoutly that the world could in fact turn into something much better than it was; during the years that his commitment to socialism was at its height, he penned a utopian tale, News from Nowhere, which was set in a future England long after Victorian capitalism had gone gurgling down history’s sewer pipe. (Later on, in the pages of his tremendous fantasy novel The Well at the World’s End, he wove a subtle but pervasive critique of the socialist views he’d championed—socialism appears there in the stark and terrible symbolic form of the Dry Tree—but that’s a subject for a different post entirely.)

News From Nowhere was quite the controversial book in its day, not least because the socialist future Morris imagined was green, agrarian, and entirely free of the mechanized regimentation of humanity that played such a huge role in the Marxist imagination then as now.  Still, the historical thread that linked Morris’ utopia to the present was very thin.  The story was set far off in the future, and Morris skimmed lightly over the process that led from the dark Satanic mills of Victorian England to the green and pleasant land of his imagined socialist England.

That was where Callenbach took hold of the utopian narrative, and hammered it into a completely new shape. Ecotopia was set barely a quarter century in Callenbach’s own future. In his vision, the states of Washington, Oregon, and the northern two-thirds of California had broken away from the United States in 1980, and the usual visitor—journalist William Weston, from what’s left of the United States—came to pay the usual visit in 1999. Over the nineteen years between independence and Weston’s visit, the new nation of Ecotopia had entirely reshaped itself in the image of the Whole Earth Catalog, adopting the technologies, customs, and worldview that San Francisco-area eco-radicals of the 1970s dreamed of establishing, and here and there actually adopted in their own lives.

It really is a tour de force. One measure of its impact is that to this day, when you ask people on the leftward end of things to imagine an ideal future that isn’t just a lightly scrubbed version of the present, dollars will get you organic free range doughnuts that what you’ll hear is some version or other of the Ecotopian future: wind turbines and solar panels, organic farms everywhere, and everyone voluntarily embracing the social customs and attitudes of the San Francisco-area avant-garde circa 1975 in perfect lockstep. While I was writing Retrotopia, until some of my readers got the hang of the fact that I don’t crowdsource my fiction, I fielded any number of comments and emails insisting that I really ought to incorporate this or that or the other aspect of the Ecotopian future into my narrative. I didn’t take offense at that; it was pretty clear to me that for a lot of people nowadays, Ecotopia is literally the only alternative to the status quo that they can imagine.

We’ll get to the broader implications of that last point in a moment. Just now, I want to talk about why I didn’t write a mildly retro version of Ecotopia. I could have; it would have been easy and, frankly, quite entertaining to do that. I’ve imagined more than once writing a tale about somebody from our world who, via some bit of science-fictionish handwaving, is transported to an alternate America in which Ronald Reagan lost the 1980 election, the Three Mile Island nuclear power plant underwent a full-scale Fukushima Daiichi meltdown with tens of thousands of casualties, and the United States had accordingly gone careening ahead toward the sustainable future we almost adopted. I may still write that story someday, but that wasn’t what I chose to do this time around.

Partly, of course, that was because Ernest Callenbach was there already forty years ago. Partly, though, it’s because not all the assumptions that undergirded Ecotopia have worn well in the decades since he wrote. It’s become painfully clear that renewable energy sources, valuable and necessary though they are, can’t simply be dropped into place as a replacement for fossil fuels; huge changes in energy use, embracing issues of energy concentration and accessibility as well as sheer quantity, will have to be made as fossil fuels run out and we have to make do with the enduring power sources of sun, wind, water, and muscle. It’s also become clear, painfully or amusingly as the case may be, that the notions that Sausalito intellectuals thought would save the world back in the 1970s—communal living, casual pansexuality, and the like—had downsides and drawbacks that nobody had gotten around to noticing yet, and weren’t necessarily as liberating and transformative as they seemed at the time.

Ecotopia also fell headlong into both of the standard pitfalls of the contemporary liberal imagination. The first of these is the belief that a perfect society can be attained if we can just abolish diversity of ideas and opinions, and get everyone to believe what the affluent liberal intelligentsia think they ought to believe. That’s why I put ongoing controversies between conservative and restorationist blocs into the story.  It’s also, on another level, why I put in repeated references to religious diversity—thus there are people running for public office in the Lakeland Republic who end an oath of office with “So help me Jesus my Lord and Savior,” just as there are military officers there who spend every Sunday at the Greek Orthodox cathedral in Toledo, and politicians who attend the Atheist Assembly.

The second pitfall, which follows from the first, is the belief that since you can’t get “those people” to have the ideas and opinions you think they ought to have, the proper response is to hole up in a self-referential echo chamber from which all unacceptable views are excluded. Ecotopia assumes implicitly that the United States, and by inference the rest of the world’s nations as well, are utterly irredeemable; the nation of Ecotopia thus barricades itself inside its borders and goes its green and merry way, and the climax of the story comes when William Weston decides to stay in Ecotopia and become one of the good people. (He had a significant other back home in the USA, by the way; what she thought of his decision to dump her for a San Francisco hippie chick is nowhere mentioned.)

We’ll be discussing both those pitfalls at length in future posts, not least because they bid fair to exert a massive influence on contemporary politics, especially but not only in the United States. The point I’d like to make here, though, is just how deep the latter habit runs through the liberal end of our collective imagination. I’m thinking here of another powerful and morally problematic work of fiction to come out of the same era, Ursula K. LeGuin’s haunting story “The Ones Who Walk Away From Omelas.” The core of the story is that there’s a splendid city, Omelas; its splendor depends on the infliction of suffering on one helpless person; now and again, people get upset by this, and leave the city. It’s stunningly well written but evades a crucial question: does walking away do anything to change the situation, or does it just let the ones who walk away from Omelas feel morally superior?

That was one of the reasons why the conclusion of Retrotopia didn’t feature Peter Carr chucking his Atlantic Republic passport and moving in with Melanie Berger. Instead, he caught the train back home, having committed himself to the challenge of trying to move his own country in the direction that the Lakeland Republic has already taken, in the full knowledge that he might not succeed. I had the entire last scene in mind from the beginning of the project, partly as a deliberate challenge to that aspect of Ecotopia, partly because that sort of leap into uncertainty seems much more relevant to our present predicament. We don’t know, any more than Carr did, what lies behind the clouds that hide the future.

Of course the primary difference between Ecotopia and Retrotopia was that my narrative was meant to explore a very different approach from Callenbach’s. He was trying to propose a new, avant-garde, cutting-edge future—it’s often forgotten that the kind of thing Callenbach was talking about really was seen as the next great wave of progress in the 1970s, before the current fad for schizoid withdrawal into a cybernetic Neverland took that title away from it in the 1980s. I’m trying to explore the possibility that going back to what worked is a better idea than plunging forward along a trajectory that leads to no place any sane human being would want to go. He was talking about innovation, while I’m talking about retrovation: the strategy of using the past as a resource for problem-solving in the present.

Retrovation used to be utterly unthinkable in modern industrial societies. At the moment, it’s making the transition from utterly unthinkable to unspeakably heretical—thus another term for it I introduced in a post a while back, the heresy of technological choice—but a lot of people still can’t get their minds around it at all. When I’ve proposed steampunk technology as one model for the future, I’ve inevitably fielded a flurry of comments insisting that you can’t possibly have Victorian technology without child labor and oppressive gender politics—and of course while I was writing Retrotopia, quite a few readers assumed as a matter of course that the tier system in the Lakeland Republic governed every detail of daily life, so that you weren’t allowed to have anything belonging to a post-1830 technological suite if you lived in a tier one county.

Not so. The word I’ve coined for the strategy under discussion, retrovation, is obviously backformed from “retro” + “innovation,” but it’s also “re-trove-ation,” re-finding, rediscovery: an active process of searching through the many options the past provides, not a passive acceptance of some bygone time as a package deal. That’s the strategy the Lakeland Republic puts to use in my narrative, and those of my readers who know their way around the backwaters and odd corners of history may find it entertaining to figure out the sources from which I lifted this or that detail of Retrotopian daily life. The rhetoric of progress, by contrast, rejects that possibility, relies on a very dubious logic that lumps “the past” together as a single thing, and insists that wanting any of it amounts to wanting all of it, with the worst features inevitably highlighted.

I’ve long since lost track of the number of times I’ve been told that rejecting the latest new, shiny, and dysfunctional technology, in favor of an older technology that works, is tantamount to cheerleading for infant mortality, or slavery, or living in caves, or what have you. I’ve sometimes thought that it might be entertaining to turn that around—“if you won’t use a cell phone, you must be in favor of bringing back a balanced global climate!”—or simply taking it in directions a little more absurd than it’s gone already—“if you prefer rail travel to air travel, why, you might as well just restart the Punic Wars!”  In either case, the point that might be made is the silliness of the progress-worshippers’ insistence that the past, or the present, or for that matter the future, is an all-or-nothing deal.

That’s also why, to return to my narrative for a moment, I made a point of showing that the sexual mores of people in the Lakeland Republic didn’t correspond to how people behaved at some point in the past—or, more to the point, the mythical notion of how people behaved in the past that’s been circulated by certain pseudoconservatives in recent decades. Thus industrial magnate Janice Mikkelson is a lesbian with a lovely wife, Peter Carr happens to see two young men who’ve just gotten married on their way to their honeymoon, and when Peter and Melanie go out for dinner and an opera, the evening ends in her bedroom. I know that was uncomfortable for the social and religious conservatives among my readers, but it had to be there, for two reasons. 

On the one hand, as a moderate Burkean conservative, I see absolutely no justification for imposing legal restraints on what consenting adults do in the privacy of their own bedrooms, or for that matter in that dimension of the public sphere that pertains to marriage licenses—and, after all, this is my utopia and I’ll permit what I want to.  On the other hand, just as I put devoutly religious people into the story to discomfit the sort of doctrinaire liberals who believe that nobody should follow traditional religious teachings, I put married gay and lesbian people into the story to discomfit the sort of doctrinaire conservatives who believe that nobody should follow contemporary sexual mores. In both cases, the point I hoped to make is that the Lakeland Republic, with its policy of retrovation and its relative comfort with a diversity of ideas and lifestyles, hasn’t gone “backward,” or for that matter “forward,” but off in a direction all its own—a direction that can’t be defined in terms of the monomaniacally linear fixations of the worshippers of progress.

And of course that’s the crucial point, the most important thing that I hope my readers got out of the narrative. At the heart of most of the modern world’s insoluble problems is the faith-based claim that human history is a straight line with no branches or meanders, leading onward and upward from the caves to the stars, and that  every software upgrade, every new and improved product on the shelves, every lurch “forward”—however that conveniently floppy word happens to be defined from day to day by marketing flacks and politicians—therefore must lead toward that imaginary destination.

That blind and increasingly untenable faith, I’ve come to think, is the central reason why the only future different from the present that most people can imagine these days, if it’s not Ecotopia, is either a rehash of the past in every detail or some kind of nightmare dystopia. These days, as often as not, that even extends to science fiction, once our society’s most effervescent cauldron of novel futures. While writing an essay on the genre for a new magazine of science fiction and fantasy, Mythic, it occurred to me—and not for the first time—how few recent works of science fiction seem to be able to portray a future society that isn’t either a straight-line extrapolation from the present, complete with all its most parochial features, a carbon-copy rehash of some specific society of the past, or a smoking wasteland.

Not all that many decades ago, SF authors routinely spun future societies as radically different from ours as ours is from, say, the ancient Maya, but such visions are rare now. I don’t think that’s accidental.  To borrow a metaphor from Retrotopia, when you’ve driven down a blind alley and are sitting there with your bumper pressed against a brick wall, the only way forward starts by backing up—but if you’ve been convinced by your society’s core ideological commitments that “backing up” can only mean returning whole hog to the imaginary, awful past from which the ersatz messiah of progress is supposed to save us, you’re stuck. There you sit, pushing uselessly on the pedal, hearing the engine labor and rattle, and watching the gas gauge move steadily toward that unwelcome letter E; it’s no surprise that after a while, the idea of a street leading somewhere else starts to seem distinctly unreal.

Other futures are possible. Retrotopia isn’t the only option, though I have to say it strikes me as a much more pleasant choice than what we’ve got now, and retrovation isn’t the only tool we need to get us out of that blind alley, though I suspect it’s more useful than a good many of the more popular items in our contemporary toolkit. Still, time will tell—and if my narrative irritates some of my readers enough to get them working on their own, radically different visions of a future that breaks free of the blind alley of linear progress, all the better.