Editor’s note: Today we have the final installment of our “Anthropocene Melbourne Campus” series, featuring two related posts by Lauren Rickards and Ruth Morgan.
Producing the Anthropocene, Producing the Future
Lauren Rickards, RMIT University
Images of the future are increasingly cast on the widescreen of the Anthropocene: the planetary-scale shift from the comfy Holocene to an unknown and threatening new ‘operating space’ for the Earth. How humanity inadvertently shifted the whole planet so radically and in such a self-damaging manner is now the subject of intense debate. Different narratives of blame locate relative responsibility with various sectors, activities and groups. Common candidates include farming, colonial plantations, industrialization and urbanisation, and the post-war acceleration in consumption and pollution. From a material perspective, there is a strong geological rationale for naming each as a major source of planetary-scale environmental and social impacts and “terraforming.” Indeed, this is how these various proposed starting dates for the Anthropocene have been identified: through the pursuit of widespread and sharp enough changes in the geological record to count as what geologists call a “Golden Spike”, the prerequisite for declaring a new epoch. Yet this search for the physical origins of the Anthropocene in the historical record needs to extend far past physical signals and their proximate causes to the visions, goals and assumptions underlying the activities involved, including what Ian Hacking would call styles of reasoning. Reading the Anthropocene in this light reveals many limitations within the outlooks, ideas and values that informed the activities mentioned above, including an often willful ignorance of the immediate impacts on people, nonhumans and the abiotic environment, as well as the “unknown unknown” of the long-term, accumulative changes being wrought.
In combing history for the sources of our present problems, it is easy to lament past generations’ failure to care about the future. Yet their ignorance of future effects should not be conflated with a disinterest in the future per se. In fact, past fixations with and assumptions about the future were core to the problematic styles of thought that enabled Anthropocene-inducing actions. Some of the proposed start dates for the Anthropocene are notable not just for the physical changes they mark , but for the ideas about the future with which they were associated. In the remainder of this essay, I consider three of them.
The earliest of the three is an increasingly prominent origin date for the Anthropocene: the beginning of the Colonial era, when certain Europeans began their Earth-changing project of finding, appropriating and exploiting lands, natures, and peoples across the planet. This ‘organizational transition’, as Simon Lewis and Mark Maslin call it, was essentially about creating a globalized economy. As they put it, ‘A new world order driven by the search for private profit was born’. Key to this globalized economy was merchant capitalism: the distribution of products around the world via increasingly long transport routes involving ships. Conducting business through long, ocean-faring supply chains demanded a new comfort with uncertainty, notably a new and positive calculative attitude toward risk. The dangers of merchant capitalism meant that the successful ‘creation of profits depended on foresight and planning’. This led to a new vision of time as ‘a commodity to be used, saved or sold to create profits, rather than something that was simply doled out by the creator.’
This reframing of the future as well as space as ‘a territory to be conquered or colonised’ was amplified with the emergence of the industrial revolution, the second proposed start date for the Anthropocene that has a distinctive future orientation. Helped along by the same discipline of geology now looking to document the Anthropocene epoch, new sources of concentrated, underground, fossil energy were found and extracted, accelerating the ongoing terraforming of the Earth and the continual compression of the ‘turnover time’ of capital. Society sought to break away from not only nature but also the past. Some groups and places were considered more successful at this than others and were ranked as “closer” to the future. Underpinning this modern imaginary was a still dominant linear sense of time that reinforced a sense of the future as something to pursue as quickly as possible.
Acceleration is also the key characteristic of the third Anthropocene start date to highlight: the “Great Acceleration” of the post-war era when production, consumption, and pollution all jumped in a frenzy of post-war world-making. Here, the shock of the wars, competing visions for what world should be rebuilt, and new computing technologies of the sort that eventually enabled the Anthropocene to be detected by Earth System Scientists, led to a different stance on the future. No longer was time understood as reliably linear or the future as single. Rather, a new profound uncertainty about what could happen led to a sense of the future as an array of multiple possibilities; a maze to be negotiated carefully, not just pursued blindly. Fittingly, the computer power that enabled this new stance on the future was a by-product of the atomic weapons program that led to the dropping of the world’s first atomic bomb, a moment in July 1945 that is now proposed as a suitable Golden Spike for the Great Acceleration phase of the Anthropocene. In 1964, RAND Corporation scientists in the United States used the ‘massive data processing and interpreting capability that… created the breakthrough which led to the development of the atomic bomb’ to produce a ‘general theory of prediction’, the aim of which was to ‘enable us to deal with socio-economic and political problems as confidently as we do with problems in physics and chemistry’. This ‘radical shift in notions of the future… became known as “forecasting”’.
Forecasting is now ubiquitous, thanks in part to the radical uncertainty that the Anthropocene has ushered in. While consideration of the future is more important than ever, it is prudent to remember that the social origins of our assumptions about the future and the tools we use to perceive it are rooted in the very origins of the Anthropocene. Thus, how suitable they are for helping us understand and navigate the Anthropocene is open to debate. One group that contests the implicit determinism of futurists and their forecasts are futurologists, who argue instead for both a greater degree of openness to unexpected possibilities, and for a commitment to actively and inclusively choosing desirable futures. Many commentators of the Anthropocene similarly call for greater reflexivity about what futures humanity as a whole is now creating, knowingly or not. With the future recast in this way as less a territory to map and claim, and more a path to be carefully and painstakingly created, we may be witnessing yet another stance on the future emerging, one more attuned to the blind spots and responsibilities involved.
Ruth Morgan, Monash University
Historians are not usually the ‘go-to’ people for talking about the future, or speculating on what the future might look like. But we can reflect on futures past, that is, on the kinds of futures that peoples in the past expected to inhabit. Although some of these futures have come to pass, it is not realisation of these futures that makes them worthy of study. Rather, it is the particular historical conditions that produced those forecasts—and the decisions they set in train—that is the bread and butter of historians. Allow me to reflect on the futures past of water in Western Australia, the vast and climatically variable state that covers nearly a third of the Australian continent.
In July of 1953, the West Australian newspaper declared that ‘High water consumption creates storage problem.’ The article assured readers, however, that, ‘At the rate of progress the department has made so far in the supplying of water, it seems unlikely that any day we turn on the tap in our bathroom or kitchen, no water will appear.’ According to the author, suburban supplies were adequate to meet the growing demands of Perth households, so long as there were sufficient rains to fill the dams. Less than a decade after the Second World War, a situation of path dependency had emerged in the state capital’s suburbs whereby the existing infrastructure was shaping and directing future water resource development. As a result, anxieties about running out of water combined with increased supply to generate further demand for water, which, in turn, created more apprehension, ultimately diminishing their resilience to fluctuating water supplies.
After the war, Perth households further increased their water use, outstripping the residents of other capitals. Total water consumption in Perth had been doubling almost every fifteen years since 1920. With the economic development of the 1950s and 1960s, the state’s population had soared and was increasingly concentrated in the suburbs of Perth. The Metropolitan Water Board was struggling to keep pace with the growth in demand, which fuelled engineers’ concerns about running out by the end of the century. How would they continue to slake the thirst of the suburbs? From where would the city drink next?
By the late 1960s, engineers considered that Perth’s dams would only be able to supply two-thirds of the water necessary to slake suburban thirsts by the end of the century. Further calculations in the early 1970s only confirmed these anxieties: demand for water would exceed the available supplies by the mid-1980s.
At this time, water planning in Australia was very rudimentary compared to Canada and the United States. Estimations of future demand tended to assume that population and per capita consumption would continue to increase at existing rates and water would have to be supplied accordingly; this approach reflected the relatively high level of available water resources per head in Australia’s cities. However, in Perth, this abundance was now in doubt, as the most accessible and cheapest sources had already been developed and some of these were being threatened by salinity. The city was running out of options.
Another key factor influencing water planning was climate. That is, water engineers expected the climate would be unchanging. But this, too, came into question in the mid-1980s when it began to be understood that decision-makers could no longer rely on ‘the assumption that past climatic data without modification are a reliable guide to the future’.
Dry years in the mid-to-late 1980s in southern Australia coincided with the emergence of the enhanced greenhouse effect on the Australian political agenda. Local water managers feared these conditions were indicative of a changing climate, which would have severe consequences for water supplies if they were to continue. These findings prompted local water managers to wonder how long the dry years might continue. When would they abate? Would rainfall return to ‘normal’?
The developing climate change agenda of the mid-1980s prompted Australia’s national science agency, CSIRO, and the Federal government to convene the Greenhouse87 conference in late 1987. By this time, the increasing scientific and political concern about anthropogenic climate change and its likely impacts had begun to seriously challenge conventional approaches to environmental and resource management. Greenhouse87 was the first national meeting of scientists and resource managers to discuss the potential effects of anthropogenic climate change for Australia. The basis of these discussions was a CSIRO climate scenario for the year 2030, when the concentration of carbon dioxide in the atmosphere was expected to have doubled. According to this model, the resultant changes in the atmospheric circulations would cause a decline in the rainfall of southwestern Australia.
The Water Authority suspected that the expected drop in rainfall might have already commenced in about 1970 and that it would continue into the middle of the twenty-first century. Such a climatic change would lead to a 20 percent reduction in rainfall and an even greater decline (over 40 per cent) in the average streamflow of the region’s rivers, due to the relationship between the soils, climate and vegetation in catchment areas. With lower rainfall and streamflow, demand for scheme water would exceed supplies more quickly than anticipated. This new line of thinking suggested that water supplies could be insufficient by as early as 2020, rather than lasting until nearly 2040. Other sources had to be found and water demand had to be curtailed quickly.
Although Western Australian water managers were not the first to consider the challenges that climate change posed to existing water supply networks, they faced a unique situation where the predictions were remarkably similar to the climatic conditions the south-west had actually experienced since the 1970s. In these circumstances, the urgency of preparing a strategy to protect the southwest’s water supplies in the face of anthropogenic climate change was far greater than in other regions at the time.
With the overhaul of the management of the state’s water resources in the mid-1990s came closer scrutiny of the lower levels of rainfall that had prevailed since the 1970s. The state’s water authorities now identified a ‘non-linear jump’ to a new regional climate equilibrium: a state of lower winter rainfall. This new perspective on the region’s climate saw the state’s water managers further reduce the estimated long-term annual inflow to the southwest water supply system. In order to meet this demand and to avoid imposing tighter water restrictions, the Water Authority brought forward plans to expand and develop additional water supplies. Doing so required them to fast-track their plans by a decade to ensure that there would be sufficient supplies as early as 2010.
Since the late 1990s, the Water Authority has continued with an ambitious program of water resource development for the city of about 2 million people. Groundwater aquifers and two desalination plants provide the lion’s share of water supplies, with recycled wastewater being pumped back into the ground to stabilise these precious reserves. The predictions of the late 1980s have borne out as there has been about a 90 percent decrease in streamflow in the region. Furthermore, all the global climate models suggest this will continue with an ongoing decline of winter rainfall for south-western Australia. Perth is not alone here as rainfall has also been declining in other areas with Mediterranean-type climates around the world, from South Africa to the southwest United States.
In retrospect, the decisions of the water managers in Western Australia in the late 1980s appear to have been very much informed by the ‘precautionary principle’, set out in the 1992 Rio Declaration on Environment and Development, taking action in the face of scientific uncertainty to avoid serious or irreversible environmental damage. Western Australian water managers found themselves among the first to respond to climate change on the basis of observable change. They were dealing with a canary in the climate change coal mine and chose to act on this very possible dry future.
Such studies of futures past shed light on cultures of prediction as well as the material and discursive consequences of the impulse to discern what lies ahead. Comprehending the likelihood of a particular future rests not only on a trust in numbers and models, but also in the ways in which they align with conditions experienced over time. Attending to the temporality of prediction in this way helps to anchor futures past in particular moments and places, revealing the extent to which the horizon remains unmade. Reading the future as yet undetermined may offer hope and opportunity for human agency in the face of the enormity of planetary crisis.
Andersson, J., 2012: The great future debate and the struggle for the world. The American Historical Review, 117, 1411-1430.
Giddens, A., 1999: Risk. Lecture 2 of ‘The Runaway World’, the 1999 BBC Reith Lectures. London, http://news.bbc.co.uk/hi/english/static/events/reith_99/week2/lecture2.htm, BBC
Hacking, I., 1992: ‘Style’ for historians and philosophers. Studies in History and Philosophy of Science Part A, 23, 1-20.
Harvey, D., 1989: The Condition of Post-Modernity, Blackwell, Oxford.
Lewis, S.L., and Maslin, M.A., 2018: The Human Planet: How We Created the Anthropocene, Penguin UK.
Massey, D., 2005: For Space, Sage, London.
Reith, G., 2004: Uncertain Times The notion of ‘risk’ and the development of modernity. Time & Society, 13, 383-402.
Steffen, W., Leinfelder, R., Zalasiewicz, J., Waters, C.N., Williams, M., Summerhayes, C., Barnosky, A.D., Cearreta, A., Crutzen, P., and Edgeworth, M., 2016: Stratigraphic and Earth System approaches to defining the Anthropocene. Earth’s Future, 4, 324-345.
Steffen, W., Rockström, J., Richardson, K., Lenton, T.M., Folke, C., Liverman, D., Summerhayes, C.P., Barnosky, A.D., Cornell, S.E., Crucifix, M., Donges, J.F., Fetzer, I., Lade, S.J., Scheffer, M., Winkelmann, R., and Schellnhuber, H.J., 2018: Trajectories of the Earth System in the Anthropocene. Proceedings of the National Academy of Sciences.