A humanist on thin ice

by Tom Griffiths

I FEEL LUCKY to have visited both of Earth’s polar ice caps. Seven years ago I voyaged to Antarctica on an Australian routine expedition ship to resupply one of the scientific stations down south. I experienced that awesome encounter by sea with the great ring of ice that surrounds the southernmost continent. You sail directly south from Australia across the stormiest sea in the world for nine days and then, finally, the ocean stills, for you are among giants. The icebergs. They loom into sight, at first a craggy eminence in the mist and then a faceted jewel revealing its edges and cliffs and threatening fragility. These great icebergs made me shiver with more than cold. In their stateliness and marbled grandeur they provoke a frisson of fear, because of what they come from, because of what they are fragments of. You are heading towards a giant ice-making machine and these are the most trivial of its industrial residue.

Two years ago I had a more intimate encounter with ice at the other end of the globe, when I visited Greenland. Instead of being on a container ship I was on a tiny fishing boat, and we sailed perilously close to fragmenting icebergs in an ice-choked fjord. I learned later that up to five people die every year in that bay from doing this, mostly Inuit fishermen from a small community of five thousand people and eight thousand sledge dogs. I was in Ilulissat, under the eaves of the bergs in Disko Bay, at the mouth of the fastest moving glacier in the world. It is now moving forty metres a day, and its pace has doubled in the past decade. The Sermeq Kujalleq glacier is allegedly the source of the iceberg that sank the Titanic. Now it is a physical and political frontier of climate change. This is where American senators fly in to get a visceral sense of what greenhouse gases are doing.

In the early twenty-first century, when you contemplate a retreating glacier, what do you feel? You are overwhelmed by a timeless wonder and a primal fear. You are witness to an ancient, remorseless natural force, one that makes humans seem trivial, expendable and irrelevant. Ice is abiotic and inhuman. Yet, with what we now know about global warming, you can no longer regard calving ice as simply natural. The pace and circumstances of the event implicate us. In our generation, ice has become moral and political. So now we have other feelings when we contemplate the melting ice – ethical anguish about humanity’s responsibility, political passion to reduce greenhouse emissions, apocalyptic doom about your prospects and even perhaps an opportunistic zeal about what your nation might have to gain in the short term. When Greenlanders recently voted for independence from Denmark, some noted that once Greenland’s ice sheet melts, Denmark – one of the flattest countries in the world – will be under water anyway. Nationalism can look pretty parochial.

At the beginning of the twenty-first century, we are standing on the brink of a precipice, but at least we know that we are. We surely don’t understand all the dangers and opportunities ahead of us, but we are now roughly aware of our predicament. That at least is an achievement. The same industrial capitalism that has unleashed carbon has given us a planetary consciousness that reveals a calving glacier not as a random, local act of nature, but instead as the frightening frontier of a possibly irreversible global, historical event. How did we come to this realisation? It emerged partly due to humans’ contemplation of glaciers. Understanding ice – its history and its future – is the key to understanding climate change.


A HUNDRED YEARS ago we didn’t know much about ice. At the beginning of the twentieth century explorers still thought they might find polar bears in Antarctica. Even fifty years ago we were still finding out how much ice there is clamped to the base of the globe. One of the paradoxical responses I had to seeing both ice caps was to be reminded of just how much ice there is on Earth. After all, we do live in the lee of an ice age. The top thirds of the latitudes of each hemisphere have ice. Admittedly, we live in a brief relatively warm phase, a precious interglacial. Calling our own geological age the Holocene and thereby separating it from the Pleistocene – the Ice Ages – is misleading. As humans, we are inhabitants and creations of an extended, continuing series of ice ages, the Quaternary.

In the mid-nineteenth century a fine scientist shocked his colleagues and the public with a revolutionary view of the global past. I’m not referring to Charles Darwin, although that intellectual drama was happening in parallel. I mean the charismatic Swiss-born Professor Louis Agassiz, who in the late 1830s proposed not only that glaciers had moved rocks around and later retreated – hence explaining the puzzling presence of isolated boulders in Swiss valleys – but that whole countries had once been covered under miles of ice. There was something obsessive about Agassiz’s catastrophist crusade. He was prepared to be lowered into the bowels of glaciers to understand their dynamics. His friend and mentor, Alexander von Humboldt, warned him against the ambition of his theory: ‘Your ice frightens me.’

The idea of a former ice age of such a massive scale took several decades to be fully accepted by the scientific community. It overturned the dominant scientific assumption that Earth was gradually cooling down through history. And it challenged the biblical account of a great deluge. Instead, Agassiz proposed that it was frozen water that had wiped Earth clean. Climate, it seemed, was capable of variability beyond remembered human experience, even beyond imagination.

Although Louis Agassiz promoted the idea of an ice age, he did not spend much energy pondering its cause. It was this quest to understand the causes of ice ages that provoked the climate change science we rely on today. In the 1860s a Scottish scientist, James Croll, proposed an astronomical theory of ice ages, arguing that they were triggered by fluctuations in Earth’s orbit, tilt and wobble that affect the power of the sun. This idea was revived and improved in the 1920s and 1930s by the brilliant Serbian mathematician Milutin Milankovitch, but it long languished in the face of healthy scientific scepticism. The astronomical theory of ice ages finally gained confirmation in the 1970s, following the acceptance of another theory, that of plate tectonics (continental drift), and also due to new evidence of past climates, drawn from oxygen isotope analyses of deep-sea sediments.

Today, we accept the Croll-Milankovitch theory of the astronomical cause of the ice ages, but we acknowledge that other factors are also involved – and that there are still mysteries. Fluctuations in Earth’s orbit and orientation are clearly triggers. But these astronomical patterns also interact with earthly factors. To generate an extended ice epoch like the Quaternary it helps to have some land close to the poles, and plate tectonics explains how the continents can get themselves there from time to time. But it is the much faster feedback mechanisms of the ebb and flow of ice sheets, and the fluctuations in greenhouse gases and their absorption in the oceans, that can amplify small changes in global temperature, thereby precipitating an ice age.

How was human influence identified as a factor in modern climate change? Let us revisit four moments in that dawning understanding. Together they constitute a useful greenhouse history, a guide to the scientific insights – all of them in some way inspired by ice – that underpin our contemporary environmental consciousness.


IT IS 1859, the Origin of Species has just burst upon the world and we are in London in the laboratory of the Irish scientist John Tyndall. Like Louis Agassiz, Tyndall is a very keen mountain climber, and he too is spellbound by glaciers. He has just returned from a summer in the European Alps and in a few years will lead an assault on the Matterhorn. Tyndall is investigating a possible cause of ice ages; he is interested in how the atmosphere might control Earth’s temperature. He wants to test the accepted notion that all gases are transparent to radiant heat. In his laboratory he first tests the main gases in the atmosphere, oxygen and nitrogen, and he finds that, yes, they are indeed transparent. Perhaps he will abandon this experiment. But then he thinks to test coal gas. There it is, easily accessible to him, for it is piped into his laboratory for lighting, an industrial gas, mostly methane produced by heating coal. When he tests it, he finds that it is opaque – it traps heat rather than letting it through. ‘Thus,’ as one writer put it, ‘the Industrial Revolution, intruding into Tyndall’s laboratory in the form of a gas jet, declared its significance for the planet’s heat balance.’ Tyndall then tries other gases and finds that CO2 is also opaque. In 1859, already a celebrated year in the history of science, the role of greenhouse gases in controlling the temperature of the planet has been identified.

Now it is 1896, and we’re in Sweden looking over the shoulder of Svante Arrhenius. He is doing lots of tedious calculations – he is also interested in what might cause ice ages and he is investigating whether levels of atmospheric carbon dioxide might be a factor. What happens to Earth’s temperature, he wonders, if the level of CO2 in the air doubles? He estimates that it would raise global temperatures by 5 or 6 degrees Celsius. But he believes that such a change could only happen after thousands of years of burning fossil fuel – and anyway, higher temperatures don’t seem such a bad thing in wintry Stockholm.

Now it is 1938, and we’re in London again. A British steam power engineer, Guy Stewart Callendar, has the temerity to stand up before a gathering of professional meteorologists – the Royal Meteorological Society, no less – and tell them about the weather. He is speaking to them with the spectre of war overhanging them all, and at the end of three decades of mild winters in Europe and North America. He is drawing on recent and historical accounts of glaciers; scholars have started referring to the period from 1300 to 1860 as the Little Ice Age, and Callendar is one of many who is beginning to reflect on the recent experience of glacial retreat. But in this talk he uses the relatively new scientific artefact of global temperature records, which began to be constructed from the 1870s. Callendar’s subject is the recent upward trend in these global temperatures. See, he tells the meteorologists, it is unmistakable since 1910. He also ventures an explanation: the globe is warming because of rising CO2 levels in the atmosphere. This expert on steam power technology, this professional of the industrial revolution, declares: ‘As man is now changing the composition of the atmosphere at a rate which must be very exceptional on the geological time scale, it is natural to seek for the probable effects of such a change.’ As a true scientist, Callendar proposes that the test of his theory will be the next twenty years of global temperatures. He utters these words on the eve of several decades of average global cooling, and so his theory will fall into neglect.

IT IS NOW 1957, the International Geophysical Year, and Antarctica has become the focus of international scientific activity. The Cold War has generated considerable superpower investment in polar expertise, but many other nations are also involved and, astonishingly, a Cold Peace (the Antarctic Treaty) is soon to be negotiated. One of the scientists working on the ice is Charles David Keeling, an American. But his interest is in the clean air down there. He is keen to measure levels of atmospheric carbon dioxide, and so he is looking for the purest, freshest, windiest air to test. Keeling has chosen two places: one is down at the South Pole (where the Americans have just established a station) and the other is on top of a mountain in Hawaii, in the middle of the Pacific Ocean. It takes only eighteen months for Keeling to discover that CO2 is rising steadily and noticeably at both sites. His graph also reveals the planet breathing, for in the northern hemisphere (where most of the land is) sprouting vegetation draws CO2 from the atmosphere each spring and then releases it again in the autumn. But against this gentle, predictable rhythm there emerged a relentless upward trend. It was this graph that so disturbed Al Gore as an undergraduate, as he explained in An Inconvenient Truth – it was Gore’s professor, Roger Revelle, who set Keeling on this work. Although Keeling begins his measurements in a period of global cooling, his findings will swiftly and decisively precipitate scientific concerns about future global warming well before the pattern reasserts itself in the 1970s. The science will thus show its ability to distinguish weather from climate.

This brief history of ideas reveals that most of the foundational climate science consisted of dispersed insights generated by lone, underfunded and part-time researchers. It is too easy, in sketching the history of science and technology, to portray a heroic, stumbling progress towards enlightenment and an inevitable, if faltering, revelation of truth. It is worth noting that this was curiosity-driven research, and mostly non-urgent and non-anxious, and that many of the scientists were researching the ancient past – the causes of ice ages – rather than the immediate future. And where they speculated on future global warming, they saw it mostly as positive. The prospect of several degrees’ rise or fall was greeted equably. Indeed, if the world were warmer, it might make winters more comfortable and agriculture more productive, or even help stave off the next ice age. For the first two-thirds of the twentieth century the global warming trend was called the ‘embetterment’ of climate, or the ‘recent amelioration’. The history of the science helps us to see through contemporary accusations that we are all in the thrall of some sinister, late-twentieth-century global conspiracy.


BUT A CRITICAL question remains: where does our current sense of urge­ncy and crisis come from? Here, once again, we must return to the ice.

Embedded in ice are tiny air bubbles from hundreds of thousands of years ago. When you drill into an ice cap that is kilometres thick, you can extract a core that is layered year by year, a precious archive of deep time. Ice cores are the holy scripts, the sacred scrolls of our age. When I was in Denmark in 2008, I paid reverent pilgrimage to the Centre for Ice and Climate in the Niels Bohr Institute at the University of Copenhagen. This was where, fifty years ago, the Danish scientist Willi Dansgaard commenced his work on oxygen isotopes and began coring the Greenland ice cap. My host at the institute kindly took me to the ice archive. It was an unexpected treat, so I did not have polar gear. After locating the hidden key, we pushed open the door to the sacred chamber of ice. The temperature of the first room was -14 degrees Celsius with a significant wind-chill factor, for the air conditioner was labouring hard to reduce the temperature further. We admired some ice and instruments, and then with due solemnity entered the inner precinct, a room at -24 degrees Celsius and going down. Here my breath began to shatter and my brain ground to a halt. There, in ordinary boxes, sample ice cores were kept in labelled plastic bags. Each core was clearly stratified, with layers like bleached tree rings. Some younger sections had visible bubbles; older sections were grey and murky with silt.

My host explained that the university has several back-up systems of refrigeration, including an emergency generator in a hidden place – hidden, because if the city lost power for some time generators would be at a premium, and what could be more important than preserving ice? I was told that some of the ice core collection is kept offsite, in a nuclear-fallout shelter. When occupying this unusual structure, the scientists had to sign an agreement that ‘in the event of a nuclear war’ they would vacate the shelter within three days. Ice archives are serious business.

It was ice that delivered the scary sense of urgency that we now feel about global warming. The oldest Greenland cores go back to the last interglacial, about a hundred and twenty thousand years ago, whereas the deepest Antarctic cores currently retrieve eight hundred thousand years of climate history. In Antarctica, there is less precipitation and seasonality and more compression of the layers of ice; resolution is traded for time. In Greenland the layers are clearer, because of the greater annual accumulation of ice. And so the more discriminating Greenland cores are essential to calibrating the longer, more condensed Antarctic archive. The polar ice caps therefore combine beautifully to give us detailed long-term climate data.

In the early 1980s the work of Dansgaard and others on the Greenland ice sheet delivered a shock. Until that time there had been a reassuring assumption that if global warming was happening, it would happen slowly, at a stately pace, as it presumably had happened in the past. But the fine, year-by-year Greenland data revealed that climate change had often occurred quite quickly, with temperature changes of as much as 5 or 6 degrees Celsius in a few decades.

This history of ‘unpleasant surprises’ (as the American earth scientist Wally Broecker termed them) was deepened by drilling in the southern ice cap. In Antarctica, in the 1990s, a long 400,000-year ice core was extracted from the middle of the ice sheet near the Soviet station, Vostok. The Vostok core, which charted four full cycles of glacial and interglacial periods, established that the carbon dioxide and methane concentration in the atmosphere had ‘moved in lockstep’ with the ice sheets and the temperature. It also revealed that present-day levels of these greenhouse gases are unprecedented during the past four hundred and twenty thousand years. And when the ancient temperature variations were graphed, they seemed to be the nervous palpitations of a highly sensitive system.

Whereas the Keeling graph of rising CO2 gave climate science a future to be concerned about and a predictive hypothesis to think with, the Greenland and Vostok ice cores gave climate science a deep past that was equally unsettling. So it was really only from the late 1980s, as abrupt climate change began to emerge from the ice record, that there was urgency and anxiety – and this coincides with the fast warming of the ’80s onwards. This was the same period in which ecological science abandoned the idea of the ‘balance of nature’ and accepted ‘disturbance’ as normal in ecosystems. It was the same period when ‘punctuated equilibrium’ – the idea of sudden change – re-entered debates in evolutionary science. Today, especially following the UN Climate Summit in Copenhagen, we feel that we are responding to this crisis too slowly – and we are. Yet a clear sense of urgency among climate scientists is only twenty years old and a confident scientific consensus has been articulated only over the past decade. So the task of coming to terms with its social and political implications has barely begun.


THE SCIENTIFIC AND cultural history of ice, especially of the southern ice cap, has driven and reflected our changing understandings of climate. Antarctica, which was so little known at the start of the twentieth century, became by the end of it the key to the future of humanity. Antarctica, it was slowly discovered, was quite unlike the Arctic. Northern hemisphere assumptions foundered on a very different, southern reality: a continent at the pole, an ice age in action, the cold core of Earth’s atmosphere, the engine of global climate. Antarctica is Louis Agassiz’s dream – or nightmare – discovered on our planet, in our own time. It is where nine-tenths of the world’s land ice resides.

Antarctic voyagers of the nineteenth century had originally wanted rock. They wanted rock, and soil they could plant a flag in and claim for their country. The ice was in the way; it was a nuisance, an obstacle. It quickly became a testing ground for physical endeavour and moral courage, a source of beauty and fear, yet still an obstruction stopping people from reaching, studying and claiming the land beneath. But by the mid-twentieth century the ice itself became a primary scientific focus and was no longer regarded just as an obscuring and frustrating barrier. People began to see Antarctica as a vestigial landscape, a giant white fossil. The continent of ice seemed to allow us to travel through time to the Pleistocene Earth. The ice had a history, one that revealed the vicissitudes of the whole Earth.

So the intellectual and physical focus of Antarctic exploration shifted. The great symbol of heroic endeavour in Antarctica had been the sledging journey, and its aim had been primarily geographical: to go furthest south, to reach a pole, to extend territorial boundaries, or to map features or coastlines. Explorers travelled over the ice and fought its surface, and they hungered for any rock that might defy or penetrate it. A century ago this year, Roald Amundsen and Robert Falcon Scott began their Antarctic voyages and were preparing to race each other over the ice to 90 degrees south.

But, from the mid-twentieth century, a different type of journey became the centrepiece of Antarctic expeditions. The traverse of the ice cap by tractor was the dominant heroic feat of the 1950s and ’60s, and its chief purpose was the study of the ice itself. Scientists began to measure the depth of the ice sheet; the pioneering glaciological work was done by a Norwegian-British-Swedish expedition in 1949-52. The men of this expedition thought the ice cap was no more than a few hundred metres thick, and so they were stunned to find that it was kilometres thick. They doubted their own arithmetic. They checked and rechecked their measurements in disbelief. These men in their tractors crossing the ice cap found that the driest of all continents was actually a vast elevated plateau of frozen water. This startling discovery, little more than half a century ago, revealed that the world’s sea level is principally controlled by the state of the Antarctic ice sheet.

Questions about the ice changed from how frustrating it was to how vast, to how continental, to how deep, to how old and then to how stable. Ice, it was realised, was vulnerable and endangered. The confirmation of anthropogenic global warming came not only from the behaviour of the ice sheet, but more clearly and dramatically from the air bubbles trapped in the ice. Before Antarctica was even seen by humans, it was recording our footprint.

If a hundred years ago the defining Antarctic journey was the sledging expedition across the surface of the ice, and fifty years ago it was the tractor traverse that, with seismic soundings, measured the volume of the ice sheet, then the defining Antarctic journey of our own era goes straight down, with the help of a drill, from the top of the ice dome to the continental bedrock, a vertical journey back through time. And the ice core extracted enables us to see our civilisation in the context of hundreds of thousands of years of climate history. Right now, in Antarctica, the international race is on again – not for the South Pole, not for the first trans-Antarctic traverse, but for the first million-year ice core.


SCIENTISTS OFTEN MOVE between two time scales. One goes back only five or ten years and is determined by the need to remain on what is seen to be a fast-moving, progressive frontier of scientific knowledge, and the other embraces a geological and evolutionary time span of millions of years. The time scales in between – those that represent a human lifetime or the centuries that characterise a society and its environmental practices – are the expertise of the humanities scholar. Science can also have a distinct spatial expertise, often bringing into focus the very distant and the microscopic, but giving less attention to human-scale geographies. Humanities scholars are thinking their way into these gaps and constantly refining the tools of analysis for the middle-level dimensions of time and space. Thus, the study of century-scale environmental and social change has become a growing area of humanities research.

Some of the new sense of urgency and anxiety about climate change has been delivered by historical research into the past thousand years of human history. Ice cores and pollen analyses have shown impressive chronological discrimination, but documentary research is capable of even greater precision. Scientific studies often strain for an ‘objective method’ over and above what is perceived to be the ‘subjective’ reading of historical documents, and thereby bypass a rich, tangled archive in favour of a constructed, countable one. We need both, together. Thanks to fine-grained social and political analysis, and to burgeoning scholarship in the field of historical climatology, we have come to a better understanding of what a difference a degree makes.

The second half of the nineteenth century saw the ending of an amorphous event that had stalked five hundred years of human history. The Little Ice Age of slightly lower average temperatures ended in 1860, and as the glaciers receded before their eyes scientists began to investigate the causes of glacial epochs. In the same decades, the industrial revolution gathered momentum and human-made carbon emissions began to take effect. The Little Ice Age began as a natural fluctuation, but did it end naturally? Where, we now wonder, did the Little Ice Age end and human-induced global warming begin? Historical climatology aims to give us a more accurate understanding of the natural background climate variability, against which to measure anthropogenic influence.

The Little Ice Age, which began about 1300 and was most pronounced in the North Atlantic, was one of two significant, sustained temperature fluctuations in the past millennium. The other is known as the Medieval Warm Period and lasted from about 900 to 1300. Each of these fluctuations is tiny, compared with the great swings of temperature during the Pleistocene and those that we now face – they have been in the range of only about 1 degree Celsius. But even these relatively small fluctuations have had significant effects on human geography and society. Most famously, Vikings colonised Iceland and Greenland in the Medieval Warm Period, and then probably abandoned those settlements as the ice re-invaded pastures and seaways. In Europe the worst years of the Little Ice Age, 1560-1660, were the period of the witch persecutions. In the late eighteenth century hard seasons, poor harvests and food shortages were a factor in the French Revolution. And on the other side of the world, at the same time, the British colonisation of Australia was threatened by a historically severe El Niño. The contours of climate and the vicissitudes of weather are deeply inscribed in the social record.

One of the great climate historians of our age is a French cultural hero, Emmanuel Le Roy Ladurie. He was a student of Fernand Braudel’s and a member of France’s Annales school, which championed historical investigation of la longue durée, the history of the long term. Le Roy Ladurie is best known for path-breaking, even racy books in cultural micro-history – especially Montaillou (1975) and Carnival in Romans (1980), detailed reconstructions of French village life in the fourteenth and sixteenth centuries. But you could say that climate history has been his enduring passion, for his research into this subject began in the 1950s and culminated in 2009 with the publication of the final volume of his trilogy on the climate of the past millennium. His changing historical interpretation of the climate of the twentieth century provides a measure of a cultural scholar’s cautious awakening to the sense that something disturbing is indeed unfolding around us.

Steeped in the history of the rural economy of southern France, Le Roy Ladurie discerned the variety of daily weather and the fluctuations in climate in the background of ordinary and revolutionary human events. Until the eighteenth century, when scientific instruments such as the thermometer and barometer became refined and available, climate seemed largely unquantified and unquantifiable. Weather was described in qualitative and impressionistic terms, and was interpreted portentously or theologically. As Le Roy Ladurie commented, ‘the handwritten comments on climate from some parish register, or the worm-eaten and illegible records of some lawyer, were too accidental and irregular to provide material for really organised knowledge.’ So he asked himself: ‘How, before thermometers came into use, can variations of 1 degree Celsius or less be detected?’ He drew on dendrochronology and on Willi Dansgaard’s first ice cores from Greenland, but he also turned to the documentary records of wine and ice, to the fluctuating dates of the grape harvest and to written accounts of the advance and recession of glaciers in alpine valleys. And, like Louis Agassiz, he strode the mountain valleys and compared glaciers of today with medieval descriptions of them.

Le Roy Ladurie began work on his first climate book, Times of Feast, Times of Famine, about the same year that Charles David Keeling started measuring the rising atmospheric levels of CO2. There had been a decade and a half of slightly cooler average global temperature, and the English historian of weather Gordon Manley even thought that the ‘recent amelioration’ of climate might be over. He contemplated the return of a colder climate and feared for the future of Icelanders. This was the period, too, when scientists strongly doubted two theories now considered fundamental to an understanding of natural climate change: the Croll-Milankovitch astronomical theory of ice ages, and the idea of continental drift. So, as Le Roy Ladurie began his work, global temperatures seemed to have stopped rising and ice ages remained an enigma.

In place of all this uncertain scientific speculation, this ‘cyclomania’, Le Roy Ladurie offered the solidity of year-by-year social data. He was not interested in cycles (which he believed were speculative and ahistorical), but in fluctuations and trends (which could be shown to exist, or not). He believed in the ‘possibility of a scientific history of climate’ and wanted to redeem historical climatology from its identification with environmental determinism, especially the simplifying and racial arguments of geographers like Ellsworth Huntington, the American academic. He also aimed to immerse the study of climate in the quirky contingencies of history. ‘Climate,’ he declared, ‘is a function of time. It varies; it is subject to fluctuation; it has a history.’

Amid mid-century doubts about the continuation of ‘the recent amelioration’ Le Roy Ladurie set out to show that ‘climatic fluctuation is not peculiar to the twentieth century.’ He compared the mild twentieth century with the Medieval Warm Period and the Little Ice Age, and confirmed that all are ‘modest variations’ (to use Manley’s term) within 1 degree Celsius. And, looking back from the 1960s, history seemed to show a routine return to the norm rather than any progressive long-term change. Le Roy Ladurie explained: ‘none of these minor fluctuations has been such as to modify the pollen diagrams seriously or for any considerable length of time; none of them has driven back the beech forests, or brought back the hazel to central Sweden or the holm oak to Normandy.’ He believed that historians tend to argue a ‘rigid fixity’ against ‘the headlong exaggerations of climate fiction’.

Forty years later, in the early twenty-first century and in the final volume of his climate trilogy, Le Roy Ladurie has accepted that there is ‘something new under the sun’. He retains his caution about climatic determinism and his prudence about over-reading singular weather events and short-term climatic fluctuations, but he is now ready to contemplate relinquishing his ‘fixity’. As a Frenchman he reluctantly acknowledges that, now even on matters of meteorology, Charles Darwin the evolutionist has probably triumphed over Georges Cuvier the fixiste. As a historian of material life he enthusiastically reads the seasons not only in the dates of the grape harvests but in the quality of the vintages. As a student of Braudel’s he is intrigued by the way climate history constantly poses the challenge of how to distinguish l’histoire evénémen­tielle from la longue durée, the surface event from the underlying trend. He emphasises the short-term benefits of twentieth-century warming for agriculture, the wine industry and tourism. His year-by-year history of weather and decade-by-decade tracking of climate brings him to a present that is more than thirty years into a period of sustained and rapid warming, one that began in 1976.

Le Roy Ladurie is a historian, he says, and not a forecaster. His professional training bids him to stop on the steps of the temple of the future. But he can look back from his vantage point and survey the knowable landscape of the past, and he can discern that we are at that point in our history where our own climate ‘fluctuation’ is poised to go beyond the scale of what people have previously experienced and recorded. So it is not only looking forward that can alarm us. Looking backward confirms that we are now tumbling into a new era.


THE FIRST DAY of last summer, 1 December 2009, dawned unseasonably cold in Canberra. The minimum temperature that morning was more than 6 degrees Celsius below normal. That’s an ice-age kind of difference. As members of the Liberal Party made their way to a leadership meeting that day, they would have had to put on winter jackets and to clear condensation from their car windows. On that crisp ‘summer’ morning, to some members of parliament global warming may indeed have seemed like ‘absolute crap’, as the aspiring leader Tony Abbott described it. The leadership crisis had been prompted by a policy disagreement about climate, but it may have been weather that delivered Abbott’s one-vote victory.

‘Climate is what you expect, weather is what you get,’ runs the old adage. But these days we hardly know what to expect. And what we expect depends less on statistics than on belief. The public debate about climate change is now not really about the science – even when it looks like it is. We are in the realm of competing ideologies and differing belief systems; we are engaged in politics. If you know your politics, it is easy to predict who will think what about the science. We need to wonder why, as Richard Hamblyn has observed, climate change may be the ‘first major environmental crisis in which experts appear more alarmed than the public’.

The public tends to define climate as the domain of the natural and physical sciences, and yet the greatest challenges in this area are political, cultural, social, psychological and emotional. As Mike Hulme writes in Why We Disagree about Climate Change (Cambridge, 2009), ‘We should not hide behind science when difficult ethical choices are called for. We must not always defer to "science" or to the "voices of scientists" when we need to make decisions about what to do. These are decisions that in relation to climate change will always entail judgments beyond the reach of science.’

Bridging the divide between science and the humanities is vital. Understanding climate requires us to be highly interdisciplinary; we all have to venture on to thin ice. That is why the science is so exciting and the politics so slippery. We have to draw upon physics, astronomy, oceanography, chemistry, geology, biology, medicine, geography, glaciology, meteorology, archaeology, anthropology, history, law, literature, languages, art, politics – and more. Many previous scientific revolutions – the Copernican, the Darwinian, the discovery of deep time – have decentred and diminished the power and significance of humanity. By contrast, the scientific revolution of climate change reveals the cumulative, insidious, all-pervading power of people on Earth. This is not just a technical issue; it implicates and challenges our humanity.


CULTURAL AND INTELLECTUAL histories are an essential part of our scientific literacy. Understanding the history of ideas enables a more subtle and discriminating assessment of the public debate about climate science today. There is not only widespread confusion about the science, but also a misunderstanding of how science – or any disciplined knowledge – operates. Acknowledging that science is inescapably political and cultural is one thing; calling professional scientific consensus a conspiracy is quite another. The current debate about the workings of ‘this purified body of information’ takes us back to the Enlightenment and the origins of scientific method. What is this specialised, systematised, self-critical form of knowledge, and how did its conventions evolve? At the heart of its identity, resilience and usefulness is the tradition of rigorous scepticism. It is a scepticism that challenges and questions but which submits to the rules of evidence and review.

The very different kind of so-called scepticism that has recently gained power in political debates about climate change is contemptuous not only of the science, but also of the disciplinary and professional traditions of science. Scepticism of this kind has been justified as a personal and political right, as an optional belief system and as an act of courage. It is reasonable, therefore, to give it a different name: ‘denialism’ or ‘contrarianism’ better capture its oppositional and partisan character. It needs to be taken seriously, not as science but as culture – as a social and psychological phenomenon with a history and pathology. Clive Hamilton’s recent book, Requiem for a Species (Allen & Unwin, 2010), begins that necessary task.

A suspicion of scholarly ‘elites’ characterises and impoverishes the Australian debate about climate change. I am reminded of the vigorous historical dispute several years ago about white-Aboriginal frontier warfare, when accusations of a professional conspiracy were made against academics. The campaign was led by Keith Windschuttle, Quadrant and The Australian; that Windschuttle’s historical work failed academic scrutiny seemed just further proof to many media commentators that a conspiracy existed. Similarly, the geologist Ian Plimer’s unwillingness to submit his alternative vision of climate change to peer review or to debate questions of evidence, and his consequent professional isolation, have been interpreted – again especially by The Australian – as confirmation that he is a bold whistleblower. Frontier warfare and climate change have little in common other than that they are identified by some conservatives as concerns merely of the left.


AUSTRALIA IS GOING to be at the frontier of climate change pain. As inhabitants of an arid continent in the grip of the El Niño Southern Oscillation, a land of drought and flooding rains, a place of escalating fatal bushfire, and with a small and embattled agricultural economy, Australians might have been expected to rush to sign Kyoto a decade ago and to have brought credible legislation for reducing carbon emissions to the Copenhagen summit. The growing Australian public rejection of climate change science may be merely another example of our southern isolationism. Or it may be further testimony to the power that American politics and culture have over Australian society. Or perhaps it is because, for two hundred years, the European colonisers of Australia have struggled to come to terms with the extreme climatic variability of the continent. Australia has a boom-and-bust ecology. Settlers have had to learn, slowly and reluctantly, that ‘drought’ is not aberrant but natural; they have struggled to understand aseasonal and non-annual climatic variation; they have had to accept a wilful nature that they cannot control or change. They are still learning. And now, suddenly, Australians are confronted by long-term, one-way climatic change for which they, in part, are held responsible. It challenges everything they have so far learned about their new land.


THROUGHOUT HIS LIFE the famous French historian Fernand Braudel championed the multiplicity of time, and the need for historians to look beyond ‘social time’ or l’histoire événementielle, the history of events, in order to embrace la longue durée, the slower-moving structures and rhythms of centuries. He saw the history of events as a ‘surface disturbance, the waves stirred up by the powerful movement of tides. A history of short, sharp, nervous vibrations...A world of vivid passions, certainly, but a blind world, as any living world must be, as ours is, oblivious of the deep currents of history, of those living waters on which our frail barks are tossed.’ ‘In truth,’ he wrote, ‘the historian can never get away from the question of time in history: time sticks to his thinking like soil to a gardener’s spade.’

Braudel’s long-term thinking was a product of the thirty-year war of the early twentieth century. ‘Can there be any humanism at the present time,’ he wrote in 1946, ‘...without an ambitious history, conscious of its duties and its great powers?’

The nineteenth and early twentieth centuries had seen the rising dominance of political history and the short time span. It was a consequence of great technical advances in the discipline of history, the refinement of the science of the document and the institutionalisation of the archive. Political history satisfied a ‘desire for exactitude’ and ‘followed the history of events step by step as it emerged from ambassadorial letters or parliamentary debates’. But, for Braudel, decades of twentieth-century war ‘have thrown us violently back into our deepest selves, and thence into a consideration of the whole destiny of mankind – that is to say, into the crucial problems of history. It is a time to lament our state, to agonise, to ponder, a time in which we must of necessity call everything into question.’ Braudel’s own experience of wartime captivity made him seek escape from the grim immediacy of the history of events, and made him reject the short time span, itself a sort of imprisonment. So he participated in that urgent postwar search for a history to live by, one that found human commonality beyond the categories of nation or race, and one that pushed history back into prehistory.

In the half-century and more since Braudel wrote those words, his longue durée has lengthened beyond his dreams as the Darwinian revolution has continued to unfold. Uncertainty and contingency – qualities at the very heart of the historical enterprise – have entered the descriptive theories of the physical and biological sciences. It is our relatively recent discovery, as David Christian put it when teaching his Big History of the past fifteen billion years, that ‘history is not an attribute of human society alone, but that the earth itself, life on earth and even the universe, have histories.’ It is not surprising that much world history is now also environmental history, a coming together of science and the humanities. Both find stimulus in the Darwinian revolution, the detonation of the atom bomb, the landing on the moon, the revelation of global warming. The Earth floats alone in space, and life upon it seems historical, vulnerable, contingent, finite.

Responding to his earlier sense of world crisis, Fernand Braudel found that humanism needed an ‘ambitious history’, one that connected la longue durée with social time: ‘nothing is more important, nothing comes closer to the crux of social reality than this living, intimate, infinitely repeated opposition between the instant of time and that time which flows only slowly.’ ‘Deep time’ and ‘social history’ seem to be the antithesis of one another, each operating on utterly different time scales and subject matter. One conjures up ancient evolutionary history, even a non-human world, while the other is engaged in the study of society. One deals in awesome geological eras, while the other takes its chronological scale from a human lifespan. The climate change crisis challenges us to connect these dimensions, to work audaciously across time and space and species.

Humans have always been biological agents – but, as the historian Dipesh Chakrabarty has put it, we can become geological agents only historically and collectively. And making sense of the crisis of climate change demands that we relate two apparently incompatible chronologies: the modern divergent history of human freedom and inequality, on the one hand, and the universalising deep-time history of ourselves as a species, on the other. The summit in Copenhagen was a global political theatre for those tensions.

A deep-time environmental history of humanity poses the question of whether we might possibly be a species of the ice ages. About 3.6 million years ago the first northern hemisphere glaciation of the current ice epoch reduced rainfall in East Africa and forced the least specialised and most versatile apes to the edges of the contracting forests. There they faced severe pressure to innovate or perish. Interglacials enabled the stronger survivors to consolidate, while the next ice age tested and extended their descendants even further. Deep in the forest, the more specialised and successful tree-climbing apes faced fewer selection pressures. The drier cycles brought by the ice ages – ten or a dozen every million years – combined with the reduction in African rainfall resulting from Australia’s drift north into equatorial seaways, pushed early humans out onto the plains, encouraged them to walk upright and demanded they live on their wits. By the time of the last ice age, fully modern humans had spread across the globe.

In the past two hundred years humans have registered on Earth’s meter as a geological force. We are changing the climate by ‘digging up the dead’, by bringing to the surface the fossilised, organic matter of once-living things and burning it. Industrialisation has initiated a new geological era that historians like to call the Anthropocene, characterised by pervasive human influence on Earth processes. It is both awful and awe-inspiring that we are now crossing a threshold of geological eras. As a result of our own actions we may be leaving behind not just the Holocene, the past ten thousand years of relatively stable climate, but also the Pleistocene, the several-million-year period of cyclical ice ages that has seen the evolution of modern humans. We have collectively become a force in climate that is comparable to the astronomical causes of ice ages.


THE ONLY WAY to make sense of our predicament is to look deeply into the ice we are losing. It is to go back to the last big ice age and beyond, to times of rapid and substantial temperature change. And when we are searching for some vestige of human agency among all this icy determinism, we might turn to an example in our own backyard. The history of the Aboriginal peoples of Australia takes humans back, if not into the ice, then certainly into the ice age, into the depths of the last glacial maximum of twenty thousand years ago and beyond, into and through periods of temperature change of 5 degrees Celsius and more, such as those we might also face. When Europeans and North Americans look for cultural beginnings they are often prompted to tell you that humans and their civilisations are products of the Holocene, and that we are all children of this recent spring of cultural creativity. By contrast, an Australian history of the world takes us back to humanity’s first deep-sea voyagers of sixty thousand years ago, to the experience of people surviving cold ice-age droughts in the central Australian deserts, and to the sustaining of human civilisation in the face of massive climate change. This is a story that modern Australians have only just discovered, and now perhaps it offers a parable for the world.

We need meaningful histories of the really longue durée that enable us to see our own fossil fuel society in proper perspective, and to see ourselves not just as a civilisation but as a species. As we confirm our making of the Anthropocene, we need to develop our capacity to look back even further than the Holocene. We need to be able to look into the disappearing ice and see how much of our history, how much of our future, is bound up with its fate.

From Griffith REVIEW Edition 29: Prosper or Perish © Copyright Griffith University & the author.