Thoughts on passing 400 ppm

In the next few days, the Mauna Loa atmospheric CO2 record will pass 400 ppm. This isn’t the first time that’s happened – we first crossed the 400 ppm threshold in May 2013, but the annual, saw-tooth variation in levels as the Northern hemisphere boreal forest breathes in and out has dipped us below 400 a couple of times since. This crossing is likely to be special however, as it is probably going to be the last time anybody alive today will experience an atmosphere with LESS than 400 ppm CO2.

Human emissions have been pushing up atmospheric levels by about 2.2 ppm every year in recent years, so normally we would expect the annual monthly minimum to increase to beyond 400 ppm from this year’s September minimum of 397.1 ppm, however we are in the midst of one of the largest El Nino years for over a decade, and the drought in the tropics during El Nino years slow the growth of trees relative to normal years, and increases fires. Previous strong El Nino years (like 1997) have helped to push the annual CO2 increase to a massive 3.7 ppm, and this year’s strong El Nino, coupled with increased forest burning in Indonesia, along with fossil fuel burning, have led Ralph Keeling to predict the annual rise could be as much as 4.4 ppm this year.

So why does it matter? 400 is in truth a fairly arbitrary value to get excited about, a neat quirk of our counting system and no more important as a value to the atmosphere than your car odometer ticking from 99,999 to 100,000. It doesn’t mean the car is going to collapse, but it certainly catches your attention. It’s the same with the atmosphere – it gives us pause to consider what we’ve done, and what it might mean for the climate system. For me, the most outrageous thing is that we, an insignificant population of carbon based life forms, have managed to alter the chemical composition of the atmosphere! And not just by a little – by a lot! And let’s not forget that the atmosphere is big – really big!

To me, as an Earth Scientist that leads me to think about when in Earth history the planet has experienced such high levels of CO2 before. Measuring atmospheric CO2 in the geological past is tricky – for the past  ~800 thousand years we have a fantastic archive of trapped atmospheric gas bubbles in ice cores, and for the whole of that record CO2 never peaked above 300 ppm. Beyond the time for which we have the ice cores, we rely on geochemical proxies in marine and terrestrial sediments to estimate CO2 and that is the heart of my research. In a paper we published last year we showed that we have to go back to more than 2.3 Million years ago, to the very earliest Pleistocene and Pliocene to find atmospheric CO2 levels as high as we are about to permanently experience. What does that mean? Well the Pliocene was a similar world to today – the continents were in much the same place, the vegetation mix across this Earth was the same, except global temperatures were 2-3 degrees C higher than now, driven primarily by those high levels of CO2.

Another thing that strikes me today is how rapidly we’ve managed to change the atmosphere. In a little over 150 years since we started to burn fossil fuels with alacrity, we’ve gone from 280 ppm to 400 ppm. It’s hard to find geological records with the temporal precision to see changes that quick, but for sure we don’t know any time in Earth history when CO2 has changed so much, so quickly.

With COP21 in Paris just around the corner, perhaps saying goodbye to sub 400 ppm will focus minds to come up with a solution. I don’t know whether it will, or what a global solution would look like, but I hope beyond anything that we don’t do nothing.
—————–
Cabot Institute member Dr Marcus Badger is a Research Associate in the Organic Geochemistry Group in the School of Chemistry. His research involves using biomolecules and climate models to better understand the Earth system.

AGU 2013: The importance of 400 ppm CO2

On 1 June 2012, a concentration of 400 ppm carbon dioxide was measured in air samples in the Arctic.  On 9 May 2013, Mauna Loa, the longest recording station, measured a daily average of 400 ppm carbon dioxide. Next year we may see the global average concentration reach 400 ppm and the year after that 400 ppm could be measured at the South Pole. The 400 ppm number is arbitrary, but it is a symbol of the anthropogenic climate change that scientists have been talking about for many years.

Here at the University of Bristol, the upcoming 400 ppm epoch prompted the question of what do we know about 400 ppm CO2 climates and how  could it be used to galvanize action on climate change?  But 400 ppm and climate change is a bigger issue than one University can take on, so we took our idea to the American Geosciences Union Fall conference.  With more than 24,000 attendees each year, AGU is the perfect place to talk about what 400 ppm CO2 means in a scientific sense and what we as scientists should do about communicating it.

Two sessions were proposed: one looking at the science of 400 ppm CO2 climates, co-convened with Kerim Nisanciouglu of the University of Bergen, Norway, the other at communicating 400 ppm co-convened with Kent Peacock of University of Lethbridge and Casey Brown of UMass Amherst.

Naomi Oreskes (pictured) asked why scientists
don’t allow themselves to sound alarmed when reporting alarming conclusions from their
research.

The communication session looked at how climate science could be communicated effectively.  First to speak was Naomi Oreskes, who asked why scientists don’t allow ourselves to sound alarmed when we’re reporting alarming conclusions. Citing from neuroscience research, Oreskes argued that when scientists conform to the ‘unemotional scientist’ paradigm they actually risk being less rational and sounding inauthentic.  It was clear that Oreskes’ points struck the audience, as many of them queued up to ask questions.

Myles Allen made a compelling case for sequestered adequate fraction of extracted (SAFE) carbon – i.e. compulsory carbon capture and storage. Allen pointed out that people will always pay to burn carbon and argued that a carbon price is just a way to ‘cook the planet slower’.  Robert Lempert took a less controversial stand and explained how uncertainty can be managed in robust decision making.  Using hydrological examples, Lempert suggested that by starting with the desired outcome and working backwards, uncertainty can be dealt with.  The session finished with James Hansen, talking about the right message, and how the things that people care about needs to be communicated by the best communicators.  Criticising the pursuit of unconventional fossil fuels, Hansen argued the need for a carbon tax which was redistributed back to people.  A lively question and answer session followed, with all the speakers engaging in a strong discussion and the audience contributing pointed questions. No problems with talking without emotion in this session!

The 400 ppm physical science session started by focussing on what information we could draw from climates in the past where CO2 is believed to have been ~400 ppm. The first speaker was Alan Haywood who summarised the work of the PlioMIP project which tries to understand the climate of the Pliocene (~3 million years ago) – what it was like and why.  The Pliocene is the most recent time period in the past when atmospheric CO2 concentrations could have been as high as they are today.  Two more Pliocene presentations followed.  First, Natalie Burls (standing in for Chris Brierley) explained that even with CO2 set to 400 ppm in their climate model simulations they could not match the warm temperatures reconstructed by Pliocene data – suggesting that either the climate models are not sensitive enough to CO2 or that there are other dynamical processes that we do not fully understand yet.  Thomas Chalk gave a comparison between different methods for reconstructing CO2 in the past, and concluded that the Pliocene concentration was indeed at around 400 ppm. The final talk in the palaeoclimate part of the session was given by Dana Royer who presented the most compelling evidence for very different climates in the past with polar forests at 80°N indicating annual mean temperatures in the Arctic that were 30°C warmer than they are today!  Dana presented new CO2 reconstructions demonstrating that the CO2 concentration at the time of the polar forests could have been around 400 ppm, again suggesting that our climate models may not be sensitive enough to CO2.

The next part of the session looked at current CO2 levels with a presentation by Steven Davis about the amount of CO2 that we have already committed to putting into the atmosphere. The energy infrastructure that we have already built amounts to future CO2 emissions of 318Gt, and new global commitments are still increasing. Vaughan Pratt followed with a talk about the reasons for the recent pause in the global warming trend, separating out natural causes and anthropogenic causes using mathematical and statistical analyses. He concludes that the recent pause is of natural origin.

The final part of the session peered through the looking glass into the future.  Andrew Friedman investigates the causes of the temperature asymmetry between the northern hemisphere and the southern hemisphere and how that asymmetry may alter under the future climate emission scenarios.  He concluded that the asymmetry is set to increase into the next century, with the northern hemisphere warming faster than the southern hemisphere and projects that the tropical rainbelt will shift northwards as a result.

Kirsten Zickfield has found that warming in the next
millenium might amount to 1 degree globally,
concentrated at the Poles.  Sea levels are projected to
rise by 0.8m.

The final talk of the session was given by Kirsten Zickfeld who examined the climate changes we might already be committed to as a result of the CO2 emissions we have already released (under the assumption that atmospheric CO2 stays at 400 ppm). She used a climate model with biogeochemical components to identify how long it would take for the climate to reach equilibrium with the present CO2 concentration of 400 ppm, what the climatic impacts of that equilibrium might be and whether it might be possible to return to CO2 levels below 400 ppm on human timescales by using negative emissions (carbon capture/storage schemes). She found that the already committed warming into the next millennium might amount to 1°C globally, concentrated at the poles. Sea levels are projected to rise by 0.8m due to thermal expansion alone and further increases of 10m due to ice melt are possible over much longer timescales. Committed changes for the ‘other CO2 problem’ – ocean acidification – are relatively small, with a pH drop of only 0.01 projected. She concludes that even if CO2 levels could drop below 400 ppm in the future, whilst air temperatures may stabilise, sea level may continue to rise due to thermal expansion alone.

Both of the sessions were recorded for access after the event and provoked a lot of debate, during the sessions and online.  We hope that in some small way these sessions have helped scientists think differently about what 400 ppm means and what we can do about it.

This blog was written by T Davies-Barnard and Catherine Bradshaw, Geographical Sciences, University of Bristol.