AGU 2013: The importance of 400 ppm CO2

On 1 June 2012, a concentration of 400 ppm carbon dioxide was measured in air samples in the Arctic.  On 9 May 2013, Mauna Loa, the longest recording station, measured a daily average of 400 ppm carbon dioxide. Next year we may see the global average concentration reach 400 ppm and the year after that 400 ppm could be measured at the South Pole. The 400 ppm number is arbitrary, but it is a symbol of the anthropogenic climate change that scientists have been talking about for many years.

Here at the University of Bristol, the upcoming 400 ppm epoch prompted the question of what do we know about 400 ppm CO2 climates and how  could it be used to galvanize action on climate change?  But 400 ppm and climate change is a bigger issue than one University can take on, so we took our idea to the American Geosciences Union Fall conference.  With more than 24,000 attendees each year, AGU is the perfect place to talk about what 400 ppm CO2 means in a scientific sense and what we as scientists should do about communicating it.

Two sessions were proposed: one looking at the science of 400 ppm CO2 climates, co-convened with Kerim Nisanciouglu of the University of Bergen, Norway, the other at communicating 400 ppm co-convened with Kent Peacock of University of Lethbridge and Casey Brown of UMass Amherst.

Naomi Oreskes (pictured) asked why scientists
don’t allow themselves to sound alarmed when reporting alarming conclusions from their
research.

The communication session looked at how climate science could be communicated effectively.  First to speak was Naomi Oreskes, who asked why scientists don’t allow ourselves to sound alarmed when we’re reporting alarming conclusions. Citing from neuroscience research, Oreskes argued that when scientists conform to the ‘unemotional scientist’ paradigm they actually risk being less rational and sounding inauthentic.  It was clear that Oreskes’ points struck the audience, as many of them queued up to ask questions.

Myles Allen made a compelling case for sequestered adequate fraction of extracted (SAFE) carbon – i.e. compulsory carbon capture and storage. Allen pointed out that people will always pay to burn carbon and argued that a carbon price is just a way to ‘cook the planet slower’.  Robert Lempert took a less controversial stand and explained how uncertainty can be managed in robust decision making.  Using hydrological examples, Lempert suggested that by starting with the desired outcome and working backwards, uncertainty can be dealt with.  The session finished with James Hansen, talking about the right message, and how the things that people care about needs to be communicated by the best communicators.  Criticising the pursuit of unconventional fossil fuels, Hansen argued the need for a carbon tax which was redistributed back to people.  A lively question and answer session followed, with all the speakers engaging in a strong discussion and the audience contributing pointed questions. No problems with talking without emotion in this session!

The 400 ppm physical science session started by focussing on what information we could draw from climates in the past where CO2 is believed to have been ~400 ppm. The first speaker was Alan Haywood who summarised the work of the PlioMIP project which tries to understand the climate of the Pliocene (~3 million years ago) – what it was like and why.  The Pliocene is the most recent time period in the past when atmospheric CO2 concentrations could have been as high as they are today.  Two more Pliocene presentations followed.  First, Natalie Burls (standing in for Chris Brierley) explained that even with CO2 set to 400 ppm in their climate model simulations they could not match the warm temperatures reconstructed by Pliocene data – suggesting that either the climate models are not sensitive enough to CO2 or that there are other dynamical processes that we do not fully understand yet.  Thomas Chalk gave a comparison between different methods for reconstructing CO2 in the past, and concluded that the Pliocene concentration was indeed at around 400 ppm. The final talk in the palaeoclimate part of the session was given by Dana Royer who presented the most compelling evidence for very different climates in the past with polar forests at 80°N indicating annual mean temperatures in the Arctic that were 30°C warmer than they are today!  Dana presented new CO2 reconstructions demonstrating that the CO2 concentration at the time of the polar forests could have been around 400 ppm, again suggesting that our climate models may not be sensitive enough to CO2.

The next part of the session looked at current CO2 levels with a presentation by Steven Davis about the amount of CO2 that we have already committed to putting into the atmosphere. The energy infrastructure that we have already built amounts to future CO2 emissions of 318Gt, and new global commitments are still increasing. Vaughan Pratt followed with a talk about the reasons for the recent pause in the global warming trend, separating out natural causes and anthropogenic causes using mathematical and statistical analyses. He concludes that the recent pause is of natural origin.

The final part of the session peered through the looking glass into the future.  Andrew Friedman investigates the causes of the temperature asymmetry between the northern hemisphere and the southern hemisphere and how that asymmetry may alter under the future climate emission scenarios.  He concluded that the asymmetry is set to increase into the next century, with the northern hemisphere warming faster than the southern hemisphere and projects that the tropical rainbelt will shift northwards as a result.

Kirsten Zickfield has found that warming in the next
millenium might amount to 1 degree globally,
concentrated at the Poles.  Sea levels are projected to
rise by 0.8m.

The final talk of the session was given by Kirsten Zickfeld who examined the climate changes we might already be committed to as a result of the CO2 emissions we have already released (under the assumption that atmospheric CO2 stays at 400 ppm). She used a climate model with biogeochemical components to identify how long it would take for the climate to reach equilibrium with the present CO2 concentration of 400 ppm, what the climatic impacts of that equilibrium might be and whether it might be possible to return to CO2 levels below 400 ppm on human timescales by using negative emissions (carbon capture/storage schemes). She found that the already committed warming into the next millennium might amount to 1°C globally, concentrated at the poles. Sea levels are projected to rise by 0.8m due to thermal expansion alone and further increases of 10m due to ice melt are possible over much longer timescales. Committed changes for the ‘other CO2 problem’ – ocean acidification – are relatively small, with a pH drop of only 0.01 projected. She concludes that even if CO2 levels could drop below 400 ppm in the future, whilst air temperatures may stabilise, sea level may continue to rise due to thermal expansion alone.

Both of the sessions were recorded for access after the event and provoked a lot of debate, during the sessions and online.  We hope that in some small way these sessions have helped scientists think differently about what 400 ppm means and what we can do about it.

This blog was written by T Davies-Barnard and Catherine Bradshaw, Geographical Sciences, University of Bristol.

Leave a Reply

Your email address will not be published. Required fields are marked *