Global warming ‘pause’ was a myth all along, says new study

The idea that global warming has “stopped” is a contrarian talking point that dates back to at least 2006. This framing was first created on blogs, then picked up by segments of the media – and it ultimately found entry into the scientific literature itself. There are now numerous peer-reviewed articles that address a presumed recent “pause” or “hiatus” in global warming, including the latest IPCC report.

So did global warming really pause, stop, or enter a hiatus? At least six academic studies have been published in 2015 that argue against the existence of a pause or hiatus, including three that were authored by me and colleagues James Risbey of CSIRO in Hobart, Tasmania, and Naomi Oreskes of Harvard University.

Our most recent paper has just been published in Nature’s open-access journal Scientific Reports and provides further evidence against the pause.

Pause not backed up by data

First, we analysed the research literature on global temperature variation over the recent period. This turns out to be crucial because research on the pause has addressed – and often conflated – several distinct questions: some asked whether there is a pause or hiatus in warming, others asked whether it slowed compared to the long-term trend and yet others have examined whether warming has lagged behind expectations derived from climate models.

These are all distinct questions and involve different data and different statistical hypotheses. Unnecessary confusion has resulted because they were frequently conflated under the blanket labels of pause or hiatus.

 

New NOAA data released earlier this year confirmed there had been no pause. The author’s latest study used NASA’s GISTEMP data and obtained the same conclusions.
NOAA

To reduce the confusion, we were exclusively concerned with the first question: is there, or has there recently been, a pause or hiatus in warming? It is this question – and only this question – that we answer with a clear and unambiguous “no”.

No one can agree when the pause started

We considered 40 recent peer-reviewed articles on the so-called pause and inferred what the authors considered to be its onset year. There was a spread of about a decade (1993-2003) between the various papers. Thus, rather than being consensually defined, the pause appears to be a diffuse phenomenon whose presumed onset is anywhere during a ten-year window.

Given that the average presumed duration of the pause in the same set of articles is only 13.5 years, this is of concern: it is difficult to see how scientists could be talking about the same phenomenon when they talked about short trends that commenced up to a decade apart.

This concern was amplified in our third point: the pauses in the literature are by no means consistently extreme or unusual, when compared to all possible trends. If we take the past three decades, during which temperatures increased by 0.6℃, we would have been in a pause between 30% and 40% of the time using the definition in the literature.

In other words, academic research on the pause is typically not talking about an actual pause but, at best, about a fluctuation in warming rate that is towards the lower end of the various temperature trends over recent decades.

How the pause became a meme

If there has been no pause, why then did the recent period attract so much research attention?
One reason is a matter of semantics. Many academic studies addressed not the absence of warming but a presumed discrepancy between climate models and observations. Those articles were scientifically valuable (we even wrote one ourselves), but we do not believe that those articles should have been framed in the language of a pause: the relationship between models (what was expected to happen) and observations (what actually happened) is a completely different issue from the question about whether or not global warming has paused.

A second reason is that the incessant challenge of climate science by highly vocal contrarians and Merchants of Doubt may have amplified scientists’ natural tendency to be reticent over reporting the most dramatic risks they are concerned about.

We explored the possible underlying mechanisms for this in an article earlier this year, which suggested climate denial had seeped into the scientific community. Scientists have unwittingly been influenced by a linguistic frame that originated outside the scientific community and by accepting the word pause they have subtly reframed their own research.

Research directed towards the pause has clearly yielded interesting insights into medium-term climate variability. My colleagues and I do not fault that research at all. Except that the research was not about a (non-existent) pause – it was about a routine fluctuation in warming rate. With 2015 being virtually certain to be another hottest year on record, this routine fluctuation has likely already come to an end.
The Conversation

—————————————
This article was originally published on The Conversation. Read the original article.

This blog is by Cabot Institute member Prof Stephan LewandowskyUniversity of Bristol.

Prof Steve Lewandowsky



Read the official press release.

Change Agents UK: Empowering people to have a positive impact on the world

One of our more exciting and inspirational collaborations this year has been with a fantastic charity called Change Agents UK.  This group works on developing a network of change agents; people empowered to live and work in a way that makes a positive impact on the world around them.  During the European Green Capital year, the University of Bristol’s Cabot Institute has collaborated with Change Agents UK to support an EU programme called the Green Capital European Voluntary Service.

Change Agents UK coordinated the programme to host 30 young volunteers from across Europe to volunteer on activities related to Bristol European Green Capital 2015 for two months in the summer of 2015.  Cabot Institute Manager Hayley Shaw helped to form the programme around their visit during which we connected volunteers to:

  • Naomi Oreskes, a prominent climate change scientist. The Change Agents went to see her film ‘Merchants of Doubt’ and met with her beforehand.
  • Andrew Garrad, Chair of Bristol 2015 and Cabot Institute Advisory Board member with a special meeting before his Cabot Institute lecture on renewable energy.
  • Cabot Institute’s Withdrawn art event by the famous artist Luke Jerram.

By helping them to connect to local activity and intellectually interesting events, the volunteers were taking part in valuable experiences to earn their Change Agents Certificate of Achievement. The Cabot Institute also sponsored Change Agents final event which celebrated their fantastic achievements with their host organisations, host families and others from the Bristol Green Capital Partnership.

This project has been really successful and has helped to contribute to the objectives of the Bristol Green Capital programme by providing enthusiastic and capable volunteers to act as Bristol Green Capital ‘change agents’ in projects across Bristol.  This has increased capacity and raised the profile of local projects that are making a positive impact on sustainability in the city.  You can find out more about the positive experiences of Change Agents in Bristol in the brilliant video made by one of the project partners, Chouette Films, below.

The programme is now over for this year and everyone has taken their wisdom earned to their home countries. The organisations are currently exploring funding opportunities to run similar programmes in the future.

If you would like to find out more about Change Agents UK, please visit their website.

http://www.changeagents.org.uk/

Follow on Twitter @changeagentsuk

———————————————
This blog has been written by Amanda Woodman-Hardy, Communications Officer at the University of Bristol’s Cabot Institute.  Follow @cabotinstitute and @Enviro_Mand.

Amanda Woodman-Hardy

Naomi Oreskes – are the merchants of doubt still selling?

Naomi Oreskes at the Cabot Institute. Image credit: Hayley Shaw.

With breakthrough science finding and new technologies emerging every day, one major issue for the scientists is to convince the public to embrace the novelties. However, despite more and more effort being put into public engagement, the credibility of science is still staggering. One of the biggest frustrations, surprisingly, comes from some fellow scientists. They deny the existing consensus of the science world and incite doubts in the public minds. Last Thursday, Naomi Oreskes gave a fascinating Cabot Institute/Bristol Festival of Ideas seminar about the agenda of these doubt merchants and their reasoning behind these agendas.  This blog highlights key parts of her talk.

In her talk, Naomi took climate change as a key example. Carbon Dioxide (CO2) was established as a greenhouse gas in 1850, and the burning of fossil fuels was proved to be an emitter of CO2 at the beginning of 20th century. In 1965, after years of intensive observation and recording, Keeling demonstrated the constant rising of CO2 in atmosphere and a prediction of global warming was made at the same time. Serious discussions about global warming in 1970s led to a consensus in the National Academy of Science (NAS) of USA which described global warming as a threat and suggested immediate actions to curb the trend. The effect of warming was consequently recorded in 1988, which confirmed the worries of scientists and prompted the creation of Intergovernmental Panel on Climate Change (IPCC).

Since its first report, IPCC has made it clear that global warming is happening, and it is caused by the increase of greenhouse gas in the atmosphere as a result of human activities. In 2004, after analyzing 928 papers published in peer reviewed journals between 1993 and 2003 related to climate change, Naomi found that none of them disagree with IPCC’s conclusion. Nowadays, 72% of Americans believe that global warming is happening and 62% thinks positive interference needs to be taken.

Nevertheless, some American politicians and think tanks still claims that there is no scientific consensus on global warming, and actively campaigns against it. Among all the think tanks denying global warming, George C. Marshall Institute is the most prominent one. Founded by three famous physicists Fred Seitz, Robert Jastrow and Bill Nierenberg, Marshall Institute has played significant roles in stimulating public doubts against scientific findings on issues like ozone layer, acidic rain, DDT, and most importantly, smoking. Seitz, who was affiliated to tobacco company R.J .Reynolds, along with Fred Singer, a rocket scientist who worked for Phillip Morris, voiced their disapproval against FDA’s conclusion which calls second hand smoking a carcinogen. They claimed that there were still space for debate on the tobacco issue and FDA’s finding is inconclusive, and this strategy (“tobacco strategy”) is now used by the same group of people in their lambasting towards EPA on global warming issue.

Besides the obvious economic connection, Naomi argues that the reason for these people to act in such way may be even deeper. After the WWII, Friedrich Hayek and Milton Friedman led the movement of neo-liberalism, argues for free-market and less government control. This ideology was massively popularised in 1980s by UK Prime Minister Margaret Thatcher and US president Ronald Reagan, and was absorbed by the Conservative camp. Under the historical background of Cold War, such ideology becomes more appealing to the Conservatives, as they believe that personal freedom is dependent on economic freedom. With this kind of mind set, Global Warming and Tobacco Control both seem to be conspiracies of socialists who try to tighten the government control on civil liberties, and it will be a slippery slope and eventually morphs the West into Soviet Union.

Knowing the reasoning behind these antagonists, it will be easier to tackle the problem. Of course, as the battlefield is the public opinion, the most fundamental work still lies with science communicators and science public engagements, which shoulders the responsibility to pass on scientific findings into public’s visible range. Besides that, controlling greenhouse gas emission does not always have to be in an anti-free market fashion. The creation of carbon credit and its trade market is a great experiment, which opens a possibility outside sometimes crude legislations. After all, climate change is a burning issue which needs immediate attention and action from the whole of human society. While the debate of climate change is no longer a scientific one, but a political one, it is worth recruiting political wisdom to think beyond the science, and come up with a package of solutions to minimise the obstacle for us to act upon it.
———————————————
This blog was written by Cabot Institute member Dan Lan, a PhD student in the School of Biological Sciences at the University of Bristol.

Dan Lan

AGU 2013: The importance of 400 ppm CO2

On 1 June 2012, a concentration of 400 ppm carbon dioxide was measured in air samples in the Arctic.  On 9 May 2013, Mauna Loa, the longest recording station, measured a daily average of 400 ppm carbon dioxide. Next year we may see the global average concentration reach 400 ppm and the year after that 400 ppm could be measured at the South Pole. The 400 ppm number is arbitrary, but it is a symbol of the anthropogenic climate change that scientists have been talking about for many years.

Here at the University of Bristol, the upcoming 400 ppm epoch prompted the question of what do we know about 400 ppm CO2 climates and how  could it be used to galvanize action on climate change?  But 400 ppm and climate change is a bigger issue than one University can take on, so we took our idea to the American Geosciences Union Fall conference.  With more than 24,000 attendees each year, AGU is the perfect place to talk about what 400 ppm CO2 means in a scientific sense and what we as scientists should do about communicating it.

Two sessions were proposed: one looking at the science of 400 ppm CO2 climates, co-convened with Kerim Nisanciouglu of the University of Bergen, Norway, the other at communicating 400 ppm co-convened with Kent Peacock of University of Lethbridge and Casey Brown of UMass Amherst.

Naomi Oreskes (pictured) asked why scientists
don’t allow themselves to sound alarmed when reporting alarming conclusions from their
research.

The communication session looked at how climate science could be communicated effectively.  First to speak was Naomi Oreskes, who asked why scientists don’t allow ourselves to sound alarmed when we’re reporting alarming conclusions. Citing from neuroscience research, Oreskes argued that when scientists conform to the ‘unemotional scientist’ paradigm they actually risk being less rational and sounding inauthentic.  It was clear that Oreskes’ points struck the audience, as many of them queued up to ask questions.

Myles Allen made a compelling case for sequestered adequate fraction of extracted (SAFE) carbon – i.e. compulsory carbon capture and storage. Allen pointed out that people will always pay to burn carbon and argued that a carbon price is just a way to ‘cook the planet slower’.  Robert Lempert took a less controversial stand and explained how uncertainty can be managed in robust decision making.  Using hydrological examples, Lempert suggested that by starting with the desired outcome and working backwards, uncertainty can be dealt with.  The session finished with James Hansen, talking about the right message, and how the things that people care about needs to be communicated by the best communicators.  Criticising the pursuit of unconventional fossil fuels, Hansen argued the need for a carbon tax which was redistributed back to people.  A lively question and answer session followed, with all the speakers engaging in a strong discussion and the audience contributing pointed questions. No problems with talking without emotion in this session!

The 400 ppm physical science session started by focussing on what information we could draw from climates in the past where CO2 is believed to have been ~400 ppm. The first speaker was Alan Haywood who summarised the work of the PlioMIP project which tries to understand the climate of the Pliocene (~3 million years ago) – what it was like and why.  The Pliocene is the most recent time period in the past when atmospheric CO2 concentrations could have been as high as they are today.  Two more Pliocene presentations followed.  First, Natalie Burls (standing in for Chris Brierley) explained that even with CO2 set to 400 ppm in their climate model simulations they could not match the warm temperatures reconstructed by Pliocene data – suggesting that either the climate models are not sensitive enough to CO2 or that there are other dynamical processes that we do not fully understand yet.  Thomas Chalk gave a comparison between different methods for reconstructing CO2 in the past, and concluded that the Pliocene concentration was indeed at around 400 ppm. The final talk in the palaeoclimate part of the session was given by Dana Royer who presented the most compelling evidence for very different climates in the past with polar forests at 80°N indicating annual mean temperatures in the Arctic that were 30°C warmer than they are today!  Dana presented new CO2 reconstructions demonstrating that the CO2 concentration at the time of the polar forests could have been around 400 ppm, again suggesting that our climate models may not be sensitive enough to CO2.

The next part of the session looked at current CO2 levels with a presentation by Steven Davis about the amount of CO2 that we have already committed to putting into the atmosphere. The energy infrastructure that we have already built amounts to future CO2 emissions of 318Gt, and new global commitments are still increasing. Vaughan Pratt followed with a talk about the reasons for the recent pause in the global warming trend, separating out natural causes and anthropogenic causes using mathematical and statistical analyses. He concludes that the recent pause is of natural origin.

The final part of the session peered through the looking glass into the future.  Andrew Friedman investigates the causes of the temperature asymmetry between the northern hemisphere and the southern hemisphere and how that asymmetry may alter under the future climate emission scenarios.  He concluded that the asymmetry is set to increase into the next century, with the northern hemisphere warming faster than the southern hemisphere and projects that the tropical rainbelt will shift northwards as a result.

Kirsten Zickfield has found that warming in the next
millenium might amount to 1 degree globally,
concentrated at the Poles.  Sea levels are projected to
rise by 0.8m.

The final talk of the session was given by Kirsten Zickfeld who examined the climate changes we might already be committed to as a result of the CO2 emissions we have already released (under the assumption that atmospheric CO2 stays at 400 ppm). She used a climate model with biogeochemical components to identify how long it would take for the climate to reach equilibrium with the present CO2 concentration of 400 ppm, what the climatic impacts of that equilibrium might be and whether it might be possible to return to CO2 levels below 400 ppm on human timescales by using negative emissions (carbon capture/storage schemes). She found that the already committed warming into the next millennium might amount to 1°C globally, concentrated at the poles. Sea levels are projected to rise by 0.8m due to thermal expansion alone and further increases of 10m due to ice melt are possible over much longer timescales. Committed changes for the ‘other CO2 problem’ – ocean acidification – are relatively small, with a pH drop of only 0.01 projected. She concludes that even if CO2 levels could drop below 400 ppm in the future, whilst air temperatures may stabilise, sea level may continue to rise due to thermal expansion alone.

Both of the sessions were recorded for access after the event and provoked a lot of debate, during the sessions and online.  We hope that in some small way these sessions have helped scientists think differently about what 400 ppm means and what we can do about it.

This blog was written by T Davies-Barnard and Catherine Bradshaw, Geographical Sciences, University of Bristol.