Decarbonising the UK rail network

Image source: Wikimedia Commons

Caboteer Dr Colin Nolden blogs on a recent All-Party Parliamentary Rail & Climate Change Groups meeting on ‘Decarbonising the UK rail network’.  The event was co-chaired by Martin Vickers MP and Daniel Zeichner MP. Speakers included:

  • Professor Jim Skea, CBE, Imperial College London
  • David Clarke, Technical Director, RIA
  • Anthony Perret, Head of Sustainable Development, RSSB
  • Helen McAllister, Head of Strategic Planning (Freight and National Passenger Operators), Network Rail

The meeting kicked off with a broad overview of the global decarbonisation challenge by Jim Skea. As former member of the UK’s Climate Change Committee and Co-chair of Working Group III of the Intergovernmental Panel on Climate Change, which oversaw the 1.5C report published in October 2018, as well member of the Scottish Just Transition Commissions, he emphasized that the net-zero target ‘is humongously challenging’. We need to recognise that all aspects of our land, economy and society require change, including lifestyles and behaviours. At the same time, the loophole of buying in permits to ‘offset’ decarbonisation in the UK net-zero target increases uncertainty as it is unclear what needs to be done territorially. The starting point for decarbonising mobility and many other sectors is nevertheless the decarbonisation of our electricity supply by 2030 as this allows the electrification of energy demand.

The recent International Energy Agency report on the ‘Future of Rail’ was mentioned. It suggests that the rail sector is one of the blindspots for decarbonisation although rail covers 8% of passenger transport, 7% of freight transport with only 2% of transport energy demand. The report concludes that a modal shift and sustainable electrification are necessary to decarbonise transport.

David Clarke pointed towards the difficulties encountered in the electrification of the Great Western line to Bristol and beyond to Cardiff but stressed that this was not a good measure for future electrification endeavours. Electrification was approached to ambitiously in 2009 following the 20-year electrification hiatus. Novel technology and deadlines with fixed time scales implied higher costs on the Great Western line. Current electrification phases such as the Bristol-Cardiff stretch, on the other hand, are being developed within the cost envelope. A problem now lies in the lack of further planned electrifications as there is a danger of demobilising relevant teams. Such a hiatus could once again lead to teething problems when electrification will be prioritised again. Bimodal trains that have accompanied electrification on the Great Western line will continue to play an important role in ongoing electrification as they allow at least part of the journeys to be completed free of fossil fuels.

Anthony Perret mentioned the RSSBs role in the ongoing development of a rail system decarbonisation strategy. The ‘what’ report was published in January 2019 and the ‘how’ report is still being drafted. Given that 70% of journeys and 80% of passenger kilometres are already electrified he suggested that new technology combinations such as hydrogen and battery will need to be tested to fill the gap where electrification is not economically viable. Hydrogen is likely to be a solution for longer distances and higher speeds while batteries are more likely to be suitable for discontinuous electrification such as the ‘bridging’ of bridges and tunnels. Freight transport’s 25,000V requirement currently implies either diesel or electrification to provide the necessary power. Anthony finished with a word of caution regarding rail governance complexities. Rail system governance needs an overhaul if it is not to hinder decarbonisation.

Helen McAllister is engaged in a task force to establish what funding needs to be made available for deliverable, affordable and efficient solutions. Particular interest lies on the ‘middle’ where full electrification is not economically viable but where promising combinations of technologies that Anthony mentioned might provide appropriate solutions. This is where emphasis on innovation will be placed and economic cases are sought. This is particularly relevant to the Riding Sunbeams project I am involved with as discontinuous and innovative electrification is one of the avenues we are pursuing. However, Helen highlighted failure of current analytical tools to take carbon emissions into account. The ‘Green Book’ requires revision to place more emphasis on environmental outcomes and to specify the ‘bang for your buck’ in terms of carbon to make it a driving factor in decision-making. At the same time, she suggested that busy commuter lines that are the obvious choice for electrification are also likely to score highest on decarbonisation.

David pointed out that despite ambitious targets in place, new diesel rolling stock that was ordered before decarbonisation took priority will only be put in service in 2020 and will in all likelihood continue running until 2050. This is an indication of the lock-in associated with durable rail assets that Jim Skea also strongly emphasized as a challenge to overcome. Transport for Wales, on the other hand, are already looking into progressive decarbonisation options, which include Riding Sunbeams, along with four other progressive decarbonisation projects currently being implemented. Helen agreed that diesel will continue to have a role to play but that franchise specification for rolling stock regarding passenger rail and commercial specification regarding freight rail can help move the retirement date forward.

Comments and questions from the audience suggest that the decarbonisation challenge is galvanising the industry with both rolling stock companies and manufacturers putting their weight behind progressive solutions. Ultimately, more capacity for rail is required to enable modal shift towards sustainable rail transport. In this context, Helen stressed the need to apply the same net-zero criteria across all industries to ensure that all sectors engage in the same challenge, ranging from aviation to railways. Leo Murray from Riding Sunbeams asked whether unelectrified railway lines into remote areas such as the Scottish Highlands, Mid-Wales and Cornwall could be electrified with overhead electricity transmission lines to transmit the power from such remote areas to urban centres with rail electrification as a by-product. Chair Danial Zeichner pointed towards a project that seeks to connect Calais and Folkstone with a thick DC cable through the channel tunnel and this is something we will follow up with some of the speakers.

In conclusion, Anthony pointed towards the Rail Carbon Tool which will help measure capital carbon involved in all projects above a certain size from January 2020 onwards as a step in the right direction. David pointed toward increasing collaboration with the advanced propulsion centre at Cranfield University to cross-fertilise innovative solutions across different mobility sectors.
Overall it was an intense yet enjoyable hour in a sticky room packed full of sustainable rail enthusiasts. Although this might evoke images of grey hair, ill-fitting suits and the odd trainspotting binoculars it was refreshing to see so many ideas and enthusiasm brought to fore by a topic as mundane as ‘decarbonising the UK rail network’.

——————————————-
This blog is written by Dr Colin Nolden, Vice-Chancellor’s Fellow, University of Bristol Law School and Cabot Institute for the Environment.

Colin is currently leading a new Cabot Institute Masters by Research project on a new energy system architecture. This project will involve close engagement with community energy organizations to assess technological and business model feasibility. Sound up your street? Find out more about this masters on our website.

Sowing the seeds of collaborations to tackle African food insecurity

A group of early career researchers from 11 African countries got together in Bristol, UK, this month for a two-week training event. Nothing so unusual about that, you may think.

Yet this course, run by the Community Network for African Vector-Borne Plant Viruses (CONNECTED), broke important new ground.

The training brought together an unusual blend of researchers: plant virologists and entomologists studying insects which transmit plant diseases, as an important part of the CONNECTED project’s work to find new solutions to the devastation of many food crops in Sub-Saharan African countries.

The CONNECTED niche focus on vector-borne plant disease is the reason for bringing together insect and plant pathology experts, and plant breeders too. The event helped forge exciting new collaborations in the fight against African poverty, malnutrition and food insecurity.

‘V4’ – Virus Vector Vice Versa – was a fully-funded residential course which attracted great demand when it was advertised. Places were awarded by competitive application, with funding awarded to cover travel, accommodation, subsistence and all training costs. For every delegate who attended, five applicants were unsuccessful.

The comprehensive programme combined: scientific talks; general lab training skills; specific virology and entomology lecture and practical work; workshops; field visits, career development, mentoring, and desk-based projects.

 

Across the fortnight delegates received plenty of peer mentoring and team-building input, as well as an afternoon focused on ‘communicating your science.’


New
collaborations will influence African agriculture for years to come

There’s little doubt that the June event, hosted by The University of Bristol, base of CONNECTED Network Director Professor Gary Foster, has sown seeds of new alliances and partnerships that can have global impact on vector-borne plant disease in Sub-Saharan Africa for many years to come.
CONNECTED network membership has grown in its 18 months to a point where it’s approaching 1,000 researchers, from over 70 countries. The project, which derived its funding from the Global Challenges Research Fund, is actively looking at still more training events.
The V4 training course follows two successful calls for pump-prime research funding, leading to nine projects now operating in seven different countries, and still many more to come. Earlier in the year CONNECTED ran a successful virus diagnostics training event in Kenya, in close partnership with BecA-ILRI Hub. One result of that training was that its 19 delegates were set to share their new knowledge and expertise with a staggering 350 colleagues right across the continent.

Project background

Plant diseases significantly limit the ability of many of Sub-Saharan African countries to produce enough staple and cash crops such as cassava, sweet potato, maize and yam. Farmers face failing harvests and are often unable to feed their local communities as a result. The diseases ultimately hinder the countries’ economic and social development, sometimes leading to migration as communities look for better lives elsewhere.
The CONNECTED network project is funded by a £2 million grant from the UK government’s Global Challenges Research Fund, which supports research on global issues that affect developing countries. It is co-ordinated by Prof. Foster from the University of Bristol School of Biological Sciences, long recognised as world-leading in plant virology and vector-transmitted diseases, with Professor Neil Boonham, from Newcastle University its Co-Director. The funding is being used to build a sustainable network of scientists and researchers to address the challenges. The University of Bristol’s Cabot Institute, of which Prof. Foster is a member, also provides input and expertise.
———————-
This blog is written by Richard Wyatt, Communications Officer for the CONNECTED network.

Indoor air pollution: The ‘killer in the kitchen’

Image credit Clean Cooking Alliance.

Approximately 3 billion people around the world rely on biomass fuels such as wood, charcoal and animal dung which they burn on open fires and using inefficient stoves to meet their daily cooking needs.

Relying on these types of fuels and cooking technologies is a major contributor to indoor air pollution and has serious negative health impacts, including acute respiratory illnesses, pneumonia, strokes, cataracts, heart disease and cancer.

The World Health Organization estimates that indoor air pollution causes nearly 4 million premature deaths annually worldwide – more than the deaths caused by malaria and tuberculosis combined. This led the World Health Organization to label household air pollution “The Killer in the Kitchen”.

As illustrated on the map below, most deaths from indoor air pollution occur in low- and middle-income countries across Africa and Asia. Women and children are disproportionately exposed to the risks of indoor air pollution as they typically spend the most time cooking.

Number of deaths attributable to indoor air pollution in 2017. Image credit Our World in Data.
Replacing open fires and inefficient stoves with modern, cleaner solutions is essential to reduce indoor air pollution and personal exposure to emissions. However, research suggests that only significant reductions in exposure can tangibly reduce negative health impacts.
The Clean Cooking Alliance, established in 2010, has focused mainly on the dissemination of improved cookstoves (ICS) – wood-burning or charcoal stoves designed to be much more efficient than more traditional models – with some success.
Randomised control trials of sole use of ICS have shown reductions in pneumonia and the duration of respiratory infections in children. However, other studies, including some funded by the Alliance, have shown that ICS have not performed well enough in the field to sufficiently reduce indoor air pollution to lessen health risks such as pneumonia and heart disease.
Alternative fuels such as liquid petroleum gas (LPG), biogas and ethanol present other options for cooking with LPG already prevalent in many countries across the world.
LPG is clean-burning and produces much less carbon dioxide than burning biomass but is still a fossil fuel.
Biogas is a clean, renewable fuel made from organic waste, and ethanol is a clean biofuel made from a variety of feedstocks.
Image credit PEEDA

Electric cooking, once seen as a pipe dream for developing countries, is becoming more feasible and affordable due to improvements and reductions in costs of technologies like solar panels and batteries.

Improved cookstoves, alternative fuels and electric cooking have been gaining traction but there is still a long way to go to solving the deadly problem of indoor air pollution.
———————-
This blog is written by Cabot Institute member Peter Thomas, Faculty of Engineering, University of Bristol. Peter’s research focusses on access to energy in humanitarian relief. This blog is co-written by Will Clements, Faculty of Engineering.

Science in action: Air pollution in Bangkok

Bangkok haze 2019 March. Wikimedia Commons.

I was given the opportunity to spend a significant part of 2018 in Bangkok, Thailand, to work with the Chulabhorn Research Institute (CRI) Laboratory of Environmental Toxicology working on a project funded by the Newton Fund on air-quality. Bangkok is a large city with over 14 million inhabitants, which suffer high levels of traffic and congestion resulting in consequent high exposure to traffic-related pollution. It is a UN Sustainable development goal to reduce the number of deaths caused by pollution by 2030. Air pollution is a global problem – a major threat to health throughout the world – but particularly so in low and medium income countries, which account for 92% of pollution related deaths (1). The poor and the marginalised often live in areas of high pollution, and children have a disproportionate exposure to pollutants at a vulnerable stage of development.

The Chulabhorn Research Institute is an independent research institute in Bangkok whose mission includes the application of science and technology to improve the Thai people’s quality of life. The Laboratory of Toxicology, under Professor Mathuros Ruchirawat, have a very strong record in using their results to inform policy and make a real impact on the lives of people in South East Asia affected by environmental toxins. For example, a previous study undertaken by the CRI found that people living and working near busy roads were exposed to Benzene from traffic exhaust, and they demonstrated increased DNA damage. Once this was known, the Thai government was persuaded to alter fuel mixtures in cars to protect the population (2).

I was in Bangkok from January to June, then returned in September till December 2018. I brought with me particle measurement and sampling equipment to count particles and sample particulate matter (PM) in Bangkok to supplement the toxicology work of the Research institute. PM can be described by its size fractions; usually reported are PM10 (aerosol diameter 10 micrometres and lower) and PM2.5 (2.5 micrometres and lower). Less often measured, is the sum-micron range (sometimes referred to as PM1) and the ultrafine range (less than 100 nm).

James Matthews with his particle measurement and sampling equipment on public transport in Bangkok.

Below 1 μm, it becomes more difficult to measure particle numbers as optical techniques fail on particles around 200 nm and smaller.  To count them, the particles require a solvent to grow them to a countable size. The requirement for regular solvents, and the high price of aerosol instrumentation to measure the smallest sizes, mean that particle number concentration is not always measured as a matter of course, but new research is indicating that they may be a significant health concern. The smaller particles can penetrate further into the lung and there is some evidence that this may cause them to pass further into the body, possibly even making its way into the brain. While much more research is needed – in both the toxicological and epidemiological domains – to understand the effects of these smaller particles I would not be surprised if the narrative on air quality moves further toward the ultrafine size range in the very near future.

While in Bangkok, I added my aerosol science expertise and experience in aerosol field measurements to the team in the CRI, taking measurements of particle number count using a handheld particle counter, and collecting samples of PM using both PM10 samplers, and a cascade impactor (the Dekati Electrical Low Pressure Impactor) that allowed samples to be separated by aerodynamic size, then collected for further analysis on ICP-MS (inductively coupled plasma mass spectrometry). Thus, metal concentrations within all the different size fractions of aerosol could be found. Within the first few months of the project, I was able to train the staff at the CRI to use this equipment, and so measurements could continue when I returned to the UK.

As well as taking measurements at the CRI in the Lak Si district, north of Bangkok, we chose three sites in the wider Bangkok area that represented different exposure conditions. We were given access to the Thai governmental Pollution Control Department (PCD) air quality measurements sites, where our instrumentation was set up next to their other pollutant measurements.

A toll road and railway in Lak Si – from Bangkok toward the Don Mueang airport. Image credit James Matthews.

The three sites included Ayutthaya, a UNESCO world heritage site north of Bangkok. Ayutthaya, while a busy tourist destination, has considerably less traffic, and therefore less traffic emission related pollutants, than the other sites. The second site, Bang Phli, was an area to the South of Bangkok where there is a lot of industry.  The third, Chok Chai, was a roadside site in central Bangkok.

Survey measurements of particle count were taken in several locations using a hand-held particle counter. The particle numbers were highest in measurements on the state railway and on the roads in Bangkok. The state railway through Bangkok is slow moving, where diesel engines repetitively start and brake, all of which contribute to particulates. Conversely the newer sky train and underground railways had low particle counts (the underground had the lowest counts I measured anywhere). At the CRI site, long term measurements near a toll road showed that the particle number count was highest at rush hours, indicating traffic as the dominant contributor. Walking routes in both Bangkok and Ayutthaya showed high concentrations near roads, and in markets and street stalls, where street vendors produce food.

Within our measurements in Bangkok, we were able to measure the mass fraction, and the metal (and some non-metals such as arsenic) content over 12 size fractions from 10 um down to 10 nm. Metals that are known to be deleterious to human health include Cadmium, Chromium, Nickel and Lead, which are class 2 (possible or probable) or class 1 (known) carcinogens. Comparing the reference site (Ayutthaya) with the roadside site (Chok Chai) over several 3-day sampling periods showed that these toxic metals were present in higher concentrations in the area with higher traffic. They were also present in higher concentration in the lower size ranges, which may result in these metals penetrating deeper into the lung.

One episode demonstrated the need for local knowledge when measurements are taken. Normally, we would expect measurements during weekdays to be higher in working areas than at weekends, due to increased work traffic, and in most cases this was the case (Ayutthaya was often an exception, likely due to weekend traffic from tourists). However, one weekend saw a notable peak in aerosol concentrations at a site one Saturday evening, which was difficult to explain. It was pointed out to me by colleagues at the institute that over this weekend was a festival during which the Chinese community in Bangkok traditionally burned items, including fake money, as a long standing tradition. The peak in particles fitted this explanation.

Knowing more about the nature and level of pollutants in a city is an important first step, but challenges persist for the people in Bangkok and other polluted cities as to how to reduce these exposures. The problem of rural crop burning is one that frustrates many in Thailand, as it is well known that the particulates from burnt crops are harmful to the population. While there are strong restrictions on the deliberate burning of crops, it is still common see fields burning in January to March in areas of northern Thailand. Similarly, Bangkok remains the city of the car, with residents accepting that they may have to sit in traffic for long periods to get anywhere.

Burning mountains in Thailand. Wikimedia Commons.

Researchers from Bristol were able to discuss our results alongside measurements from the PCD and the CRI in Bangkok in a seminar held in 2019. It was apparent that there is great awareness of the dangers of air pollution, but it still seems that more needs to be done to address these problems. In January 2019, Bangkok made the headlines for PM2.5 exposures that were at dangerously high levels. January in Thailand is during the ‘cool’ season, where both temperatures and rainfall are low. This weather results in a trapping of pollutants within the city, thus increasing exposure levels. On discussing this with the pollution experts in Thailand, it was argued that the levels this year were typical levels for January, but the reporting of those levels had changed. The Thai PCD advocate communication of pollutant levels though their website and their app, and until recently the PCD sites did not measure PM2.5 in a sufficient number of stations to use it in their air quality index calculations. This year, they changed the calculation to include PM2.5, and as a consequence, the high pollutant levels discussed above were reported. The reporting of these pollutant levels can be attributed to the greater awareness of the population to the problem of pollution, which in turn is leading to a greater urgency in finding solutions.

So there is a role for the direct engagement with the population, which may lead to pressure on governments to respond. There is also a role for science to provide leaders with tangible solutions, such as the suggestion to change fuel mixtures. But the huge challenge of reducing sources of pollutants in a growing city like Bangkok remains.

1 Landringham, P. J. et al 2017. Lancet 391, 462-512
2 Davies R., 2018. Lancet 391, 421.

————————–
This blog is written by Cabot Institute member Dr James Matthews, School of Chemistry, University of Bristol. James’ research looks at the flow of gases in urban environments, and the use of perfluorocarbon trace gas releases to map the passage of air in urban cities.
James Matthews

How we traced ‘mystery emissions’ of CFCs back to eastern China

Since being universally ratified in the 1980s, the Montreal Protocol – the treaty charged with healing the ozone layer – has been wildly successful in causing large reductions in emissions of ozone depleting substances. Along the way, it has also averted a sizeable amount of global warming, as those same substances are also potent greenhouse gases. No wonder the ozone process is often held up as a model of how the international community could work together to tackle climate change.

However, new research we have published with colleagues in Nature shows that global emissions of the second most abundant ozone-depleting gas, CFC-11, have increased globally since 2013, primarily because of increases in emissions from eastern China. Our results strongly suggest a violation of the Montreal Protocol.

A global ban on the production of CFCs has been in force since 2010, due to their central role in depleting the stratospheric ozone layer, which protects us from the sun’s ultraviolet radiation. Since global restrictions on CFC production and use began to bite, atmospheric scientists had become used to seeing steady or accelerating year-on-year declines in their concentration.

Ozone-depleting gases, measured in the lower atmosphere. Decline since the early 1990s is primarily due to the controls on production under the Montreal Protocol. AGAGE / CSIRO

But bucking the long-term trend, a strange signal began to emerge in 2013: the rate of decline of the second most abundant CFC was slowing. Before it was banned, the gas, CFC-11, was used primarily to make insulating foams. This meant that any remaining emissions should be due to leakage from “banks” of old foams in buildings and refrigerators, which should gradually decline with time.

But in that study published last year, measurements from remote monitoring stations suggested that someone was producing and using CFC-11 again, leading to thousands of tonnes of new emissions to the atmosphere each year. Hints in the data available at the time suggested that eastern Asia accounted for some unknown fraction of the global increase, but it was not clear where exactly these emissions came from.

Growing ‘plumes’ over Korea and Japan

Scientists, including ourselves, immediately began to look for clues from other measurements around the world. Most monitoring stations, primarily in North America and Europe, were consistent with gradually declining emissions in the nearby surrounding regions, as expected.
But all was not quite right at two stations: one on Jeju Island, South Korea, and the other on Hateruma Island, Japan.

These sites showed “spikes” in concentration when plumes of CFC-11 from nearby industrialised regions passed by, and these spikes had got bigger since 2013. The implication was clear: emissions had increased from somewhere nearby.

To further narrow things down, we ran computer models that could use weather data to simulate how pollution plumes travel through the atmosphere.

Atmospheric observations at Gosan and Hateruma monitoring stations showed an increase in CFC-11 emissions from China, primarily from Shandong, Hebei and surrounding provinces. Rigby et al, Author provided

From the simulations and the measured concentrations of CFC-11, it became apparent that a major change had occurred over eastern China. Emissions between 2014 and 2017 were around 7,000 tonnes per year higher than during 2008 to 2012. This represents more than a doubling of emissions from the region, and accounts for at least 40% to 60% of the global increase. In terms of the impact on climate, the new emissions are roughly equivalent to the annual CO₂ emissions of London.

The most plausible explanation for such an increase is that CFC-11 was still being produced, even after the global ban, and on-the-ground investigations by the Environmental Investigations Agency and the New York Times seemed to confirm continued production and use of CFC-11 even in 2018, although they weren’t able to determine how significant it was.

While it’s not known exactly why production and use of CFC-11 apparently restarted in China after the 2010 ban, these reports noted that it may be that some foam producers were not willing to transition to using second generation substitutes (HFCs and other gases, which are not harmful to the ozone layer) as the supply of the first generation substitutes (HCFCs) was becoming restricted for the first time in 2013.

Bigger than the ozone hole

Chinese authorities have said they will “crack-down” on any illegal production. We hope that the new data in our study will help. Ultimately, if China successfully eliminates the new emissions sources, then the long-term negative impact on the ozone layer and climate could be modest, and a megacity-sized amount of CO₂-equivalent emissions would be avoided. But if emissions continue at their current rate, it could undo part of the success of the Montreal Protocol.

 

The network of global (AGAGE) and US-run (NOAA) monitoring stations. Luke Western, Author provided

While this story demonstrates the critical value of atmospheric monitoring networks, it also highlights a weakness of the current system. As pollutants quickly disperse in the atmosphere, and as there are only so many measurement stations, we were only able to get detailed information on emissions from certain parts of the world.

Therefore, if the major sources of CFC-11 had been a few hundred kilometres further to the west or south in China, or in unmonitored parts of the world, such as India, Russia, South America or most of Africa, the puzzle would remain unsolved. Indeed, there are still parts of the recent global emissions rise that remain unattributed to any specific region.

When governments and policy makers are armed with this atmospheric data, they will be in a much better position to consider effective measures. Without it, detective work is severely hampered.


—————————
This blog is written by Cabot Institute member Dr Matt Rigby, Reader in Atmospheric Chemistry, University of Bristol; Luke Western, Research Associate in Atmospheric Science, University of Bristol, and Steve Montzka, Research Chemist, NOAA ESRL Global Monitoring Division, University of ColoradoThis article is republished from The Conversation under a Creative Commons license. Read the original article.

Listen to Matt Rigby talk about CFC emissions on BBC Radio 4’s Inside Science programme.

Climate change: sea level rise could displace millions of people within two generations

A small boat in the Illulissat Icefjord is dwarfed by the icebergs that have calved from the floating tongue of Greenland’s largest glacier, Jacobshavn Isbrae. Image credit: Michael Bamber

Antarctica is further from civilisation than any other place on Earth. The Greenland ice sheet is closer to home but around one tenth the size of its southern sibling. Together, these two ice masses hold enough frozen water to raise global mean sea level by 65 metres if they were to suddenly melt. But how likely is this to happen?

The Antarctic ice sheet is around one and half times larger than Australia. What’s happening in one part of Antarctica may not be the same as what’s happening in another – just like the east and west coasts of the US can experience very different responses to, for example, a change in the El Niño weather pattern. These are periodic climate events that result in wetter conditions across the southern US, warmer conditions in the north and drier weather on the north-eastern seaboard.

The ice in Antarctica is nearly 5km thick in places and we have very little idea what the conditions are like at the base, even though those conditions play a key role in determining the speed with which the ice can respond to climate change, including how fast it can flow toward and into the ocean. A warm, wet base lubricates the bedrock of land beneath the ice and allows it to slide over it.

Though invisible from the surface, melting within the ice can speed up the process by which ice sheets slide towards the sea. Gans33/Shutterstock

These issues have made it particularly difficult to produce model simulations of how ice sheets will respond to climate change in future. Models have to capture all the processes and uncertainties that we know about and those that we don’t – the “known unknowns” and the “unknown unknowns” as Donald Rumsfeld once put it. As a result, several recent studies suggest that previous Intergovernmental Panel on Climate Change reports may have underestimated how much melting ice sheets will contribute to sea level in future.

What the experts say

Fortunately, models are not the only tools for predicting the future. Structured Expert Judgement is a method from a study one of us published in 2013. Experts give their judgement on a hard-to-model problem and their judgements are combined in a way that takes into account how good they are at assessing their own uncertainty. This provides a rational consensus.

The approach has been used when the consequences of an event are potentially catastrophic, but our ability to model the system is poor. These include volcanic eruptions, earthquakes, the spread of vector-borne diseases such as malaria and even aeroplane crashes.

Since the study in 2013, scientists modelling ice sheets have improved their models by trying to incorporate processes that cause positive and negative feedback. Impurities on the surface of the Greenland ice sheet cause positive feedback as they enhance melting by absorbing more of the sun’s heat. The stabilising effect of bedrock rising as the overlying ice thins, lessening the weight on the bed, is an example of negative feedback, as it slows the rate that the ice melts.

The record of observations of ice sheet change, primarily from satellite data, has also grown in length and quality, helping to improve knowledge of the recent behaviour of the ice sheets.

With colleagues from the UK and US, we undertook a new Structured Expert Judgement exercise. With all the new research, data and knowledge, you might expect the uncertainties around how much ice sheet melting will contribute to sea level rise to have got smaller. Unfortunately, that’s not what we found. What we did find was a range of future outcomes that go from bad to worse.

Reconstructed sea level for the last 2500 years. Note the marked increase in rate since about 1900 that is unprecedented over the whole time period. Robert Kopp/Kopp et al. (2016).

 

Rising uncertainty

We gathered together 22 experts in the US and UK in 2018 and combined their judgements. The results are sobering. Rather than a shrinking in the uncertainty of future ice sheet behaviour over the last six years, it has grown.

If the global temperature increase stays below 2°C, the experts’ best estimate of the average contribution of the ice sheets to sea level was 26cm. They concluded, however, that there is a 5% chance that the contribution could be as much as 80cm.

If this is combined with the two other main factors that influence sea level – glaciers melting around the world and the expansion of ocean water as it warms – then global mean sea level rise could exceed one metre by 2100. If this were to occur, many small island states would experience their current once-in-a-hundred–year flood every other day and become effectively uninhabitable.

A climate refugee crisis could dwarf all previous forced migrations. Punghi/Shutterstock

For a climate change scenario closer to business as usual – where our current trajectory for economic growth continues and global temperatures increase by 5℃ – the outlook is even more bleak. The experts’ best estimate average in this case is 51cm of sea level rise caused by melting ice sheets by 2100, but with a 5% chance that global sea level rise could exceed two metres by 2100. That has the potential to displace some 200m people.

Let’s try and put this into context. The Syrian refugee crisis is estimated to have caused about a million people to migrate to Europe. This occurred over years rather than a century, giving much less time for countries to adjust. Still, sea level rise driven by migration of this size might threaten the existence of nation states and result in unimaginable stress on resources and space. There is time to change course, but not much, and the longer we delay the harder it gets, the bigger the mountain we have to climb.


 

Click here to subscribe to our climate action newsletter. Climate change is inevitable. Our response to it isn’t.The Conversation

This blog was written by Cabot Institute member Jonathan Bamber, Professor of Physical Geography, University of Bristol and Michael Oppenheimer, Professor of Geosciences and International Affairs, Princeton University.  This article is republished from The Conversation under a Creative Commons license. Read the original article.

The future of sustainable ocean science

Westminster Central Hall

May 9th ushered in the 9th National Oceanography Centre (NOC) Association meeting, held among the crowds, statues, flags, and banners, at Central Hall in an unseasonably chilly and rainy Westminster. But it was the first such meeting where the University of Bristol was represented, and I was honoured to fly our own flag, for both University of Bristol and the Cabot Institute for the Environment.

NOC is – currently – a part of the Natural Environment Research Council (one of the UK Research Councils, under the umbrella of UKRI), but is undergoing a transformation in the very near future to an independent entity, and a charitable organisation in its own right aimed at the advancement of science. If you’ve heard of NOC, you’re likely aware of the NOC buildings in Southampton (and the sister institute in Liverpool). However, the NOC Association is a wider group of UK universities and research institutes with interests in marine science, and with a wider aim: to promote a two-way conversation between scientists and other stakeholders, from policy makers to the infrastructure organisations that facilitate – and build our national capability in – oceanographic research.

The meeting started with an introduction by the out-going chair of the NOC Association, Professor Peter Liss from the University of East Anglia, who is handing over the reins to Professor Gideon Henderson from Oxford University. The newly independent NOC Board will face the new challenges of changing scientific community, including the challenge of making the Association more visible and more diverse.

Professor Peter Liss, outgoing chair of the NOC Association,
giving the welcome talk

As well as the changes and challenges facing the whole scientific community, there are some exciting developments in the field of UK and international marine science in the next two years, which are likely to push the marine science agenda forward. In the UK, the Foreign Commonwealth Office International Ocean Strategy will be released in the next few months, and there is an imminent announcement of a new tranche of ecologically-linked UK Marine Protected Areas (MPAs) for consultation. On the international stage, a new Intergovernmental Panel on Climate Change special report on the Oceans and Cryosphere is due to be released in September; the Biodiversity Beyond National Jurisdiction (BBNJ) report on deep sea mining will be announced in the next few months; and the next United Nations Framework Convention on Climate Change (UNFCCC ) Conference of Parties (COP) climate change conference, scheduled for the end of this year in Chile, has been branded the “Blue COP”.

The afternoon was dedicated to a discussion of the upcoming UN Decade of Ocean Science for Sustainable Development, starting in 2021. With such a wealth of national and international agreements and announcements in next two years, the UN Decade will help to “galvanise and organise” the novel, scientific advice in the light of ever increasing and cumulative human impacts on the oceans.
Alan Evans, Head of the International and Strategic
Partnerships Office and a Marine Science Policy Adviser, giving a presentation
on the UN Decade of Ocean Science for Sustainable Development
The UN Decade is aligned strongly with the key global goals for sustainable development and has two overarching aims: to generate ocean science, and to generate policy and communication mechanisms and strategies. The emphasis is being placed on “science for solutions”, bringing in social scientists and building societal benefits: making the oceans cleaner, safer, healthier and – of course – all in a sustainable way.

Research and development priorities include mapping the seafloor; developing sustainable and workable ocean observing systems; understanding ecosystems; management and dissemination of open access data; multi-hazard warning systems (from tsunamis to harmful algal blooms); modelling the ocean as a compartment of the Earth system; and pushing for a robust education and policy strategy to improve “ocean literacy”.

Whilst these are exciting areas for development, the scheme is still in its very early stages, and there’s a lot to do in the next two years. As the discussion progressed, it was clear that there is a need for more “joined-up” thinking regarding international collaboration. There are so many international marine science-based organisations such that collaboration can be “messy” and needs to be more constructive: we need to be talking on behalf of each other. On a national level, there is a need to build a clear UK profile, with a clear strategy, that can be projected internationally. The NOC Association is a good place to start, and Bristol and the Cabot Institute for the Environment can play their parts.

Lastly, a decade is a long time. If the efforts are to be sustained throughout, and be sustainable beyond The Decade, we need to make sure that there is engagement with Early Career Researchers (ECRs) and mid-career researchers, as well as robust buy-in from all stakeholders. Whilst there are several national-scale organisations with fantastic programs to promote ECRs, such as the Climate Linked Atlantic Sector Science (CLASS) fellowship scheme and the Marine Alliance for Science and Technology for Scotland (MASTS) doctoral training program, this needs to be extended to ambitious international ECR networking schemes. Together with the future generation of researchers, we can use the momentum of the UN Decade make marine research sustainable, energised and diverse.

———————————

This blog is written by Dr Kate Hendry, a reader in Geochemistry in the University of Bristol School of Earth Sciences and a committee member for the Cabot Institute for the Environment Environmental Change Theme. She is the UoB/Cabot representative on the NOC Association, a member of the Marine Facilities Advisory Board (MFAB), and a co-chair of a regional Southern Ocean Observing System (SOOS) working group.

Global warming ‘hiatus’ is the climate change myth that refuses to die

File 20181210 76977 hkxl6p.jpg?ixlib=rb 1.1
riphoto3 / shutterstock

The record-breaking, El Niño-driven global temperatures of 2016 have given climate change deniers a new trope. Why, they ask, hasn’t it since got even hotter?

In response to a recent US government report on the impact of climate change, a spokesperson for the science-denying American Enterprise Institute think-tank claimed that “we just had […] the biggest drop in global temperatures that we have had since the 1980s, the biggest in the last 100 years.”

These claims are blatantly false: the past two years were two of the three hottest on record, and the drop in temperature from 2016 to 2018 was less than, say, the drop from 1998 (a previous record hot year) to 2000. But, more importantly, these claims use the same kind of misdirection as was used a few years ago about a supposed “pause” in warming lasting from roughly 1998 to 2013.

At the time, the alleged pause was cited by many people sceptical about the science of climate change as a reason not to act to reduce greenhouse pollution. US senator and former presidential candidate Ted Cruz frequently argued that this lack of warming undermined dire predictions by scientists about where we’re heading.

However, drawing conclusions on short-term trends is ill-advised because what matters to climate change is the decade-to-decade increase in temperatures rather than fluctuations in warming rate over a few years. Indeed, if short periods were suitable for drawing strong conclusions, climate scientists should perhaps now be talking about a “surge” in global warming since 2011, as shown in this figure:

Global temperature observations compared to climate models. Climate-disrupting volcanoes are shown at the bottom, and the purported hiatus period is shaded. 2018 values based on year to date (YTD).
NASA; Berkeley Earth; various climate models., Author provided

The “pause” or “hiatus” in warming of the early 21st century is not just a talking point of think-tanks with radical political agendas. It also features in the scientific literature, including in the most recent report of the Intergovernmental Panel on Climate Change and more than 200 peer-reviewed articles.

Research we recently published in Environmental Research Letters addresses two questions about the putative “pause”: first, is there compelling evidence in the temperature data alone of something unusual happening at the start of the 21st century? Second, did the rise in temperature lag behind projections by climate models?

In both cases the answer is “no”, but the reasons are interesting.

Reconstructing a historical temperature record from instruments designed for other purposes, such as weather forecasting, is not always easy. Several problems have affected temperature estimates for the period since 2000. The first of these was the fact that uneven geographical distribution of weather stations can influence the apparent rate of warming. Other factors include changes in the instruments used to measure ocean temperatures. Most of these factors were known at the time and reported in the scientific literature, but because the magnitudes of the effects were unknown, users of temperature data (from science journalists to IPCC authors) were in a bind when interpreting their results.

‘This glacier was here in 1908’: warming might fluctuate, but the long-term trend is clear.
Matty Symons/Shutterstock

A more subtle problem arises when we ask whether a fluctuation in the rate of warming is a new phenomena, rather than the kind of variation we expect due to natural fluctuations of the climate system. Different statistical tests are needed to determine whether a phenomena is interesting depending on how the data are chosen. In a nutshell, if you select data based on them being unusual in the first place, then any statistical tests that seemingly confirm their unusual nature give the wrong answer. (The statistical issue here is similar to the fascinating but counterintuitive “Monty Hall problem”, which has caught out many mathematicians).

When the statistical test is applied correctly, the apparent slowdown in warming is no more significant than other fluctuations in the rate of warming over the past 40 years. In other words, there is no compelling evidence that the supposed “pause” period is different from other previous periods. Neither is the deviation between the observations and climate model projections larger than would be expected.

That’s not to say that such “wiggles” in the temperature record are uninteresting – several of our team are involved in further studies of these fluctuations, and the study of the “pause” has yielded interesting new insights into the climate system – for example, the role of changes in the Atlantic and Pacific oceans.

There are lessons here for the media, for the public, and for scientists.

For scientists, there are two lessons: first, when you get to know a dataset by using it repeatedly in your work, make sure you also still remember the limitations you read about when first downloading it. Second, remember that your statistical choices are always part of a cascade of decisions, and at least occasionally those decisions must be revisited.

For the public and the media, the lesson is to check claims about the data. In particular, when claims are made based on short periods or specific datasets, they are often designed to mislead. If someone claims the world hasn’t warmed since 1998 or 2016, ask them why those specific years – why not 1997 or 2014? Why have such short limits at all? And also check how reliable similar claims have been in the past.

The technique of misinformation is nicely described in a quote attributed to climate researcher Michael Tobis:

“If a large data set speaks convincingly against you, find a smaller and noisier one that you can huffily cite.”

Global warming didn’t stop in 1998. Don’t be fooled by claims that it stopped in 2016 either. There is only one thing that will stop global warming: cuts to greenhouse gas emissions.The Conversation

———————————
This blog is written by Kevin Cowtan, Professor of Chemistry, University of York and Professor Stephan Lewandowsky, Chair of Cognitive Psychology, University of Bristol Cabot Institute. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Kevin Cowtan
Stephan Lewandowsky

Should we engineer the climate? A social scientist and natural scientist discuss

File 20181207 128196 1nndt5c.jpg?ixlib=rb 1.1
Ekaterina Karpacheva/Shutterstock.com

This is an article from The Conversation’s Head to Head, a series in which academics from different disciplines chew over current debates. Let us know what else you’d like covered – all questions welcome. Details of how to contact us are at the end of the article.


Rob Bellamy: 2018 has been a year of unprecedented weather extremes around the world. From the hottest temperatures ever recorded in Japan to the largest wildfire in the history of California, the frequency and intensity of such events have been made much more likely by human-induced climate change. They form part of a longer-term trend – observed in the past and projected into the future – that may soon make nations desperate enough to consider engineering the world’s climate deliberately in order to counteract the risks of climate change.

Indeed, the spectre of climate engineering hung heavily over the recent United Nations climate conference in Katowice, COP24, having featured in several side events as negotiators agreed on how to implement the landmark 2015 Paris Agreement, but left many worried that it does not go far enough.

Matt Watson: Climate engineering – or geoengineering – is the purposeful intervention into the climate system to reduce the worst side effects of climate change. There are two broad types of engineering, greenhouse gas removal (GGR) and solar radiation management (or SRM). GGR focuses on removing anthropogenically emitted gases from the atmosphere, directly reducing the greenhouse effect. SRM, meanwhile, is the label given to a diverse mix of large-scale technology ideas for reflecting sunlight away from the Earth, thereby cooling it.

An engineered future?

RB: It’s increasingly looking like we may have to rely on a combination of such technologies in facing climate change. The authors of the recent IPCC report concluded that it is possible to limit global warming to no more than 1.5°C, but every single one of the pathways they envisaged that are consistent with this goal require the use of greenhouse gas removal, often on a vast scale. While these technologies vary in their levels of maturity, none are ready to be deployed yet – either for technical or social reasons or both.

If efforts to reduce greenhouse gas emissions by transitioning away from fossil fuels fail, or greenhouse gas removal technologies are not researched and deployed quickly enough, faster-acting SRM ideas may be needed to avoid so-called “climate emergencies”.

SRM ideas include installing mirrors in Earth’s orbit, growing crops that have been genetically modified to make them lighter, painting urban areas white, spraying clouds with salt to make them brighter, and paving mirrors over desert areas – all to reflect sunlight away. But by far the best known idea – and that which has, rightly or wrongly, received the most attention by natural and social scientists alike – is injecting reflective particles, such as sulphate aerosols, into the stratosphere, otherwise known as “stratospheric aerosol injection” or SAI.

MW: Despite researching it, I do not feel particularly positive about SRM (very few people do). But our direction of travel is towards a world where climate change will have significant impacts, particularly on those most vulnerable. If you accept the scientific evidence, it’s hard to argue against options that might reduce those impacts, no matter how extreme they appear.

Do you remember the film 127 Hours? It tells the (true) story of a young climber who, pinned under a boulder in the middle of nowhere, eventually ends up amputating his arm, without anaesthetic, with a pen knife. In the end, he had little choice. Circumstances dictate decisions. So if you believe climate change is going to be severe, you have no option but to research the options (I am not advocating deployment) as broadly as possible. Because there may well come a point in the future where it would be immoral not to intervene.

SRM using stratospheric aerosols has many potential issues but does have a comparison in nature – active volcanism – which can partially inform us about the scientific challenges, such as the dynamic response of the stratosphere. Very little research is currently being conducted, due to a challenging funding landscape. What is being done is at small scale (financially), is linked to other, more benign ideas, or is privately funded. This is hardly ideal.

A controversial idea

RB: But SAI is a particularly divisive idea for a reason. For example, as well as threatening to disrupt regional weather patterns, it, and the related idea of brightening clouds at sea, would require regular “top-ups” to maintain cooling effects. Because of this, both methods would suffer from the risk of a “termination effect”: where any cessation of cooling would result in a sudden rise in global temperature in line with the level of greenhouse gases in the atmosphere. If we hadn’t been reducing our greenhouse gas emissions in the background, this could be a very sharp rise indeed.




Read more:
Time is running out on climate change, but geoengineering has dangers of its own


Such ideas also raise concerns about governance. What if one powerful actor – be it a nation or a wealthy individual – could change the global climate at a whim? And even if there were an international programme, how could meaningful consent be obtained from those who would be affected by the technology? That’s everybody on Earth. What if some nations were harmed by the aerosol injections of others? Attributing liability would be greatly contentious in a world where you can no longer disentangle natural from artificial.

And who could be trusted to deliver such a programme? Your experience with the SPICE (Stratospheric Particle Injection for Climate Engineering) project shows that people are wary of private interests. There, it was concerns about a patent application that in part led to the scientists calling off a test of delivery hardware for SAI that would have seen the injection of water 1km above the ground via a pipe and tethered balloon.

MW: The technological risks, while vitally important, are not insurmountable. While non-trivial, there are existing technologies that could deliver material to the stratosphere.

Most researchers agree that the socio-political risks, such as you outline, outweigh the technological risks. One researcher remarked at a Royal Society meeting, in 2010: “We know that governments have failed to combat climate change, what are the chances of them safely implementing a less-optimal solution?”. This is a hard question to answer well. But in my experience, opponents to research never consider the risk of not researching these ideas.

The SPICE project is an example where scientists and engineers took the decision to call off part of an experiment. Despite what was reported, we did this of our own volition. It annoyed me greatly when others, including those who purported to provide oversight, claimed victory for the experiment not going ahead. This belies the amount of soul searching we undertook. I’m proud of the decisions we made, essentially unsupported, and in most people’s eyes it has added to scientists’ credibility.

 

Moral hazard

RB: Some people are also worried that the promise of large-scale climate engineering technologies might delay or distract us from reducing greenhouse gas emissions – a “moral hazard”. But this remains to be seen. There are good reasons to think that the promise (or threat) of SRM might even galvanise efforts to reduce greenhouse gas emissions.

MW: Yes, I think it’s at least as likely that the threat of SAI would prompt “positive” behaviour, towards a sustainable, greener future, than a “negative” behaviour pattern where we assume technology, currently imaginary, will solve our problems (in fact our grandchildren’s problems, in 50 years time).

RB: That said, the risks of a moral hazard may not be the same for all climate engineering ideas, or even all SRM ideas. It’s a shame that the specific idea of stratospheric aerosol injection is so frequently conflated with its parent category of SRM and climate engineering more generally. This leads people to tar all climate engineering ideas with the same brush, which is to the detriment of many other ideas that have so far raised relatively fewer societal concerns, such as more reflective settlements or grasslands on the SRM side of things, or virtually the entire category of greenhouse gas removal ideas. So we risk throwing the baby out with the bathwater.

MW: I agree with this – somewhat. It’s certainly true all techniques should be given the same amount of scrutiny based on evidence. Some techniques, however, often look benign but aren’t. Modifying crops to make them more reflective, brightening clouds, even planting trees all have potentially profound impacts at scale. I disagree a little in as much as we simply don’t know enough yet to say which technologies have the potential to reduce the impacts of climate change safely. This means we do need to be thinking about all of these ideas, but objectively.

Anyone that passionately backs a particular technology concerns me. If it could be conclusively proven that SAI did more harm than good, then we should stop researching it. All serious researchers in SAI would accept that outcome, and many are actively looking for showstoppers.

RB: I agree. But at present there is very little demand for research into SRM from governments and wider society. This needs to be addressed. And we need broad societal involvement in defining the tools – and terms – of such research, and indeed in tackling climate change more broadly.




Read more:
Why you need to get involved in the geoengineering debate – now


The question of governance

MW: Some people think that we should just be getting on with engineering the climate, whereas others feel even the idea of it should not even be discussed or researched. Most academics value governance, as a mechanism that allows freedom to explore ideas safely and there are very few serious researchers, if any, who push back against this.

A challenge, of course, is who governs the governors. There are strong feelings on both sides – scientists either must, or cannot, govern their own research, depending on your viewpoint. Personally, I’d like to see a broad, international body set up with the power to govern climate engineering research, especially when conducting outdoor experiments. And I think the hurdles to conducting these experiments should consider both the environmental and social impact, but should not be an impediment to safe, thoughtful research.

RB: There are more proposed frameworks for governance than you can shake a stick at. But there are two major problems with them. The first is that most of those frameworks treat all SRM ideas as though they were stratospheric aerosol injection, and call for international regulation. That might be fine for those technologies with risks that cross national boundaries, but for ideas like reflective settlements and grasslands, such heavy handed governance might not make sense. Such governance is also at odds with the bottom-up architecture of the Paris Agreement, which states that countries will make nationally determined efforts to tackle climate change.

Which leads us to the second problem: these frameworks have almost exclusively arisen from a very narrow set of viewpoints – either those of natural or social scientists. What we really need now is broad societal participation in defining what governance itself should look like.

MW: Yes. There are so many questions that need to be addressed. Who pays for delivery and development and, critically, any consequences? How is the global south enfranchised – they are least responsible, most vulnerable and, given current geopolitical frameworks, unlikely to have a strong say. What does climate engineering mean for our relationship with nature: will anything ever be “natural” again (whatever that is)?

All these questions must be considered against the situation where we continue to emit CO₂ and extant risks from climate change increase. That climate engineering is sub-optimal to a pristine, sustainably managed planet is hard to argue against. But we don’t live in such a world. And when considered against a +3°C world, I’d suggest the opposite is highly likely to be true.

If there’s a specific topic or question you’d like experts from different disciplines to discuss, you can:The Conversation

  • Email your question to josephine.lethbridge@theconversation.com
  • Tell us on Twitter by tagging @ConversationUK with the hashtag #HeadtoHead, or
  • Message us on Facebook.

———————————
This blog was written by Dr Rob Bellamy, Presidential Fellow in Environment, University of Manchester and Dr Matthew Watson, Reader in Natural Hazards, University of Bristol Cabot Institute. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Rob Bellamy
Matt Watson

Quality through Equality – tackling gender issues in hydrology

Quality through Equality organising committee (l-r Dr Francesca Pianosi, Dr Valentina Noacco, Sebastian Gnann, Lina Stein, Dr Maria Pregnolato, Elisa Coraggio, Melike Kiraz, Lina Wang)

Results of a 1-day workshop organised by the Bristol University’s Water Engineering Group

A professor asked our group of PhD students last year, “Who here thinks of staying in academia after finishing their PhD?” Of the 10 male students present, 4 or 5 said they could imagine continuing in academia. None of the 5 female students raised their hand. When asked for their reasons for not wanting to stay in academia, some of the things mentioned were the challenge of combining family and academia, a lack of role models or different career aspirations.

This experience started the idea of organising a workshop on gender issues in hydrology, with the aim of raising awareness of unconscious biases, offer role models and discuss ideas on how to make the hydrologic community more diverse. Although the focus of the workshop was on gender diversity, most things we learned apply as well to issues related to misrepresentation of ethnic minorities or disabled scientists.

To achieve the aims mentioned above, the workshop included: three invited speakers (Prof Hannah Cloke, Dr Joshua Larsen, Prof Elena Toth) who shared their experiences regarding gender issues in hydrology; a talk and a training on unconscious biases (Prof Havi Carel); and a group discussion. The workshop was attended by 44 hydrologists, mainly PhD students, of which 28 were female and 16 were male.

One highlight of the day was the presentation of Hannah Cloke talking about her career progress to full professor while at the same time raising four kids. Together with Elena Toth and Joshua Larsen, she agreed that combining academia and raising a family is possible, because academia offers one of the most flexible work environments possible. However, it does need a supportive stance of the university to enable that flexibility (flexitime working hours, childcare facilities, flexible childcare support for conferences) and supportive colleagues. Hannah finished with good advice for all PhD students, but especially women or members of minorities: A work-family-life balance is essential. Say no before you are overwhelmed and exhausted, but: be brave! Say yes to opportunities that scare you and do great science! And encourage each other to be brave. This is definitely advice I will try to implement in my life.

The afternoon included an unconscious bias training by Professor Havi Carel (watch her TED talk about unconscious bias) and group discussions around how academia can become more diverse and how we can create an enjoyable academic environment.

Some of the topics we discussed were:

What can senior and peer colleagues do?

Often postgraduate and early career researchers suffer from lack of communication at their institutions. Peer-to-peer mentoring or senior-to-junior mentoring may offer opportunities for discussion to take place, particularly about equality/inclusion/diversity issues. When exclusion/discrimination problems are experienced/witnessed, having a range of peer and senior people to discuss with becomes very important, and facilitates reporting to leadership if needed. These meetings and discussions will also give opportunities to people who may otherwise feel their problems are overlooked, to find support, be empowered and build up their self-confidence.

What can leadership do?

To specifically include researchers with caring responsibilities some attendees mentioned that it would be helpful if institutions could improve access to affordable childcare – this may include nurseries at University as well as more flexible reimbursement for childcare during specific events, such as conferences, where children cannot be brought along by parents.

What is the role of role models?

The attendees agreed that role models can be vital in shaping career pathways as they inspire, work as advisors and can start or change career aspirations. Role models should be relatable (by gender, ethnicity, etc.) and are thus not always available in less diverse environments. However, if role models do not exist new ways to develop them can be used and should be encouraged. For example, Twitter or other social media can offer a great selection of diverse role models from all over the world.

What is success in academia (or in life)?

Success can be defined in many ways. Some people want to make a difference, some want to publish high quality material, some want a good work-family-life balance, and some want all of those together. This highlights how important it is for line managers, supervisors, and colleagues to accept and nurture this diversity. A redefinition of success should be flexible and shaped according to the people in a certain work environment. This will hopefully lead to a more enjoyable and a more productive work environment.

The feedback we received from the day was overwhelmingly positive. This includes both talking to attendees and evaluating questionnaires people filled out at the end of the day. The discussions about the topics and the opportunity to share experiences with others were found the highlights of the workshop. A large part of the participants felt more aware about biases and more empowered to tackle them. Some changes are already happening as a result of the workshop, for example our research group is diversifying social activities to be more inclusive, and both the British Hydrological Society as well as the Young Hydrologic Society have appointed EDI (Equality, Diversity & Inclusion) champions now! With one third of the 44 attendees being male, the workshop demonstrated that not just women are interested to learn about biases and discuss their experiences.

We thank the GW4 Water Security Alliance, the Cabot Institute and the School of Engineering of Bristol University for funding this event. A big thank you to our three speakers and Havi Carel who conducted the training, and to all attendees for creating an inclusive and productive atmosphere. Now it is our task to implement what we have learned and communicate the results as widespread as possible. And on a personal note, I definitely feel there is a future in academia for me now.

If you are interested in organising a similar event at your institution and have any questions, feel free to contact us: hydro-equality2019@bristol.ac.uk

Further information and material can be found on our website.

Some further reading about the topic of diversity and bias in STEM, including a list of scientific literature documenting the challenges women and minorities face in STEM subjects.

———————————-
This blog was written by Cabot Institute member Lina Stein and other members of the organising committee, a hydrology PhD student in the Department of Civil Engineering at the University of Bristol.