East Africa must prepare for more extreme rainfall during the short rainy season – new study

Rainy season in Kenya

East Africa has recently had an unprecedented series of failed rains. But some rainy seasons are bringing the opposite: huge amounts of rainfall.

In the last few months of 2023, the rainy season known as the “short rains” was much wetter than normal. It brought severe flooding to Kenya, Somalia and Tanzania. In Somalia, more than 2 million people were affected, with over 100 killed and 750,000 displaced from their homes. Tens of thousands of people in northern Kenya lost livestock, farmland and homes.

The very wet short rainy seasons are linked to a climate event known as a positive Indian Ocean Dipole (known as the “IOD”). And climate model projections show an increasing trend of extreme Indian Ocean dipoles.

In a new research paper, we set out to investigate what effect more frequent extreme Indian Ocean Dipole events would have on rainfall in east Africa. We did this using a large number of climate simulations and models.

Our results show that they increase the likelihood of very wet days – therefore making very wet seasons.

This could lead to extreme weather events, even more extreme than the floods of 1997, which led to 10 million people requiring emergency assistance, or those of 2019, when hundreds of thousands were displaced.

We recommend that decision-makers plan for this kind of extreme rainfall, and the resulting devastating floods.

How the Indian Ocean Dipole works

Indian Ocean Dipole events tend to occur in the second half of the year, and can last for months. They have two phases: positive and negative.

Positive events occur when the temperature of the sea surface in the western Indian Ocean is warmer than normal and the temperature in the eastern Indian Ocean is cooler than normal. Put simply, this temperature difference happens when winds move warmer water away from the ocean surface in the eastern region, allowing cooler water to rise.

In the warmer western Indian Ocean, more heated air will rise, along with water vapour. This forms clouds, bringing rain. Meanwhile, the eastern part of the Indian Ocean will be cooler and drier. This is why flooding in east Africa can happen at the same time as bushfires in Australia.

The opposite is true for negative dipole events: drier in the western Indian Ocean and wetter in the east.

Under climate change we’re expecting to see more frequent and more extreme positive dipole events – bigger differences between east and west. This is shown by climate model projections. They are believed to be driven by different paces of warming across the tropical Indian Ocean – with western and northern regions projected to warm faster than eastern parts.

Often heavy rain seasons in east Africa are attributed to El Niño, but recent research has shown that the direct impact of El Niño on east African rainfall is actually relatively modest. El Niño’s principal influence lies in its capacity to bring about positive dipole events. This occurs since El Niño events tend to cool the water in the western Pacific Ocean – around Indonesia – which also helps to cool down the water in the eastern Indian Ocean. These cooler temperatures then help kick-start a positive Indian Ocean Dipole.

Examining unprecedented events

Extreme positive Indian Ocean Dipole events are rare in the recent climate record. So to examine their potential impacts on rainfall extremes, we used a large set of climate simulations. The data allowed us to diagnose the sensitivity of rainfall to larger Indian Ocean Dipole events in a statistically robust way.

Our results show that as positive dipole events become more extreme, more wet days during the short rains season can be expected. This effect was found to be largest for the frequency of extremely wet days. Additionally, we found that as the dipole strength increases, the influence on the most extreme days becomes even larger. This means that dipole events which are even slightly “record-breaking” could lead to unprecedented levels of seasonal rainfall.

Ultimately, if positive Indian Ocean Dipole seasons increase in frequency, as predicted, regular seasons of flooding impacts will become a new normal.

One aspect not included in our analysis is the influence of a warmer atmosphere on rainfall extremes. A warmer atmosphere holds more moisture, allowing for the development of more intense rain storms. This effect could combine with the influence of extreme positive dipoles to bring unprecedented levels of rainfall to the Horn of Africa.

2023 was a year of record-breaking temperatures driven both by El Niño and global warming. We might expect that this warmer air could have intensified rain storms during the season. Indeed, evidence from a recent assessment suggests that climate change-driven warming is highly likely responsible for increased rainfall totals.

Responding to an unprecedented future

Policymakers need to plan for this.

In the long term it is crucial to ensure that any new infrastructure is robust to withstand more frequent and heavier rains, and that government, development and humanitarian actors have the capacity to respond to the challenges.

Better use of technology, such as innovations in disseminating satellite rainfall monitoring via mobile phones, can communicate immediate risk. New frontiers in AI-based weather prediction could improve the ability to anticipate localised rain storms, including initiatives focusing on eastern Africa specifically.

Linking rainfall information with hydrological models designed for dryland environments is also essential. These will help to translate weather forecasts into impact forecasts, such as identifying risks of flash flooding down normally dry channels or bank overflow of key rivers in drylands.

These technological improvements are crucial. But better use of the forecast information we already have can also make a big difference. For instance, initiatives like “forecast-based financing”, pioneered by the Red Cross Red Crescent movement, link forecast triggers to pre-approved financing and predefined action plans, helping communities protect themselves before hazards have even started.

For these endeavours to succeed, there must be dialogue between the science and practitioner communities. The scientific community can work with practitioners to integrate key insights into decisions, while practitioners can help to ensure research efforts target critical needs. With this, we can effectively build resilience to natural hazards and resist the increasing risks of our changing climate.The Conversation

———————————

This blog is written by David MacLeod, Lecturer in Climate Risk, Cardiff University; Erik W. Kolstad, Research professor, Uni Research; Cabot Institute for the Environment member Katerina Michaelides, Professor of Dryland Hydrology, School of Geographical Sciences, University of Bristol, and Michael Singer, Professor of Hydrology and Geomorphology, Cardiff University. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Intense downpours in the UK will increase due to climate change – new study

A flash flood in London in October 2019.
D MacDonald/Shutterstock

Elizabeth Kendon, University of Bristol

In July 2021, Kew in London experienced a month’s rain in just three hours. Across the city, tube lines were suspended and stations closed as London experienced its wettest day in decades and flash floods broke out. Just under two weeks later, it happened again: intense downpours led to widespread disruption, including the flooding of two London hospitals.

Colleagues and I have created a new set of 100-year climate projections to more accurately assess the likelihood of heavy rain downpours like these over the coming years and decades. The short answer is climate change means these extreme downpours will happen more often in the UK – and be even more intense.

To generate these projections, we used the Met Office operational weather forecast model, but run on long climate timescales. This provided very detailed climate projections – for every 2.2km grid box over the UK, for every hour, for 100 years from 1981 to 2080. These are much more detailed than traditional climate projections and needed to be run as a series of 20-year simulations that were then stitched together. Even on the Met Office supercomputer, these still took about six months to run.

We ran 12 such 100-year projections. We are not interested in the weather on a given day but rather how the occurrence of local weather extremes varies year by year. By starting the model runs in the past, it is also possible to verify the output against observations to assess the model’s performance.

At this level of detail – the “k-scale” – it is possible to more accurately assess how the most extreme downpours will change. This is because k-scale simulations better represent the small-scale atmospheric processes, such as convection, that can lead to destructive flash flooding.

The fire service attending to a vehicle stuck in floodwater.
Flash flooding can be destructive.
Ceri Breeze/Shutterstock

More emissions, more rain

Our results are now published in Nature Communications. We found that under a high emissions scenario downpours in the UK exceeding 20mm per hour could be four times as frequent by the year 2080 compared with the 1980s. This level of rainfall can potentially produce serious damage through flash flooding, with thresholds like 20mm/hr used by planners to estimate the risk of flooding when water overwhelms the usual drainage channels. Previous less detailed climate models project a much lower increase of around two and a half times over the same period.

We note that these changes are assuming that greenhouse gas emissions continue to rise at current rates. This is therefore a plausible but upper estimate. If global carbon emissions follow a lower emissions scenario, extreme rain will still increase in the UK – though at a slower rate. However, the changes are not inevitable, and if we emit less carbon in the coming decades, extreme downpours will be less frequent.

The increases are significantly greater in certain regions. For example, extreme rainfall in north-west Scotland could be almost ten times more common, while it’s closer to three times more frequent in the south of the UK. The greater future increases in the number of extreme rainfall events in the higher resolution model compared with more traditional lower resolution climate models shows the importance of having k-scale projections to enable society to adapt to climate change.

As the atmosphere warms, it can hold more moisture, at a rate of 7% more moisture for every degree of warming. On a simple level, this explains why in many regions of the world projections show an increase in precipitation as a consequence of human-induced climate change. This new study has shown that, in the UK, the intensity of downpours could increase by about 5% in the south and up to about 15% in the north for every degree of regional warming.

Group of girls with an umbrella walking through a city.
The projected increase in the intensity of rainfall is significantly greater in certain regions.
NotarYES/Shutterstock

However, it is far from a simple picture of more extreme events, decade by decade, as a steadily increasing trend. Instead, we expect periods of rapid change – with records being broken, some by a considerable margin – and periods when there is a pause, with no new records set.

This is simply a reflection of the complex interplay between natural variability and the underlying climate change signal. An analogy for this is waves coming up a beach on an incoming tide. The tide is the long-term rising trend, but there are periods when there are larger waves, followed by lulls.

Despite the underlying trend, the time between record-breaking events at the local scale can be surprisingly long – even several decades.

Our research marks the first time that such a high-resolution data set has spanned over a century. As well as being a valuable asset for planners and policymakers to prepare for the future, it can also be used by climate attribution scientists to examine current extreme rainfall events to see how much more likely they will have been because of human greenhouse gas emissions. The research highlights the importance of meeting carbon emissions targets and also planning for increasingly prevalent extreme rainfall events, which to varying degrees of intensity, look highly likely in all greenhouse gas emissions scenarios.

The tendency for extreme years to cluster poses challenges for communities trying to adapt to intense downpours and risks infrastructure being unprepared, since climate information based on several decades of past observations may not be representative of the following decades.


This blog is written by Cabot Institute for the Environment member Elizabeth Kendon, Professor of Climate Science, University of Bristol. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Lizzie Kendon
Professor Lizzie Kendon

The Horn of Africa has had years of drought, yet groundwater supplies are increasing – why?

 

Harvepino / shutterstock

The Horn of Africa – which includes Somalia, Ethiopia, Kenya and some surrounding countries – has been hit by increasingly frequent and devastating droughts. Despite this, it seems the region has an increasing amount of groundwater. And this water could help support drought-stricken rural communities.

That’s the key finding from our new research, in which we discovered that while overall rainfall is decreasing, an increase in “high-intensity” rainfall has led to more water being stored deep underground. It’s a paradoxical finding, yet one that may help one of the world’s most vulnerable regions adapt to climate change.

In the Horn of Africa, rural communities live in a constant state of water scarcity punctuated by frequent periods of food insecurity. People there rely on the “long rains” between March and May and the “short rains” between October and December to support their lives and livelihoods.

As we write this, the region’s drylands are experiencing a fifth consecutive season of below-average rainfall. This has left 50 million people in acute food insecurity. The droughts have caused water shortages, livestock deaths, crop failures, conflict and even mental health challenges.

The drought is so severe that it is even affecting zebras, giraffes and other wildlife, as all surface waters are drying up and edible vegetation is becoming scarce. Worryingly, a sixth failed rainy season has already been predicted for March to May 2023.

Long rains down, short rains up

In a new paper we investigated changes in seasonal rainfall in the Horn of Africa over the past 30 years. We found the total rainfall within the “long rains” season is declining, perhaps related to the warming of a particular part of the Pacific Ocean. However, rainfall is increasing in the “short rains”. That’s largely due to a climate phenomenon known as the Indian Ocean Dipole, when a warmer-than-usual Indian Ocean produces higher rainfall in east Africa, similar to El Niño in the Pacific.

We then investigated what these rainfall trends mean for water stored below ground. Has it decreased in line with declining “long rains”, or risen due to the increasing “short rains”?

Map of East Africa
The Horn of Africa borders the Red Sea, the Gulf of Aden and the Indian Ocean.
Peter Hermes Furian / shutterstock

To do this we made use of a pair of satellites which orbit repeatedly and detect small changes in the Earth’s gravitational field that can be interpreted as changes in the mass of water storage. If there’s a significant increase in water storage underground, then the satellite will record a stronger gravity field at that location compared to the previous measurement, and vice versa. From this, the mass of water added or lost in that location can be determined.

Using these satellite-derived estimates, we found that water storage has been increasing in recent decades. The increase correlates with the increasing “short rains”, and has happened despite the “long rains” getting drier.

Given that the long rains deliver more seasonal rain than the short rains, we wanted to understand the paradoxical finding that underground water is increasing. A clue is given by examining how rainfall is converted into groundwater in drylands.

When rain is light and drizzly, much of the water that reaches the ground dampens the soil surface and soon evaporates back into the warm, dry atmosphere. To become groundwater, rainfall instead needs to be intense enough so that water will quickly infiltrate deep into the soil. This mostly happens when lots of rain falls at once and causes dry riverbeds to fill with water which can then leak into underground aquifers.

People stand in river, rainy sky.
Heavy rains fill a dry river bed in the Somali region of Ethiopia.
Stanley Dullea / shutterstock

These most intense rainfall events are increasing in the “short rains”, in line with the overall increase in total rain in that season. And despite a decrease in overall rainfall in the “long rains”, intense rainfall has remained consistently high over time. This means that both rainy seasons have enough intense rainfall to increase the amount of water stored underground.

Finally, we demonstrated that the increasing water storage in this region is not connected to any rise in soil moisture near the surface. It therefore represents “banked” water that resides deep below ground and likely contributes to a growing regional groundwater aquifer in this region.

Groundwater can help people adapt to climate change

While early warning networks and humanitarian organisations focus on the urgent impacts of drought, our new research points to a silver lining that may support long-term climate adaptation. Those rising groundwater supplies we have identified may potentially be exploited to support people in rural areas whose food and water are increasingly insecure.

But there are some caveats. First, we have not assessed the depth of the available groundwater across the region, but we suggest that the water table is shallow enough to be affected by seasonal rainfall. This means it may also be shallow enough to support new bore holes to extract it. Second, we do not know anything about the quality of the stored groundwater and whether it can be deemed suitable for drinking. Finally, we do not know exactly what will happen if the most extreme droughts of the past few seasons continue and both long and short rains fail, causing intense rainfall to decrease too.

Nevertheless, our findings point to the need for extensive groundwater surveys across the Horn of Africa drylands to ascertain whether this increasing water resource may be viable enough to offset the devastating droughts. Groundwater could potentially irrigate fields and provide drinking water for humans and livestock, as part of a strategy to help this vulnerable region adapt to the effects of climate change.The Conversation

————————

This blog was written by Cabot Institute for the Environment member Katerina Michaelides, Associate Professor, School of Geographical Sciences, University of BristolMichael Singer, Professor in Physical Geography (Hydrology and Geomorphology), Cardiff University; and Markus Adloff, PostDoctoral Researcher, Earth System Modelling, Université de BerneThis article is republished from The Conversation under a Creative Commons license. Read the original article.

World Water Day: Climate change and flash floods in Small Island Developing States

Pluvial flash flooding (otherwise known as flash flooding caused by rain) is a major hazard globally, but a particularly acute problem for Small Island Developing States (SIDS). Many SIDS experience extreme rainfall events associated with tropical cyclones (often referred to as hurricanes) which trigger excess surface water runoff and lead to pluvial flash flooding.

Following record-breaking hurricanes in the Caribbean such as Hurricane Maria in 2017 and Hurricane Dorian in 2019, the severe risk facing SIDS has been reaffirmed and labelled by many as a sign of the ‘new normal’ due to rising global temperatures under climate change. Nonetheless, in the Disaster Risk Reduction community there is a limited understanding of both current tropical-cyclone induced flood hazard and how this might change under different climate change scenarios, which inhibits attempts to build adaptive capacity and resilience to these events.

As part of the first year of my PhD research, I am applying rainfall data that has been produced by Emily Vosper and Dr Dann Mitchell in the University of Bristol BRIDGE group using a tropical cyclone rainfall model. This model uses climate model data to simulate a large number of tropical cyclone events in the Caribbean, which are used to understand how the statistics of tropical cyclone-induced rainfall might change under the 1.5C and 2C Paris Agreement scenarios. This rainfall data will be input into the hydrodynamic model LISFLOOD-FP to simulate pluvial flash flooding associated with hurricanes in Puerto Rico.

Investigating changes in flood hazard associated with different rainfall scenarios will help us to understand how flash flooding, associated with hurricanes, emerges under current conditions and how this might change under future climate change in Puerto Rico. Paired with data identifying exposure and vulnerability, my research hopes to provide some insight into how flood risk related to hurricanes could be estimated, and how resilience could be improved under future climate change.

————————————-
This blog is written by Cabot Institute member Leanne Archer, School of Geographical Sciences,  University of Bristol.
Leanne Archer

The East Asian monsoon is many millions of years older than we thought

Sub-tropical rainforest in China. Image credit: UMBRELLA project

The East Asian monsoon covers much of the largest continent on Earth leading to rain in the summer in Japan, the Koreas and lots of China. Ultimately, more than 1.5 billion people depend on the water it provides for agriculture, industry and hydroelectric power.

Understanding the monsoon is essential. That is why colleagues and I recently reconstructed its behaviour throughout its 145m-year history, in order to better understand how it acts in response to changes in geography or the wider climate in the very long term, and what that might mean for the future.

Our study, published in the journal Science Advances indicates that the East Asian monsoon is much older and more varied than previously thought. Until quite recently the general consensus was that the monsoon came into being around 23m years ago, some time after the Tibetan Plateau was formed.

However, we show that it has been ever present for at least the past 145m years (except during the Late Cretaceous: the era of T. Rex), regardless of whether there was a Tibetan Plateau or how much CO₂ was in the atmosphere.

What is a monsoon?

At its most simple level a monsoon is a highly seasonal distribution in precipitation leading to a distinct “wet” and “dry” seasons – the word even derives from the Arabic “mausim”, translated as “season”.

The East Asian monsoon is a “sea breeze monsoon”, the most common type. They form because land and sea heat up at different rates, so high pressure forms over the sea and low pressure over land which results in wind blowing onshore in the summer.

 

It’s the world’s largest, highest plateau.
Rashevskyi Viacheslav / shutterstock

Although The Tibetan Plateau is not strictly needed to form the East Asian monsoon it can serve to enhance it. At 5km or more above sea level, the plateau simply sits much higher in the atmosphere and thus the air above it is heated much more than the same air would be at a lower elevation (consider the ground temperature in Tibet compared to the freezing air 5km above your head). As that Tibetan air is warmer than the surrounding cold air it rises and acts as a heat “pump”, sucking more air in to replace it and enhancing the monsoon circulation.

Changes over the (millions of) years

We found the intensity of the monsoon has varied significantly over the past 145m years. At first, it was around 30% weaker than today. Then, during the Late Cretaceous 100-66m years ago, a huge inland sea covered much of North America and weakened the Pacific trade winds. This caused East Asia to become very arid due to the monsoon disappearing.

However, rainfall patterns changed substantially after the Indian tectonic plate collided into the Asian continent around 50m years ago, forming the Himalayas and the Tibetan Plateau. As the land rose up, so did the strength of the monsoon. Our results suggest that 5-10m years ago there were “super-monsoons” with rainfall 30% stronger than today.

But how can we be sure that such changes were caused by geography, and not elevated carbon dioxide concentrations? To test this, we again modelled the climate for all different time periods (roughly every 4m years) and increased or reduced the amount of CO₂ in the atmosphere to see what effect this had on the monsoon. In general, irrespective of time period chosen, the monsoon showed little sensitivity (-1% to +13%) to changes in CO₂ compared to the impact of changes in regional geography.

Climate models are working

The monsoon in East Asia is mainly a result of its favourable geographic position and regional topography – though our work shows that CO₂ concentrations do have an impact, they are secondary to tectonics.

The past can help us better understand how the monsoon will behave as the climate changes – but its not a perfect analogue. Although rainfall increased almost every time CO₂ doubled in the past, each of these periods was unique and dependent on the specific geography at the time.

The reassuring thing is that climate models are showing agreement with geological data through the past. That means we have greater confidence that climate models are able to accurately predict how the monsoon will respond over the next century as humans continue to emit more CO₂ into the atmosphere.The Conversation

—————————–
This blog was written by Cabot Institute member Dr Alex Farnsworth, Postdoctoral Research Associate in meteorology at the University of Bristol. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Alex Farnsworth

World Water Day: How can research and technology reduce water use in agriculture?

Record breaking temperatures in 2018 led to drought in many European countries. Image credit Wikimedia Domain Mimikry11.

World Water Day draws attention to the global water crisis and addresses why so many people are being left behind when it comes to having access to safe water. The UN estimates that globally 80% of people who have to use unsafe and unprotected water sources live in rural areas. This can leave households, schools, workplaces and farms struggling to survive. On farms water is vital for the production of food and is used in a huge range of processes, including irrigation and watering livestock. In this blogpost I will lightly review the current issues around water in agriculture and highlight some exciting research projects that may offer potential solutions.

What is the water crisis?

The UN Sustainable Development Goal 6 is to ensure that all people have access to sustainable, safe water by 2030. Unfortunately, we’re a long way off achieving this goal as a recent report from UNICEF/WHO estimates that there are currently 2.1 billion people living without access to safe water in their homes and workplaces. Another report estimates that 71% of the global population experiences severe water scarcity during at least one month of the year. In recent years we have seen water risks increase, with severe droughts in Africa, China, Europe, India and the US. In sub-Saharan Africa, the number of record breaking dry months increased by 50% from 1980 to 2013. Unfortunately droughts, floods and rising sea levels are predicted to continue and become more unpredictable under climate change scenario models and as the global population continues to grow, there will be increasing demands on water supplies. Increases in water scarcity are likely to lead to increases in political and economic instability, conflict and migration.

Why is water important to agriculture?

In agriculture, water is vital for growing crops and sustaining livestock. Farmers use water to irrigate, apply pesticides and fertilizer and protect from heat and frost. This heavy reliance means that when water supplies run out, farmers are unable to effectively maintain their crops and livestock, leading to food insecurity. Drought stress can result in yield losses of 64% in rice, 50% in chickpea, 18 – 32% in potato. Drought has particularly devastating effects in tropical and sub-tropical regions, where climate change is predicted to have the biggest impact.

The amount of water it takes to produce food and drink products is pretty shocking. Beef production in particular is associated with high levels of water usage. It is estimated that the global average water footprint of a 150g beef burger is 2350 litres; despite producing just 5% of the world’s food calories, beef production is reported to create 40% of the water scarcity burden. Although there are big variations in the environmental impacts of beef farming, with grassland fed, rotational systems being less intensive than grain fed herds on deforested land.

Where does water used for agriculture come from?

The water that is used in agriculture comes from a range of sources, including surface and ground water supplies, rivers and streams, open canals, ponds, reservoirs and municipal systems. Globally, the FAO estimates that agriculture accounts for 70% of freshwater withdrawals, which is predominately used for irrigation. In many areas the high level of groundwater used for irrigation is unsustainable, leading to depletion. For instance, the OECD estimates that groundwater supplies 60% of India’s agricultural water needs but groundwater sources are suffering from depletion and pollution in 60% of states. A big problem is that irrigation is often highly inefficient; in the US the FAO estimates that the amount of irrigated water that is actually used by plants is only 56%. Large amounts of energy are also needed to withdraw, treat and supply agricultural water, leading to significant greenhouse gas (GHG) emissions.

What happens to agricultural water after use?

As well as depleting freshwater supplies, agriculture can also pollute them, with runoff containing large quantities of nutrients, antibiotics, growth hormones and other chemicals. This in turn has big affects on human health through contamination of surface and ground water with heavy metals, nitrate and pathogens and in the environment; it can cause algal blooms, dead zones and acidification of waterways. Combined these issues mean that better management of water in agriculture has huge potential for improving sustainability, climate resilience and food security, whilst reducing emissions and pollution.

What are the potential solutions?

Thankfully there are many innovative projects that are working to improve issues around water in agriculture. Below are a few examples that I find particularly promising.

How can technology help?

To reduce water wastage on farms, agri-technology is being developed whereby multiple wireless sensors detect soil moisture and evapotranspiration. The sensors feed this information to a cloud-based system that automatically determines precisely how much water to use in different parts of the field, leading to increased yields and saving water. Farmers can get water management recommendations via a smartphone app and the information automatically instructs irrigation systems. At a larger scale, these data systems can feed into a regional crop water demand model to inform decision-making on agricultural policies and management practices, and to provide early warnings of potential flood and drought risks.

Sensor that detects leaf moisture levels. Image credit: Wikimedia Domain Massimiliano Lincetto

Irrigation systems are also being made more efficient; one study found that simply changing from surface sprinklers to drip irrigation that applies water directly to plant roots through low-pressure piping, reduced non-beneficial water wastage by 76%, while maintaining yield production. In arid areas these systems can be used for a technique called partial root drying, whereby water is supplied to alternate side of the roots, the water stressed side then sends signals to close stomatal pores which reduces water lost through evapotranspiration.

These efficient precision irrigation systems are becoming cheaper and easier for farmers to use. However in tropical and sub-tropical areas, the technology can be difficult to apply smallholder farming, where there is often insufficient Internet connectivity, expertise, capital investment, and supply of energy and water. Several precision agriculture projects are working to overcome these challenges to promote efficient use of irrigation water, including in the semi-arid Pavagada region of India, the Gash Delta region of Sudan and São Paulo, Brazil. In Nepal, a Water Resources Information System has been established that collects data to inform river management, whereas in Bangladesh hundreds of solar-fuelled irrigation pumps have been installed that simultaneously reduce reliance on fossil fuels and reduce GHG emissions.

Hydroponic systems whereby plants are grown in water containing nutrients are becoming increasingly popular; the global market for hydroponics is projected to reach £325 million by 2020. Compared with land-based agriculture, hydroponics uses less land; causes less pollution and soil erosion and so these systems are less vulnerable to climate change. Critically they also reduce water use; once the initial water requirements are met, the closed-system recycles water and there is less evapotranspiration. The adoption of these systems is predicted to occur predominately in water stressed regions of the Middle East and Africa and in highly urbanised countries such as Israel, Japan and the Netherlands.

How can researching traditional approaches help?

It’s not just about agri-tech; there are relatively simple, traditional ways to tackle water issues in agriculture. To protect against drought, farmers can harvest and store rainwater during heavy downpours by building ponds and storage reservoirs. To reduce water wastage, farmers can improve the ability of soil to absorb and hold water through reducing tillage and using rotational livestock grazing, compost, mulch and cover crops. Wetlands, grasslands and riparian buffers can be managed to protect against floods, prevent waterlogging of crops and improve water quality. Increasingly these traditional methods valued and research is being done to optimise them. For instance a novel forage grass hybrid has been developed that is more resilient to water stress and can reduce runoff by 43 – 51% compared with conventional grass cultivars.

A small-scale farmer in Kenya who is harvest rainwater. Image credit: Wikimedia Domain Timothy Mburu.

How can crop and livestock breeding help?

In the past, crop and livestock varieties have been selected for high productivity. However, these varieties are often severely affected by changes in climate and extreme weather events such as drought and require high levels of water and nutrients. To improve resilience and sustainability, breeders increasingly need to also select for stress responses and resource use efficiency. In crops, drought resilience and water use efficiency is influenced by many traits, including root and shoot architecture, stomatal density and thickness of the waxy cuticle that covers leaves and reduces evapotranspiration. The complexity of these traits makes breeding crops for drought resilience challenging, as many different groups of genes need to be selected for. To deal with this, the International Rice Research Institute’s Green Super Rice project has been crossing high-yielding parent lines with hundreds of diverse varieties to produce new high-yielding varieties that require less water, fertilisers and pesticides. These varieties are now being delivered to farmers in countries across Asia and Africa. Similarly, climate change resilience is also vital for current and future livestock farming. Projects run by Professor Eileen Wall (SRUC) have identified novel traits and genes associated with drought and heat resilience in UK and African dairy cattle, which can be incorporated into breeding programmes.

What are the incentives?

Although these projects might sound promising, without incentives to drive their uptake it may take a long time for real impacts to come to fruition. Unfortunately, in some countries such as India there can be a lack of monetary incentives that would effectively enable farmers to take up new water management technology and practices. In the EU, the Common Agricultural Policy (CAP) has allocated funds to support farmers in complying with ‘greening rules’ that improve sustainability, preserve ecosystems and efficient use of natural resources, including water. Farmers across the EU receive CAP payments for environmentally friendly farming practices, such as crop diversification and maintaining permanent grassland.

In many European countries, there is increasing consumer demand for sustainably farmed food products. This is driving large and small manufacturers to seek out sustainable suppliers and so farmers are incentivised to improve the sustainability of their farming practices so that they can be certified.  For instance the Sustainable Farming Assurance Programme requires farmers to follow good agricultural and environmental protection practices, including sustainable water use. In the coming years, more food products are likely to have water foot print labels that provide the consumer with information on the amount of water used during production and processing. This places considerable power in the hands of the consumer and large manufacturers are responding. For instance, by 2020 Kellogg has pledged to buy ten priority ingredients (corn, wheat, rice, potatoes, sugar and cocoa) only from farms that prioritise protecting water supplies, as well as using fertilizers safely, reducing emissions, and improving soil health. And Pepsico has created sustainable agriculture sourcing programmes that aim to help farmers improve water and soil resource management, protect water supplies, minimise emissions and improve soil health.

What can we do?

There are ways to take responsibility for reducing our own water footprints, including reducing meat and animal production consumption, reducing food wastage and buying sustainably farmed products. Finally, we can all get involved with communicating and promoting the importance of water in agriculture so that more people are aware of the issues. Head to the World Water Day website to find out about resources and events that may be happening near you.

——————————
This blog is written by Caboteer Dr Katie Tomlinson, who recently completed her PhD at the University of Bristol on cassava brown streak disease. Katie is now an Innovation and Skills manager at the BBSRC and is running the Sustainable Agriculture Research and Innovation Club. Views presented in this blog are her own. You can follow Katie on Twitter: @KatieTomlinson4.

Dr Katie Tomlinson

 

Play stops rain: could ‘cloud seeding’ deliver perfect Wimbledon weather?

Image credit: Carine06, Wikimedia Commons

Wimbledon, 2026. Bright blue skies and a wonderful late afternoon sun lights up the lush green grass of centre court. Out strides the British number one and four-time winner, Andy Henman, to the cheers of the excitable, partisan crowd.

Somewhere nearby, at the headquarters of WeatherMod Inc, a group of technicians are busily checking data, confident that their efforts have worked. They have been in contact with two pilots who have just completed their spray sorties and are returning to land at Heathrow’s new third runway. Thanks to the delivery of 4kg of, in its pure form, a yellowish powder known as Silver Iodide (AgI) into clouds upwind of London, it is now raining over the Salisbury Plain, 100 miles away, and the rain predicted for later in SW19 is now 92% less likely.

This scenario probably sounds a little far-fetched, and not least the bit about the repeatedly successful home-grown tennis player. However, weather modification occurs more often than most people are aware. For example, as I wrote that first paragraph I genuinely didn’t realise that a Weather Modification Incorporated actually already exists in Fargo, North Dakota. They, and other companies like them have sprung up over the past few years promising to manage water for crops, clear fog and even protect wedding days from ill-timed hail.

But two questions need further investigation to consider the likelihood of the above scenario at Wimbledon: can we do it (that is, does it work) and should we do it? Neither, it turns out, are particularly easy to answer.

Changing the weather

In order to make rain several processes need to occur. First, small particles known as cloud condensation nuclei (CCN) are required onto which water can condense. Then these droplets need to grow to a size where they precipitate out of the cloud, finally falling where and when required.

In our hypothetical scenario we would therefore need to be able to either control or at least predict accurately the concentration of CCN, the rate at which droplets form, and the evaporation rates within the clouds. We’d also also need to have some handle on the rate and direction in which rain would fall.

Silver iodine dumped into a cloud attracts water, which turns into rain.
Smcnab386 / wiki, CC BY-SA

In reality, cloud seeding with AgI – the current default option – only really tackles the first of these processes, forming the condensation nuclei. Even if clouds are seeded, it is still a matter of debate as to whether they actually create much additional rain. While companies claim success, some scientists are more wary. Although other seeding agents (and methodologies) exist, it is worth noting that, in the case of AgI, the nature of the clouds into which the particles are injected will govern the outcome.

Seeding works best in clouds which have a pre-existing mixture of water droplets and ice, as this type of nucleation requires ice-crystals to form. Following the production of CCN we’d then need to be able to predict, through computer modelling, how small droplets will form into rain and eventually fall.

One of the major drawbacks of cloud seeding is a lack of proof that it works: given weather forecasting remains imperfect, how would you know what would have happened without intervention? The second part of the question is arguably even harder to approach. What are the ethics of removing water from one part of the world, even on a small scale, and moving it somewhere else? Is this “messing with nature” or “playing God”? Water is, after all, the most precious commodity on Earth.

Let’s assume for now that it is possible to alter local weather patterns and to prevent or cause rain. This could be used for both good and evil, and the potential for abuse is worth considering. While manipulating the weather as a weapon is now explicitly outlawed by the UN’s ENMOD treaty, there have been efforts to alter the outcome of conflict using cloud seeding.

‘Operation Popeye’: the US used cloud-seeding to extend the monsoon season during the Vietnam war, causing delays on the waterlogged Ho Chi Minh Trail.
manhhai, CC BY

Deliberate and accidental effects from commercial activity also seem possible. That dreamy, rain-free wedding ordered up by an anxious billionaire could easily ruin a school sports day in a nearby town.

The question of attribution is possibly the most challenging. Without any alternative outcomes to analyse, how can you really know what are the impacts from your actions. Some even say, quite incorrectly, that cloud seeding experiments caused floods, such as those that killed 35 people in the English village of Lynmouth in 1952. Expert opinion leans strongly against that idea being correct. Nonetheless, conspiracy theories persist. If, in our hypothetical Wimbledon scenario, bits of Wiltshire flooded, who would foot the bill?

It’s certainly possible in theory to prevent rain in one place by using cloud seeding to induce it in another, upwind. But there are huge challenges and the jury is still out about whether such efforts really work.

There are some very good causes, such as inducing rainfall in Sub-Saharan Africa during drought, where I would sanction intervention. For something as frivolous as a sporting event I feel differently. Just last weekend I played cricket for four hours in unrelenting drizzle (thanks Skip). While not a massively enjoyable experience it was at least familiar, and is part of the essence of both cricket and tennis. There’s some comfort in that.

—————————————————-
The Conversation

This blog is by Matthew Watson, Reader in Natural Hazards at the University of Bristol.
This article was originally published on The Conversation. Read the original article.

Matthew Watson

Why do flood defences fail?

More than 40,000 people were forced to leave their homes after Storm Desmond caused devastating floods and wreaked havoc in north-west England. Initial indications were that the storm may have caused the heaviest local daily rainfall on record in the UK. As much as £45m has been spent on flood defences in the region in the previous ten years and yet the rainfall still proved overwhelming. So what should we actually expect from flood defence measures in this kind of situation? And why do they sometimes fail?

We know that floods can and will happen. Yet we live and work and put our crucial societal infrastructure in places that could get flooded. Instead of keeping our entire society away from rivers and their floodplains, we accept flood risks because living in lowlands has benefits for society that outweigh the costs of flood damage. But knowing how much risk to take is a tricky business. And even when there is an overall benefit for society, the consequences for individuals can be devastating.

We also need to calculate risks when we build flood defences. We usually protect ourselves from some flood damage by building structures like flood walls and river or tidal barriers to keep rising waters away from populated areas, and storage reservoirs and canals to capture excess water and channel it away. But these structures are only designed to keep out waters from typical-sized floods. Bigger defences that could protect us from the largest possible floods, which may only happen once every 100 years, would be much more expensive to build and so we choose to accept this risk as less than the costs.

Balancing the costs and benefits

In the UK, the Environment Agency works with local communities to assess the trade off between the costs of flood protection measures, and the benefits of avoiding flood damage. We can estimate the lifetime benefits of different proposed flood protection structures in the face of typical-sized floods, as well as the results of doing nothing. On the other side of the ledger, we can also estimate the structures’ construction and maintenance costs.

In some cases, flood protection measures can be designed so that if they fail, they do the least damage possible, or at least avoid catastrophic damage. For example, a flood protection wall can be built so that if flood waters run over it they run into a park rather than residential streets or commercial premises. And secondary flood walls outside the main wall can redirect some of the overflow back towards the river channel.

 

Thames Barrier: big costs but bigger benefits.
Ross Angus/Flickr, CC BY-SA

The Environment Agency puts the highest priority on the projects with the largest benefits for the smallest costs. Deciding where that threshold should be set is a very important social decision, because it provides protection to some but not all parts of our communities. Communities and businesses need to be well-informed about the reasons for those thresholds, and their likely consequences.

We also protect ourselves from flood damage in other ways. Zoning rules prevent valuable assets such as houses and businesses being built where there is an exceptionally high flood risk. Through land management, we can choose to increase the amount of wooded land, which can reduce the impact of smaller floods. And flood forecasting alerts emergency services and helps communities rapidly move people and their portable valuables out of the way.

Always some risk

It’s important to realise that since flood protection measures never eliminate all the risks, there are always extra costs on some in society from exceptional events such as Storm Desmond, which produce very large floods that overwhelm protection measures. The costs of damage from these exceptional floods are difficult to estimate. Since these large floods have been rare in the past, our records of them are very limited, and we are not sure how often they will occur in the future or how much damage will they cause. We also know that the climate is changing, as are the risks of severe floods, and we are still quite uncertain about how this will affect extreme rainfall.

 

At the same time we know that it’s very hard to judge the risk from catastrophic events. For example, we are more likely to be afraid of catastrophic events such as nuclear radiation accidents or terrorist attacks, but do not worry so much about much larger total losses from smaller events that occur more often, such as floods.

Although the process of balancing costs against benefits seems clear and rational, choosing the best flood protection structure is not straightforward. Social attitudes to risk are complicated, and it’s difficult not to be emotionally involved if your home or livelihood are at risk.
The Conversation

————————————–
This blog is written by Cabot Institute member Dr Ross Woods, a Senior Lecturer in Water and Environmental Engineering, University of Bristol.  This article was originally published on The Conversation. Read the original article.

Ross Woods