Downhill all the way: Monitoring landslides using geophysics

Developments in geophysical methods used to monitor surface and subsurface changes prior to landslides can lead to improved prediction and early warning.

 

Every year, landslides cause fatalities and destruction in locations worldwide. Nevertheless, what triggers them and when they occur can often be difficult to predict. A recent article in Reviews of Geophysics examined developments in landslide monitoring using insights and methods from geophysics. Here, one of the authors of the paper answers some questions about landslide monitoring and prediction.

Why is the monitoring of landslides important, and what role can geophysics play?

Sometimes the most effective option for mitigating the risk from landslides is monitoring.

In an ideal world, we would have the geotechnical and financial resources to be able to remove all landslide hazards through slope stabilization and remediation. In reality, this just isn’t possible, and sometimes the most effective option for mitigating the risk from a landslide is to monitor it.

Historically, this has been done by monitoring deformation at the surface and by looking at changes from localised points in the subsurface; for example, by measuring fluctuations in the water table in a borehole. Variations in these data may provide clues about when slope failure is imminent.
The advantage of geophysical methods is that they can not only monitor subsurface properties and how they change over time but can also do so at much higher spatial resolution and over a wider area than point sources of information, such as boreholes.

What are the different types of landslides and why are geophysical methods particularly useful for monitoring “moisture-induced” landslides?

“Landslide” is one of those words that sounds simple enough to define but in reality is very complex.

One of the distinctions we can make between landslide types is their triggering mechanism; most landslides are caused by the direct consequences of increased rainfall and shaking by earthquakes, but they can also be a result of secondary factors such as deforestation.


Between 2007 and 2016, 83% of landslides globally were triggered by rainfall or other hydrological events. This is why we use the term “moisture-induced” in our review article, as it reflects the complicated nature of all sources of water present in landslide systems, including rainfall, snow-melt, and groundwater, amongst others.

Introducing increased amounts of water into a landslide changes the properties of the subsurface, which leads to destabilization and, when a critical threshold is exceeded, slope failure. These changes in material properties can be monitored by geophysical methods and, by comparing data collected over time, it is possible to make inferences about the destabilizing processes that are occurring in the subsurface of the landslide system.
Changes in subsurface ground moisture derived from a semi-permanent, 3D electrical resistivity (ER) array at the Hollin Hill Landslide Observatory, North Yorkshire, UK. The left image shows wet winter conditions, in which the western lobe of the landslide has significantly more subsurface moisture than the eastern lobe. The right image shows drier summer conditions, showing subsurface drainage from the failing Whitby Mudstone Formation to the underlying Staithes Sandstone Formation, despite dry ground at the surface of the landslide. Credit: Uhlemann et al. [2017], Figure 11

 

What different geophysical methods are used to gather information about moisture-induced landslides?

The majority of studies used passive seismic and active geoelectrical methods.

Our review article looks at published case studies from the past 12 years to see what kinds of methods are being applied to monitor moisture-induced landslides. What struck us was that the majority of studies used one of two methods: passive seismic and active geoelectrical methods.


Passive seismic monitoring has been used for many decades in global seismological studies, but really only started to be scaled down to look at smaller scale features, such as landslides, in the mid-1990s.

Although passive seismic monitoring has been around longer, monitoring landslides using active geoelectrical methods, primarily electrical resistivity (ER), has really taken off in the last decade or so. There have been several studies in which ER technologies have been developed specifically for landslide monitoring approaches. Consequently, ER monitoring is currently able to provide more information than passive seismic monitoring on the pre-failure conditions of landslides.
Lower equipment costs and power consumption, combined with better data management and equipment durability, means we can collect more geophysical data for longer from landslides. Each of the points in this plot shows information gathered from published case studies about the length of time and amount of data acquired during a single geophysical monitoring campaign. Multiannual campaigns are becoming increasingly common compared to nearly a decade ago. Credit: Whiteley et al. [2018], Figure 6

 

What do these methods tell us about the subsurface conditions of landslides?

The two approaches provide an opportunity to better understand the variable nature of the subsurface in time and space.

Passive seismic and active geoelectrical approaches complement each other very well. First, they tell us about different aspects of the subsurface conditions beneath a landslide. Seismic methods are able to tell us about the strength of the ground, while ER methods provide information about subsurface moisture dynamics. Both of these aspects are very important when trying to predict landslide movements.


Second, passive approaches tend to have great temporal resolution, but their spatial coverage can be limited by the number of seismic sensors deployed on a slope, usually due to cost or power requirements. On the other hand, ER methods can provide very high spatial resolution, but as they are dependent on collecting a set of data from many measurements, their temporal resolution can be limited. Together, the two approaches provide an opportunity to better understand the variable nature of the subsurface in time and space.

What advances in equipment and data analysis have improved understanding of landslide processes?

The financial, computational, and energy cost of equipment is continually reducing, which means we can collect more data for longer periods, and send data from the field to the lab for near real-time analysis.

Also, data telemetry means we can send data from the field to the lab for near real-time analysis. Both of these are crucial when using geophysical methods for early-warning of landslide failure.

Recently, there has been an increase in the use of 3D surveys and petrophysical relationships linking geophysical The financial, computational and energy cost of equipment is continually reducing, which means we can collect more data for longer periods. Also, data telemetry means we can send data from the field to the lab for near real-time analysis. Both of these are crucial when using geophysical methods for early-warning of landslide failure.

In ER monitoring, movements in the electrode array would have historically produced errors in the resistivity model, but developments in ER data inversion can now use this source of “error” to track movements in the landslide. Similarly seismic “ambient noise” is being used in innovative ways to monitor landslides, even though these background signals would have traditionally been undesirable in seismological surveys.

Left: The “Automated time-Lapse Electrical Resistivity” (ALERT) geoelectrical monitoring system installed at Hollin Hill, North Yorkshire, UK. Right: Inside the cabinet, the system acquires geoelectrical, geotechnical and weather data. Collecting geophysical measurements alongside local displacement and environmental data allows for more robust interpretations of the changes in subsurface geoelectrical data over time. Credit: British Geological Survey

Where is the field of geophysical monitoring of moisture-induced landslide heading?

The challenge now is to start looking for clues to identify precursory conditions to slope failure and to develop geophysical thresholds to inform early-warning approaches. 

The great news is that this is a very active area of research! There is a lot of work being done in environmental seismology to increase the number of low-cost, low-power seismic sensors that can be deployed in landslide settings. This is important, as it will allow us to monitor landslides at very high-resolution in both the spatial and temporal domain.

Looking to the future, one can envision “smart sensor” sites that provide power, data storage, and telemetry, accommodating a wide range of integrated geophysical, geotechnical, and environmental monitoring methods. These could include seismic and electrical arrays, wireless sensor networks, and weather stations, with data relayed back to central processing sites for near-real time assessment, and early-warnings of impending failure based on calibrated geophysical thresholds.

———————————
This blog was written by Cabot Institute for the Environment member James Whitely, postgraduate researcher at University of Bristol’s School of Earth Science and the British Geological Society, with contributions from the articles co-authors. The blog was originally published by Editors’ Vox.


Original blog Citation: Whiteley, J. (2019), Downhill all the way: monitoring landslides using geophysics, Eos, 100 https://doi.org/10.1029/2019EO111065

Participating and coaching at a risk communication ‘pressure cooker’ event

Anna Hicks (British Geological Survey) and BUFI Student (University of Bristol) Jim Whiteley reflect on their experiences as a coach and participant of a NERC-supported risk communication ‘pressure cooker’, held in Mexico City in May.

Jim’s experience….

When the email came around advertising “the Interdisciplinary Pressure Cooker on Risk Communication that will take place during the Global Facility for Disaster Reduction and Recovery (GFDRR; World Bank) Understanding Risk Forum in May 2018, Mexico City, Mexico” my thoughts went straight to the less studious aspects of the description:

‘Mexico City in May?’ Sounds great!
‘Interdisciplinary risk communication?’ Very à la mode! 
‘The World Bank?’ How prestigious! 
‘Pressure Cooker?’ Curious. Ah well, I thought, I’ll worry about that one later…

As a PhD student using geophysics to monitor landslides at risk of failure, communicating that risk to non-scientists isn’t something I am forced to think about too often. This is paradoxical, as the risk posed by these devastating natural hazards is the raison d’être for my research. As a geologist and geophysicist, I collect numerical data from soil and rocks, and try to work out what this tells us about how, or when, a landslide might move. Making sense of those numbers is difficult enough as it is (three and a half years’ worth of difficult to be precise) but the idea of having to take responsibility for, and explain how my research might actually benefit real people in the real world? Now that’s a daunting prospect to confront.

However, confront that prospect is exactly what I found myself doing at the Interdisciplinary Pressure Cooker on Risk Communication in May this year. The forty-odd group of attendees to the pressure cooker were divided in to teams; our team was made up of people working or studying in a staggeringly wide range of areas: overseas development in Africa, government policy in the US, town and city planning in Mexico and Argentina, disaster risk reduction (DRR) in Colombia, and of course, yours truly, the geophysicist looking at landslides in Yorkshire.

Interdisciplinary? Check.

One hour before the 4am deadline.

The possible issues to be discussed were as broad as overfishing, seasonal storms, population relocation and flooding. My fears were alleviated slightly, when I found that our team was going to be looking at hazards related to ground subsidence and cracking. Easy! I thought smugly. Rocks and cracks, the geologists’ proverbial bread and butter! We’ll have this wrapped up by lunchtime! But what was the task? Develop a risk communication strategy, and devise an effective approach to implementing this strategy, which should be aimed at a vulnerable target group living in the district of Iztapalapa in Mexico City, a district of 1.8 million people. Right.

Risk communication? Check.

It was around this time I realised that I glossed over the most imperative part of the email that had been sent around so many months before: ‘Pressure Cooker’. It meant exactly what it said on the tin; a high-pressure environment in which something, in this case a ‘risk communication strategy’ needed to be cooked-up quickly. Twenty-four hours quickly in fact. There would be a brief break circa 4am when our reports would be submitted, and then presentations were to be made to the judges at 9am the following morning. I checked the time. Ten past nine in the morning. The clock was ticking.

Pressure cooker? Very much check.

Anna’s experience….

What Jim failed to mention up front is it was a BIG DEAL to win a place in this event. 440 people from all over the world applied for one of 35 places. So, great job Jim! I was also really grateful to be invited to be a coach for one of the groups, having only just ‘graduated’ out of the age bracket to be a participant myself! And like Jim, I too had some early thoughts pre-pressure cooker, but mine were a mixture of excitement and apprehension in equal measures:

‘Mexico City in May?’ Here’s yet another opportunity to show up my lack of Spanish-speaking skills…
‘Interdisciplinary risk communication?’ I know how hard this is to do well…
‘The World Bank?’ This isn’t going to be your normal academic conference! 
‘Pressure Cooker?’ How on earth am I going to stay awake, let alone maintain good ‘coaching skills’?!

As an interdisciplinary researcher working mainly in risk communication and disaster risk reduction, I was extremely conscious of the challenges of generating risk communication products – and doing it in 24 hours? Whoa. There is a significant lack of evidence-based research about ‘what works’ in risk communication for DRR, and I knew from my own research that it was important to include the intended audience in the process of generating risk communication ‘products’. I need not have worried though. We had support from in-country experts that knew every inch of the context, so we felt confident we could make our process and product relevant and salient for the intended audience. This in part was also down to the good relationships we quickly formed in our team, crafted from patience, desire and ability to listen to each other, and for an unwavering enthusiasm for the task!

The morning after the night before.

So we worked through the day and night on our ‘product’ – a community based risk communication strategy aimed at women in Iztapalapa with the aim of fostering a community of practice through ‘train the trainer’ workshops and the integration of art and science to identify and monitor ground cracking in the area.

The following morning, after only a few hours’ sleep, the team delivered their presentation to fellow pressure-cooker participants, conference attendees, and importantly, representatives of the community groups and emergency management teams in the geographical areas in which our task was focused. The team did so well and presented their work with confidence, clarity and – bags of the one thing that got us through the whole pressure cooker – good humour.

It was such a pleasure to be part of this fantastic event and meet such inspiring people, but the icing on the cake was being awarded ‘Best Interdisciplinary Team’ at the awards ceremony that evening. ‘Ding’! Dinner served.

—————
This blog has been reposted with kind permission from James Whiteley.  View the original blog on BGS Geoblogy.   This blog was written by James Whiteley, a geophysicist and geologist at University of Bristol, hosted by British Geological Survey and Anna Hicks from the British Geologial Survey.

Deploying and Servicing a Seismic Network in Central Italy

From a scientific point of view, the seismicity that is hitting Central Italy presents itself as an unmissable opportunity for seismologists to analyse the triggering and the evolution of an earthquake sequence. From the tens of instruments installed in the affected area, a huge amount of data is being collected. Such a well-recorded sequence will allow us to produce a comprehensive seismic catalogue of events. On this big quantity of data, new algorithms will be developed and tested for the characterisation of even the smallest earthquakes. Moreover, they will enable the validation of more accurate and testable statistical and physics-based forecast models, which is the core objective of my Ph.D. project.
Seismicity map of the Amatrice-Norcia sequence updated 5 November 2016.
The Central Apennines are one of the most seismically hazardous areas in Italy and in Europe. Many destructive earthquakes have occurred throughout this region in the past, most recently the 2009 MW = 6.4 L’Aquila event. On August 24th, just 43 km North of the 2009 epicentre, an earthquake of magnitude 6.0 occurred and devastated the villages of Amatrice and Accumuli, leading to 298 fatalities, hundreds of injured and tens of thousands people affected. The mainshock was followed, in under an hour, by a MW = 5.4 aftershock. Two months later, on October 26th, the northern sector of the affected area was struck by two earthquakes of magnitude 5.4 and 5.9, respectively, with epicentres near the village of Visso. To make things even worse, on October 30th the city of Norcia was hit by a magnitude 6.5 mainshock, which has been the biggest event of the sequence to date and the strongest earthquake in Italy in the last 36 years. Building collapses and damages were very heavy for many villages and many historical heritage buildings have reported irreparable damages, such as the 14th century St. Benedict cathedral. Luckily, the has been no further fatalities since the very first event of August 24.
St. Benedict cathedral (Norcia), erected in the late 14th century and completely destroyed after the Mw 6.5 earthquake of October 30th.
Immediately after the first big event, an emergency scientific response team was formed by the British Geological Survey (BGS) and the School of GeoSciences at the University of Edinburgh, to support the rapid deployment of high-accuracy seismometers in collaboration with the Istituto Nazionale di Geofisica e Vulcanologia (INGV). The high detection capabilities, made possible by such a dense network, will let us derive a seismic catalogue with a great regional coverage and improved magnitude sensitivity. This new, accurate, catalogue will be crucial in developing operational forecast models. The ultimate aim is to understand the potential migration of seismic activity to neighbouring faults as well as the anatomy of the seismogenic structure and to shed light into the underlying physical processes that produce the hazard.
Thanks to the quick response of the National Environmental Research Council (NERC) and SEIS-UK, 30 broadband stations have been promptly dispatched from Leicester and arrived in less than 48 hours in Rome. There, a group of 9 people composed by INGV and BGS seismologists, technicians and Ph.D. students (including myself) from University of Bristol, Dublin Institute for Advanced Study (DIAS) and University of Ulster were ready to travel across the Apennines to deploy this equipment. The first days in Rome were all about planning; the location of each station was carefully decided so as to integrate the existing Italian permanent and temporary networks in the most appropriate way. After having performed the ‘huddle test’ in the INGV, which involves parallel checking of all the field instrumentation in order to ensure its correct functioning, we packed all the equipment and headed to the village of Leonessa, a location considered safe enough to be used as our base camp (despite the village being damaged and evacuated after the 30th October event).
Preparing instrumentation for the huddle test in one of INGV’s storage rooms.
In order to optimise time and resources, and to start recording data as soon as possible, we decided to split in 3 groups so that we could finish our work between the end of August and the first week of September. Each seismic station is composed of a buried sensor, a GPS antenna, a car battery, a regulator and two solar panels. The current deployment will stay for 1 year and will be collecting data continually. Each sensor had to be carefully buried and levelled to guarantee the highest quality of recording, which was a strenuous challenge when the ground was quite rocky!
Typical setting of our deployed stations. On the left, the buried sensor. Its cables, buried as well, connect it to the instrumentation inside the black box (a car battery, and a regulator). On the right, the solar panel (a second one was added in October service) and the white GPS antenna.
Aside from the scientific value of the expedition, the deployment week was a great opportunity to get to know each other, share opinions, ideas and, of course, get some training in seismology! At the end, we managed to install 24 stations around an area of approximately 2700 km2.
As this type of seismic station didn’t have telemetry, each needed to be revisited to retrieve data. For this purpose, from October 17th, David Hawthorn (BGS) and I flew to Italy again and stayed there for the following ten days to service the seismometers and to do the first data dump. Our goals were also  to check the quality of the first month of recordings, to add a second solar panel where needed, and to prepare the stations for the forthcoming winter. To do that, a lot of hammering and woodworking was needed. We serviced all the sites, raising the solar panels and GPS antennas on posts, which were securely anchored to the ground, to prevent snow from covering them. The stations were all in good conditions, with just minor damages due to some very snoopy cows.
David Hawthorn (BGS) servicing the stations – A second solar panel was added. Panels and GPS antennas were raised on posts anchored to the ground through timbers.
Dumping data from the stations using a netbook and specific hard drive.
On October 26, just the night before leaving for Rome, we experienced first-hand the frightening feeling of a mainshock just below our feet. Both the quakes of that evening surprised us while we were inside a building; the rumble just few seconds before the quake was shocking and the shaking was very strong. Fortunately, there were no severe damages in Leonessa but many people in the village refused to spend the night in their own houses. Also, it was impressive to see the local emergency services response: only a few minutes after the first quake, policemen were already out to patrol the inner village checking for any people experiencing difficulties.
The small village of Pescara del Tronto suffered many collapses and severe damages after the 24 August earthquakes. View from the motorway above.
Throughout our car transfers from one site to another we frequently found roads interrupted by a building collapse or by a landslide, but we could also admire the mountains with a mantle of beautiful autumnal colours and the spectacular landscapes offered by the Apennines, like the Monte Vettore, the Gran Sasso (the highest peak in the Apennines) and the breath-taking Castelluccio plain near Norcia.
View of the Norcia plain, near to the 24th August magnitude 5.4 and the 30th October magnitude 6.5 epicentres.
View of the Castelluccio plain. This picture was taken from the village of Castelluccio, just 5 days before it was totally destroyed by the magnitude 6.5 mainshock.
From my point of view, I learned a lot and really enjoyed this experience. I feel privileged to have started my Ph.D. in leading institutions like the University of Bristol and the BGS and, at the same time, to be able to spend time in my home country (yes, I am Italian…) with such interesting scientific questions. What I know for sure is that we will be back there again.

Blog written by Simone Mancini, 1st year Ph.D. student, University of Bristol and British Geological Survey.

Implementing volcanic hazard assessment operationally

Following the 2010 eruption of the Eyjafjallajökull volcano in Iceland, the National Risk Register now lists volcanic hazards at the highest priority level. Volcanic hazard assessment draws together scientific knowledge of volcanic processes, observational evidence and statistical modelling to assess and forecast hazard and risk. Researchers at the University of Bristol have been central to the development of local, regional and global volcanic risk modelling over recent decades. One aspect of ongoing research is to develop a strategy for devising and implementing hazard assessments in an operational environment, to provide decision support during a volcanic crisis.

Cabot Professor Willy Aspinall
demonstrating the application of
Expert Elicitation in volcanic
hazard modelling at the OTVHA
workshop, Vienna, April 2014

Last week, I organised a workshop on Operational Techniques for Volcanic Hazard Assessment. The 2-day workshop, held in Vienna, Austria and supported by the European Geosciences Union and the Cabot Institute, brought together researchers from 11 institutions in eight countries to explore current practice in methods applied to operational and near-real time volcanic hazard assessment.  I was assisted in organising by Dr Jacopo Selva of INGV in Bolognia and speakers included Cabot Institute members Professor Willy Aspinall and Dr Thea Hincks, Dr Richard Luckett of the British Geological Survey and Dr Laura Sandri, of INGV, Italy.

There is a real gap between our ability to monitor and understand volcanic processes and our capacity to implement that understanding in a way that is useful operationally. In this workshop, we were able to bring together some of the leading researchers from around the world to explore how different tools and techniques are deployed. Better integration of these tools is essential for volcanic hazard forecasting to be useful for risk management.

The workshop involved discussion sessions and practical demonstrations of tools for real-time monitoring alerts, the use of expert judgment, Bayesian event tree scenario modelling and Bayesian belief network inference tools.  Dr Mike Burton from INGV Pisa, who took part in the workshop, said,

“It’s really important for volcanologists to engage with how our science can be adapted and incorporated in hazard assessments. The OTVHA workshop was a really useful exercise in exploring how our knowledge and uncertainty can be assimilated for real time decision support.”

Monitoring a volcano in Ethiopia

My research in Bristol concerns the interface between volcano monitoring data and hazard scenario models and I felt the workshop was a great success.  A few groups have developed approaches to modelling volcanic hazard and risk. This workshop provided a great forum for detailed discussion of how these tools and techniques can be combined and compared.  As scientists, we need to understand how to optimise and communicate our model output to be useful for decision makers.

Developing tools that are both scientifically and legally defensible is a major challenge in natural hazard science. The idea of organising the OTVHA workshop was to further explore the opportunities in addressing these challenges, which are central to the mission of the Cabot Institute. We’ve already started planning for the next workshop!

The OTVHA workshop was followed up with an associated session at the EGU General Assembly meeting, ‘Advances in Assessing Short-term Hazards and Risk from Volcanic Unrest or Eruption’, with a keynote presentation by Prof Chuck Connor on assessment of volcanic risk for nuclear facilities.

——————————–
This blog is written by Cabot Institute member Henry Odbert, School of Earth Sciences, University of Bristol.

———————————-

There are a few places left on the Cabot Institute Summer School on Risk and Uncertainty in Natural Hazards, featuring Willy Aspinall and other leading Cabot Institute academics.  Book your place now.