National greenhouse gas reporting needs an overhaul – it’s time to directly measure the atmosphere

Junk Culture / shutterstock

How much greenhouse gas is emitted by any individual country? With global emissions of carbon dioxide hitting a record of 36.8 billion tonnes this year, and delegates gathering in Madrid for the latest UN climate talks, it’s a pressing question.

One might assume that we know precisely how much is emitted by any given country, and that such figures are rigorously cross-checked and scrutinised. And in some respects, this is true – countries are required to report their emissions to the UN, based on exhaustive guidelines and with reams of supporting data.

Yet these reports are based on what are known as inventory (or “bottom-up”) methods. To simplify, this means that governments figure out how much greenhouse gas is emitted by a typical car, cow, or coal plant, and then add up all the cows, cars and so on to get an overall emissions figure.

Map showing the UK’s CO2 emissions, calculated using ‘bottom-up’ methods. Daniel Hoare, University of Bristol, © Crown 2019 copyright Defra & BEIS, Author provided.

 

While this method is essential to understand the make-up of a country’s emissions, it is ultimately reliant on accurate and comprehensive information on economic activity, some compromises to allow standardisation across countries, and some element of trust.
And such reporting can go awry. In 2018 and again earlier this year, colleagues and I made headlines when we first identified mystery emissions of a banned ozone-depleting substance and greenhouse gas and then later tracked its source down to factories in eastern China.




Read more:
How we traced ‘mystery emissions’ of CFCs back to eastern China


The problem is that these “bottom-up” emissions reports do not generally include what some might consider key information: measurements that can indicate the actual amount of greenhouse gas in the atmosphere.

So could new data help us better understand how much we are emitting?

A national greenhouse gas monitoring network

The UK, Switzerland and Australia have pioneered a measurement-based approach to add credibility and transparency to their emissions reports. In 2012, a network of measurement stations was established on telecommunications towers across the UK to sniff out greenhouse gases emitted from around the country.

A tower used to sample the greenhouse gases in the air in Norfolk, England. Inset: a researcher working on the project. University of Bristol, Author provided
To interpret these measurements, we use sophisticated computer models that simulate how gases are transported from the surface, through the atmosphere, to the points where they are observed. By comparing the modelled and measured concentrations, we can determine the national emission rate.
These “top-down” estimates, which now form a key part of the UK’s National Inventory Report to the UN, have yielded some surprising insights. Sceptics may suspect that governments would be keen to “hide” emissions from the rest of the world, but in at least one case atmospheric data suggests that the UK has for years actually over-estimated, by around 100%, emissions of a potent greenhouse gas used in car air conditioners (HFC-134a). In contrast, for the major greenhouse gases methane and nitrous oxide, the data in recent years corroborates the UK inventory reports remarkably well.

More questions than answers?

Naturally, once this measurement data is available, new questions emerge. For example, the UK inventory suggests that methane emissions have gradually declined since 1990 but the atmospheric data suggests little trend, if any. This is important, because the UK benchmarks its emissions reductions against the year 1990.

Could this suggest that the country has not been as successful as it thought at reducing methane leaked from landfills, for example? Or have such emissions reductions been offset by some other source? Unfortunately, such questions are difficult to answer using “standard” atmospheric measurement techniques – a molecule of methane emitted from a landfill looks very similar to one from a cow.

Very similar, that is, but not identical. I am involved in a new £3m project called DARE-UK (Detection and Attribution of Regional Emissions in the UK), which looks for tell-tale features that can help us identify where carbon dioxide, methane and nitrous oxide in the atmosphere came from.
One type of signal that we are looking for is a tiny perturbation to the ratio of heavy and light isotopes of methane and carbon dioxide in the air. Isotopes are almost identical to one another but differ in their molecular mass. It turns out that cow burps, for example, emit methane with less of the heavy isotope than similar amounts of methane from a leaky gas boiler. So, we hope that this type of data may help the UK’s inventory team identify which sectors of the bottom-up reports may require re-examination.

We need improved transparency

While these measurements are proving a valuable aid for inventory compilers, their main utility is likely to be in ensuring trust and transparency in the international reporting process. Atmospheric measurements do not suffer from the confidentiality issues that can prevent interested parties from peeking behind the scenes of national inventories.

Could governments still hide their emissions? It’s unlikely, provided top-down methods are used with open and transparent protocols and data sharing. This should avoid accusations of foul play that could threaten to derail initiatives like the international climate accord, the Paris Agreement.

The UK example shows this type of emissions evaluation is now ready for the international stage. Institutions such as the World Meteorological Organization are working with governments and sub-national stakeholders to try to make it happen. Hopefully policymakers will see the value of finding out what’s really being released into their airThe Conversation.

————————————
This blog is written by Cabot Institute member Dr Matt Rigby, Reader in Atmospheric Chemistry, University of Bristol. This article is republished from The Conversation under a Creative Commons license. Read the original article.

An insight into aviation emissions and their impact on the atmosphere

Image credit: El Ronzo, Flickr

The proliferation of aviation has brought about huge benefits to our society, enhancing global economic prosperity and allowing humanity to travel faster, further and more frequently than ever before. However, the relentless expansion of the industry is a major detriment to the environment on a local, regional and global level. This is due to the vast amounts of pollution produced from the jet fuel combustion process, that is required to propel aircraft through the air and to sustain steady level flight.

Aircraft impact the climate largely through the release of CO2, which results in a direct contribution to the greenhouse effect, absorbing terrestrial radiation and trapping heat within the atmosphere, leading to rising temperatures. However, it is also vital not to overlook the non-CO2 aircraft emissions such as NOx, soot and water vapour, which result in alternative climate change mechanisms – the indirect greenhouse effect, the direct aerosol effect and aviation induced cloudiness. When accounting for these non-CO2 effects, it can be assumed that the climate impact is doubled or tripled compared to that of CO2 alone.

This report provides the necessary background information to grasp the science behind aircraft emissions and delves into the impacts aviation has on the atmosphere’s ability to cleanse itself of harmful emissions, otherwise known as the oxidising capacity of the atmosphere. It does so through an analysis of three distinct and commonly flown flight routes, investigating the impact that each flight’s emissions have on the surrounding atmospheric chemistry and discusses the potential effects this has on our Earth-atmosphere system.

Read the full report by Kieran Tait

——————————
Read our other blogs about air travel:

  1. To fly or not to fly? Towards a University of Bristol approach
  2. I won’t fly to your conference, but I hope you will still invite me to participate
Watch our latest video on air travel at the University of Bristol.

AI & sustainable procurement: the public sector should first learn what it already owns

While carrying out research on the impact of digital technologies for public procurement governance, I have realised that the deployment of artificial intelligence to promote sustainability through public procurement holds some promise. There are many ways in which machine learning can contribute to enhance procurement sustainability.

For example, new analytics applied to open transport data can significantly improve procurement planning to support more sustainable urban mobility strategies, as well as the emergence of new models for the procurement of mobility as a service (MaaS).* Machine learning can also be used to improve the logistics of public sector supply chains, as well as unlock new models of public ownership of, for example, cars. It can also support public buyers in identifying the green or sustainable public procurement criteria that will deliver the biggest improvements measured against any chosen key performance indicator, such as CO2 footprint, as well as support the development of robust methodologies for life-cycle costing.

However, it is also evident that artificial intelligence can only be effectively deployed where the public sector has an adequate data architecture.** While advances in electronic procurement and digital contract registers are capable of generating that data architecture for the future, there is a significant problem concerning the digitalisation of information on the outcomes of past procurement exercises and the current stock of assets owned and used by the public sector. In this blog, I want to raise awareness about this gap in public sector information and to advocate for the public sector to invest in learning what it already owns as a potential major contribution to sustainability in procurement, in particular given the catalyst effect this could have for a more circular procurement economy.

Backward-looking data as a necessary evidence base

It is notorious that the public sector’s management of procurement-related information is lacking. It is difficult enough to have access to information on ‘live’ tender procedures. Accessing information on contract execution and any contractual modifications has been nigh impossible until the very recent implementation of the increased transparency requirements imposed by the EU’s 2014 Public Procurement Package. Moreover, even where that information can be identified, there are significant constraints on the disclosure of competition-sensitive information or business secrets, which can also restrict access.*** This can be compounded in the case of procurement of assets subject to outsourced maintenance contracts, or in assets procured under mechanisms that do not transfer property to the public sector.

Accessing information on the outcomes of past procurement exercises is thus a major challenge. Where the information is recorded, it is siloed and compartmentalised. And, in any case, this is not public information and it is oftentimes only held by the private firms that supplied the goods or provided the services—with information on public works more likely to be, at least partially, under public sector control. This raises complex issues of business to government (B2G) data sharing, which is only a nascent area of practice and where the guidance provided by the European Commission in 2018 leaves many questions unanswered.*

I will not argue here that all that information should be automatically and unrestrictedly publicly disclosed, as that would require some careful considerations of the implications of such disclosures. However, I submit that the public sector should invest in tracing back information on procurement outcomes for all its existing stock of assets (either owned, or used under other contractual forms)—or, at least, in the main categories of buildings and real estate, transport systems and IT and communications hardware. Such database should then be made available to data scientists tasked with seeking all possible ways of optimising the value of that information for the design of sustainable procurement strategies.

In other words, in my opinion, if the public sector is to take procurement sustainability seriously, it should invest in creating a single, centralised database of the durable assets it owns as the necessary evidence base on which to seek to build more sustainable procurement policies. And it should then put that evidence base to good use.

More circular procurement economy based on existing stocks

In my view, some of the main advantages of creating such a database in the short-, medium- and long-term would be as follows.

In the short term, having comprehensive data on existing public sector assets would allow for the deployment of different machine learning solutions to seek, for example, to identify redundant or obsolete assets that could be reassigned or disposed of, or to reassess the efficiency of the existing investments eg in terms of levels of use and potential for increased sharing of assets, or in terms of the energy (in)efficiency derived from their use. It would also allow for a better understanding of potential additional improvements in eg maintenance strategies, as services could be designed having the entirety of the relevant stock into consideration.

In the medium term, this would also provide better insights on the whole life cycle of the assets used by the public sector, including the possibility of deploying machine learning to plan for timely maintenance and replacement, as well as to improve life cycle costing methodologies based on public-sector specific conditions. It would also facilitate the creation of a ‘public sector second-hand market’, where entities with lower levels of performance requirements could acquire assets no longer fit for their original purpose, eg computers previously used in more advanced tasks that still have sufficient capacity could be repurposed for routine administrative tasks. It would also allow for the planning and design of recycling facilities in ways that minimised the carbon footprint of the disposal.

In the long run, in particular post-disposal, the existence of the database of assets could unlock a more circular procurement economy, as the materials of disposed assets could be reused for the building of other assets. In that regard, there seem to be some quick wins to be had in the construction sector, but having access to more and better information would probably also serve as a catalyst for similar approaches in other sectors.

Conclusion

Building a database on existing public sector-used assets as the outcome of earlier procurement exercises is not an easy or cheap task. However, in my view, it would have transformative potential and could generate sustainability gains not only aimed at reducing the carbon footprint of future public expenditure but, more importantly, at correcting or somehow compensating for the current environmental impacts of the way the public sector operates. This could make a major difference in accelerating emissions reductions and should consequently be a matter of sufficient priority for the public sector to engage in this exercise. In my view, it should be a matter of high priority.

* A Sanchez-Graells, ‘Some public procurement challenges in supporting and delivering smart urban mobility: procurement data, discretion and expertise’, in M Finck, M Lamping, V Moscon & H Richter (eds), Smart Urban Mobility – Law, Regulation, and Policy, MPI Studies on Intellectual Property and Competition Law (Berlin, Springer, 2020) forthcoming. Available on SSRN: http://ssrn.com/abstract=3452045.

** A Sanchez-Graells, ‘Data-driven procurement governance: two well-known elephant tales’ (2019) Communications Law, forthcoming. Available on SSRN: https://ssrn.com/abstract=3440552.

*** A Sanchez-Graells, ‘Transparency and competition in public procurement: A comparative view on a difficult balance’, in K-M Halonen, R Caranta & A Sanchez-Graells (eds), Transparency in EU Procurements: Disclosure within public procurement and during contract execution, vol 9 EPL Series (Edward Elgar 2019) 33-56. Available on SSRN: https://ssrn.com/abstract=3193635.

————————————
This blog was written by Cabot Institute member Professor Albert Sanchez-Graells, Professor of Economic Law (University of Bristol Law School).

Albert Sanchez-Graells

Science in action: Air pollution in Bangkok

Bangkok haze 2019 March. Wikimedia Commons.

I was given the opportunity to spend a significant part of 2018 in Bangkok, Thailand, to work with the Chulabhorn Research Institute (CRI) Laboratory of Environmental Toxicology working on a project funded by the Newton Fund on air-quality. Bangkok is a large city with over 14 million inhabitants, which suffer high levels of traffic and congestion resulting in consequent high exposure to traffic-related pollution. It is a UN Sustainable development goal to reduce the number of deaths caused by pollution by 2030. Air pollution is a global problem – a major threat to health throughout the world – but particularly so in low and medium income countries, which account for 92% of pollution related deaths (1). The poor and the marginalised often live in areas of high pollution, and children have a disproportionate exposure to pollutants at a vulnerable stage of development.

The Chulabhorn Research Institute is an independent research institute in Bangkok whose mission includes the application of science and technology to improve the Thai people’s quality of life. The Laboratory of Toxicology, under Professor Mathuros Ruchirawat, have a very strong record in using their results to inform policy and make a real impact on the lives of people in South East Asia affected by environmental toxins. For example, a previous study undertaken by the CRI found that people living and working near busy roads were exposed to Benzene from traffic exhaust, and they demonstrated increased DNA damage. Once this was known, the Thai government was persuaded to alter fuel mixtures in cars to protect the population (2).

I was in Bangkok from January to June, then returned in September till December 2018. I brought with me particle measurement and sampling equipment to count particles and sample particulate matter (PM) in Bangkok to supplement the toxicology work of the Research institute. PM can be described by its size fractions; usually reported are PM10 (aerosol diameter 10 micrometres and lower) and PM2.5 (2.5 micrometres and lower). Less often measured, is the sum-micron range (sometimes referred to as PM1) and the ultrafine range (less than 100 nm).

James Matthews with his particle measurement and sampling equipment on public transport in Bangkok.

Below 1 μm, it becomes more difficult to measure particle numbers as optical techniques fail on particles around 200 nm and smaller.  To count them, the particles require a solvent to grow them to a countable size. The requirement for regular solvents, and the high price of aerosol instrumentation to measure the smallest sizes, mean that particle number concentration is not always measured as a matter of course, but new research is indicating that they may be a significant health concern. The smaller particles can penetrate further into the lung and there is some evidence that this may cause them to pass further into the body, possibly even making its way into the brain. While much more research is needed – in both the toxicological and epidemiological domains – to understand the effects of these smaller particles I would not be surprised if the narrative on air quality moves further toward the ultrafine size range in the very near future.

While in Bangkok, I added my aerosol science expertise and experience in aerosol field measurements to the team in the CRI, taking measurements of particle number count using a handheld particle counter, and collecting samples of PM using both PM10 samplers, and a cascade impactor (the Dekati Electrical Low Pressure Impactor) that allowed samples to be separated by aerodynamic size, then collected for further analysis on ICP-MS (inductively coupled plasma mass spectrometry). Thus, metal concentrations within all the different size fractions of aerosol could be found. Within the first few months of the project, I was able to train the staff at the CRI to use this equipment, and so measurements could continue when I returned to the UK.

As well as taking measurements at the CRI in the Lak Si district, north of Bangkok, we chose three sites in the wider Bangkok area that represented different exposure conditions. We were given access to the Thai governmental Pollution Control Department (PCD) air quality measurements sites, where our instrumentation was set up next to their other pollutant measurements.

A toll road and railway in Lak Si – from Bangkok toward the Don Mueang airport. Image credit James Matthews.

The three sites included Ayutthaya, a UNESCO world heritage site north of Bangkok. Ayutthaya, while a busy tourist destination, has considerably less traffic, and therefore less traffic emission related pollutants, than the other sites. The second site, Bang Phli, was an area to the South of Bangkok where there is a lot of industry.  The third, Chok Chai, was a roadside site in central Bangkok.

Survey measurements of particle count were taken in several locations using a hand-held particle counter. The particle numbers were highest in measurements on the state railway and on the roads in Bangkok. The state railway through Bangkok is slow moving, where diesel engines repetitively start and brake, all of which contribute to particulates. Conversely the newer sky train and underground railways had low particle counts (the underground had the lowest counts I measured anywhere). At the CRI site, long term measurements near a toll road showed that the particle number count was highest at rush hours, indicating traffic as the dominant contributor. Walking routes in both Bangkok and Ayutthaya showed high concentrations near roads, and in markets and street stalls, where street vendors produce food.

Within our measurements in Bangkok, we were able to measure the mass fraction, and the metal (and some non-metals such as arsenic) content over 12 size fractions from 10 um down to 10 nm. Metals that are known to be deleterious to human health include Cadmium, Chromium, Nickel and Lead, which are class 2 (possible or probable) or class 1 (known) carcinogens. Comparing the reference site (Ayutthaya) with the roadside site (Chok Chai) over several 3-day sampling periods showed that these toxic metals were present in higher concentrations in the area with higher traffic. They were also present in higher concentration in the lower size ranges, which may result in these metals penetrating deeper into the lung.

One episode demonstrated the need for local knowledge when measurements are taken. Normally, we would expect measurements during weekdays to be higher in working areas than at weekends, due to increased work traffic, and in most cases this was the case (Ayutthaya was often an exception, likely due to weekend traffic from tourists). However, one weekend saw a notable peak in aerosol concentrations at a site one Saturday evening, which was difficult to explain. It was pointed out to me by colleagues at the institute that over this weekend was a festival during which the Chinese community in Bangkok traditionally burned items, including fake money, as a long standing tradition. The peak in particles fitted this explanation.

Knowing more about the nature and level of pollutants in a city is an important first step, but challenges persist for the people in Bangkok and other polluted cities as to how to reduce these exposures. The problem of rural crop burning is one that frustrates many in Thailand, as it is well known that the particulates from burnt crops are harmful to the population. While there are strong restrictions on the deliberate burning of crops, it is still common see fields burning in January to March in areas of northern Thailand. Similarly, Bangkok remains the city of the car, with residents accepting that they may have to sit in traffic for long periods to get anywhere.

Burning mountains in Thailand. Wikimedia Commons.

Researchers from Bristol were able to discuss our results alongside measurements from the PCD and the CRI in Bangkok in a seminar held in 2019. It was apparent that there is great awareness of the dangers of air pollution, but it still seems that more needs to be done to address these problems. In January 2019, Bangkok made the headlines for PM2.5 exposures that were at dangerously high levels. January in Thailand is during the ‘cool’ season, where both temperatures and rainfall are low. This weather results in a trapping of pollutants within the city, thus increasing exposure levels. On discussing this with the pollution experts in Thailand, it was argued that the levels this year were typical levels for January, but the reporting of those levels had changed. The Thai PCD advocate communication of pollutant levels though their website and their app, and until recently the PCD sites did not measure PM2.5 in a sufficient number of stations to use it in their air quality index calculations. This year, they changed the calculation to include PM2.5, and as a consequence, the high pollutant levels discussed above were reported. The reporting of these pollutant levels can be attributed to the greater awareness of the population to the problem of pollution, which in turn is leading to a greater urgency in finding solutions.

So there is a role for the direct engagement with the population, which may lead to pressure on governments to respond. There is also a role for science to provide leaders with tangible solutions, such as the suggestion to change fuel mixtures. But the huge challenge of reducing sources of pollutants in a growing city like Bangkok remains.

1 Landringham, P. J. et al 2017. Lancet 391, 462-512
2 Davies R., 2018. Lancet 391, 421.

————————–
This blog is written by Cabot Institute member Dr James Matthews, School of Chemistry, University of Bristol. James’ research looks at the flow of gases in urban environments, and the use of perfluorocarbon trace gas releases to map the passage of air in urban cities.
James Matthews

Collecting silences

‘Noise’ is the Greenhouse gas (GHG) emissions which have resulted from fossil-fuel-powered economic growth which is measured as GDP for particular territories. In Figure 1, ‘noise’ is the area below the green line to the left of the vertical dotted line (historical) and below the blue line to the right of the vertical dotted line (projected). ‘Silence’ is the reduction of fossil-fuel use and the mitigation of carbon emissions. In Figure 1, ‘silence’ is the green shaded area above the blue line and below the dotted blue line to the right of the vertical dotted line.

Figure 1

To ensure that we maintain atmospheric GHG emission concentrations conducive to human habitation and the ecosystems that support us, we need to assign less value to ‘noise’ (burning fossil fuels) and more value to ‘silence’ (GHG emission mitigations). Creating a system which assigns value to ‘silences’ by turning them into investable resources requires an effort sharing mechanism to establish demand and organizational capacity alongside accurate measuring, reporting and verification for supply.

Organizational capacity for supplying ‘silences’ depends on the ability of organizations to create, trade and accumulate GHG emission mitigations. Due to the intangible nature of such ‘silences’, turning GHG emissions mitigations into investable sources requires their assetization as quasi-private goods with well-defined and delineated quasi-property rights. As preservations of the intangible commodity of stable atmospheric GHG concentrations through the reduction of pollution, such rights need to protect investment by ensuring that these private goods are definable, identifiable and capable of assumption by third parties. Such rights also require enforcement and protection against political and regulatory risk.

Commodifying GHG emission mitigations as quasi-private goods by assetizing them with well-defined and delineated quasi-property rights therefore provides the basis for the supply of ‘silences’. Rather than ‘internalising’ the cost of stabilising or reducing atmospheric GHG concentrations, this approach assigns value to GHG emission mitigations. Yet, if we want to avoid climate catastrophe according to the most recent IPCC 1.5C report and the UNDP Emissions Gap Report, GHG emission mitigations also require concretization on the demand-side. There are several examples of GHG emission mitigation and energy demand reduction assetization that can help illustrate how such systems of demand and supply can function.

Similar to GHG emission mitigations, energy demand reductions also represent the reduction of an intangible commodity vis-à-vis a baseline. While stable atmospheric GHG emission levels are the intangible commodity in the case of the former, in the case of the latter the intangible commodity is energy supply which fuels economic growth. Both require the assetization of mitgations/reduction to create ‘tangibility’, which provides the basis for assigning value. To illustrate, energy demand reductions are absent on domestic and corporate accounts and subsequently undervalued vis-à-vis increases in revenues.

Market-based instruments that succeed in setting and enforcing targets and creating systems of demand, however, can create ‘tangibility’. Energy demand reductions, for example, are assetized as white certificates representing equal units of energy savings (negawatts) in white certificate markets. Similarly, demand-side response enables the assetization of short-term shifts in energy (non-)use (flexiwatts) to benefit from flexibility and balancing markets. Carbon emission mitigations are assetized under the Clean Development Mechanism (CDM) as Certified Emissions Reductions (CERs).

Crucially, these examples shift the emphasis from the cost of pollution and the need to ‘internalise’ this cost or from turning pollution into a quasi-private good through Emissions Trading Schemes (ETS) towards the positive pricing of energy demand reductions and carbon emission mitigations. Positive pricing turns their respective reduction and mitigation itself into a quasi-private good by turning ‘silences’ into investable resources.

The main technical difficulty of establishing such systems lies in the definition of baselines and measuring, reporting and verification vis-à-vis these baselines. The difficulties inherent in this approach are well documented but improved sensing technology, such as the Internet of Things (IoT), and distributed ledgers promise greatly improved granularity and automated time-stamping of all aspects of energy (non-)use at sub-second intervals. If structures of demand are clearly identified through target-driven market-based instruments and supply is facilitated through the assetization of ‘silences’ as quasi-private goods with clearly defined and enforced quasi-property rights, a clear incentive also exists to ensure that MRV structures are improved accordingly.

Key to the implementation of such target-driven market-based instruments are mechanisms to ensure that efforts are shared among organisations, sectors or countries, depending on the scale of implementation. Arguably, one of the reasons why the CDM failed in many aspects was because of the difficulty of proving additionality. This concept was supposed to ensure that only projects that could prove their viability based on the availability of funds derived from the supply, trade and accumulation of CERs would be eligible for CDM registration.

The difficulty of proving additionally increases cost and complexity. To ensure that new mechanisms no longer require this distinction, a dynamic attribution of efforts is required. A mechanism to dynamically share efforts can also help address rebound effects inherent in energy efficiency and energy demand reduction efforts. Key is the target-driven nature of associated market-based instruments and the equitable distribution of the rebound through a dynamic mechanism which shares any rebounds (i.e. increases in carbon emissions) equitably among organisations, sectors or countries. With an appropriate effort-sharing mechanism in place, the demand and supply of ‘silences’ can be aligned with targets aiming to maintain atmospheric GHG emission concentrations in line with levels conducive to human habitation and the ecosystems that support us.

—————————-
This blog is written by Cabot Institute member Dr Colin Nolden, a Vice Chancellor’s Fellow in sustainable city business models. The blog has been reposted with kind permission of World Sustainable Energy Days. If you would like to read more on this topic, you can read Colin’s research paper here.

Colin Nolden

 

Monitoring greenhouse gas emissions: Now more important than ever?

As part of Green Great Britain Week, supported by BEIS, we are posting a series of blogs throughout the week highlighting what work is going on at the University of Bristol’s Cabot Institute for the Environment to help provide up to date climate science, technology and solutions for government and industry.  We will also be highlighting some of the big sustainability actions happening across the University and local community in order to do our part to mitigate the negative effects of global warming. Today our blog will look at ‘Explaining the latest science on climate change’.

The IPCC report

On 8 October 2018 the Intergovernmental Panel on Climate Change (IPCC) [1] published their special report on Global Warming of 1.5 ˚C. As little as 24 hours after the report had been published, the results of the report were already receiving extensive global coverage in the media, with BBC News describing the report as the “final call”. The BBC News article also explicitly mentions that this is “the most extensive warning yet on the risks of rising global temperatures. Their dramatic report on keeping that rise under 1.5 ˚C states that the world is now completely off track, heading instead towards 3 ˚C. Staying below 1.5 ˚C will require ‘rapid, far-reaching and unprecedented changes in all aspects of society’ [2].”

Reading the report has quite honestly been somewhat overwhelming but also necessary to understand exactly what we are in for. And as much as I understand the difficulty one might face either with the technical terms of the report or even the volume of information, I would really encourage you to give it a read. This special report covers a wide range of subjects from oceans, ice and flooding to crops, health and economy. However, if you do find that the chapters themselves are too lengthy or difficult, there is an amazing interactive, and very easy way that will help you explore the impacts of a 1.5 ˚C, 2 ˚C and beyond on Carbon Brief’s website.

There are two distinct parts in the IPCC special report. The full technical report that consists of 5 chapters and a short summary for policy makers (SPM). The SPM clearly states that “Estimated anthropogenic global warming matches the level of observed warming to within ±20 %” which translates into ‘almost 100 % of the warming is the result of human activity’ [3] [4].

We know for a fact that human activity is warming the planet

One outcome of this “human activity” that we often discuss is the emission of greenhouse gases (GHGs). Through various types of activities, whether that is agriculture, deforestation or burning fossil fuels, GHGs are emitted to the atmosphere. Without going too much into the chemistry and physics, what these GHGs do is change the mixing ratios within the atmosphere, resulting in greater absorbance of infrared radiation. And it is this change in the composition of our atmosphere that we refer to as the manmade greenhouse gas effect which also leads to the warming described in the IPCC report. But far more than the warming effect itself, global warming has all sorts of impacts most of which you can explore through the interactive link above.

Greenhouse gases and a long history of monitoring

Some of the ‘usual suspects’ in the discussion of GHG emissions are carbon dioxide (CO2), methane (CH4) and nitrous oxide (N2O) (often described as the ‘major’ greenhouse gases [5]). However, an often-overlooked set of halogenated greenhouse gases are playing an increasingly large role in anthropogenic driven climate change. Gases like perfluorocarbons (PFCs) and hydrofluorocarbons (HFCs) are compounds that are emitted through some form of human activity. In the case of PFCs for example, the GHGs CF4 and C2F6 are two of the most volatile and long-lived gases monitored under the Kyoto protocol [6] and they are both primarily emitted through or during industrial processes. In contrast, HFCs are used widely as coolants in refrigerators and air-conditioning units, as blowing agents in foam manufacture and propellants in aerosols. They were originally introduced to replace ozone-depleting gases such as chlorofluorocarbons (CFCs), but like their predecessors, are potent greenhouse gases. Given the long lifetime of many of these halogenated gases, current emissions will influence the climate system for decades to come.

In order to monitor the accumulation of these gases in atmosphere, high-precision measurements are required. Through projects such as the Advanced Global Atmospheric Gases Experiment (AGAGE) [7] (figure 1 [8]) that has been measuring the composition of the global atmosphere continuously since 1978 and the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory Global Monitoring Division, scientists have tracked the atmospheric concentrations of climate forcing gases from as far back as 1950s [9].

Figure 1: The AGAGE network

The Atmospheric Chemistry Research Group (ACRG) Chemistry Department, University of Bristol

The ACRG carries out research in the UK and worldwide in collaboration with other atmospheric chemistry research centres, universities and third parties. In the UK, the ACRG runs the UK Deriving Emissions linked to Climate Change network (DECC) [10], funded by the Department for Business, Energy and Industrial Strategy (BEIS) to measure atmospheric GHG and ozone depleting substances over the UK. These measurements are used in elaborate mathematical models to create top-down emission estimates for the UK and verify the UK GHG inventories submitted to the United Nations Framework Convention for Climate Change (UNFCCC) as part of the Kyoto protocol. Worldwide, the group is involved in the AGAGE network, monitoring global background levels of a wide range of GHGs. The ACRG runs 2 of the 9 global background stations under the AGAGE programme. One of these is the Mace Head station (Figure 2) on the west coast of Ireland, which is ideally placed for resolving northern hemispheric baseline air amongst European pollution events. The other AGAGE research station managed by the ACRG is the site at Ragged Point, Barbados. This site just north of the tropics, sits on the eastern edge of the island of Barbados and is directly exposed to the Atlantic. The researchers in ACRG study a variety of GHGs and a very large range of topics from maintaining instrument suites to ensuring the quality of the resulting data so that it can be used in modelling studies.

Figure 2: The Mace Head Station (Credit: Dr Kieran Stanley)

Why are measuring stations and networks like AGAGE so valuable and more important than ever?

The answer to this question is straightforward. Without measurement stations and their underlying networks, we would have very few means [11] by which to measure the accumulation of GHGs in the global atmosphere, and consequently no way of evaluating their emissions without relying on statistics from the industries that emit them. The current IPCC report is underpinned by such measurements, which allow scientists to estimate the impact of anthropogenic activity on past, present and future climates.

From Mauna Loa and its 60 -year record of atmospheric CO2 [12], to unexpected growth in emissions of banned substances such as CFC – 11 [13] and monitoring the accumulation of extremely long-lived greenhouse gases in the global atmosphere, atmospheric measurements stations have been our inside man when it comes to keeping track of what is happening in our atmosphere and to what extent human activities are altering its composition.

Perhaps now more than ever, in the light of the IPCC report, we can appreciate the importance of the data that have been collected over decades but also, the efforts of those who have been directly or indirectly involved in this kind of work.  Continuing and expanding the measurement networks for these gases is and will be even more vital for a continued understanding of global and regional GHG emission trends.

References

[1] http://www.ipcc.ch/
[2]  https://www.bbc.co.uk/news/science-environment-45775309
[3]  http://report.ipcc.ch/sr15/pdf/sr15_spm_final.pdf
[4]  https://www.carbonbrief.org/analysis-why-scientists-think-100-of-global-warming-is-due-to-humans
[5]  https://www.c2es.org/content/main-greenhouse-gases/
[6]  https://www.atmos-chem-phys.net/10/5145/2010/acp-10-5145-2010.pdf
[7]  https://agage.mit.edu/
[8]  https://agage.mit.edu/
[9]  https://www.esrl.noaa.gov/gmd/about/aboutgmd.html
[10]  http://www.bristol.ac.uk/chemistry/research/acrg/current/decc.html
[11]  https://www.co2.earth/co2-ice-core-data
[12]  https://www.co2.earth/daily-co2
[13]  https://www.theguardian.com/environment/2018/may/16/mysterious-rise-in-banned-ozone-destroying-chemical-shocks-scientists

—————————-
This blog is written by Cabot Institute members Eleni Michalopoulou, Dr Dan SayDr Kieran Stanley and Professor Simon O’Doherty from the University of Bristol’s School of Chemistry.

Dan Say
Eleni Michalopoulou

 

Read other blogs in this Green Great Britain Week series:
1. Just the tip of the iceberg: Climate research at the Bristol Glaciology Centre
2. Monitoring greenhouse gas emissions: Now more important than ever?
3. Digital future of renewable energy
4. The new carbon economy – transforming waste into a resource
5. Systems thinking: 5 ways to be a more sustainable university
6. Local students + local communities = action on the local environment

Forest accounting rules put EU’s climate credibility at risk, say leading experts

**Article re-posted from EURACTIV **


Forest mitigation should be measured using a scientifically-objective approach, not allowing countries to hide the impacts of policies that increase net emissions, writes a group of environmental scientists led by Dr Joanna I House.

Dr Joanna I House is a reader in environmental science and policy at the Cabot Institute, University of Bristol, UK. She co-signed this op-ed with other environmental scientists listed at the bottom of the article.

From an atmospheric perspective, a reduction in the forest sink leads to more CO2 remaining in the atmosphere and is thus effectively equivalent to a net increase in emissions. [Yannik S/Flickr]

When President Trump withdrew from the Paris Agreement, the EU’s Climate Commissioner, Miguel Arias Cañete spoke for all EU Member States when he said that, “This has galvanised us rather than weakened us, and this vacuum will be filled by new broad committed leadership.” The French President, Emmanuel Macron, echoed him by tweeting, “Make our planet great again”.

But as the old saying goes, ‘If you talk the talk, you must walk the walk,’ and what better place to start than the very laws the EU is currently drafting to implement its 2030 climate target under the Paris Agreement. This includes a particularly contentious issue that EU environment leaders will discuss on 19 June, relating to the rules on accounting for the climate impact of forests.

Forests are crucial to limiting global warming to 2 degrees Celsius. Deforestation is responsible for almost one tenth of anthropogenic carbon dioxide (CO2) emissions, while forests remove almost a third of CO2 emissions from the atmosphere.

In the EU, forests currently grow more than they are harvested.  As a result, they act as a net ‘sink’ of CO2 removing more than 400 Mt CO2 from the atmosphere annually, equivalent to 10% of total EU greenhouse gas (GHG) emissions.

New policies adopted or intended by Member States will likely drive them to harvest more trees (e.g. for the bioeconomy and bioenergy), reducing the sink. The controversy is, in simple terms, if forests are taking up less CO2 due to policies, should this be counted?

Based on lessons learnt from the Kyoto Protocol, the European Commission proposed that accounting for the impacts of forests on the atmosphere should be based on a scientifically robust baseline. This baseline (known as the ‘Forest Reference Level’) should take into account historical data on forest management activities and forest dynamics (age-related changes). If countries change forest management activities going forward, the atmospheric impact of these changes would be fully accounted based on the resulting changes in GHG emissions and sinks relative to the baseline. This approach is consistent with the GHG accounting of all other sectors.

Subsequently, some EU member states have proposed that any increase in harvesting, potentially up to the full forest growth increment, should not be penalised. This would be achieved by including this increase in harvesting, and the related change in the net carbon sink, in the baseline.

As land-sector experts involved in scientific and methodological reports (including for the Intergovernmental Panel on Climate Change, IPCC), in the implementation of GHG inventory reports, and in science advice to Governments, we have several scientific concerns with this approach.

From an atmospheric perspective, a reduction in the forest sink leads to more CO2 remaining in the atmosphere and is thus effectively equivalent to a net increase in emissions. This is true even if forests are managed “sustainably”, i.e. even if harvest does not exceed forest growth.

This is further complicated as the issues are cross-sectoral. Higher harvest rates may reduce the uptake of CO2 by forests, but use of the harvested wood may lead to emissions reductions in other sectors e.g. through the substitution of wood for other more emissions-intensive materials (e.g. cement) or fossil energy. These emission reductions will be implicitly counted in the non-LULUCF sectors.  Therefore, to avoid bias through incomplete accounting, the full impact of increased harvesting must be also accounted for.

Including policy-related harvest increases in the baseline could effectively hide up to 400 MtCO2/yr from EU forest biomass accounting compared to the “sink service” that EU forests provide today, or up to 300 MtCO2/yr relative to a baseline based on a scientific approach (up to two thirds of France’s annual emissions).

If policy-related impacts on net land carbon sinks are ignored or discounted, this would:
 

  • Hamper the credibility of the EU’s bioenergy accounting: Current IPCC guidance on reporting emissions from bioenergy is not to assume that it is carbon neutral, but rather any carbon losses should to be reported under the ‘Land Use, Land-Use Change and Forestry’ (LULUCF) sector rather than under the energy sector (to avoid double counting). EU legislation on bioenergy similarly relies on the assumption that carbon emissions are fully accounted under LULUCF.
  • Compromise the consistency between the EU climate target and the IPCC trajectories. The EU objective of reducing GHG emissions of -40% by 2030 (-80/95% by 2050) compared to 1990 is based on the IPCC 2°C GHG trajectory for developed countries. This trajectory is based not just on emissions, but also on land-sinks. Hiding a decrease in the land sink risks failure to reach temperature targets and would require further emission reductions in other sectors to remain consistent with IPCC trajectories.
  • Contradict the spirit of the Paris Agreement, i.e., that “Parties should take action to conserve and enhance sinks”, and that Parties should ensure transparency in accounting providing confidence that the nationally-determined contribution of each country (its chosen level of ambition in mitigation) is met without hiding impacts of national policies.
  • Set a dangerous precedent internationally, potentially leading other countries to do the same (e.g. in setting deforestation reference levels). This would compromise the credibility of the large expected forest contribution to the Paris Agreement.

The Paris Agreement needs credible and transparent forest accounting and EU leaders are about to make a decision that could set the standard.   Including policy-driven increases in harvest in baselines means the atmospheric impacts of forest policies will be effectively hidden from the accounts (while generating GHG savings in other sectors). Basing forest accounting on a scientifically-objective approach would ensure the credibility of bioenergy accounting, consistency between EU targets and the IPCC 2°C trajectory, and compliance with the spirit of Paris Agreement. The wrong decision would increase the risks of climate change and undermine our ability to “make the planet great again”.

Disclaimer: the authors express their view in their personal capacities, not representing their countries or any of the institutions they work for.

***

Signatories:

Joanna I House, Reader in Environmental Science and Policy, Co-Chair Global Environmental Change, Cabot Institute, University of Bristol, UK
Jaana K Bäck, Professor in Forest – atmosphere interactions, Chair of the EASAC Forest multifunctionality report, University of Helsinki, Finland
Valentin Bellassen, Researcher in Agricultural and Environmental Economics, INRA, France
Hannes Böttcher, Senior Researcher at Oeko-Institut.
Eric Chivian M.D., Founder and Former Director, Center for Health and the Global Environment Harvard Medical School
Pep Canadell, Executive Director of the Global Carbon Project
Philippe Ciais, scientist at Laboratoire des Sciences du Climat et de l’Environnement, Gif sur Yvette, France
Philip B. Duffy, President and Executive Director Woods Hole Research Center, USA
Sandro Federici, Consultant on MRV and accounting for mitigation in the Agriculture and land use sector
Pierre Friedlingstein, Chair, Mathematical Modelling of Climate Systems, University of Exeter, UK.
Scott Goetz, Professor, Northern Arizona University
Nancy Harris, Research Manager, Forests Program, World resources Institute.
Martin Herold, Professor for Geoinformation Science and Remote Sensing and co-chair of Global Observations of Forest Cover and Land Dynamics (GOFC-GOLD), Wageningen University and Research, The Netherlands
Mikael Hildén, Professor, Climate Change Programme and the Resource Efficient and Carbon Neutral Finland Programme, Finnish Environment Institute and the Strategic Research Council, Finland
Richard A. Houghton, Woods Hole Research Centre USA
Tuomo Kalliokoski University of Helsinki, Finland
Janne S. Kotiaho, Professor of Ecology, University of Jyväskylä, Finland
Donna Lee, Climate and Land Use Alliance
Anders Lindroth, Lund University, Sweden
Jari Liski, Research Professor, Finnish Meteorological Institute, Finland
Brendan Mackey, Director, Griffith Climate Change Response Program, Griffith University, Australia
James J. McCarthy, Harvard University, USA
William R. Moomaw, Co-director Global Development and Environment Institute, Tufts University, USA
Teemu Tahvanainen, University of Eastern Finland
Olli Tahvonen, Professor forest economics and policy, University of Helsinki, Finland
Keith Pausitan, University Distinguished Professor, Colorado State University, USA
Colin Prentice, AXA Chair in Biosphere and Climate Impacts, Imperial College London, UK
N H Ravindranath, Centre for Sustainable Technologies (CST), Indian Institute of Science, India
Laura Saikku, Senior Scientist, Finnish Environment Institute
Maria J Sanchez, Scientific Director of BC3 (Basque Center for Climate Change), Spain
Sampo Soimakallio, Senior Scientist, Finnish Environment Institute
Zoltan Somogyi, Hungarian Forest Research Institute, Budapest, Hungary
Benjamin Smith, Professor of Ecosystem Science, Lund University, Sweden
Pete Smith, Professor of Soils & Global Change, University of Aberdeen, UK
Francesco N. Tubiello, Te Leader, Agri-Environmental Statistics, FAO
Timo Vesala, Professor of Meteorology, University of Helsinki, Finland
Robert Waterworth
Jeremy Woods, Imperial College London, UK
Dan Zarin, Climate and Land Use Alliance

Reflections on sustainability in my first few months in the UK

I’m Michael Donatti, a Cabot Institute Masters Research Fellow for 2016-2017. I have come to the University of Bristol from Houston, Texas, to read for an MSc in Environmental Policy and Management. Alongside studying, I have had the chance to experience this new city and this new country from an outsider’s perspective, and here are some of my initial thoughts on environmental sustainability in Bristol and the UK.

To be quite honest, Bristol is an interesting choice for a European Green Capital city. It lacks the biking infrastructure of Amsterdam, the strict ambition to attain carbon neutrality of Copenhagen, and the abundance of green space of Ljubljana. According to the University of Bristol student newspaper, epigram, 60% of Bristol’s air contains illegal levels of nitrogen dioxide. While walking and running along the streets of Bristol, I have felt this pollution. Partly, the low air quality results from the differing priorities of American and British/European emissions regulations. According to David Herron, Europe focuses on carbon dioxide and carbon monoxide to increase efficiency, decrease dependency on Russian oil, and curb climate change; in contrast, the USA focus on nitrogen oxides and particulate matter to improve local air quality and reduce smog. Combined with Bristol’s traffic problem, it is no wonder the air quality is actually quite bad.

Before arriving here, I expected more farmers’ markets and small shops. Like much of England (purely from my personal experience), the city seems to suffer from an overabundance of chain stores, cafes, supermarkets, and the like. Coming from the capitalists’ land of strip malls and chain stores, this observation shocked me. Tesco and Sainsbury hold a position of power scarcely rivalled by any supermarket chains back home, and while prices tend to be good because of their power, sustainability is lacking. Fruits and vegetables come wrapped up in plastic cases and bags; differentiating between what is local or organic or seasonal and what is not often takes detailed inspection. At HEB, a supermarket chain in Texas where I do much of my shopping, produce is in bulk bins, not unnecessary plastic cases. They have marketing campaigns and price markdowns for what is in season and what is local (granted, “local” in America’s second largest state is a loaded term).

I have felt excited to find nice coffee shops and cute stores, only to shortly thereafter realise that even those are chains, like Friska on Bristol’s Queen’s Road (which has at least two other locations). In Oxford, I was disappointed to learn that even The Eagle and Child, the pub where C.S. Lewis and J.R.R. Tolkien hung out, has been bought out by Nicholson’s, a chain of pubs across the UK. Not all chains inherently lack environmental sustainability, but they certainly lack character and promote the social inequalities that pervade modern capitalism.

Don’t get me wrong, Bristol and the UK are certainly much better in some areas of sustainability than Houston and the USA. As environmentalists, we can’t revel in our successes for too long without then setting higher goals and getting back to work. However, to paint a better picture of Bristol and the UK (because I have truly loved it here), I will include some of those successes. The British train network is far more extensive than any in the USA. Recycling is more ingrained in British cities; in Bristol, we even separate food waste, which is far from commonplace in Texas. Charging for plastic bags in stores and supermarkets appears more widespread here; while Austin, Texas, promotes using reusable bags, Houston has no restrictions on them. Bristol is far denser and more walkable than most American cities; Houston is the quintessential American automobile city. Not having a car is almost unheard of, partly because the greater Houston area is almost 40 times larger than the greater Bristol area and many families live in single-family homes.

Another aspect of Bristol that earned it its European Green Capital status is its social capital. I have only been here a few months, but I have tried to plug in to the city’s network of change makers. The city’s Green Capital Partnership has over 800 business partners that have pledged to improve their sustainability; the city has initiatives in resident health, happiness, and mobility; and it has set lofty goals for carbon emissions reductions. The challenge now is to make Bristol’s social capital accessible to all. I realised I could buy my food from the Real Economy Co-operative to waste less plastic, reduce transport emissions, and help local farmers, but how do we transfer opportunities like that into the mainstream? I am excited to keep learning more about Bristol’s initiatives in sustainability as I study here, and hopefully what I learn I can take back with me to Houston and Texas, which sorely need the help. I also hope Bristol will not become complacent with its Green Capital designation or too focused on nice-sounding rhetoric. Society needs real environmental improvements, and those improvements need to happen now.

References

“Bristol Green Capital Partnership.” Bristol Green Capital. Accessed November 16, 2016. http://bristolgreencapital.org/.
“European Green Capital.” Accessed November 16, 2016. http://ec.europa.eu/environment/europeangreencapital/winning-cities/2015-bristol/index.html.
Herron, David. “Differences in US and EU Emissions Standard Key Cause of Dieselgate.” The Long Tail Pipe, October 2, 2015. https://longtailpipe.com/2015/10/02/differences-in-us-and-eu-emissions-standard-key-cause-of-dieselgate/.

Brexit: can research light the way?

What could Brexit mean for UK science? What impact will it have on UK fisheries? Could Brexit be bad news for emissions reductions? These were just some questions discussed at a Parliamentary conference last week, organised by the Parliamentary Office of Science and Technology (POST), the Commons Library and Parliament’s Universities Outreach team.

MPs researchers, Parliamentary staff and academic researchers from across the country came together to consider some of the key policy areas affected by the UK’s decision to leave the EU.

Why does academic research matter to Parliament?

Given the unchartered waters that Parliament is facing as the UK prepares to withdraw from the EU, it is more important than ever that Parliamentary scrutiny and debate is informed by robust and reliable evidence.

Academic research is expected to meet rigorous standards of quality, independence and transparency. Although it is far from being the only source of evidence relevant to Parliament, it has vital role to play in the effective scrutiny of Government.

“Academics can help ensure that we get the best possible outcome for the British public through describing the state of knowledge, setting out comparative knowledge (whether in different territories or over time), and evaluating what’s happening as it plays out” said Penny Young, House of Commons Librarian, in her keynote speech.

Last week’s meeting showcased relevant UK academic research as well as giving participants the opportunity to hear the perspectives and concerns of different groups. With over 100 participants, the organisers made the wise decision to split us up into smaller groups to discuss specific policy areas.  This worked rather well, although most people would have liked to be in several groups at once!

What does the future hold for UK research?

In the session on science and research funding a mix of early career researchers and more seasoned academics set out their top issues. The discussion quickly moved beyond research funding. All the researchers agreed free movement of researchers between the UK, other parts of the EU, and beyond the EU, was a top priority.  Several researchers were concerned that the UK research community would become more isolated as a result of Brexit, making it more difficult to recruit and retain the best academic staff.

The group also discussed what kind of data we needed to gauge the impact of Brexit on UK research.  One researcher argued that if we wait until we have “hard data” – such as statistics on citations, publications and collaborations, it might be too late for decision-makers to intervene in any meaningful way.

Economic Impact of Brexit: New Models Needed

Researchers participating in the session on “trade relations and economic impact” highlighted that research on the economic impact of Brexit tends to focus on trade.  New models are needed that take trade into account, along with other relevant factors such as investment, migration and regulation. Participants also felt that more data on the local effects of trade deals would be useful to policymakers, but there are very few studies looking at such effects because of the many uncertainties involved.

Environment, agriculture and fisheries: ‘Cod Wars’?

What would the loss of subsidies under the Common Agricultural Policy mean for UK agriculture? Participants highlighted that areas such as horticulture and fisheries in particular could end up struggling with workforce retention. On a brighter note, one researcher thought there could be some financial gain for UK fisheries if the UK took back its Economic Exclusion Zone (EEZ), but warned of possible future “Cod Wars” if countries clashed over fishing rights.

Immigration: how many EU nationals live in the UK?

Participants in the immigration discussion group highlighted that we do not have reliable figures for how many EU nationals live in the UK. According to some estimates the figure is around 3 million, but this is based on survey data. More reliable data is needed to make informed policy decisions. Participants also highlighted that while most of the discussion around border control focuses on people, movement of goods across borders was also vitally important.

Energy and climate: who will drive emissions reductions targets?

The energy and climate group considered the impact of Brexit across Europe as a whole. The UK has been a strong driver for ambitious emissions reduction targets for the EU. Would other nations continue to drive such targets? Participants also speculated over whether UK would remain part of the European Emissions Trading Scheme and stay involved with some of the EU’s internal energy market regulatory bodies after Brexit.

Foreign and security policy

Participants covered a huge range of topics from UK-Irish relations to the future of NATO and drug trafficking and border control. The importance of learning lessons from history was a key theme in the session, whether it related to the future of NATO or to major treaty negotiations more generally.

What next…

These conversations were not based entirely on research evidence, not least because it there are simply too many uncertainties for research to answer all our questions on the impact of Brexit. In the end our discussions were based around a mix of anecdote, opinion, and ‘hard’ evidence. Overall it was a very enriching experience and we came away with lots of new contacts and ideas.

Many of the researchers said that they’d had relatively few opportunities to feed into policy discussions with parliament and government and that there needed to be many more meetings like this one!

This article was written for The House of Commons Library Blog Second Reading by Chandy Nath, acting Director of the POST and Cressida Auckland, a POST fellow.

Picture credit: Brexit Scrabble, by Jeff Djevdet; Creative Commons Attribution 2.0 Generic (CC by 2.0)

Measuring greenhouse gases during India’s monsoon

NERC’s BAe-146 research aircraft at the Facility for Airborne Atmospheric Measurements (FAAM). Image credit: FAAM
This summer, researchers across the UK and India are teaming up to study the Indian monsoon as part of a £8 million observational campaign using the NERC research aircraftBAe-146

India receives 80% of its annual rainfall in three months – between June and September. There are large year-to-year differences in the strength of the monsoon, which is heavily impacted by drivers such as aerosols and large-scale weather patterns, and this has significant impact on the livelihoods of over a billion people. For example, due to the strong El Nino last year, the 2015 monsoon experienced a 14% lower precipitation than average with some regions of India facing up to 50% shortfall.  Forecasting the timing and strength of the monsoon is critical for the region and particularly for India’s farmers, who must manage water resources to avoid failing crops.

 

Roadside mural of the BAe-146 in Bangalore, India. Original artist unknown.  Image credit: Guy Gratton

The observational campaign, which is part of NERC’s Drivers of Variability in the South Asian Monsoon programme, is led jointly by UK researchers: Professor Hugh Coe (University of Manchester), Dr Andy Turner (University of Reading) and Dr Adrian Matthews (University of East Anglia) and Indian scientists from the Indian Space Research Organization and Indian Institute of Science.

Bristol PhD student Dan Say installing sample containers on the BAe- 146. Image credit: Angelina Wenger

To complement this project to study physical and chemical drivers of the monsoon, I am measuring greenhouse gas from the aircraft with PhD student Dan Say (School of Chemistry, University of Bristol). Dan is gaining valuable field experience by operating several instruments aboard the BAe-146 through the intense heat and rain of the Indian monsoon.

Two of the greenhouse gases that we are studying, methane and nitrous oxide, are primarily produced during the monsoon season from India’s intensive agriculture. Methane is emitted from rice paddies, in which flooded soils create prime conditions for “anaerobic” methane production. Nitrous oxide is also emitted from these flooded soils due the large quantity of fertilizers that are applied, again through anaerobic pathways. 

 

Rice fields near Bangalore, India. Image credit: Guy Gratton.

Our previous understanding of the large-scale emissions of these greenhouse gases from India’s agricultural soils has been limited and we aim to further our knowledge of what controls their production. In addition to the methane concentrations measured on the aircraft, with collaborators at the Royal Holloway, University of London’s isotope facility, we are also measuring the main isotope of methane (the 13-carbon isotope), which will provide us with a valuable tool for differentiating between agricultural and other sources of methane in the region. By combining this information with other measurements from the aircraft (for example, of moisture and of other atmospheric pollutants), we aim to gain new insights on how we may reduce these emissions in the future.

In addition, many synthetic “man-made” greenhouse gases are being measured for the first time in South Asia, giving us the first look at emissions from this region of some of the most potent warming agents. These include the suite of halocarbons such as hydrofluorocarbons (HFCs) and their predecessors the hydrochlorofluorocarbons (HCFCs) and chlorofluorocarbons (CFCs). These gases will be measured on the University of Bristol School of Chemistry’s ‘Medusa’ gaschromatography-mass spectrometer (GC-MS) facility run by Professor Simon O’Doherty.

 

Sample canisters for collecting air that will be measured on the School of Chemistry’s ‘Medusa’ GC-MS facility. Image credit: Angelina Wenger

————————————-

This blog is written by University of Bristol Cabot Institute member Dr Anita Ganesan, a NERC Research Fellow, School of Geographical Sciences, who looks at greenhouse gas emissions estimation.
Anita Ganesan