Countries may be under-reporting their greenhouse gas emissions – that’s why accurate monitoring is crucial

Luciann Photography / Pexels

Pledges to cut greenhouse gas emissions are very welcome – but accurate monitoring across the globe is crucial if we are to meet targets and combat the devastating consequences of global warming.

During COP26 in Glasgow, many countries have set out their targets to reach net-zero by the middle of this century.

But a serious note of caution was raised in a report in the Washington Post. It revealed that many countries may be under-reporting their emissions, with a gap between actual emissions into the atmosphere and what is being reported to the UN.

This is clearly a problem: if we are uncertain about what we are emitting now, we will not know for certain that we have achieved our emission reduction targets in the future.

Quantifying emissions

Currently, countries must follow international guidelines when it comes to reporting emissions. These reports are based on “bottom-up” methods, in which national emissions are tallied up by combining measures of socioeconomic activity with estimates on the intensity of emissions involved in those activities. For example, if you know how many cows you have in your country and how much methane a typical cow produces, you can estimate the total methane emitted from all the cows.

There are internationally agreed guidelines that specify how this kind of accountancy should be done, and there is a system of cross-checking to ensure that the process is being followed appropriately.

But, according to the Washington Post article, there appear to be some unexpected differences in emissions being reported between similar countries.

The reporting expectations between countries are also considerably different. Developed countries must report detailed, comprehensive reports each year. But, acknowledging the administrative burden of this process, developing countries can currently report much more infrequently.

Plus, there are some noteable gaps in terms of what needs to be reported. For example, the potent greenhouse gases that were responsible for the depletion of the stratospheric ozone layer – such as chlorofluorocarbons (CFCs) – are not included.

A ‘top-down’ view from the atmosphere

To address these issues, scientists have been developing increasingly sophisticated techniques that use atmospheric greenhouse gas observations to keep track of emissions. This “top-down” view measures what is in the atmosphere, and then uses computer models to work backwards to figure out what must have been emitted upwind of the measurements.

To demonstrate the technique, an international team of scientists converged on Glasgow, to observe how carbon dioxide and methane has changed during the COP26 conference.

While this approach cannot provide the level of detail on emission sectors (such as cows, leaks from pipes, fossil fuels or cars) that the “bottom–up” methods attempt, scientists have demonstrated that it can show whether the overall inventory for a particular gas is accurate or not.

The UK was the first country, now one of three along with Switzerland and Australia, to routinely publish top-down emission estimates in its annual National Inventory Report to the United Nations.

A network of five measurement sites around the UK and Ireland continuously monitors the levels of all the main greenhouse gases in the air using tall towers in rural regions.

Emissions are estimated from the measurements using computer models developed by the Met Office. And the results of this work have been extremely enlightening.

In a recent study, we showed that the reported downward trend in the UK’s methane emissions over the last decade is mirrored in the atmospheric data. But a large reported drop before 2010 is not, suggesting the methane emissions were over-estimated earlier in the record.

In another, we found that the UK had been over-estimating emissions of a potent greenhouse gas used in car air conditioners for many years. These studies are discussed with the UK inventory team and used to improve future inventories.

While there is currently no requirement for countries to use top-down methods as part of their reporting, the most recent guidelines and a new World Meteorological Organisation initiative advocate their use as best practice.

If we are to move from only three countries evaluating their emissions in this way, to a global system, there are a number of challenges that we would need to overcome.

Satellites may provide part of the solution. For carbon dioxide and methane, the two most important greenhouse gases, observations from space have been available for more than a decade. The technology has improved dramatically in this time, to the extent that imaging of some individual methane plumes is now possible from orbit.

In 2018, India, which does not have a national monitoring network, used these techniques to include a snapshot of its methane emissions in its report to the UN.

But satellites are unlikely to provide enough information alone.

To move towards a global emissions monitoring system, space-based and surface-based measurements will be required together. The cost to establish ground-based systems such as the UK’s will be somewhere between one million and tens of millions of dollars per country per year.

But that level of funding seems achievable when we consider that billions have been pledged for climate protection initiatives. So, if the outcome is more accurate emissions reporting, and a better understanding of how well we are meeting our emissions targets, such expenditure seems like excellent value for money.

It will be up to the UN and global leaders to ensure that the international systems of measurement and top-down emissions evaluation can be scaled-up to meet the demands of a monitoring system that is fit for purpose. Without robust emissions data from multiple sources, the accuracy of future claims of emission reductions may be called into question.The Conversation

————————-

This blog is written by Cabot Institute for the Environment member Professor Matt Rigby, Reader in Atmospheric Chemistry, University of Bristol

This article is republished from The Conversation under a Creative Commons license. Read the original article.

Time for policymakers to make policies (and to learn from those who are)

From a social scientist’s point of view, the recent IPCC report and the reception it has received are a bit odd. The report certainly reflects a huge amount of work, its message is vital, and it’s great so many people are hearing it. But not much in the report updates how we think about climate change. We’ve known for a while that people are changing the climate, and that how much more the climate changes will depend on the decisions we make.

What decisions? The Summary for Policymakers— the scientists’ memo to the people who will make the really important choices—doesn’t say. The words “fossil fuel”, “oil”, and “coal” never even appear. Nor “regulation”, “ban”, “subsidy”, or “tax”. The last five pages of the 42-page Summary are entitled “Limiting Future Climate Change”; but while “policymakers” appear, “policies” do not.

This is not the fault of the authors; Working Group I’s remit does not include policy recommendations. Even Working Group III (focused on mitigation) is not allowed to advocate for specific choices. Yet every IPCC contributor knows the most important question is which emission pathway we take, and that will depend on what policies we choose.

Which is why it’s so odd that big policy issues and announcements get comparatively little airtime (and research funding). For example, in June, the European Union codified in law the goal of reducing its greenhouse gas emissions 55% by 2030 (relative to 1990), and last month the European Commission presented a set of ambitious proposals for hitting that target. As a continent, Europe is already leading the world in emission reductions (albeit starting from a high level, with large cumulative historical emissions), and showing the rest of the world how to organize high-income societies in low-carbon ways. But the Commission’s proposals—called “Fit for 55”—have gone largely under the radar, not only outside of the EU but even within it.

The proposals are worth examining. At least according to the Commission, they will make the EU’s greenhouse gas emissions consistent with its commitments under the Paris Agreement. (Independent assessments generally agree that while a 55% reduction by 2030 won’t hit the Paris Agreement’s 1.5˚ target, it would be a proportionate contribution to the goal of limiting global heating to no more than 2˚.) And they will build on the EU’s prior reduction of its territorial emissions by 24% between 1990 and 2019.

A change of -24% over that period, and -18% for consumption emissions, is in one sense disappointing, given that climate scientists were warning about the need for action even before 1990. But this achievement, inadequate though it may be, far exceeds those of other high per-capita emitters, like the U.S. (+14%), Canada (+21%), or Australia (+54%).

The most notable reductions have been in the areas of electricity generation and heavy industry—sectors covered by the EU’s emissions trading system (ETS). Emissions from buildings have not declined as much, and those from transportation (land, air, and marine) have risen. Several of the Fit for 55 proposals therefore focus on these sectors. Maritime transport is to be incorporated into the ETS; free permits for aviation are to be eliminated; and a new, separate ETS for fuels used in buildings and land transport is to be established. Sales of new cars and trucks with internal combustion engines will end as of 2035, and increased taxes will apply to fuels for transport, heat, and electricity.

The Commission also proposes to cut emissions under the ETS by 4.2% each year (rather than 2.2% currently); expand the share of electricity sourced from renewables; and set a stricter (lower) target for the total amount of energy the EU will use by 2030—for the sake of greater energy efficiency.

All of this is going to be hugely contentious, and it will take a year or two at least for the Commission, the member-states, and the European Parliament to negotiate a final version. Corporate lobbying will shape the outcome, as will public opinion (paywall).

Two of the most interesting proposals are meant to head off opposition from industry and voters. A carbon border adjustment mechanism will put a price on greenhouse gases emitted by the production abroad of selected imports into the EU (provisionally cement, fertiliser, iron, steel, electricity, and aluminium). This will protect European producers from competitors subject to weaker rules. A social climate fund, paid for out of the new ETS, will compensate low-income consumers and small businesses for the increased costs of fossil fuels—thereby preventing any rise in fuel poverty.

No country is doing enough to mitigate emissions. But Fit for 55 represents the broadest, most detailed emissions reductions plan in the world—and, in some form, it will be implemented. Decision-makers everywhere should be studying, and making, policies like this.

—————————–

This guest blog is by friend of Cabot Insitute for the Environment and PLOS Climate Academic Editor Malcolm Fairbrother. Malcolm is a Professor of Sociology at Umeå University (Sweden), the Institute for Futures Studies (Stockholm), and University of Graz (Austria). Twitter: @malcolmfair. This blog has been reposted with kind permission from Malcolm Fairbrother. View the original blog.

Top image credit: Cold Dawn, Warm World by Mark McNestry, CC BY 2.0

 

National greenhouse gas reporting needs an overhaul – it’s time to directly measure the atmosphere

Junk Culture / shutterstock

How much greenhouse gas is emitted by any individual country? With global emissions of carbon dioxide hitting a record of 36.8 billion tonnes this year, and delegates gathering in Madrid for the latest UN climate talks, it’s a pressing question.

One might assume that we know precisely how much is emitted by any given country, and that such figures are rigorously cross-checked and scrutinised. And in some respects, this is true – countries are required to report their emissions to the UN, based on exhaustive guidelines and with reams of supporting data.

Yet these reports are based on what are known as inventory (or “bottom-up”) methods. To simplify, this means that governments figure out how much greenhouse gas is emitted by a typical car, cow, or coal plant, and then add up all the cows, cars and so on to get an overall emissions figure.

Map showing the UK’s CO2 emissions, calculated using ‘bottom-up’ methods. Daniel Hoare, University of Bristol, © Crown 2019 copyright Defra & BEIS, Author provided.

 

While this method is essential to understand the make-up of a country’s emissions, it is ultimately reliant on accurate and comprehensive information on economic activity, some compromises to allow standardisation across countries, and some element of trust.
And such reporting can go awry. In 2018 and again earlier this year, colleagues and I made headlines when we first identified mystery emissions of a banned ozone-depleting substance and greenhouse gas and then later tracked its source down to factories in eastern China.




Read more:
How we traced ‘mystery emissions’ of CFCs back to eastern China


The problem is that these “bottom-up” emissions reports do not generally include what some might consider key information: measurements that can indicate the actual amount of greenhouse gas in the atmosphere.

So could new data help us better understand how much we are emitting?

A national greenhouse gas monitoring network

The UK, Switzerland and Australia have pioneered a measurement-based approach to add credibility and transparency to their emissions reports. In 2012, a network of measurement stations was established on telecommunications towers across the UK to sniff out greenhouse gases emitted from around the country.

A tower used to sample the greenhouse gases in the air in Norfolk, England. Inset: a researcher working on the project. University of Bristol, Author provided
To interpret these measurements, we use sophisticated computer models that simulate how gases are transported from the surface, through the atmosphere, to the points where they are observed. By comparing the modelled and measured concentrations, we can determine the national emission rate.
These “top-down” estimates, which now form a key part of the UK’s National Inventory Report to the UN, have yielded some surprising insights. Sceptics may suspect that governments would be keen to “hide” emissions from the rest of the world, but in at least one case atmospheric data suggests that the UK has for years actually over-estimated, by around 100%, emissions of a potent greenhouse gas used in car air conditioners (HFC-134a). In contrast, for the major greenhouse gases methane and nitrous oxide, the data in recent years corroborates the UK inventory reports remarkably well.

More questions than answers?

Naturally, once this measurement data is available, new questions emerge. For example, the UK inventory suggests that methane emissions have gradually declined since 1990 but the atmospheric data suggests little trend, if any. This is important, because the UK benchmarks its emissions reductions against the year 1990.

Could this suggest that the country has not been as successful as it thought at reducing methane leaked from landfills, for example? Or have such emissions reductions been offset by some other source? Unfortunately, such questions are difficult to answer using “standard” atmospheric measurement techniques – a molecule of methane emitted from a landfill looks very similar to one from a cow.

Very similar, that is, but not identical. I am involved in a new £3m project called DARE-UK (Detection and Attribution of Regional Emissions in the UK), which looks for tell-tale features that can help us identify where carbon dioxide, methane and nitrous oxide in the atmosphere came from.
One type of signal that we are looking for is a tiny perturbation to the ratio of heavy and light isotopes of methane and carbon dioxide in the air. Isotopes are almost identical to one another but differ in their molecular mass. It turns out that cow burps, for example, emit methane with less of the heavy isotope than similar amounts of methane from a leaky gas boiler. So, we hope that this type of data may help the UK’s inventory team identify which sectors of the bottom-up reports may require re-examination.

We need improved transparency

While these measurements are proving a valuable aid for inventory compilers, their main utility is likely to be in ensuring trust and transparency in the international reporting process. Atmospheric measurements do not suffer from the confidentiality issues that can prevent interested parties from peeking behind the scenes of national inventories.

Could governments still hide their emissions? It’s unlikely, provided top-down methods are used with open and transparent protocols and data sharing. This should avoid accusations of foul play that could threaten to derail initiatives like the international climate accord, the Paris Agreement.

The UK example shows this type of emissions evaluation is now ready for the international stage. Institutions such as the World Meteorological Organization are working with governments and sub-national stakeholders to try to make it happen. Hopefully policymakers will see the value of finding out what’s really being released into their airThe Conversation.

————————————
This blog is written by Cabot Institute member Dr Matt Rigby, Reader in Atmospheric Chemistry, University of Bristol. This article is republished from The Conversation under a Creative Commons license. Read the original article.

An insight into aviation emissions and their impact on the atmosphere

Image credit: El Ronzo, Flickr

The proliferation of aviation has brought about huge benefits to our society, enhancing global economic prosperity and allowing humanity to travel faster, further and more frequently than ever before. However, the relentless expansion of the industry is a major detriment to the environment on a local, regional and global level. This is due to the vast amounts of pollution produced from the jet fuel combustion process, that is required to propel aircraft through the air and to sustain steady level flight.

Aircraft impact the climate largely through the release of CO2, which results in a direct contribution to the greenhouse effect, absorbing terrestrial radiation and trapping heat within the atmosphere, leading to rising temperatures. However, it is also vital not to overlook the non-CO2 aircraft emissions such as NOx, soot and water vapour, which result in alternative climate change mechanisms – the indirect greenhouse effect, the direct aerosol effect and aviation induced cloudiness. When accounting for these non-CO2 effects, it can be assumed that the climate impact is doubled or tripled compared to that of CO2 alone.

This report provides the necessary background information to grasp the science behind aircraft emissions and delves into the impacts aviation has on the atmosphere’s ability to cleanse itself of harmful emissions, otherwise known as the oxidising capacity of the atmosphere. It does so through an analysis of three distinct and commonly flown flight routes, investigating the impact that each flight’s emissions have on the surrounding atmospheric chemistry and discusses the potential effects this has on our Earth-atmosphere system.

Read the full report by Kieran Tait

——————————
Read our other blogs about air travel:

  1. To fly or not to fly? Towards a University of Bristol approach
  2. I won’t fly to your conference, but I hope you will still invite me to participate
Watch our latest video on air travel at the University of Bristol.

AI & sustainable procurement: the public sector should first learn what it already owns

While carrying out research on the impact of digital technologies for public procurement governance, I have realised that the deployment of artificial intelligence to promote sustainability through public procurement holds some promise. There are many ways in which machine learning can contribute to enhance procurement sustainability.

For example, new analytics applied to open transport data can significantly improve procurement planning to support more sustainable urban mobility strategies, as well as the emergence of new models for the procurement of mobility as a service (MaaS).* Machine learning can also be used to improve the logistics of public sector supply chains, as well as unlock new models of public ownership of, for example, cars. It can also support public buyers in identifying the green or sustainable public procurement criteria that will deliver the biggest improvements measured against any chosen key performance indicator, such as CO2 footprint, as well as support the development of robust methodologies for life-cycle costing.

However, it is also evident that artificial intelligence can only be effectively deployed where the public sector has an adequate data architecture.** While advances in electronic procurement and digital contract registers are capable of generating that data architecture for the future, there is a significant problem concerning the digitalisation of information on the outcomes of past procurement exercises and the current stock of assets owned and used by the public sector. In this blog, I want to raise awareness about this gap in public sector information and to advocate for the public sector to invest in learning what it already owns as a potential major contribution to sustainability in procurement, in particular given the catalyst effect this could have for a more circular procurement economy.

Backward-looking data as a necessary evidence base

It is notorious that the public sector’s management of procurement-related information is lacking. It is difficult enough to have access to information on ‘live’ tender procedures. Accessing information on contract execution and any contractual modifications has been nigh impossible until the very recent implementation of the increased transparency requirements imposed by the EU’s 2014 Public Procurement Package. Moreover, even where that information can be identified, there are significant constraints on the disclosure of competition-sensitive information or business secrets, which can also restrict access.*** This can be compounded in the case of procurement of assets subject to outsourced maintenance contracts, or in assets procured under mechanisms that do not transfer property to the public sector.

Accessing information on the outcomes of past procurement exercises is thus a major challenge. Where the information is recorded, it is siloed and compartmentalised. And, in any case, this is not public information and it is oftentimes only held by the private firms that supplied the goods or provided the services—with information on public works more likely to be, at least partially, under public sector control. This raises complex issues of business to government (B2G) data sharing, which is only a nascent area of practice and where the guidance provided by the European Commission in 2018 leaves many questions unanswered.*

I will not argue here that all that information should be automatically and unrestrictedly publicly disclosed, as that would require some careful considerations of the implications of such disclosures. However, I submit that the public sector should invest in tracing back information on procurement outcomes for all its existing stock of assets (either owned, or used under other contractual forms)—or, at least, in the main categories of buildings and real estate, transport systems and IT and communications hardware. Such database should then be made available to data scientists tasked with seeking all possible ways of optimising the value of that information for the design of sustainable procurement strategies.

In other words, in my opinion, if the public sector is to take procurement sustainability seriously, it should invest in creating a single, centralised database of the durable assets it owns as the necessary evidence base on which to seek to build more sustainable procurement policies. And it should then put that evidence base to good use.

More circular procurement economy based on existing stocks

In my view, some of the main advantages of creating such a database in the short-, medium- and long-term would be as follows.

In the short term, having comprehensive data on existing public sector assets would allow for the deployment of different machine learning solutions to seek, for example, to identify redundant or obsolete assets that could be reassigned or disposed of, or to reassess the efficiency of the existing investments eg in terms of levels of use and potential for increased sharing of assets, or in terms of the energy (in)efficiency derived from their use. It would also allow for a better understanding of potential additional improvements in eg maintenance strategies, as services could be designed having the entirety of the relevant stock into consideration.

In the medium term, this would also provide better insights on the whole life cycle of the assets used by the public sector, including the possibility of deploying machine learning to plan for timely maintenance and replacement, as well as to improve life cycle costing methodologies based on public-sector specific conditions. It would also facilitate the creation of a ‘public sector second-hand market’, where entities with lower levels of performance requirements could acquire assets no longer fit for their original purpose, eg computers previously used in more advanced tasks that still have sufficient capacity could be repurposed for routine administrative tasks. It would also allow for the planning and design of recycling facilities in ways that minimised the carbon footprint of the disposal.

In the long run, in particular post-disposal, the existence of the database of assets could unlock a more circular procurement economy, as the materials of disposed assets could be reused for the building of other assets. In that regard, there seem to be some quick wins to be had in the construction sector, but having access to more and better information would probably also serve as a catalyst for similar approaches in other sectors.

Conclusion

Building a database on existing public sector-used assets as the outcome of earlier procurement exercises is not an easy or cheap task. However, in my view, it would have transformative potential and could generate sustainability gains not only aimed at reducing the carbon footprint of future public expenditure but, more importantly, at correcting or somehow compensating for the current environmental impacts of the way the public sector operates. This could make a major difference in accelerating emissions reductions and should consequently be a matter of sufficient priority for the public sector to engage in this exercise. In my view, it should be a matter of high priority.

* A Sanchez-Graells, ‘Some public procurement challenges in supporting and delivering smart urban mobility: procurement data, discretion and expertise’, in M Finck, M Lamping, V Moscon & H Richter (eds), Smart Urban Mobility – Law, Regulation, and Policy, MPI Studies on Intellectual Property and Competition Law (Berlin, Springer, 2020) forthcoming. Available on SSRN: http://ssrn.com/abstract=3452045.

** A Sanchez-Graells, ‘Data-driven procurement governance: two well-known elephant tales’ (2019) Communications Law, forthcoming. Available on SSRN: https://ssrn.com/abstract=3440552.

*** A Sanchez-Graells, ‘Transparency and competition in public procurement: A comparative view on a difficult balance’, in K-M Halonen, R Caranta & A Sanchez-Graells (eds), Transparency in EU Procurements: Disclosure within public procurement and during contract execution, vol 9 EPL Series (Edward Elgar 2019) 33-56. Available on SSRN: https://ssrn.com/abstract=3193635.

————————————
This blog was written by Cabot Institute member Professor Albert Sanchez-Graells, Professor of Economic Law (University of Bristol Law School).

Albert Sanchez-Graells

Science in action: Air pollution in Bangkok

Bangkok haze 2019 March. Wikimedia Commons.

I was given the opportunity to spend a significant part of 2018 in Bangkok, Thailand, to work with the Chulabhorn Research Institute (CRI) Laboratory of Environmental Toxicology working on a project funded by the Newton Fund on air-quality. Bangkok is a large city with over 14 million inhabitants, which suffer high levels of traffic and congestion resulting in consequent high exposure to traffic-related pollution. It is a UN Sustainable development goal to reduce the number of deaths caused by pollution by 2030. Air pollution is a global problem – a major threat to health throughout the world – but particularly so in low and medium income countries, which account for 92% of pollution related deaths (1). The poor and the marginalised often live in areas of high pollution, and children have a disproportionate exposure to pollutants at a vulnerable stage of development.

The Chulabhorn Research Institute is an independent research institute in Bangkok whose mission includes the application of science and technology to improve the Thai people’s quality of life. The Laboratory of Toxicology, under Professor Mathuros Ruchirawat, have a very strong record in using their results to inform policy and make a real impact on the lives of people in South East Asia affected by environmental toxins. For example, a previous study undertaken by the CRI found that people living and working near busy roads were exposed to Benzene from traffic exhaust, and they demonstrated increased DNA damage. Once this was known, the Thai government was persuaded to alter fuel mixtures in cars to protect the population (2).

I was in Bangkok from January to June, then returned in September till December 2018. I brought with me particle measurement and sampling equipment to count particles and sample particulate matter (PM) in Bangkok to supplement the toxicology work of the Research institute. PM can be described by its size fractions; usually reported are PM10 (aerosol diameter 10 micrometres and lower) and PM2.5 (2.5 micrometres and lower). Less often measured, is the sum-micron range (sometimes referred to as PM1) and the ultrafine range (less than 100 nm).

James Matthews with his particle measurement and sampling equipment on public transport in Bangkok.

Below 1 μm, it becomes more difficult to measure particle numbers as optical techniques fail on particles around 200 nm and smaller.  To count them, the particles require a solvent to grow them to a countable size. The requirement for regular solvents, and the high price of aerosol instrumentation to measure the smallest sizes, mean that particle number concentration is not always measured as a matter of course, but new research is indicating that they may be a significant health concern. The smaller particles can penetrate further into the lung and there is some evidence that this may cause them to pass further into the body, possibly even making its way into the brain. While much more research is needed – in both the toxicological and epidemiological domains – to understand the effects of these smaller particles I would not be surprised if the narrative on air quality moves further toward the ultrafine size range in the very near future.

While in Bangkok, I added my aerosol science expertise and experience in aerosol field measurements to the team in the CRI, taking measurements of particle number count using a handheld particle counter, and collecting samples of PM using both PM10 samplers, and a cascade impactor (the Dekati Electrical Low Pressure Impactor) that allowed samples to be separated by aerodynamic size, then collected for further analysis on ICP-MS (inductively coupled plasma mass spectrometry). Thus, metal concentrations within all the different size fractions of aerosol could be found. Within the first few months of the project, I was able to train the staff at the CRI to use this equipment, and so measurements could continue when I returned to the UK.

As well as taking measurements at the CRI in the Lak Si district, north of Bangkok, we chose three sites in the wider Bangkok area that represented different exposure conditions. We were given access to the Thai governmental Pollution Control Department (PCD) air quality measurements sites, where our instrumentation was set up next to their other pollutant measurements.

A toll road and railway in Lak Si – from Bangkok toward the Don Mueang airport. Image credit James Matthews.

The three sites included Ayutthaya, a UNESCO world heritage site north of Bangkok. Ayutthaya, while a busy tourist destination, has considerably less traffic, and therefore less traffic emission related pollutants, than the other sites. The second site, Bang Phli, was an area to the South of Bangkok where there is a lot of industry.  The third, Chok Chai, was a roadside site in central Bangkok.

Survey measurements of particle count were taken in several locations using a hand-held particle counter. The particle numbers were highest in measurements on the state railway and on the roads in Bangkok. The state railway through Bangkok is slow moving, where diesel engines repetitively start and brake, all of which contribute to particulates. Conversely the newer sky train and underground railways had low particle counts (the underground had the lowest counts I measured anywhere). At the CRI site, long term measurements near a toll road showed that the particle number count was highest at rush hours, indicating traffic as the dominant contributor. Walking routes in both Bangkok and Ayutthaya showed high concentrations near roads, and in markets and street stalls, where street vendors produce food.

Within our measurements in Bangkok, we were able to measure the mass fraction, and the metal (and some non-metals such as arsenic) content over 12 size fractions from 10 um down to 10 nm. Metals that are known to be deleterious to human health include Cadmium, Chromium, Nickel and Lead, which are class 2 (possible or probable) or class 1 (known) carcinogens. Comparing the reference site (Ayutthaya) with the roadside site (Chok Chai) over several 3-day sampling periods showed that these toxic metals were present in higher concentrations in the area with higher traffic. They were also present in higher concentration in the lower size ranges, which may result in these metals penetrating deeper into the lung.

One episode demonstrated the need for local knowledge when measurements are taken. Normally, we would expect measurements during weekdays to be higher in working areas than at weekends, due to increased work traffic, and in most cases this was the case (Ayutthaya was often an exception, likely due to weekend traffic from tourists). However, one weekend saw a notable peak in aerosol concentrations at a site one Saturday evening, which was difficult to explain. It was pointed out to me by colleagues at the institute that over this weekend was a festival during which the Chinese community in Bangkok traditionally burned items, including fake money, as a long standing tradition. The peak in particles fitted this explanation.

Knowing more about the nature and level of pollutants in a city is an important first step, but challenges persist for the people in Bangkok and other polluted cities as to how to reduce these exposures. The problem of rural crop burning is one that frustrates many in Thailand, as it is well known that the particulates from burnt crops are harmful to the population. While there are strong restrictions on the deliberate burning of crops, it is still common see fields burning in January to March in areas of northern Thailand. Similarly, Bangkok remains the city of the car, with residents accepting that they may have to sit in traffic for long periods to get anywhere.

Burning mountains in Thailand. Wikimedia Commons.

Researchers from Bristol were able to discuss our results alongside measurements from the PCD and the CRI in Bangkok in a seminar held in 2019. It was apparent that there is great awareness of the dangers of air pollution, but it still seems that more needs to be done to address these problems. In January 2019, Bangkok made the headlines for PM2.5 exposures that were at dangerously high levels. January in Thailand is during the ‘cool’ season, where both temperatures and rainfall are low. This weather results in a trapping of pollutants within the city, thus increasing exposure levels. On discussing this with the pollution experts in Thailand, it was argued that the levels this year were typical levels for January, but the reporting of those levels had changed. The Thai PCD advocate communication of pollutant levels though their website and their app, and until recently the PCD sites did not measure PM2.5 in a sufficient number of stations to use it in their air quality index calculations. This year, they changed the calculation to include PM2.5, and as a consequence, the high pollutant levels discussed above were reported. The reporting of these pollutant levels can be attributed to the greater awareness of the population to the problem of pollution, which in turn is leading to a greater urgency in finding solutions.

So there is a role for the direct engagement with the population, which may lead to pressure on governments to respond. There is also a role for science to provide leaders with tangible solutions, such as the suggestion to change fuel mixtures. But the huge challenge of reducing sources of pollutants in a growing city like Bangkok remains.

1 Landringham, P. J. et al 2017. Lancet 391, 462-512
2 Davies R., 2018. Lancet 391, 421.

————————–
This blog is written by Cabot Institute member Dr James Matthews, School of Chemistry, University of Bristol. James’ research looks at the flow of gases in urban environments, and the use of perfluorocarbon trace gas releases to map the passage of air in urban cities.
James Matthews

Collecting silences

‘Noise’ is the Greenhouse gas (GHG) emissions which have resulted from fossil-fuel-powered economic growth which is measured as GDP for particular territories. In Figure 1, ‘noise’ is the area below the green line to the left of the vertical dotted line (historical) and below the blue line to the right of the vertical dotted line (projected). ‘Silence’ is the reduction of fossil-fuel use and the mitigation of carbon emissions. In Figure 1, ‘silence’ is the green shaded area above the blue line and below the dotted blue line to the right of the vertical dotted line.

Figure 1

To ensure that we maintain atmospheric GHG emission concentrations conducive to human habitation and the ecosystems that support us, we need to assign less value to ‘noise’ (burning fossil fuels) and more value to ‘silence’ (GHG emission mitigations). Creating a system which assigns value to ‘silences’ by turning them into investable resources requires an effort sharing mechanism to establish demand and organizational capacity alongside accurate measuring, reporting and verification for supply.

Organizational capacity for supplying ‘silences’ depends on the ability of organizations to create, trade and accumulate GHG emission mitigations. Due to the intangible nature of such ‘silences’, turning GHG emissions mitigations into investable sources requires their assetization as quasi-private goods with well-defined and delineated quasi-property rights. As preservations of the intangible commodity of stable atmospheric GHG concentrations through the reduction of pollution, such rights need to protect investment by ensuring that these private goods are definable, identifiable and capable of assumption by third parties. Such rights also require enforcement and protection against political and regulatory risk.

Commodifying GHG emission mitigations as quasi-private goods by assetizing them with well-defined and delineated quasi-property rights therefore provides the basis for the supply of ‘silences’. Rather than ‘internalising’ the cost of stabilising or reducing atmospheric GHG concentrations, this approach assigns value to GHG emission mitigations. Yet, if we want to avoid climate catastrophe according to the most recent IPCC 1.5C report and the UNDP Emissions Gap Report, GHG emission mitigations also require concretization on the demand-side. There are several examples of GHG emission mitigation and energy demand reduction assetization that can help illustrate how such systems of demand and supply can function.

Similar to GHG emission mitigations, energy demand reductions also represent the reduction of an intangible commodity vis-à-vis a baseline. While stable atmospheric GHG emission levels are the intangible commodity in the case of the former, in the case of the latter the intangible commodity is energy supply which fuels economic growth. Both require the assetization of mitgations/reduction to create ‘tangibility’, which provides the basis for assigning value. To illustrate, energy demand reductions are absent on domestic and corporate accounts and subsequently undervalued vis-à-vis increases in revenues.

Market-based instruments that succeed in setting and enforcing targets and creating systems of demand, however, can create ‘tangibility’. Energy demand reductions, for example, are assetized as white certificates representing equal units of energy savings (negawatts) in white certificate markets. Similarly, demand-side response enables the assetization of short-term shifts in energy (non-)use (flexiwatts) to benefit from flexibility and balancing markets. Carbon emission mitigations are assetized under the Clean Development Mechanism (CDM) as Certified Emissions Reductions (CERs).

Crucially, these examples shift the emphasis from the cost of pollution and the need to ‘internalise’ this cost or from turning pollution into a quasi-private good through Emissions Trading Schemes (ETS) towards the positive pricing of energy demand reductions and carbon emission mitigations. Positive pricing turns their respective reduction and mitigation itself into a quasi-private good by turning ‘silences’ into investable resources.

The main technical difficulty of establishing such systems lies in the definition of baselines and measuring, reporting and verification vis-à-vis these baselines. The difficulties inherent in this approach are well documented but improved sensing technology, such as the Internet of Things (IoT), and distributed ledgers promise greatly improved granularity and automated time-stamping of all aspects of energy (non-)use at sub-second intervals. If structures of demand are clearly identified through target-driven market-based instruments and supply is facilitated through the assetization of ‘silences’ as quasi-private goods with clearly defined and enforced quasi-property rights, a clear incentive also exists to ensure that MRV structures are improved accordingly.

Key to the implementation of such target-driven market-based instruments are mechanisms to ensure that efforts are shared among organisations, sectors or countries, depending on the scale of implementation. Arguably, one of the reasons why the CDM failed in many aspects was because of the difficulty of proving additionality. This concept was supposed to ensure that only projects that could prove their viability based on the availability of funds derived from the supply, trade and accumulation of CERs would be eligible for CDM registration.

The difficulty of proving additionally increases cost and complexity. To ensure that new mechanisms no longer require this distinction, a dynamic attribution of efforts is required. A mechanism to dynamically share efforts can also help address rebound effects inherent in energy efficiency and energy demand reduction efforts. Key is the target-driven nature of associated market-based instruments and the equitable distribution of the rebound through a dynamic mechanism which shares any rebounds (i.e. increases in carbon emissions) equitably among organisations, sectors or countries. With an appropriate effort-sharing mechanism in place, the demand and supply of ‘silences’ can be aligned with targets aiming to maintain atmospheric GHG emission concentrations in line with levels conducive to human habitation and the ecosystems that support us.

—————————-
This blog is written by Cabot Institute member Dr Colin Nolden, a Vice Chancellor’s Fellow in sustainable city business models. The blog has been reposted with kind permission of World Sustainable Energy Days. If you would like to read more on this topic, you can read Colin’s research paper here.

Colin Nolden

 

Monitoring greenhouse gas emissions: Now more important than ever?

As part of Green Great Britain Week, supported by BEIS, we are posting a series of blogs throughout the week highlighting what work is going on at the University of Bristol’s Cabot Institute for the Environment to help provide up to date climate science, technology and solutions for government and industry.  We will also be highlighting some of the big sustainability actions happening across the University and local community in order to do our part to mitigate the negative effects of global warming. Today our blog will look at ‘Explaining the latest science on climate change’.

The IPCC report

On 8 October 2018 the Intergovernmental Panel on Climate Change (IPCC) [1] published their special report on Global Warming of 1.5 ˚C. As little as 24 hours after the report had been published, the results of the report were already receiving extensive global coverage in the media, with BBC News describing the report as the “final call”. The BBC News article also explicitly mentions that this is “the most extensive warning yet on the risks of rising global temperatures. Their dramatic report on keeping that rise under 1.5 ˚C states that the world is now completely off track, heading instead towards 3 ˚C. Staying below 1.5 ˚C will require ‘rapid, far-reaching and unprecedented changes in all aspects of society’ [2].”

Reading the report has quite honestly been somewhat overwhelming but also necessary to understand exactly what we are in for. And as much as I understand the difficulty one might face either with the technical terms of the report or even the volume of information, I would really encourage you to give it a read. This special report covers a wide range of subjects from oceans, ice and flooding to crops, health and economy. However, if you do find that the chapters themselves are too lengthy or difficult, there is an amazing interactive, and very easy way that will help you explore the impacts of a 1.5 ˚C, 2 ˚C and beyond on Carbon Brief’s website.

There are two distinct parts in the IPCC special report. The full technical report that consists of 5 chapters and a short summary for policy makers (SPM). The SPM clearly states that “Estimated anthropogenic global warming matches the level of observed warming to within ±20 %” which translates into ‘almost 100 % of the warming is the result of human activity’ [3] [4].

We know for a fact that human activity is warming the planet

One outcome of this “human activity” that we often discuss is the emission of greenhouse gases (GHGs). Through various types of activities, whether that is agriculture, deforestation or burning fossil fuels, GHGs are emitted to the atmosphere. Without going too much into the chemistry and physics, what these GHGs do is change the mixing ratios within the atmosphere, resulting in greater absorbance of infrared radiation. And it is this change in the composition of our atmosphere that we refer to as the manmade greenhouse gas effect which also leads to the warming described in the IPCC report. But far more than the warming effect itself, global warming has all sorts of impacts most of which you can explore through the interactive link above.

Greenhouse gases and a long history of monitoring

Some of the ‘usual suspects’ in the discussion of GHG emissions are carbon dioxide (CO2), methane (CH4) and nitrous oxide (N2O) (often described as the ‘major’ greenhouse gases [5]). However, an often-overlooked set of halogenated greenhouse gases are playing an increasingly large role in anthropogenic driven climate change. Gases like perfluorocarbons (PFCs) and hydrofluorocarbons (HFCs) are compounds that are emitted through some form of human activity. In the case of PFCs for example, the GHGs CF4 and C2F6 are two of the most volatile and long-lived gases monitored under the Kyoto protocol [6] and they are both primarily emitted through or during industrial processes. In contrast, HFCs are used widely as coolants in refrigerators and air-conditioning units, as blowing agents in foam manufacture and propellants in aerosols. They were originally introduced to replace ozone-depleting gases such as chlorofluorocarbons (CFCs), but like their predecessors, are potent greenhouse gases. Given the long lifetime of many of these halogenated gases, current emissions will influence the climate system for decades to come.

In order to monitor the accumulation of these gases in atmosphere, high-precision measurements are required. Through projects such as the Advanced Global Atmospheric Gases Experiment (AGAGE) [7] (figure 1 [8]) that has been measuring the composition of the global atmosphere continuously since 1978 and the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory Global Monitoring Division, scientists have tracked the atmospheric concentrations of climate forcing gases from as far back as 1950s [9].

Figure 1: The AGAGE network

The Atmospheric Chemistry Research Group (ACRG) Chemistry Department, University of Bristol

The ACRG carries out research in the UK and worldwide in collaboration with other atmospheric chemistry research centres, universities and third parties. In the UK, the ACRG runs the UK Deriving Emissions linked to Climate Change network (DECC) [10], funded by the Department for Business, Energy and Industrial Strategy (BEIS) to measure atmospheric GHG and ozone depleting substances over the UK. These measurements are used in elaborate mathematical models to create top-down emission estimates for the UK and verify the UK GHG inventories submitted to the United Nations Framework Convention for Climate Change (UNFCCC) as part of the Kyoto protocol. Worldwide, the group is involved in the AGAGE network, monitoring global background levels of a wide range of GHGs. The ACRG runs 2 of the 9 global background stations under the AGAGE programme. One of these is the Mace Head station (Figure 2) on the west coast of Ireland, which is ideally placed for resolving northern hemispheric baseline air amongst European pollution events. The other AGAGE research station managed by the ACRG is the site at Ragged Point, Barbados. This site just north of the tropics, sits on the eastern edge of the island of Barbados and is directly exposed to the Atlantic. The researchers in ACRG study a variety of GHGs and a very large range of topics from maintaining instrument suites to ensuring the quality of the resulting data so that it can be used in modelling studies.

Figure 2: The Mace Head Station (Credit: Dr Kieran Stanley)

Why are measuring stations and networks like AGAGE so valuable and more important than ever?

The answer to this question is straightforward. Without measurement stations and their underlying networks, we would have very few means [11] by which to measure the accumulation of GHGs in the global atmosphere, and consequently no way of evaluating their emissions without relying on statistics from the industries that emit them. The current IPCC report is underpinned by such measurements, which allow scientists to estimate the impact of anthropogenic activity on past, present and future climates.

From Mauna Loa and its 60 -year record of atmospheric CO2 [12], to unexpected growth in emissions of banned substances such as CFC – 11 [13] and monitoring the accumulation of extremely long-lived greenhouse gases in the global atmosphere, atmospheric measurements stations have been our inside man when it comes to keeping track of what is happening in our atmosphere and to what extent human activities are altering its composition.

Perhaps now more than ever, in the light of the IPCC report, we can appreciate the importance of the data that have been collected over decades but also, the efforts of those who have been directly or indirectly involved in this kind of work.  Continuing and expanding the measurement networks for these gases is and will be even more vital for a continued understanding of global and regional GHG emission trends.

References

[1] http://www.ipcc.ch/
[2]  https://www.bbc.co.uk/news/science-environment-45775309
[3]  http://report.ipcc.ch/sr15/pdf/sr15_spm_final.pdf
[4]  https://www.carbonbrief.org/analysis-why-scientists-think-100-of-global-warming-is-due-to-humans
[5]  https://www.c2es.org/content/main-greenhouse-gases/
[6]  https://www.atmos-chem-phys.net/10/5145/2010/acp-10-5145-2010.pdf
[7]  https://agage.mit.edu/
[8]  https://agage.mit.edu/
[9]  https://www.esrl.noaa.gov/gmd/about/aboutgmd.html
[10]  http://www.bristol.ac.uk/chemistry/research/acrg/current/decc.html
[11]  https://www.co2.earth/co2-ice-core-data
[12]  https://www.co2.earth/daily-co2
[13]  https://www.theguardian.com/environment/2018/may/16/mysterious-rise-in-banned-ozone-destroying-chemical-shocks-scientists

—————————-
This blog is written by Cabot Institute members Eleni Michalopoulou, Dr Dan SayDr Kieran Stanley and Professor Simon O’Doherty from the University of Bristol’s School of Chemistry.

Dan Say
Eleni Michalopoulou

 

Read other blogs in this Green Great Britain Week series:
1. Just the tip of the iceberg: Climate research at the Bristol Glaciology Centre
2. Monitoring greenhouse gas emissions: Now more important than ever?
3. Digital future of renewable energy
4. The new carbon economy – transforming waste into a resource
5. Systems thinking: 5 ways to be a more sustainable university
6. Local students + local communities = action on the local environment

Forest accounting rules put EU’s climate credibility at risk, say leading experts

**Article re-posted from EURACTIV **


Forest mitigation should be measured using a scientifically-objective approach, not allowing countries to hide the impacts of policies that increase net emissions, writes a group of environmental scientists led by Dr Joanna I House.

Dr Joanna I House is a reader in environmental science and policy at the Cabot Institute, University of Bristol, UK. She co-signed this op-ed with other environmental scientists listed at the bottom of the article.

From an atmospheric perspective, a reduction in the forest sink leads to more CO2 remaining in the atmosphere and is thus effectively equivalent to a net increase in emissions. [Yannik S/Flickr]

When President Trump withdrew from the Paris Agreement, the EU’s Climate Commissioner, Miguel Arias Cañete spoke for all EU Member States when he said that, “This has galvanised us rather than weakened us, and this vacuum will be filled by new broad committed leadership.” The French President, Emmanuel Macron, echoed him by tweeting, “Make our planet great again”.

But as the old saying goes, ‘If you talk the talk, you must walk the walk,’ and what better place to start than the very laws the EU is currently drafting to implement its 2030 climate target under the Paris Agreement. This includes a particularly contentious issue that EU environment leaders will discuss on 19 June, relating to the rules on accounting for the climate impact of forests.

Forests are crucial to limiting global warming to 2 degrees Celsius. Deforestation is responsible for almost one tenth of anthropogenic carbon dioxide (CO2) emissions, while forests remove almost a third of CO2 emissions from the atmosphere.

In the EU, forests currently grow more than they are harvested.  As a result, they act as a net ‘sink’ of CO2 removing more than 400 Mt CO2 from the atmosphere annually, equivalent to 10% of total EU greenhouse gas (GHG) emissions.

New policies adopted or intended by Member States will likely drive them to harvest more trees (e.g. for the bioeconomy and bioenergy), reducing the sink. The controversy is, in simple terms, if forests are taking up less CO2 due to policies, should this be counted?

Based on lessons learnt from the Kyoto Protocol, the European Commission proposed that accounting for the impacts of forests on the atmosphere should be based on a scientifically robust baseline. This baseline (known as the ‘Forest Reference Level’) should take into account historical data on forest management activities and forest dynamics (age-related changes). If countries change forest management activities going forward, the atmospheric impact of these changes would be fully accounted based on the resulting changes in GHG emissions and sinks relative to the baseline. This approach is consistent with the GHG accounting of all other sectors.

Subsequently, some EU member states have proposed that any increase in harvesting, potentially up to the full forest growth increment, should not be penalised. This would be achieved by including this increase in harvesting, and the related change in the net carbon sink, in the baseline.

As land-sector experts involved in scientific and methodological reports (including for the Intergovernmental Panel on Climate Change, IPCC), in the implementation of GHG inventory reports, and in science advice to Governments, we have several scientific concerns with this approach.

From an atmospheric perspective, a reduction in the forest sink leads to more CO2 remaining in the atmosphere and is thus effectively equivalent to a net increase in emissions. This is true even if forests are managed “sustainably”, i.e. even if harvest does not exceed forest growth.

This is further complicated as the issues are cross-sectoral. Higher harvest rates may reduce the uptake of CO2 by forests, but use of the harvested wood may lead to emissions reductions in other sectors e.g. through the substitution of wood for other more emissions-intensive materials (e.g. cement) or fossil energy. These emission reductions will be implicitly counted in the non-LULUCF sectors.  Therefore, to avoid bias through incomplete accounting, the full impact of increased harvesting must be also accounted for.

Including policy-related harvest increases in the baseline could effectively hide up to 400 MtCO2/yr from EU forest biomass accounting compared to the “sink service” that EU forests provide today, or up to 300 MtCO2/yr relative to a baseline based on a scientific approach (up to two thirds of France’s annual emissions).

If policy-related impacts on net land carbon sinks are ignored or discounted, this would:
 

  • Hamper the credibility of the EU’s bioenergy accounting: Current IPCC guidance on reporting emissions from bioenergy is not to assume that it is carbon neutral, but rather any carbon losses should to be reported under the ‘Land Use, Land-Use Change and Forestry’ (LULUCF) sector rather than under the energy sector (to avoid double counting). EU legislation on bioenergy similarly relies on the assumption that carbon emissions are fully accounted under LULUCF.
  • Compromise the consistency between the EU climate target and the IPCC trajectories. The EU objective of reducing GHG emissions of -40% by 2030 (-80/95% by 2050) compared to 1990 is based on the IPCC 2°C GHG trajectory for developed countries. This trajectory is based not just on emissions, but also on land-sinks. Hiding a decrease in the land sink risks failure to reach temperature targets and would require further emission reductions in other sectors to remain consistent with IPCC trajectories.
  • Contradict the spirit of the Paris Agreement, i.e., that “Parties should take action to conserve and enhance sinks”, and that Parties should ensure transparency in accounting providing confidence that the nationally-determined contribution of each country (its chosen level of ambition in mitigation) is met without hiding impacts of national policies.
  • Set a dangerous precedent internationally, potentially leading other countries to do the same (e.g. in setting deforestation reference levels). This would compromise the credibility of the large expected forest contribution to the Paris Agreement.

The Paris Agreement needs credible and transparent forest accounting and EU leaders are about to make a decision that could set the standard.   Including policy-driven increases in harvest in baselines means the atmospheric impacts of forest policies will be effectively hidden from the accounts (while generating GHG savings in other sectors). Basing forest accounting on a scientifically-objective approach would ensure the credibility of bioenergy accounting, consistency between EU targets and the IPCC 2°C trajectory, and compliance with the spirit of Paris Agreement. The wrong decision would increase the risks of climate change and undermine our ability to “make the planet great again”.

Disclaimer: the authors express their view in their personal capacities, not representing their countries or any of the institutions they work for.

***

Signatories:

Joanna I House, Reader in Environmental Science and Policy, Co-Chair Global Environmental Change, Cabot Institute, University of Bristol, UK
Jaana K Bäck, Professor in Forest – atmosphere interactions, Chair of the EASAC Forest multifunctionality report, University of Helsinki, Finland
Valentin Bellassen, Researcher in Agricultural and Environmental Economics, INRA, France
Hannes Böttcher, Senior Researcher at Oeko-Institut.
Eric Chivian M.D., Founder and Former Director, Center for Health and the Global Environment Harvard Medical School
Pep Canadell, Executive Director of the Global Carbon Project
Philippe Ciais, scientist at Laboratoire des Sciences du Climat et de l’Environnement, Gif sur Yvette, France
Philip B. Duffy, President and Executive Director Woods Hole Research Center, USA
Sandro Federici, Consultant on MRV and accounting for mitigation in the Agriculture and land use sector
Pierre Friedlingstein, Chair, Mathematical Modelling of Climate Systems, University of Exeter, UK.
Scott Goetz, Professor, Northern Arizona University
Nancy Harris, Research Manager, Forests Program, World resources Institute.
Martin Herold, Professor for Geoinformation Science and Remote Sensing and co-chair of Global Observations of Forest Cover and Land Dynamics (GOFC-GOLD), Wageningen University and Research, The Netherlands
Mikael Hildén, Professor, Climate Change Programme and the Resource Efficient and Carbon Neutral Finland Programme, Finnish Environment Institute and the Strategic Research Council, Finland
Richard A. Houghton, Woods Hole Research Centre USA
Tuomo Kalliokoski University of Helsinki, Finland
Janne S. Kotiaho, Professor of Ecology, University of Jyväskylä, Finland
Donna Lee, Climate and Land Use Alliance
Anders Lindroth, Lund University, Sweden
Jari Liski, Research Professor, Finnish Meteorological Institute, Finland
Brendan Mackey, Director, Griffith Climate Change Response Program, Griffith University, Australia
James J. McCarthy, Harvard University, USA
William R. Moomaw, Co-director Global Development and Environment Institute, Tufts University, USA
Teemu Tahvanainen, University of Eastern Finland
Olli Tahvonen, Professor forest economics and policy, University of Helsinki, Finland
Keith Pausitan, University Distinguished Professor, Colorado State University, USA
Colin Prentice, AXA Chair in Biosphere and Climate Impacts, Imperial College London, UK
N H Ravindranath, Centre for Sustainable Technologies (CST), Indian Institute of Science, India
Laura Saikku, Senior Scientist, Finnish Environment Institute
Maria J Sanchez, Scientific Director of BC3 (Basque Center for Climate Change), Spain
Sampo Soimakallio, Senior Scientist, Finnish Environment Institute
Zoltan Somogyi, Hungarian Forest Research Institute, Budapest, Hungary
Benjamin Smith, Professor of Ecosystem Science, Lund University, Sweden
Pete Smith, Professor of Soils & Global Change, University of Aberdeen, UK
Francesco N. Tubiello, Te Leader, Agri-Environmental Statistics, FAO
Timo Vesala, Professor of Meteorology, University of Helsinki, Finland
Robert Waterworth
Jeremy Woods, Imperial College London, UK
Dan Zarin, Climate and Land Use Alliance