Plan B: Saving Political Face Beyond 2 Degrees

So far the ‘targets and timetables’ approach to keeping climate change below 2oC has done very little to reduce emissions. What happens when we start thinking about giving up the 2oC target?

WHO:  Oliver Geden, German Institute for International and Security Affairs (Stiftung Wissenschaft und Politik)

WHAT: Looking at the ‘politically possible’ in light of our failures to get anywhere near the emissions reductions needed to keep global warming below 2oC.

WHEN: June 2013

WHERE: Online at the German Institute for International and Security Affairs

TITLE:  Modifying the 2°C Target: Climate Policy Objectives in the Contested Terrain of Scientific Policy Advice, Political Preferences, and Rising Emissions (open access)

This paper is all about the realpolitik. At the outset, it points out that in the 20 years since the UN framework on climate change (UNFCCC) was adopted that progress has been ‘modest at best’. Also, in order to keep global emissions from soaring quickly beyond the 2oC limit, significant reductions will be needed in the decade between 2010-2020, which is ‘patently unrealistic’.

Ok, so we’ve procrastinated away the most important decades that we had to do something about climate change with minimal impacts on both the economy and the wider environment. What now?

This paper suggests that the best bet might be changing or ‘modifying’ the internationally agreed on 2oC target. The author points out (quite rightly) that unrealistic targets signal that you can disregard them with few consequences. For instance, I’m not about to say that I’m going to compete in the next Olympic Marathon, because the second I miss a single training session it’s obviously time to give up given I’ve never run a full marathon before.

So if the world is going to fail on our 2oC training schedule, what will we aim for instead? Should we just aim for redefining ‘safe’ and ‘catastrophic’ climate change? Should we aim for 2.5oC? Should we aim for short term overshoot in the hopes that future humans will pick up the slack when we’ve kicked the can down the road for them?

The author points out what many people don’t like to notice when their countries are failing on their carbon reduction diets – not only have we already warmed by 0.8oC, but we’ve already baked in another 0.5oC from current emissions, so we’re already really close to 2oC without even starting to give up our fossil fuel habits. Also, those reductions we’ve all been promising to make and failing to make (or withdrawing from completely in Canada’s case)? Yeah, if we met all those targets, we’d still blow ‘significantly’ past 2oC. Ouch.

The emissions gap (from paper)

The emissions gap (from paper)

Another issue – the current top-down UNFCCC approach assumes that once we reach an agreement, that effective governance structures can be set up and operating within a matter of years, which is highly unlikely given we can’t even reach an agreement yet.

So what does a ‘more pragmatic stance’ for the EU on climate policy look like if we’re going to collectively blow past 2oC? Will climate policy have any legitimacy?

The author argues that the coming palpable impacts of climate change will soon remove the political possibility of ignoring climate change as an issue while in office (which I for one am looking forward to). He also doesn’t place much faith in the UN process finding a global solution with enough time – if an agreement is reached in 2015, it’s unlikely to be ratified by 2020, at which point the targets from 2015 are obsolete.

One suggestion for the EU is reviewing the numbers for the likelihood of passing 2oC. Currently, humanity is vaguely aiming to have a 50/50 chance of staying below 2oC. If we could roll the dice with slightly higher odds of blowing 2oC, maybe we could buy some time to get our political butts in gear?

That idea puts all the hard work of mitigation on everyone post-2050, at which point we’ll all be dealing with the climate impacts as well as trying to find the time for mitigation.

The other option is to say that 2oC is a ‘benchmark’ (only slightly better than an ‘aspirational target?’) and put our faith in climate inertia allowing humanity to overshoot on emissions and then increase the amount of sequestration (negative emissions) to pull back from the brink of the next mass extinction.

The paper does acknowledge that this will implicitly approve a temperature overshoot as well as an emissions overshoot, which could possibly kick the can down the road to 2300 before global temperatures are below 2oC above what we used to call ‘normal’. Apologies to everyone’s great great great great grandchildren for making you responsible for all of that.

Kicking the can down the road to 2300 (from paper)

Kicking the can down the road to 2300 (from paper)

The author also acknowledges that overshoot policies will only be accepted by the wider public if they’re convinced that this time governments will actually respect them as limits not to be passed. Previous experience with the UNFCCC processes show that any extra time that can be wrangled through carbon accounting is likely to be procrastinated away as well.

The other option could be a target of 2.5oC or 550ppm of CO2 in the atmosphere, but as the paper points out, the ‘targets and timetables’ policies haven’t worked yet, and it might be time to look more towards feasible ‘policies and measures’.

The problem for me with this paper is that while it’s practical to look at aiming for what humanity can politically achieve in terms of climate policies, redefining what ‘dangerous climate change’ is to fit with realpolitik rather than physics won’t work. Physics doesn’t negotiate – the first law of thermodynamics doesn’t care that there was an economic downturn in 2008 that has made it harder to pass climate legislation.

So yes, we need to think about what is politically possible in the current ‘we can still procrastinate on this’ climate. But we also need to be planning for the tipping point once all the extreme weather adds up to business as usual no longer being feasible. We may be able to ‘postpone the impending failure of the 2oC target’, but we won’t be able to ignore the impacts of climate change.

CO2 garden steroids

Is additional atmospheric CO2 fertilizing plants? Is this a good thing?

WHO: Randall J. Donohue, Tim R. McVicar, CSIRO Land and Water, Canberra, ACT, Australia
Michael L. Roderick, Research School of Biology, Australian National University, Canberra,
Research School of Earth Sciences, Australian National University, Canberra,  Australian Research Council Centre of Excellence for Climate System Science, Sydney, New South Wales, Australia.
Graham D. Farquhar, Research School of Biology, Australian National University, Canberra, ACT, Australia.

WHAT: Trying to measure the effect that CO2 fertilization has on plants from increased atmospheric CO2 from global warming

WHEN: 19 June 2013

WHERE: Geophysical Research Letters, Vol. 40, Issue 12, June 2013

TITLE: Impact of CO2 fertilization on maximum foliage cover across the globe’s warm, arid environments (subs req.)

Climate deniers and people who don’t want to take action on climate change often say that increased levels of CO2 in our atmosphere will be a good thing, because plants need CO2 to grow, so why not let the plants have all the CO2 they want and watch them grow like gangbusters!?

More CO2= good? (Chris Gin, flickr)

More CO2= good? (Chris Gin, flickr)

This is the same as suggesting that because humans need water that I should drink water all the time without pause and that would be awesome (it wouldn’t; I would die from water intoxication). Flaws in denier logic aside, these researchers from the CSIRO tried to find out if you could measure whether there was an increase in plant growth from increased atmospheric CO2 and whether you could measure it.

The researchers looked at warm, arid areas of the world where rain is the limiting factor for plant growth. Places like South Western Australia, Southern California and South Eastern Spain. They then looked at the plant growth data from 1982-2010 and broke it into three year segments that were averaged to make sure they accounted for a lag time between changes in rain patterns and plant growth.

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Then they ran the numbers for what plant growth would have been with constant amounts of rain so they could separate out the effect of CO2 alone.

What they found was that transpiration and photosynthesis are directly coupled to atmospheric CO2, which plays a role in setting the Fx edge (upper limit of plant growth). Plant growth increased ~11% between 1982 and 2010 from increased levels of atmospheric CO2 fertilization.

Then, just to make sure they were correct, they looked at the different things that could have influenced that increase. Increased temperatures lowered plant growth (too hot to grow). Plant productivity increased to a certain point under drought conditions (as plants became more water efficient and drought tolerant), but that couldn’t account for the 11% increase. There was an observed 14% increase in plant growth from a 10% increase in precipitation, but that couldn’t account for their numbers either because they ran them for a constant level of rain.

So, as the researchers stated in their paper, this ‘provides a means of directly observing the CO2 fertilization effect as it has historically occurred across the globe’s warm, arid landscapes’.

But does this mean all plants will grow more and we don’t have to worry about climate change anymore? Unfortunately, no.

This only applies to warm arid areas where water is the dominant limit to growth. Also, the other effects of climate change – longer droughts, more extreme storms, longer heatwaves, more extreme bushfires – are likely going to outweigh the positive effect of the increase in growth from CO2 fertilization.

The researchers point out in their conclusion that this research doesn’t simply translate to other environments with different limitations. In a Q&A with the CSIRO when asked whether this research means that climate change is good, the lead author Dr. Donohue stated; ‘this does not mean that climate change is good for the planet.’

So yes, there is a fertilization effect from increased CO2, but no, it doesn’t mean we get to keep burning carbon.

Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Your odds just shortened – Aussie Heatwaves

Climate change has increased the chance of extreme heatwaves in Australia by more than five times.

WHO: Sophie C. Lewis and David J. Karoly, School of Earth Sciences and ARC Centre of Excellence for Climate System Science, University of Melbourne, Victoria, Australia

WHAT: Looking at how much influence human-caused climate changes are pushing Australian summers into more extreme heat.

WHEN: July 2013

WHERE: Geophysical Research Letters (DOI: 10.1002/grl.50673), pre-publication release

TITLE: Anthropogenic contributions to Australia’s record summer temperatures of 2013 (subs. req)

There’s some interesting research happening at my alma mater Melbourne University these days (go Melbourne!). Even if you weren’t there to experience the extreme summer of 2012-13 in Australia, I’m sure you all remember the new colour that had to be created by the Australian Bureau of Meteorology for the weather maps when they maxed out above 50oC, or maybe the new rating for bushfires of ‘catastrophic’ for the climate fuelled fires that are beyond just an extreme risk?

Extreme Heat in January 2013 (Bureau of Meteorology)

Extreme Heat in January 2013 (Bureau of Meteorology)

So, to answer that age-old question ‘exactly how much have we messed this up?’ these researchers looked at the historical monthly weather patterns, weather patterns with natural forcing only and patterns with natural forcing and human forcing and matched those up with what actually happened.

They looked at the average monthly mean temperatures, maximum temperatures and minimum temperatures and found that the monthly extremes are increasing faster than the daily extremes – that is that there are more months that are more overall extreme than there are days of extremes.

The historical data they used for the experiment was from 1850 to 2005, with the baseline climate data (what they used as a reference for ‘normal’) being 1911-1940 because 30 years of weather data makes a climate!

They then created experiments for the data with natural forcing only, with natural and human forcing and ran exciting statistical functions like a probability density function with a kernel smoothing function that almost sounds like a super-cool popcorn maker.

To double check for error, they used the second hottest summer temperatures to make sure they could pick out the human influences from the randomness that can be the data, thereby deliberately making their findings conservative.

Once they’d run their fun popcorn-sounding numbers, they calculated the FAR – Fraction of Attributable Risk, which is exactly what it sounds like – the fraction of the risk of something happening that can attributed to a cause.

So if our ‘bad guy’ is human-induced climate change, how much can we blame it for the Australian Angry Summer of 2012-13? Well, a fair bit.

When they compared the numbers, they had 90% confidence that there was a 2.5 times increase in extreme heat from human influences. When they compared 1976-2005 data and extended the model out to 2020, the fraction increased again to a 5 times increased likelihood.

Extreme heat is ‘substantially more likely’ because of humans burning fossil fuels, which are pretty bold words from research scientists – when there’s a greater than 90% chance of something they say ‘very likely’ where most people would say ‘very certain’. In their research, events that should have been occurring 1-in-16 years naturally were happening 1-in-6 years with the historical numbers and 1-in-2 years with the model out to 2020. Ouch – summer just got more uncomfortable more often.

MTSOfan, flickr

MTSOfan, flickr

For me, the kicker came when the paper pointed out that the 2012-13 summer in Australia came in a La Niña year. Extreme heat events normally come with an El Niño year – La Niña years are cooler with more rain. So the fact that the Angry Summer occurred in a La Niña year is scary – sea surface temperatures were average or cooler in some places while at the same time the Bureau of Meteorology was scrambling for new map colours.

The paper concludes that their research supports ‘a clear conclusion that anthropogenic climate change had a substantial influence on the extreme summer heat over Australia’ and that these kinds of events are now five times as likely to occur. Welcome to summers on climate change?

Drought – worse than we thought

Inconsistencies with drought models that don’t account for sea surface temperature changes mean that drought in a climate changed world could be worse than predicted.

WHO: Aiguo Dai, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: Looking at the impact of sea surface temperature variability on drought

WHEN: January 2013

WHERE: Nature Climate Change, Vol. 3, No. 1, January 2013

TITLE: Increasing drought under global warming in observations and models (open access)

Climate deniers love it when the models are slightly wrong for predicting future climate changes, and believe me, I’d love it if climate change weren’t so verifiably real and we could all retire and live la dolce vita.

However, that’s not reality, and in the case of this paper, where the model doesn’t quite line up with the observed changes that’s because it’s worse than we previously though. Oh dear.

Global warming since the 1980s has contributed to an 8% increase in drought-ridden areas in the 2000s. It’s led to things like diminished corn crops and the steady draining of underground water aquifers in the USA, much of which is currently experiencing persistent drought. The letter L on the map below stands for long term drought.

Long term drought in the Southwest of the USA (from US Drought Monitor)

Long term drought in the Southwest of the USA (from US Drought Monitor)

What’s that got to do with climate models? Well, while the droughts in Southern Europe or my homeland of Australia are due to lack of rain drying things out, drought can also be from increased evaporation from warmer air temperatures, which the models don’t fully take into account.

These droughts are harder to measure because they’re related to sea surface temperature changes that take decades and can be hard to identify as a human forced signal rather than just natural variations. So this researcher compared sea surface temperatures with drought predictions and observed warming to try and work out what is going on.

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

There were two areas where the models differed from the observed changes – the Sahel area in Africa and the USA.

In the Sahel, the models predicted there would be warming in the North Atlantic Ocean which would lead to increased rain. What actually happened was that there was large warming in the South Atlantic Ocean compared to the North Atlantic and steady warming over the Indian Ocean which meant less rain, not more. Similarly, for the predicted patterns in the USA, the influence of the Pacific Multidecadal Oscillation was not known to be influenced by human climate forcing. However, it switched to a warm phase from above-normal sea surface temperature.

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

These sea surface variations that were missed in some of the previous models have some obvious consequences for planning for the slow pressure cooker of stress that drought is on anyone living through it, let alone trying to make a living from agriculture.

The researcher noted that there were also some differences from the models when looking at sulphate aerosols, however for the 21st Century the signal from greenhouse gases will be much stronger than those from aerosols, so shouldn’t mess with the data too much.

So what does this all mean? Well, it means that there are both regional and broader trends for drought in a changed climate. The broad patterns are fairly stable ‘because of the large forced trend compared with natural variations’, which is scientist for humans are making a large enough mess out of this to see the evidence clearly.

The paper ends quite bluntly stating that having re-worked the simulations to take into account the new data for sea surface temperature and other variables, that it’s only more bad news.

It’s likely to be ‘severe drought conditions by the late half of this century over many densely populated areas such as Europe, the eastern USA, southeast Asia and Brazil. This dire prediction could have devastating impacts on a large number of the population if the model’s regional predictions turn out to be true.’

Yes, a researcher actually used the word ‘dire’ in a scientific paper. Oh, and this was with an intermediate emissions scenario, not the business as usual path we’re currently all on. How about we all agree to stop burning carbon now?

Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Playing the Emissions on Shuffle

What do the emission reductions of industrialised nations look like when you count the imports manufactured overseas?

WHO: Glen P. Peters, Center for International Climate and Environmental Research, Oslo, Norway
Jan C. Minx, Department for Sustainable Engineering, and Department for the Economics of Climate Change, Technical University Berlin, Germany
Christopher L. Weberd, Science and Technology Policy Institute, Washington, Civil and Environmental Engineering, Carnegie Mellon University, Pittsburgh, USA
Ottmar Edenhofer, Department for the Economics of Climate Change, Technical University Berlin, Potsdam Institute for Climate Impact Research, Potsdam, Germany

WHAT: Measuring the transfer of CO2 emissions through international trade

WHEN: 24 May 2011

WHERE: Proceedings of the National Academy of Sciences (PNAS) vol. 108 no. 21, May 2011

TITLE: Growth in emission transfers via international trade from 1990 to 2008 (open access)

These researchers have found a problem with the way we count carbon emissions. When we count them, countries tend to count them for industries that emit within their own territorial borders, which means that emissions in the developing world have kept going up, while emissions in the developed world (or first world) have either flattened or dropped, depending on how much your government likes to admit the reality of climate change.

However, most of the emissions from the developed world are to produce goods for places like North America and Europe. So these researchers wanted to work out exactly how much international trade contributed towards global emissions increasing by 39% from 1990 – 2008. Was the increase in emissions due to development in countries like China, or was it a case of wealthy countries just shuffling their manufacturing emissions to another country and continuing to increase consumption rates?

As you might guess (spoiler alert) it’s the latter. Turns out all we’ve been doing is moving most of our industrial manufacturing emissions to developing countries and importing the products back, allowing everyone to say ‘yay, we reduced emissions!’ while the actual amount of carbon being burned continues to increase.

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

But don’t take my word for it – what does the paper say?

The researchers took the global economy and broke it down into 113 regions with 95 individual countries and 57 economic sectors. They then looked at all the national and international data they could get on supply chain emissions to produce goods between 1990-2008, as well as doing extra detailed analysis for the years 1997, 2001 and 2004. They called it a Time Series with Trade and it was based on GDP, bilateral trade and emissions statistics (all of which you can generally find at your national statistics office online). The only thing they left out of their analysis was emissions from land use change, because there wasn’t enough data for them to thoroughly analyse it.

They found that global CO2 emissions from exported goods rose from 4.3 Gigatonnes (Gt) in 1990 to 7.8 Gt of CO2 in 2008, with a big increase in the decade up to 2008. Exports have increased their share of global emissions from 20% to 26% and grew on average by 4.3% per year, which was faster than the global population grew (1.4%), faster than total global CO2 emissions grew (2%) and faster than global GDP grew (3.6%).

The only thing that export emissions didn’t grow faster than was the dollar value of all that trade, which increased by 12% each year. So not only are all those new iPhones costing you a lot of money (and making Apple super wealthy), they’re also burning a lot of carbon.

But the thing the paper points out is that international trade has led to simply shifting the location of the emissions, rather than reducing the emissions – shuffling them around the planet to avoid counting them. The researchers estimate that the transfer of emissions from wealthy countries to developing countries has been 17% per year increasing from 0.4 Gt of CO2 in 1990 to 1.6 Gt in 2008.

This is an issue, because it means that all of the countries that signed on to Kyoto to reduce their carbon emissions – most of which promised around 0.7 Gt CO2 reduction per year – have simply shifted those emissions through trade to make them someone else’s problem, while continuing to consume stuff at an ever increasing rate.

More and more stuff (epSos, flickr)

More and more stuff (epSos, flickr)

The researchers point out that while China is currently the world’s largest emitter of carbon emissions, with the USA at number two, if you counted consumption emissions (meaning you made the USA count the emissions for all the stuff they use that’s made in China), they’d swap places and the USA would be the world’s largest emitter.

This makes sense if you think it through – have a look around your house at everything that’s made in China. All of that carbon that China is burning, which is destroying their air quality and polluting their cities and people; all of that is to make stuff for you to consume.

If you count the consumption emissions, the emissions reduction of 3% from the developed world becomes an emissions growth of 11%. Oops. Also, the researchers point out that emissions reductions in wealthy countries are often exceeded by the growth of trade emissions.

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

So what does this mean, other than the fact that everyone is trying to avoid having to clean up their own mess?

It means there’s a problem with the way we count emissions from trade vs emissions from consumption. It also means that we’re currently failing to reduce our carbon emissions in any significant way, which puts us on a straight path to 4, 5 or 6oC of global warming, otherwise known as the next mass extinction.

One Size Doesn’t Fit All

Looking at the Nationally Appropriate Mitigation Actions being undertaken by developing countries with the UNFCCC.

WHO: Xander van Tilburg, Sophy Bristow, Frauke Röser, Donovan Escalante , Hanna Fekete, MitigationMomentum Project, Energy research Centre of the Netherlands (ECN)

WHAT: An update on the progress of the NAMA projects under the UNFCCC process

WHEN: June 2013

WHERE: Published on their website MitigationMomentum.org

TITLE: Status Report on Nationally Appropriate Mitigation Actions (NAMAs) (open access)

This week, the UNFCCC (United Nations Framework Convention on Climate Change) is meeting in Bonn to try and make some more progress towards action on climate change (yay!). One of the papers that was released to time with the negotiations is this one and I thought it would be interesting to look at what actually happens on the ground in relation to the high level negotiations. There will be lots of acronyms, so bear with me and I’ll try and get us there in English.

What is a NAMA?

NAMAs are not this guy (photo: Tamboko the Jaguar, flickr)

NAMAs are not this guy (photo: Tamboko the Jaguar, flickr)

Did you say Llama? No, NAMA… In true bureaucratic style, the UN came up with the really forgettable name of NAMA for Nationally Appropriate Mitigation Actions, which can also be called ‘correctly fitting climate jeans’ or even ‘different places are different’. I want to keep calling them llamas (because I lack that maturity) but I promise not to make any bad llama jokes. The main thing you need to remember is that climate mitigation in Alaska is obviously going to be different to climate mitigation in Indonesia because they’re very different locations and economies.

The idea with NAMA is for a bottom up approach to the UNFCCC negotiations and the ideas that come out of the negotiations (they’re not just talk-fests – they have program and policy ideas too!).

Because the wealthy industrialised countries (also called the First World, Global North or Annex 1 in UN speak) are mostly responsible for the emissions causing climate change, we are also then more responsible for the cleanup. So in 2007 at the Conference of the Parties (COP 13) in Bali, it was decided that NAMA projects should be created by developing countries for mitigation projects they’d like to do which would be funded by industrialised countries.

The projects need to be related to sustainable development and are supported through technology, financing and capacity building (training local people). The people running the projects also report back to the UNFCCC so that progress can be monitored (like any project). The first group of NAMAs were submitted to the UNFCCC Secretariat at the Copenhagen COP 15 in 2009.

NAMA projects are only conducted in developing countries because the idea is that it’s going to be easier for those countries to change the way they’re developing towards a low carbon economy, rather than just following in the full carbon burning footsteps of the industrialised world and then having to retrofit low carbon alternatives.

So if they’re going to try and build it right the first time round, what do they do? First, the country comes up with a feasibility study – what do they want to do and is it possible? If it is possible, then they develop the concept to present to the UNFCCC for funding. The concept has to have a mitigation objective and be clear about who is running the project as well as support from the government of the country.

Once they’ve worked out what they’re doing, they start the preparation phase where they work out the costs, the specific support they need to pull off the project and an estimate of how much carbon emissions will be reduced through the project.

Finally, they start the implementation of the project, which is my favourite bit – getting on the ground and getting it done.

NAMA projects by stage (left) and location (right) (from paper)

NAMA projects by stage (left) and location (right) (from paper)

So far, €100million has been provided to NAMA projects, and a NAMA facility was launched to help the projects with financial and technical support in December 2012. Most of the projects are related to energy supply and the majority of them (56%) are based in Latin America.

The funding agreed to was from 2010 until 2012, so a long term financing arrangement will need to be made at this year’s talks, but I think it’s really exciting to see the tangible reality of what the UNFCCC is trying to do.

The first two NAMA projects submitted were from Mali and Ethiopia looking at shifting freight to electric rail in Ethiopia and energy efficiency and renewable energy supply in Mali.

So far, five projects have advanced far enough to receive funding. The projects are between 3-5 years in length, need between €5 – €15million in funding and should be able to start quickly (within 3-12 months) after applying.

The five projects are:

  1. Small scale renewable energy projects in Northern Sumatra, Indonesia with a feed-in tariff for independent power producers (IPPs)
  2. A project to stimulate investment in renewable energy systems in Chile
  3. Waste to energy systems using agricultural waste in Peru (with different approaches tailored to different geographic locations)
  4. Energy conservation and efficiency standards for the building sector in Tunisia
  5. A geothermal energy project in Kenya

There are still details and processes that need to be worked out as the NAMA program progresses, given that one size never fits all for climate mitigation and renewable energy generation. But I really like the idea of locally developed projects that suit the challenges different countries face being implemented on the ground, supported at a high level from the UNFCCC.

Climate Question: Do We Get to Keep Flying?

An analysis of jet fuel alternatives that could be viable in the next decade.

WHO: James I. Hileman, Hsin Min Wong, Pearl E. Donohoo, Malcolm A. Weiss, Ian A. Waitz, Massachusetts Institute of Technology (MIT)
David S. Ortiz, James T. Bartis, RAND Corporation Environment, Energy and Economic Development Program

WHAT: A feasibility study of alternatives to conventional jet fuel for aircraft.

WHEN: 2009

WHERE: Published on both the MIT website and RAND Corporation website

TITLE: Near-Term Feasibility of Alternative Jet Fuels

Last week, I looked at how our transport systems could be carbon free by 2100 and was intrigued by the comment ‘hydro-processed renewable jet fuel made from plant oils or animal fats is likely to be the only biomass-based fuel that could be used as a pure fuel for aviation, but would require various additives in order to be viable as an aviation fuel’.

It made me wonder what was being done for airplane fuel alternatives, or do we not have any alternatives and will I have to give up visiting my family in Australia?

Any other options? (photo: Amy Huva 2013)

Any other options? (photo: Amy Huva 2013)

I came across this technical report by MIT and the RAND Corporation (apparently RAND stands for Research ANd Development) and sat down to read all 150pages (you’re welcome) and see what our options for fuels that we could feasibly use in the next decade are.

The paper compared alternative fuels on the basis of compatibility with existing aircraft and infrastructure, production potential, production costs, lifecycle Greenhouse Gas (GHG) emissions, air quality emissions, merit of the fuel as jet fuel vs ground fuel and the maturity of the technology.

The researchers pointed out (quite rightly) that emissions from biofuels need to take into account the carbon emitted through land use changes because if you clear-fell a forest to plant a biofuel crop any carbon you’ve saved by not burning oil has just been invalidated by the carbon emitted from clear-felling the forest.

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

There were five different fuel groups looked at;

  1. Conventional Petroleum
  2. Unconventional Petroleum
  3. Synthetic fuel from natural gas, coal or biomass
  4. Renewable oils
  5. Alcohols

The standard fuel used in North America for aviation is called Jet A and was used as the benchmark for the study. So what did they find?

Conventional Petroleum Fuels

Almost all Jet A fuel comes from crude oil and is kerosene based. The emissions from Jet A are 87.5g of CO2e (CO2 equivalent) per megajoule (MJ) of energy created (g CO2e/MJ). Of that 87.5g, 73.2g comes from the burning of the fuel and there can be a 7% variation on the amount emitted from refining depending on the quality of the crude oil used and the refining process.

The world consumption of jet fuel is estimated at 5million barrels per day of oil. This number is really hard to wrap your head around, so let me quickly walk you through some math. A barrel of oil is 159L, which means 5million barrels per day is 795,000,000L of oil burned each day. To get that volume of water, you would have to run a fire hose (359L/minute) for 101 years (yes, YEARS). We burn that much in one day.

Given that a conventional fuel is already oil based and you can’t reduce those carbon emissions, the tweak for this paper was an Ultra Low Sulfur Jet A fuel, which would reduce sulfur emissions from burning the fuel.

While it’s a great to reduce sulfur emissions that cause acid rain, the extra refining needed upped the lifecycle emissions to 89g CO2e/MJ.

Unconventional Petroleum Fuels

Unconventional fuels are things like the Canadian tar sands (or oil sands if you’re their PR people) and Venezuelan Heavy Oil. These oils are dirtier and require more refining to be made into jet fuel. They also require more effort to get out of the ground, and so the lifecycle emissions are 103g CO2e/MJ (with an uncertainty of 5%). The upstream emissions of sourcing and refining the fuel are what add the extra – burning the fuel has the same emissions as Jet A, and the upstream emissions range from 16g CO2e/MJ for open cut mining to 26g CO2e/MJ for in-situ mining.

You can also get shale oil through the process of fracking and refine it to Jet A. Shale based Jet A also burns the same as Jet A, but the extraction emissions are a whopping 41g CO2e/MJ which is double the tar sands extraction emissions, giving an overall 114.2g CO2e/MJ lifecycle emissions.

Fischer-Tropsch Synthetic Fuels

These are fuels derived through the catalysed Fisher-Tropsch process and then refined into a fuel. These fuels are good because they have almost zero sulfur content (and therefore almost zero sulfur emissions). They don’t work as a 100% fuel without an engine refit because of the different aromatic hydrocarbon content, and the energy density is 3% less than Jet A (meaning you’d need 3% more fuel in the tank to go the same distance as Jet A fuel). However, it does combine easily to make a 50/50 blend for planes.

You can make FT Synthetic fuel from natural gas which gives you 101g CO2e/MJ emissions, from coal which gives you between 120-195g CO2e/MJ and relies on carbon capture and storage as a technical fix, or from biomass, which has almost zero lifecycle emissions ONLY if you use a waste product as the source and don’t contribute to further land use changes.

Renewable Oils

These are biodiesel or biokerosene which can be made from soybean oil, canola oil, palm oil, coconut oil, animal fats, waste products or halophytes and algae.

Because this paper was looking at fuels that could be commercially used in the next 10 years, they looked at a 5% blend with Jet A fuel to meet freeze point requirements (most renewable oils freeze at too high a temperature for the altitude planes fly at). They found too many safety and freezing point issues with biodiesel or biokerosene, so didn’t calculate the emissions from them as they’re not practical for use.

Another renewable oil is Hydroprocessed Jet Fuel entertainingly sometimes called ‘Syntroleum’. This is made from plant oils, animal fats or waste grease. Soybean oil without land use emissions would have only 40-80% of the emissions of Jet A, while palm oil would have 30-40% the emissions of Jet A.

Alcohol Fuels

The paper looked at using ethanol (the alcohol we drink) and butanol as replacement fuels. They both had lower energy densities to Jet A, higher volatility (being flammable and explosive) and issues with freezing at cruising altitude. While butanol is slightly safer to use as a jet fuel than ethanol, the report suggests it’s better used as a ground transport fuel than a jet fuel (I assume the better use of ethanol as a drink is implied).

Options for jet fuel alternatives (from paper)

Options for jet fuel alternatives (from paper)

After going through all the options, the researchers found that the three main options we have for alternative fuels over the next decade that could be commercially implemented are;

  1. Tar sands oil
  2. Coal-derived FT Synthetic oil
  3. Hydroprocessed Renewable jet fuel

They recommended that when looking to reduce the emissions from the transport sector that aviation shouldn’t be treated any differently. While strongly recommending that land use changes be taken into account for the use of biofuels, they also pointed out that the use for aviation should also be looked at as limited biofuel resources may be more effective producing heat and power rather than being used for transport.

Personally, I don’t find the report very heartening given that the first two options involve either dirtier oil or really dirty coal when what we need to be doing is reducing our emissions, not changing the form they’re in and still burning them. I’ll be keeping my eye out for any new research into hydroprocessed renewable jet fuels that could use waste products or algae – given the speed that oceans are acidifying, there could be a lot of ocean deadzones that are really good at growing algae and could be used as a jet fuel.

But until then, it looks like there aren’t many options for the continuation of air travel once we start seriously reducing our emissions – they’ll be a really quick way to burn through our remaining carbon budget.

Your Transport – Carbon Free in 2100

Detailed scenarios looking at how all transport of people and goods can be zero carbon by 2100

WHO: L.D.D. Harvey, Department of Geography, University of Toronto, Canada

WHAT: Scenarios across all sectors of transport for people and goods and how they can be zero carbon by the year 2100

WHEN: March 2013

WHERE: Energy Policy, Vol. 54

TITLE: Global climate-oriented transportation scenarios (subs req.)

We need to decarbonise our economy, but what does that actually look like? What do our transit and transport systems look like with zero carbon? Are we all going back to the horse and cart? I don’t think my apartment can fit a horse!

This very very detailed paper from the University of Toronto looked at what might happen, and the general gist of it is that first we need to work really hard to increase the efficiency of all our transport. Once we’ve gotten the energy intensity as low as possible on everything, we need to switch the remaining energy requirements over to different fuel sources (bio fuels, hydrogen fuel cells, electric).

For this paper, the globe was divided into ten socio-economic regions that had different per capita incomes, activity levels, energy intensities, potential for future population growth, income growth and energy levels. Each segment was then analysed for the per capita travel of light duty vehicles (cars, SUVs, pickup trucks), air travel, rail travel and other modes of transport. To further complicate the calculations, there were low growth and high growth scenarios looked at as well.

The data was worked from 2005 and extrapolated out to 2100 and if this kind of large scale number crunching really gets you going, all the spreadsheets that the researcher used are available online here (Climate-OrientedTransportScenarios) for you to do your own zero carbon transport scenarios (thanks to Dr. Harvey for making this available open access).

Energy demand scenarios (from paper)

Energy demand scenarios (from paper)

Interestingly, growth in per capita travel relative to GDP growth has halted in several industrialised countries, which makes sense when you think about it – beyond a certain point you end up with more money to travel than time to do it in.

In terms of climate change, the paper assumes we’re able to stabilise the CO2 concentration in the atmosphere at 450ppm. The paper also talks a lot about peak oil and the effect it could have on resource prices and the availability of fossil fuels as fuel. Given that we need to leave 80% of the known fossil fuel reserves on the planet in the ground, I’m not so sure how much effect peak oil may have, but you never know – we could be suicidal enough to try and burn all of it.

Cars

Improvements need to be made reducing the weight of cars, improving the engine efficiency and the aerodynamics. Passenger space will increase so we can transport more people per car, air conditioning becomes more efficient (and necessary in some places because of climate change) and hybrid electric cars replace fossil fuel cars for urban driving. Fuel consumption drops from 10.4L/100km in 2005 to 1-2L/100km (of a biofuel) in 2100.

While I was really hoping the paper would tell me of the demise of ugly giant pickup trucks, sadly it looks like we may keep them and they’ll become hydrogen fuel cell monster trucks.

Buses

Buses will increase engine efficiency and ridership. Many buses are already diesel or electric, but the diesel efficiency will become around 50% and the hydrogen fuel cell buses will have 60% engine efficiency.

Passenger Rail

Trains will be electrified where they can be, and efficient diesel (becoming biofuel) where they can’t be electrified.

Air

The efficiency of planes is expected to increase by 20% from 2000 – 2020, with a 1% per year efficiency gain every year after that. The International Civil Aviation Organisation (ICAO) has already announced they’re aiming for 2% per year efficiency to 2050, so this one isn’t too far from reality. However, the paper points out that this will probably require a radical change in aircraft design, and a possible switch to plant oils or animal fat biomass-based fuel beyond that.

Freight

Freight trains need to reduce their weight, improve their engine efficiency, develop diesel-electric hybrid drive trains and get clever about load configuration to maximise efficiency. The energy requirement of tractors and other long haul trailers also needs to be reduced.

Marine freight is an interesting one. The paper points out that the majority of the world’s shipping is currently oil, coal and other bulk materials like iron ore. Obviously, none of this will need to be shipped anywhere in a zero carbon world, because we won’t need it. Mostly, marine freight will reduce the energy intensity of ships, and future transport will be made up of 60% container ships, 20% bulk ships, 10% general cargo ships and 10% biofuel supertankers.

Green Scenarios

The paper also looks at some ‘Green Scenarios’ which are the ones where we actually get ourselves into gear seriously to decarbonise (and hopefully stop having the endless debate about whether climate change is ‘real’).

The green scenarios have additional reduced total passenger travel with truck and air travel compensated by rail and other travel modes. There’s also an extra 20% decrease in global freight, which makes me hope people become more minimalist and have less junk in this future scenario? (I can dream!)

Initially, the greatest demand for biofuels are cars, but by 2035 freight is the biggest biofuel user, so maybe we’ve started to also become more clever in the way we plan urban areas with density and rapid transit too? (I think I like this future planet!)

Fuel demand scenarios (from paper)

Fuel demand scenarios (from paper)

The paper concludes that we need new urban development with higher density, more walkable, bikable and transit friendly options as well as making energy intensity reductions in all forms of transport and then switching the remaining fossil fuels to hydrogen or biofuel. This will go hand in hand with engine efficiency increases as well as battery technology improvements.

The key thing I took away from this paper is that we need to be doing ALL of this. We can’t just drive an electric car and still have our books from Amazon.com shipped here on an old, inefficient cargo ship belching fossil fuels. We also can’t fix one single transport sector and wash our hands of it saying ‘there- I fixed climate change!’

Climate change will affect everything, regardless of whether we actually do something about it or not. So we need to change the way we do everything to do it without carbon.