IPCC Discovers Infographics – Communicates Climate Change

Working Group II put out their state of the climate for AR5 this March and finally worked out how to communicate climate change.

WHO: The many, many world leading scientific authors of the IPCC – the list is here.

WHAT: Working Group II – the impacts, vulnerability and adaptation group from the IPCC

WHEN: 31 March 2014

WHERE: The IPCC website

TITLE: Climate Change 2014: Impacts, Adaptation, and Vulnerability (open access)

Remember in October 2013, when the first chunk of the IPCC 5th Assessment Report (AR5 for us nerds) was released and I was really snarky about how everyone at the UN speaks a really dense dialect of bureaucrat that almost no-one else can understand and will therefore not bother to read?

Well this time the IPCC got it right! The report from Working Group II who look at the impacts, vulnerability and adaptation humanity will have to do because of climate change discovered colours and infographics and language that normal people actually speak and then used it in their summary for policy makers. They even improved the website to make it more user friendly. Round of applause to the IPCC, the UN Foundation and all the communicators who probably spent many hours de-wonkifying the scientific and bureaucratic language.

The IPCC discovers colour and images (from paper)

The IPCC discovers colour and images (from paper)

This means my job this week was much easier as I don’t even need to translate the 44-page summary for you, but since it’s 44 pages long, I’ll give you the short version.

This time around, the IPCC deliberately went with a risk management frame for communicating climate impacts, and noted that doing something about climate change isn’t really related to the science, so much as it’s a value judgement on how much we’re willing to roll the dice. They do however helpfully point out that ‘climate change poses risks for human and natural systems’ and that it’s being felt in all countries, on all continents and in all the oceans as well. Sorry to burst your bubble if you thought climate change wasn’t going to get you too.

They even put a glossary up the front so you know what they’re talking about when they use words like ‘hazard’, ‘exposure’, ‘vulnerability’, ‘impacts’, ‘risk’, ‘adaptation’, ‘transformation’ and ‘resilience’. Communication high five, IPCC.

Siku the polar bear (image: Polar Bears International)

Siku the polar bear (image: Polar Bears International)

Impacts
So what’s happening so far because of climate change then? Well, it’s a long list of nasty stuff; glaciers are melting, there’s drought, the permafrost is melting and releasing methane. Species are being forced out of their habitat faster than they can move and going extinct and the IPCC can report with a high level of confidence that we’re causing climate change extinctions at a much faster rate than has ever happened with previous natural cycles of climate change.

Plant and animal migration opportunities and how far they could get pushed with climate change (from paper)

Plant and animal migration opportunities and how far they could get pushed with climate change (from paper)

Crops are getting more negative impacts from climate extremes than the extra CO2 is helping them grow, and the ability to grow lettuce in Greenland will be a coin toss depending on how good the quality of the soil is at such high latitudes. Climate change is already affecting the yield of wheat, maize, rice and soybean crops.

Climate change is affecting (and will continue to affect) humans too – it’s harming our health through heatwave deaths and increased waterborne diseases. It’s a ‘threat multiplier’, which means it makes stressful situations more dire, like the drought in Syria which was a big factor in the current civil war there.

The authors also point out that vulnerabilities differ because of inequality, which is their nice way of saying that if you’re poor or you live in a poor country; climate change will hit you first. This makes sense from what we’re already seeing of climate impacts and clean up from extreme weather disasters – it’s much harder to plan for climate adaptation when you live in a warzone.

Adaptation
After all that depressing news, they follow up with some good news – what we’re doing to adapt to climate change. They point out that adaptation is becoming embedded into planning processes, so areas will be more resilient to changes. Adaptation knowledge is accumulating in both the private and public sectors and is being incorporated into risk management processes.

They do point out though that adaptation and mitigation choices that are made now will affect the severity of climate change for the rest of the century. No pressure or anything, but if we get this wrong all your grandchildren might hate you for it.

Future risks
Then they get into how bad it could get if we do nothing. Low-lying Pacific Islands go underwater (the first one was actually evacuated last weekend), coastal cities get flooded, people die in storms and heatwaves, food runs short in some places, farmers lose their land from drought and desertification and places we are really fond of like the Great Barrier Reef die too.

But even if you don’t care about the plants, animals or people in far away countries, the IPCC isn’t going to let you off the hook. They point out that human influence on the climate system is clear, and it’s the level of danger to humans that we have to manage.

Then they do get a little wonky and come up with a hilarious acronym: RFC which stands for ‘Reasons for Concern’ (bureaucrats have a deep love of acronyms). What are the RFCs and should they be keeping you up at night?

Well it’s your call to lose sleep over it, but you should be worried about losing unique systems (any natural wonder of the world basically), extreme weather, uneven distribution of impacts (even if climate change doesn’t destroy your home city, where do you think all the migrants from the dustbowl will go?), global aggregate impacts (like ocean acidification killing all commercial fisheries), and abrupt irreversible impacts (hello melting Greenland ice sheet!).

Sensibly, they point out that increased warming puts us at a greater risk of ‘severe, pervasive and irreversible impacts’, and that the cost of adapting to all these scary disasters is much cheaper if we mitigate (you know, stop burning carbon).

Sectoral risks
Just in case you still thought that climate change is not going to affect you, your friends and family, your hometown and your favourite holiday location, the IPCC would like to let you know it’s also going to affect your livelihood and your access to food.

We’re going to have more drought and water shortages, could have abrupt change in the Arctic or Amazon rainforest causing all kinds of disruption to not only carbon storage, water quality and biodiversity but also economic activity.

Coastal populations will be threatened by flooding, fisheries could collapse and ocean acidification already caused the loss of $10million worth of scallops in Canada. We’ll probably get more famines thus wiping out all the great work charities have done to try and end world hunger, and if that wasn’t bad enough, the report says ‘all aspects of food security are potentially affected by climate change, including food access, utilisation and price stability’. Everything is going to get more expensive and harder to source.

Cities will have more heat stress, flash flooding, landslides, air pollution, drought and water scarcity (the difference being that drought is when you’re short on water for your garden, water scarcity is when you’re short on water for people). Rural areas will have more food and water insecurity and could lose their farms and livelihoods to drought.

And if that laundry list of destruction wasn’t enough for you, here’s what the IPCC says about their worst case scenario projection (which is what will happen with business as usual): ‘by 2100 for the high-emission scenario RCP 8.5 the combination of high temperatures and humidity in some areas for parts of the year is projected to compromise normal human activities including growing food or working outdoors’.

Yeah, business as usual will make it too hot to go outdoors in some places and you won’t be able to grow any food.

Too hot to go outside – business as usual in 2100 on right (from paper)

Too hot to go outside – business as usual in 2100 on right (from paper)

Building resilience
So now that the IPCC has told us with high levels of certainty that we’re in big trouble and that climate change is going to affect everyone, no matter how much money you have to still import bacon, coffee and avocados, what can we do about it?

Firstly – coordinate across different levels of government for things like flood proofing and building infrastructure. Use the range of available strategies and actions to make sure communities are reducing their vulnerability – each of the risk bars on the IPCC infographic have a shaded area, which is the amount of risk that can be reduced through adaptation. Make sure planning takes into account diverse interests, circumstances and sociocultural contexts.

Adaptation risk management opportunities for Australia (from paper)

Adaptation risk management opportunities for Australia (from paper)

Some of the really hard conversations around climate change in the future are going to be with communities who will need to relocate or will lose their way of life because of climate impacts. These discussions are both really important and really difficult – we should be planning for that.

The report gives a slight nod to fossil fuel subsides (and the need to remove them) by saying ‘improved resource pricing, charges and subsidies’ which is their way of saying ‘divest, people’.

Also, (and somewhat obviously, but these things need to be said) the success of any adaptation will depend on how much we mitigate. Unless we stop burning carbon, we won’t have anything left we can adapt to – remember, business as usual makes it too hot to go outside and grow food.

So there you have it – the IPCC have kicked a massive goal this time around managing to stop speaking bureaucrat and start communicating with people. Kudos where it is deserved. Working Group III have their report coming out next week, so we’ll see if they can keep up the great work.

In the mean time, let’s stop burning carbon.

Our Fast-Paced Modern Climate

How can we determine dangerous levels of climate change so that we can stay within those limits?

WHO: James Hansen, Makiko Sato, Jeffrey Sachs, Earth Institute, Columbia University, New York, USA Pushker Kharecha, Earth Institute, Columbia University, New York, Goddard Institute for Space Studies, NASA, New York, USA
Valerie Masson-Delmotte, Institut Pierre Simon Laplace, Laboratoire des Sciences du Climat et de l’Environnement (CEA-CNRS-UVSQ), Gif-sur-Yvette, France
Frank Ackerman, Synapse Energy Economics, Cambridge, Massachusetts, USA
David J. Beerling, Department of Animal and Plant Sciences, University of Sheffield, South Yorkshire, UK
Paul J. Hearty, Department of Environmental Studies, University of North Carolina, USA
Ove Hoegh-Guldberg, Global Change Institute, University of Queensland, Australia
Shi-Ling Hsu, College of Law, Florida State University, Tallahassee, Florida, USA
Camille Parmesan, Marine Institute, Plymouth University, Plymouth, Devon, UK, Integrative Biology, University of Texas, Austin, Texas, USA
Johan Rockstrom, Stockholm Resilience Center, Stockholm University, Sweden
Eelco J. Rohling, School of Ocean and Earth Science, University of Southampton, Hampshire, UK Research School of Earth Sciences, Australian National University, Canberra, ACT, Australia
Pete Smith, University of Aberdeen, Aberdeen, Scotland, United Kingdom
Konrad Steffen, Swiss Federal Institute of Technology, Swiss Federal Research Institute WSL, Zurich, Switzerland
Lise Van Susteren, Center for Health and the Global Environment, Advisory Board, Harvard School of Public Health, Boston, Massachusetts, USA
Karina von Schuckmann, L’Institut Francais de Recherche pour l’Exploitation de la Mer, Ifremer, Toulon, France
James C. Zachos, Earth and Planetary Science, University of California, Santa Cruz, USA

WHAT: Working out what the limit to dangerous climate change is and what the implications are for the amount of carbon we need to not burn.

WHEN: December 2013

WHERE: PLOS One, Vol 8. Issue 12

TITLE: Assessing ‘‘Dangerous Climate Change’’: Required Reduction of Carbon Emissions to Protect Young People, Future Generations and Nature (open access)

This (very) lengthy and detailed paper runs us all through exactly what’s happening with the planet’s climate, what’s making it change so rapidly (spoiler: it’s us) and what objectively we need to do about it. Needless to say, since the lead author is Dr. James Hansen, the godfather of climate science, we would do well to heed his warnings. He knows his stuff; he was doing climate models before I was born (seriously).

Firstly, the paper points out that humans are the main cause of climate change and then also neatly points out that while all 170 signatories to the UN Framework on climate change (UNFCCC) have agreed to reduce emissions, so far not only have we not worked out what the limit for ‘dangerous’ climate change is, we’ve also done nothing to fix it except fiddle at the edges.

Epic procrastination fail, humanity.

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP

Then, the researchers look at 2oC warming as a target. In reality, while 2oC is a nice, seemingly round number that is far enough away from our current 0.8oC of warming, the reason it was randomly chosen to be our line in the sand is that it’s the point beyond which ecosystems start collapsing. I have a sneaking suspicion it was also easy to agree on because it was way into the ‘distant’ future, but let’s play nice and believe it was all for rational scientific rigour.

The latest IPCC report says that if we’re going to stay below 2oC, we can burn a total of 1,000GtC (Gigatonnes of carbon). Ever. This means we need to leave fossil fuels in the ground and stop cutting down more trees than we plant.

As has been pointed out in previous papers, the researchers show that burning all the fossil fuels is a really bad idea. A really bad idea in a mass extinction like the dinosaurs kind of way.

So, if we look at all the warming that has happened so far and measure the energy imbalance in the atmosphere, what do we get? Firstly a step back – energy imbalance. This is your energy budget where you always want it to be constant. Energy comes into the atmosphere from the sun, some goes back out, some stays and keeps us warm and comfy on planet Earth.

Fossil fuels mean that humans have taken a seriously large amount of energy out of the ground and burned it. Releasing this energy into the atmosphere means we’ve now got too much energy inside our atmosphere and we’re out of balance.

What happens when we’re out of balance? Well, so far it hasn’t been pretty. With only 0.8oC of global warming 98% of Greenland’s surface melted for the first time in recorded history, Arctic sea ice hit new record lows, the planet has seen more frequent more extreme storms, floods, typhoons, hurricanes, droughts, fires, algal blooms, glacial melt, and ocean acidification. We’ve had weird storms no-one has ever heard of before like Derechos, we’ve had tropical diseases in new places, and the Jet Stream over the Northern Hemisphere getting ‘stuck’ and dumping more weird weather on us. It’s pretty clear the planet is unwell and that it’s because of us.

you have humans

If all that terrifying stuff is happening at 0.8oC of warming, what does that make 2oC? Hopefully your answer is ‘horrifying’, because that’s what my answer is. Since 2050 (when we’ll arrive at 2oC if we keep going business as usual) is within my working lifetime, I’ll let you know how horrifying it is when we get there.

More scientific than ‘horrifying’ though, the researchers point out that previous paleoclimate changes, from the Earth’s tilt and other slow oscillations took between 20,000 – 400,000 years to happen. Changes happening at that rate give the plants and animals and fish time to relocate and survive. The rate at which we’re changing our modern climate is bad news for things that are not mobile.

How far out of balance are we? The paper estimates that between 2005-2010 the planet was 0.58 W/m2 (± .15W/m2) out of balance. How much of that was caused by humanity? Well, solar irradiance has been going down over the last while, so it’s pretty much all us.

If we are 0.5 W/m2 out of balance, the researchers calculate that we would need to reduce the CO2 concentration down to 360ppm to have energy balance again (we’re currently at 395ppm). If you include some error in the calculations and we’re 0.75W/m2 out of balance, humanity needs to get CO2 down to a concentration of 345ppm.

To make planning easier, the researchers suggest we just aim to get and stay below 350ppm.

The paper then runs through all the reasons why 2oC is a really bad idea to let happen. Because it will lead to damaging sea level rise (sorry Miami), because change is happening too quickly for many species to survive and more than half the species on the planet could go extinct from too much warming (and yes, if we warm this planet enough, humans could be part of that mass extinction).

Because the recovery back to normal temperatures happens on a timescale of millions of years which is beyond the comprehension of humanity.

So to avoid being the next mass extinction, what do we need to do? First, we need to quantify how quickly fossil fuels need to be totally phased out.

If emissions are reduced to zero in 2015, the world could get back to 350ppm by 2100. If we wait until 2035, it would take until 2300. If we wait until 2055, it would take until the year 3000. So when we start reducing emissions is important.

Reduction scenarios (from paper) BAU: Business As Usual

Reduction scenarios (from paper) BAU: Business As Usual

If we had started reducing emissions in 2005, it would only have taken reductions of 3.5% per year. Since we didn’t do that, if we start now, we need to reduce emissions by 6% a year. If we delay until 2020 it becomes 15% per year, so let’s not procrastinate on this one humanity. Also keep in mind that the amount that is considered ‘politically possible’ is currently around 2-3% reductions each year, which means that scientific reality and political delusions are going to collide very soon.

If we reduce our carbon emissions by 6% per year to keep below 350ppm of carbon dioxide by the end of the century, our total carbon budget is going to be 500GtC.

This means we’ve got ~129GtC that we can burn between now and 2050, and another 14GtC left over for 2050-2100. Humanity has already burned through ~370GtC from fossil fuels, so we’ve got to kick this habit quickly.

The paper points out that this means all of our remaining fossil fuel budget can be provided for by current conventional fossil fuels. Therefore, we would require the rapid phase-out of coal and leave all unconventional fossil fuels in the ground. Yes, all of them – all the tar sands, the shale gas, the shale oil, the Arctic oil, the methane hydrates, all of it.

The researchers also say that slow climate feedbacks need to be incorporated into planning, because we’re probably starting to push those limits. Slow feedbacks include things like melting ice sheets (Greenland and Antarctica), deforestation, melting permafrost and methane hydrates.

These things are like climate ‘black swans’ – they’re unquantifiable in that you don’t know when you’ve passed the irreversible tipping point until after you’ve gone beyond it, but things like the ocean no longer being able to absorb most of the carbon we spew into the atmosphere and the rapidly melting permafrost need to be considered in daylight as well as our nightmares now. This is because slow feedbacks can amplify climate changes by 30-50% which puts a big hole in our ‘not burning carbon anymore’ plan.

The paper points out: ‘warming of 2oC to at least the Eemian level could cause major dislocations for civilisation’ which I don’t even need to translate from scientist, because scientists are no longer bothering to pull their punches when explaining how quickly we need to stop burning carbon before we’re really screwed.

So what do we do? The paper makes some suggestions, pointing out that since the science clearly shows what’s happening, the range of responses is also pretty clear.

The first thing is a price on carbon. This paper suggests a carbon ‘fee’ with a border levy for countries that don’t sign up to the fee idea. The fee starts at $15/tonne of carbon and increases by $10/tonne each year. Imports from countries that don’t have the fee get charged at the border, which can then be used for assistance to industries that are exporting to countries without the fee.

They point out that this fee is below the price of cleaning up our climate carbon mess. If we wanted to pay to sequester 100ppm of CO2 out of the atmosphere, it would cost ~$500/tonne of carbon. If that was charged to all countries based on their cumulative emissions, that would be a cost of $28 trillion for the USA (or $90,000 per person) who is responsible for 25% of cumulative global emissions. Hmmm – expensive.

The other things we need to get rid of fossil fuels are hybrid renewable smart grids and better efficiency as well as not only an end to deforestation but ‘reforestation’ and increasing the amount of trees on the planet.

There’s a lot of work to be done, but the clearest thing from this paper is the choice we cannot make is to do nothing. So let’s stop burning carbon.

Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?

 

Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

Smoking Kills, so does Climate Change

A translation of the IPCC 5th Assessment Report Summary for Policymakers

WHO: The Intergovernmental Panel on Climate Change

WHAT: Summary for policy makers of their 2000 page 5th Assessment Report (AR5) of the state of the climate and climate science.

WHEN: 27 September 2013

WHERE: On the IPCC website

TITLE: Climate Change 2013: The Physical Science Basis Summary for Policymakers (open access)

There’s a lot of things not to like about the way the IPCC communicates what they do, but for me the main one is that they speak a very specific dialect of bureaucrat that no-one else understands unless they’ve also worked on UN things and speak the same sort of acronym.

The worst bit of this dialect of bureaucrat is the words they use to describe how confident they are that their findings are correct. They probably believe they’re being really clear, however they use language that none of the rest of us would ever use and it means their findings make little sense without their ‘very likely = 90-100% certain’ footnote at the front.

So now that we’ve established that the UN doesn’t speak an understandable form of English, what does the report actually say? It works its way through each of the different climate systems and how they’re changing because humans are burning fossil fuels.

As you can see from this lovely graph, each of the last three decades has been noticeably warmer than the proceeding one, and the IPCC are 66% sure that 1983-2012 was the warmest 30 year period in 1,400 years.

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

One of the reasons I really like this graph is you can see how the rate of change is speeding up (one of the key dangers with climate change). From 1850 through to around 1980 each decade’s average is touching the box of the average before it, until after the 80s when the heat shoots up much more rapidly.

The report did have this dig for the deniers though: ‘Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends’. Which is UN bureaucrat for ‘when you cherry pick data to fit your denier talking points you’re doing it wrong’.

Looking at regional atmospheric trends, the report notes that while things like the Medieval Warm Period did have multi-decadal periods of change, these changes didn’t happen across the whole globe like the warming currently being measured.

In the oceans, the top layer has warmed (the top 75m) by 0.11oC per decade from 1971 to 2010, and more than 60% of the carbon energy we’ve pumped into the atmosphere since 1971 has been stored in the top layer, with another 30% being stored in the ocean below 700m.

This extra heat is not just causing thermal expansion, it’s speeding up sea level rise, which the IPCC are 90% certain increased from 1901 to 2010 from 1.7mm per year to 3.2mm per year. This is now happening faster than the past two millenniums. Yes, sea level is rising faster than it has for the last 2,000,000 years so you might want to sell your waterfront property sooner, rather than later.

The extra carbon has also made it harder to live in the ocean if you own a shell, because the acidity of the ocean has increased by 26% which makes shells thinner and harder to grow.

On the glaciers and the ice sheets, the IPCC is 90% certain that the rate of melting from Greenland has increased from 34Gigatonnes (Gt) of ice per year to 215Gt of ice after 2002. Yes, increased from 34Gt to 215Gt – it’s melting six times faster now thanks to us.

For Antarctica, the IPCC is 66% certain that the rate of ice loss has increased from 30Gt per year to 147Gt per year, with most of that loss coming from the Northern Peninsula and West Antarctica. Worryingly, this ice loss will also include the parts of Antarctica that are gaining ice due to natural variability.

And at the North Pole, Santa is going to have to buy himself and his elves some boats or floating pontoons soon, because the IPCC have found ‘multiple lines of evidence support[ing] very substantial Artctic warming since the mid-20th Century’. Sorry Santa!

As for the carbon we’ve been spewing into the atmosphere since the 1850s, well, we’re winning that race too! ‘The atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years’. Congratulations humanity – in the last century and a half, we’ve changed the composition of the atmosphere so rapidly that this hasn’t been seen in 800,000 years!

Methane levels have gone up by 150%, and I’m undecided as to whether that means I should eat more beef to stop the cows from farting, or if it means we raised too many cows to be steaks in the first place…

This is the part of the report where we get into the one excellent thing the IPCC did this time around – our carbon budget. I’m not sure whether they realised that committing to reduce certain percentages by certain years from different baselines meant that governments were able to shuffle the numbers to do nothing and make themselves look good at the same time, but this is a promising step.

I’ve written about the very excellent work of Kevin Anderson at the Tyndall Centre in the UK before, but the basic deal with a carbon budget is this: it doesn’t matter when we burn the carbon or how fast, all the matters is the total emissions in the end. You can eat half the chocolate bar now, and half the chocolate bar later, but you’re still eating a whole bar.

Our budget to have a 2/3 chance of not going beyond dangerous climate change is 1,000Gt of carbon and so far we’ve burnt 545Gt, so we’re more than halfway there. All of this leads to the conclusion that ‘human influence on the climate system is clear. This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming and understanding of the climate system.’

What observations you may ask? Scientists have made progress on working out how climate change pumps extreme weather up and makes it worse. They also got it right for the frequency of extreme warm and cold days, which if you live in North America was the hot extremes winning 10:1 over the cold extremes. Round of applause for science everyone!

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

They’re also 95% sure that more than half of the observed global surface warming from 1951 is from humanity. So next time there’s a nasty heatwave that’s more frequent than it should be, blame humans.

The report does also point out though that even though the heat records are beating the cold records 10-1, this doesn’t mean that snow disproves climate change (sorry Fox News!). There will still be yearly and decade by decade by variability in how our planet cooks which will not be the same across the whole planet. Which sounds to me like we’re being warmed in an uneven microwave. For instance, El Niño and La Niña will still be big influencers over the Pacific and will determine to a great extent the variability in the Pacific North West (yeah, it’s still going to rain a lot Vancouver).

For those that were fans of the film The Day After Tomorrow, there’s a 66% chance the Atlantic Meridional Circulation will slow down, but only a 10% chance it will undergo an abrupt change or collapse like it did in the film, so you’re not going to be running away from a flash freezing ocean any time this century.

The report then runs through the different scenarios they’ve decided to model that range from ‘we did a lot to reduce carbon emissions’ to ‘we did nothing to reduce carbon emissions and burned all the fossil fuels’. Because this is the IPCC and they had to get EVERYONE to agree on each line of the report (I’m serious, they approved it line by line, which has to be the most painful process I can think of) the scenarios are conservative in their estimations, not measuring tipping points (which are really hard to incorporate anyway). So their ‘worst case scenario’ is only 4.0oC of surface warming by 2100.

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Now, obviously ‘only’ 4oC of climate change by the end of the century is still pretty unbearable. There will still be a lot of hardship, drought, famine, refugee migration and uninhabitable parts of the planet with 4oC. However, once we get to 4oC, it’s likely to have triggered tipping points like methane release from permafrost, so 4oC would be a stopping point towards 6oC even if we ran out of carbon to burn. And 6oC of course, as you all hear me say frequently is mass extinction time. It’s also the point at which even if humanity did survive, you wouldn’t want to live here anymore.

The paper finishes up with a subtle dig at the insanity of relying on geoengineering, pointing out that trying to put shade reflectors into orbit or artificially suck carbon out of the air has a high chance of going horribly wrong. They also point out that if we did manage large scale geoegineering and it then broke down, re-releasing that carbon back into the atmosphere would super-cook the planet really quickly.

The moral of this 36 page ‘summary’ is that it’s us guys. We’re as certain that we’ve done this as we are that smoking causes cancer. We have burned this carbon and it’s changed the chemical energy balance of the atmosphere and if we don’t stop burning carbon we’re going to cook the whole planet. Seriously. So let’s finally, actually stop burning carbon.

If We Burn All the Fossil Fuels

“The practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned” – Hansen et al. September 2013

WHO: James Hansen, Makiko Sato, The Earth Institute, Columbia University, New York, NY
Gary Russell, NASA Goddard Institute for Space Studies, New York, NY
Pushker Kharecha, The Earth Institute, Columbia University, NASA Goddard Institute for Space Studies, New York, NY

WHAT: Using deep ocean oxygen isotope ratios to determine the sensitivity of climate forcing for sea levels and surface temperatures.

WHEN: September 2013

WHERE: Philosophical Transactions of the Royal Society A (Phil Trans R Soc A) Vol. 371, No. 2001

TITLE: Climate sensitivity, sea level and atmospheric carbon dioxide (open access)

Ok, firstly, let us just take a moment to geek out about how awesome science is. This paper has looked at what our planet was like millions of years ago by studying at the amount of different oxygen and carbon types in the shells of foraminifera that have been buried at the bottom of the ocean since they died millions of years ago. Science can tell us not only how old they are by dating the carbon in their fossilised bodies, but also what the temperature was too. That is awesome.

Foraminifera from Japan (Wikimedia commons)

Foraminifera from Japan (Wikimedia commons)

The lead author of this paper – Dr. James Hansen is pretty much the Godfather of climate science. He’s been doing climate models looking at the possible effects of extra carbon in our atmosphere since he basically had to do them by hand in the 1980s before we had the internet. He knows his stuff. And so far, he’s been right with his projections.

The paper (which is a very long read at 25 pages) focuses on the Cenozoic climate, which is the period of time from 65.5 million years ago to present. The Cenozoic is the period after the Cretaceous (so we’re talking mammals here, not dinosaurs) and includes the Palaeocene-Eocene thermal maximum where the deep ocean was 12oC warmer than today as well as the cooling from there that led to the formation of the Greenland and Antarctic ice sheets.

The period of time studied by the paper (bottom axis is million years before present) (from paper)

The period of time studied by the paper (bottom axis is million years before present) (from paper)

What does this show us? The warming that eventually led to the Palaeocene-Eocene thermal maximum started around 3,000 years before there was a massive carbon release. The researchers think this carbon release was from methane hydrates in the ocean venting, because there was a lag in the warming in the intermediate ocean after the carbon release.

The thermal maximum had global surface temperatures around 5oC warmer than today, and there was about 4,000 – 7,000 Gigatonnes (Gt) of carbon that was released into the atmosphere to force that kind of warming.

After this warming happened there were ‘hyperthermal’ events (where the temperature spiked again) as the planet slowly cooled, showing how long the recovery time for the planet was from this greenhouse warmed state.

In the warmed world of the Palaeocene-Eocene maximum, sea levels were probably 120m higher than they are now. The researchers found that there’s a snowball effect with changes in ocean temperatures where a -1oC difference in deep ocean temperatures was enough to trigger the last ice age, while sea levels were 5- 10m higher when temperatures were ‘barely warmer than the Holocene’ (which is us – we live in the Holocene).

The researchers found that during the Pliocene, (about 5million years ago) sea levels were 15m higher than today, which they point out means that the East and West Antarctic ice sheets are likely to be unstable at temperatures we will reach this century from burning fossil fuels.

From the data they then tried to work out what the sensitivity of the atmosphere is to extra carbon. This is important to know, because we’re currently changing the chemical composition of the atmosphere much faster than ever before. The previous greenhouse warming that the planet experienced occurred over millennial time scales – the current rate that we’re pumping carbon into the atmosphere is causing change over only hundreds of years.

To work out how sensitive the climate is to being forced by carbon, the researchers used a simplified model where the atmosphere was split into 24 layers to test the rapid equilibrium responses to forcing.

They wanted to find out if we could be in danger of runaway climate change – the most extreme version of which happened on the planet Venus where runaway climate change amplified by water vapour led to a new stable average temperature of 450oC and the carbon was baked onto the surface of the planet and all the water evaporated into the sky. Obviously, humanity will want to avoid that one… Good news is there isn’t enough carbon on this planet for humans to accidentally do that to ourselves until the sun does it to us in a billion years or so.

Venus_Clouds

We’ve avoided this for now (NASA NSSDC Photo Gallery)

The researchers then tested the response to doubling and halving the CO2 in the system, from the 1950 concentration of 310ppm of CO2 in the atmosphere. They found that three halving gives you a ‘snowball Earth’ response of mass glaciations, while in the other direction 1-4x CO2 is when all the snow disappears, which speeds up the feedback (because snow reflects heat) making the fast feedback sensitivity 5oC of global warming. For 8-32x CO2 the sensitivity is approx. 8oC with water vapour feedbacks (what happened on Venus but a smaller scale).

But what do any of these numbers mean?

As the paper says; ‘the practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned’.

So here’s the lesson we need to learn from the Palaeocene-Eocene thermal maximum. For global warming we can assume that 75% of it is from CO2, and the remaining 25% is from other greenhouse gasses like methane and nitrous oxide. If we burn all the fossil fuels we have left in the ground, that’s about 10-15,000Gt of carbon that we could put in the atmosphere.

That gives us 5x the CO2 from 1950, or 1,400ppm. This will give us 16oC of global warming. It will be a world where there’s an average temperature of 20oC on land and 30oC at the poles (the current average is 14oC). Keep in mind also, that 6oC of warming is generally enough for a mass extinction like the dinosaurs.

This will eliminate grain production across most of the globe and seriously increase the amount of water vapour in the air, which means it’s getting more humid (also the water vapour will destroy most of the ozone layer too).

A wet bulb temperature is the temperature with the humidity included. Humans generally live with wet bulb temperatures between 26-27oC up to 31oC in the tropics. A wet bulb temperature of 35oC or above means the body can’t cool down and results in ‘lethal hyperthermia’ which is scientist for it’s so hot and sticky that you die from the heat.

Burning all the fossil fuels will result in a planet with wet bulb temperatures routinely above 35oC, which means we’ll have cooked the atmosphere enough that we’ll end up cooking ourselves.

If the climate has a low sensitivity to this kind of forcing, it will take 4.8x CO2 concentrations to cause an unlivable climate. If the climate is more sensitive, it will take less than that to cook ourselves.

Oh, and the other kicker? The Palaeocene-Eocene thermal maximum took millions of years to take place, so the mammals survived by evolving to be smaller. Our climate change is only taking hundreds of years, which is not enough time for any plants or animals to evolve and adapt.

Basically, if we burn all the fossil fuels, we’re all going down and taking the rest of the species on the planet with us, and we really will be the dumbest smart species ever to cause our own extinction.

So far, James Hansen has been correct with his climate projections. So when he says we can’t burn all the fossil fuels because if we do we’ll cook the planet, I say we pay close attention to what he says. Oh, and we should stop burning carbon.

Plan B: Saving Political Face Beyond 2 Degrees

So far the ‘targets and timetables’ approach to keeping climate change below 2oC has done very little to reduce emissions. What happens when we start thinking about giving up the 2oC target?

WHO:  Oliver Geden, German Institute for International and Security Affairs (Stiftung Wissenschaft und Politik)

WHAT: Looking at the ‘politically possible’ in light of our failures to get anywhere near the emissions reductions needed to keep global warming below 2oC.

WHEN: June 2013

WHERE: Online at the German Institute for International and Security Affairs

TITLE:  Modifying the 2°C Target: Climate Policy Objectives in the Contested Terrain of Scientific Policy Advice, Political Preferences, and Rising Emissions (open access)

This paper is all about the realpolitik. At the outset, it points out that in the 20 years since the UN framework on climate change (UNFCCC) was adopted that progress has been ‘modest at best’. Also, in order to keep global emissions from soaring quickly beyond the 2oC limit, significant reductions will be needed in the decade between 2010-2020, which is ‘patently unrealistic’.

Ok, so we’ve procrastinated away the most important decades that we had to do something about climate change with minimal impacts on both the economy and the wider environment. What now?

This paper suggests that the best bet might be changing or ‘modifying’ the internationally agreed on 2oC target. The author points out (quite rightly) that unrealistic targets signal that you can disregard them with few consequences. For instance, I’m not about to say that I’m going to compete in the next Olympic Marathon, because the second I miss a single training session it’s obviously time to give up given I’ve never run a full marathon before.

So if the world is going to fail on our 2oC training schedule, what will we aim for instead? Should we just aim for redefining ‘safe’ and ‘catastrophic’ climate change? Should we aim for 2.5oC? Should we aim for short term overshoot in the hopes that future humans will pick up the slack when we’ve kicked the can down the road for them?

The author points out what many people don’t like to notice when their countries are failing on their carbon reduction diets – not only have we already warmed by 0.8oC, but we’ve already baked in another 0.5oC from current emissions, so we’re already really close to 2oC without even starting to give up our fossil fuel habits. Also, those reductions we’ve all been promising to make and failing to make (or withdrawing from completely in Canada’s case)? Yeah, if we met all those targets, we’d still blow ‘significantly’ past 2oC. Ouch.

The emissions gap (from paper)

The emissions gap (from paper)

Another issue – the current top-down UNFCCC approach assumes that once we reach an agreement, that effective governance structures can be set up and operating within a matter of years, which is highly unlikely given we can’t even reach an agreement yet.

So what does a ‘more pragmatic stance’ for the EU on climate policy look like if we’re going to collectively blow past 2oC? Will climate policy have any legitimacy?

The author argues that the coming palpable impacts of climate change will soon remove the political possibility of ignoring climate change as an issue while in office (which I for one am looking forward to). He also doesn’t place much faith in the UN process finding a global solution with enough time – if an agreement is reached in 2015, it’s unlikely to be ratified by 2020, at which point the targets from 2015 are obsolete.

One suggestion for the EU is reviewing the numbers for the likelihood of passing 2oC. Currently, humanity is vaguely aiming to have a 50/50 chance of staying below 2oC. If we could roll the dice with slightly higher odds of blowing 2oC, maybe we could buy some time to get our political butts in gear?

That idea puts all the hard work of mitigation on everyone post-2050, at which point we’ll all be dealing with the climate impacts as well as trying to find the time for mitigation.

The other option is to say that 2oC is a ‘benchmark’ (only slightly better than an ‘aspirational target?’) and put our faith in climate inertia allowing humanity to overshoot on emissions and then increase the amount of sequestration (negative emissions) to pull back from the brink of the next mass extinction.

The paper does acknowledge that this will implicitly approve a temperature overshoot as well as an emissions overshoot, which could possibly kick the can down the road to 2300 before global temperatures are below 2oC above what we used to call ‘normal’. Apologies to everyone’s great great great great grandchildren for making you responsible for all of that.

Kicking the can down the road to 2300 (from paper)

Kicking the can down the road to 2300 (from paper)

The author also acknowledges that overshoot policies will only be accepted by the wider public if they’re convinced that this time governments will actually respect them as limits not to be passed. Previous experience with the UNFCCC processes show that any extra time that can be wrangled through carbon accounting is likely to be procrastinated away as well.

The other option could be a target of 2.5oC or 550ppm of CO2 in the atmosphere, but as the paper points out, the ‘targets and timetables’ policies haven’t worked yet, and it might be time to look more towards feasible ‘policies and measures’.

The problem for me with this paper is that while it’s practical to look at aiming for what humanity can politically achieve in terms of climate policies, redefining what ‘dangerous climate change’ is to fit with realpolitik rather than physics won’t work. Physics doesn’t negotiate – the first law of thermodynamics doesn’t care that there was an economic downturn in 2008 that has made it harder to pass climate legislation.

So yes, we need to think about what is politically possible in the current ‘we can still procrastinate on this’ climate. But we also need to be planning for the tipping point once all the extreme weather adds up to business as usual no longer being feasible. We may be able to ‘postpone the impending failure of the 2oC target’, but we won’t be able to ignore the impacts of climate change.

Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Playing the Emissions on Shuffle

What do the emission reductions of industrialised nations look like when you count the imports manufactured overseas?

WHO: Glen P. Peters, Center for International Climate and Environmental Research, Oslo, Norway
Jan C. Minx, Department for Sustainable Engineering, and Department for the Economics of Climate Change, Technical University Berlin, Germany
Christopher L. Weberd, Science and Technology Policy Institute, Washington, Civil and Environmental Engineering, Carnegie Mellon University, Pittsburgh, USA
Ottmar Edenhofer, Department for the Economics of Climate Change, Technical University Berlin, Potsdam Institute for Climate Impact Research, Potsdam, Germany

WHAT: Measuring the transfer of CO2 emissions through international trade

WHEN: 24 May 2011

WHERE: Proceedings of the National Academy of Sciences (PNAS) vol. 108 no. 21, May 2011

TITLE: Growth in emission transfers via international trade from 1990 to 2008 (open access)

These researchers have found a problem with the way we count carbon emissions. When we count them, countries tend to count them for industries that emit within their own territorial borders, which means that emissions in the developing world have kept going up, while emissions in the developed world (or first world) have either flattened or dropped, depending on how much your government likes to admit the reality of climate change.

However, most of the emissions from the developed world are to produce goods for places like North America and Europe. So these researchers wanted to work out exactly how much international trade contributed towards global emissions increasing by 39% from 1990 – 2008. Was the increase in emissions due to development in countries like China, or was it a case of wealthy countries just shuffling their manufacturing emissions to another country and continuing to increase consumption rates?

As you might guess (spoiler alert) it’s the latter. Turns out all we’ve been doing is moving most of our industrial manufacturing emissions to developing countries and importing the products back, allowing everyone to say ‘yay, we reduced emissions!’ while the actual amount of carbon being burned continues to increase.

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

But don’t take my word for it – what does the paper say?

The researchers took the global economy and broke it down into 113 regions with 95 individual countries and 57 economic sectors. They then looked at all the national and international data they could get on supply chain emissions to produce goods between 1990-2008, as well as doing extra detailed analysis for the years 1997, 2001 and 2004. They called it a Time Series with Trade and it was based on GDP, bilateral trade and emissions statistics (all of which you can generally find at your national statistics office online). The only thing they left out of their analysis was emissions from land use change, because there wasn’t enough data for them to thoroughly analyse it.

They found that global CO2 emissions from exported goods rose from 4.3 Gigatonnes (Gt) in 1990 to 7.8 Gt of CO2 in 2008, with a big increase in the decade up to 2008. Exports have increased their share of global emissions from 20% to 26% and grew on average by 4.3% per year, which was faster than the global population grew (1.4%), faster than total global CO2 emissions grew (2%) and faster than global GDP grew (3.6%).

The only thing that export emissions didn’t grow faster than was the dollar value of all that trade, which increased by 12% each year. So not only are all those new iPhones costing you a lot of money (and making Apple super wealthy), they’re also burning a lot of carbon.

But the thing the paper points out is that international trade has led to simply shifting the location of the emissions, rather than reducing the emissions – shuffling them around the planet to avoid counting them. The researchers estimate that the transfer of emissions from wealthy countries to developing countries has been 17% per year increasing from 0.4 Gt of CO2 in 1990 to 1.6 Gt in 2008.

This is an issue, because it means that all of the countries that signed on to Kyoto to reduce their carbon emissions – most of which promised around 0.7 Gt CO2 reduction per year – have simply shifted those emissions through trade to make them someone else’s problem, while continuing to consume stuff at an ever increasing rate.

More and more stuff (epSos, flickr)

More and more stuff (epSos, flickr)

The researchers point out that while China is currently the world’s largest emitter of carbon emissions, with the USA at number two, if you counted consumption emissions (meaning you made the USA count the emissions for all the stuff they use that’s made in China), they’d swap places and the USA would be the world’s largest emitter.

This makes sense if you think it through – have a look around your house at everything that’s made in China. All of that carbon that China is burning, which is destroying their air quality and polluting their cities and people; all of that is to make stuff for you to consume.

If you count the consumption emissions, the emissions reduction of 3% from the developed world becomes an emissions growth of 11%. Oops. Also, the researchers point out that emissions reductions in wealthy countries are often exceeded by the growth of trade emissions.

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

So what does this mean, other than the fact that everyone is trying to avoid having to clean up their own mess?

It means there’s a problem with the way we count emissions from trade vs emissions from consumption. It also means that we’re currently failing to reduce our carbon emissions in any significant way, which puts us on a straight path to 4, 5 or 6oC of global warming, otherwise known as the next mass extinction.

Pandora’s Permafrost Freezer

What we know about permafrost melt is less than what we don’t know about it. So how do we determine the permafrost contribution to climate change?

WHO: E. A. G. Schuur, S. M. Natali, C. Schädel, University of Florida, Gainesville, FL, USA
B. W. Abbott, F. S. Chapin III, G. Grosse, J. B. Jones, C. L. Ping, V. E. Romanovsky, K. M. Walter Anthony University of Alaska Fairbanks, Fairbanks, AK, USA
W. B. Bowden, University of Vermont, Burlington, VT, USA
V. Brovkin, T. Kleinen, Max Planck Institute for Meteorology, Hamburg, Germany
P. Camill, Bowdoin College, Brunswick, ME, USA
J. G. Canadell, Global Carbon Project CSIRO Marine and Atmospheric Research, Canberra, Australia
J. P. Chanton, Florida State University, Tallahassee, FL, USA
T. R. Christensen, Lund University, Lund, Sweden
P. Ciais, LSCE, CEA-CNRS-UVSQ, Gif-sur-Yvette, France
B. T. Crosby, Idaho State University, Pocatello, ID, USA
C. I. Czimczik, University of California, Irvine, CA, USA
J. Harden, US Geological Survey, Menlo Park, CA, USA
D. J. Hayes, M. P.Waldrop, Oak Ridge National Laboratory, Oak Ridge, TN, USA
G. Hugelius, P. Kuhry, A. B. K. Sannel, Stockholm University, Stockholm, Sweden
J. D. Jastrow, Argonne National Laboratory, Argonne, IL, USA
C. D. Koven, W. J. Riley, Z. M. Subin, Lawrence Berkeley National Lab, Berkeley, CA, USA
G. Krinner, CNRS/UJF-Grenoble 1, LGGE, Grenoble, France
D. M. Lawrence, National Center for Atmospheric Research, Boulder, CO, USA
A. D. McGuire, U.S. Geological Survey, Alaska Cooperative Fish and Wildlife Research Unit, University of Alaska, Fairbanks, AK, USA
J. A. O’Donnell, Arctic Network, National Park Service, Fairbanks, AK, USA
A. Rinke, Alfred Wegener Institute, Potsdam, Germany
K. Schaefer, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, CO, USA
J. Sky, University of Oxford, Oxford, UK
C. Tarnocai, AgriFoods, Ottawa, ON, Canada
M. R. Turetsky, University of Guelph, Guelph, ON, Canada
K. P. Wickland, U.S. Geological Survey, Boulder, CO, USA
C. J. Wilson, Los Alamos National Laboratory, Los Alamos, NM, USA
 S. A. Zimov, North-East Scientific Station, Cherskii, Siberia

WHAT: Interviewing and averaging the best estimates by world experts on how much permafrost in the Arctic is likely to melt and how much that will contribute to climate change.

WHEN: 26 March 2013

WHERE: Climactic Change, Vol. 117, Issue 1-2, March 2013

TITLE: Expert assessment of vulnerability of permafrost carbon to climate change (open access!)

We are all told that you should never judge a book by its cover, however I’ll freely admit that I chose to read this paper because the headline in Nature Climate Change was ‘Pandora’s Freezer’ and I just love a clever play on words.

So what’s the deal with permafrost and climate change? Permafrost is the solid, permanently frozen dirt/mud/sludge in the Arctic that often looks like cliffs of chocolate mousse when it’s melting. The fact that it’s melting is the problem, because when it melts, the carbon gets disturbed and moved around and released into the atmosphere.

Releasing ancient carbon into the atmosphere is what humans have been doing at an ever greater rate since we worked out that fossilised carbon makes a really efficient energy source, so when the Arctic starts doing that as well, it’s adding to the limited remaining carbon budget our atmosphere has left. Which means melting permafrost has consequences for how much time humanity has left to wean ourselves off our destructive fossil fuel addiction.

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

 How much time do we have? How much carbon is in those cliffs of chocolate mousse? We’re not sure. And that’s a big problem. Estimates in recent research think there could be as much as 1,700 billion tonnes of carbon stored in permafrost in the Arctic, which is much higher than earlier estimates from research in the 1990s.

To give that very large number some context, 1,700 billion tonnes can also be called 1,700 Gigatonnes, which should ring a bell for anyone who read Bill McKibben’s Rolling Stone global warming math article. The article stated that the best current estimate for humanity to have a shot at keeping global average temperatures below a 2oC increase is a carbon budget of 565Gt. So if all the permafrost melted, we’ve blown that budget twice.

What this paper did, was ask the above long list of experts on soil, carbon in soil, permafrost and Arctic research three questions over three different time scales.

  1. How much permafrost is likely to degrade (aka quantitative estimates of surface permafrost degradation)
  2. How much carbon it will likely release
  3. How much methane it will likely release

They included the methane question because methane has short term ramifications for the atmosphere. Methane ‘only’ stays in the atmosphere for around 100 years (compared to carbon dioxide’s 1000 plus years) and it has 33 times the global warming potential (GWP) of CO2 over a 100 year period. So for the first hundred years after you’ve released it, one tonne of methane is as bad as 33 tonnes of CO2. This could quickly blow our carbon budgets as we head merrily past 400 parts per million of CO2 in the atmosphere from human forcing.

The time periods for each question were; by 2040 with 1.5-2.5oC Arctic temperature rise (the Arctic warms faster than lower latitudes), by 2100 with between 2.0-7.5oC temperature rise (so from ‘we can possibly deal with this’ to ‘catastrophic climate change’), and by 2300 where temperatures are stable after 2100.

The estimates the experts gave were then screened for level of expertise (you don’t want to be asking an atmospheric specialist the soil questions!) and averaged to give an estimate range. For surface loss of permafrost under the highest warming scenario, the results were;

  1. 9-16% loss by 2040
  2. 48-63% loss by 2100
  3. 67-80% loss by 2300
Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Ouch. If we don’t start doing something serious about reducing our carbon emissions soon, we could be blowing that carbon budget really quickly.

For how much carbon the highest warming scenario may release, the results were;

  1. 19-45billion tonnes (Gt) CO2 by 2040
  2. 162-288Gt CO2 by 2100
  3. 381-616Gt CO2 by 2300

Hmm. So if we don’t stop burning carbon by 2040, melting permafrost will have taken 45Gt of CO2 out of our atmospheric carbon budget of 565Gt. Let’s hope we haven’t burned through the rest by then too.

However, if Arctic temperature rises were limited to 2oC by 2100, the CO2 emissions would ‘only’ be;

  1. 6-17Gt CO2 by 2040
  2. 41-80Gt CO2 by 2100
  3. 119-200Gt CO2 by 2300

That’s about a third of the highest warming estimates, but still nothing to breathe a sigh of relief at given that the 2000-2010 average annual rate of fossil fuel burning was 7.9Gt per year. So even the low estimate has permafrost releasing more than two years worth of global emissions, meaning we’d have to stop burning carbon two years earlier.

When the researchers calculated the expected methane emissions, the estimates were low. However, when they calculated the CO2 equivalent (CO2e) for the methane (methane being 33 times more potent than CO2 over 100 years), they got;

  1. 29-60Gt CO2e by 2040
  2. 250-463Gt CO2e by 2100
  3. 572-1004Gt CO2e by 2300

Thankfully, most of the carbon in the permafrost is expected to be released as the less potent carbon dioxide, but working out the balance between how much methane may be released into the atmosphere vs how much will be carbon dioxide is really crucial for working out global carbon budgets.

The other problem is that most climate models that look at permafrost contributions to climate change do it in a linear manner where increased temps lead directly to an increase in microbes and bacteria and the carbon is released. In reality, permafrost is much more dynamic and non-linear and therefore more unpredictable, which makes it a pain to put into models. It’s really difficult to predict abrupt thaw processes (as was seen over 98% of Greenland last summer) where ice wedges can melt and the ground could collapse irreversibly.

These kinds of non-linear processes (the really terrifying bit about climate change) made the news this week when it was reported that the Alaskan town of Newtok is likely to wash away by 2017, making the townspeople the first climate refugees from the USA.

The paper points out that one of the key limitations to knowing exactly what the permafrost is going to do is the lack of historical permafrost data. Permafrost is in really remote hard to get to places where people don’t live because the ground is permanently frozen. People haven’t been going to these places and taking samples unlike more populated areas that have lengthy and detailed climate records. But if you don’t know how much permafrost was historically there, you can’t tell how fast it’s melting.

The key point from this paper is that even though we’re not sure exactly how much permafrost will contribute to global carbon budgets and temperature rise, this uncertainty alone should not be enough to stall action on climate change.

Yes, there is uncertainty in exactly how badly climate change will affect the biosphere and everything that lives within it, but currently our options range from ‘uncomfortable and we may be able to adapt’ to ‘the next mass extinction’.

So while we’re working out exactly how far we’ve opened the Pandora’s Freezer of permafrost, let’s also stop burning carbon. 

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?