Send in the Clouds

The influence of clouds is one of the big uncertainties for climate sensitivity.

WHO: Steven C. Sherwood, Climate Change Research Centre and ARC Centre of Excellence for Climate System Science, University of New South Wales, Sydney, Australia
Sandrine Bony, Jean-Louis Dufresne, Laboratoire de Météorologie Dynamique and Institut Pierre Simon Laplace (LMD/IPSL), CNRS, Université Pierre et Marie Curie, Paris, France

WHAT: Measuring evaporation, humidity and temperature change in the lower troposphere

WHEN: 2 January 2014

WHERE: Nature, Vol 505, No. 7481

TITLE: Spread in model climate sensitivity traced to atmospheric convective mixing (subs req)

The big question for many people in climate science is what happens when we double the concentration of CO2 in our atmosphere? Is it good and plants grow more? Is it bad and we all die in the next mass extinction? Is it somewhere in between and maybe we’ll be ok?

Most climate models estimate between 1.5oC and 4.5oC of warming for the doubling of CO2, but as I’ve blogged before there’s a huge difference between the consequences of 1.5oC and 4.5oC. So how do we narrow this down for a more accurate model?

These scientists from Australia and France decided to measure the drying effect of cloud mixing to try and sort it out.

Cloud feedbacks are one of the unknowns in climate models – they’re constantly changing, their impact is localised, but overall they have a big impact. They also have different impacts at different levels. High clouds, which are 8km above ground (or at 400hPa pressure) play a minor role in feedback, because at that level of the atmosphere, the temperature isn’t changing that much – it’s happening in the lower atmosphere because that’s where we’re spewing all our carbon pollution (incidentally, warming in the lower atmosphere and lack of warming in the upper atmosphere is one of the ways we know humans are causing current climate change).

Mid level clouds (from 3km-8km) have medium levels of climate feedbacks, but the low clouds – below 3km (750hPa pressure) are capable of strong feedback. Because of this, these researchers decided to measure the low clouds.

Obligatory cloud photo (photo: Amy Huva)

Obligatory cloud photo (photo: Amy Huva)

The evaporation that increases as the temperature warms could lead to more clouds forming, however on the other hand a warmer climate could mean more dry air is drawn to the surface which doesn’t allow the clouds to form. Which one is it?

Firstly, cloud mixing. 2km above sea level there is a cloud boundary where a specific type of cloud mixing takes place. This is called ‘lower tropospheric mixing’ which sounds to me like a drink I’d order on a summer beach holiday. What it actually means is a mix of downdrafts of wind, local evaporation of rain and the transportation of shallow clouds. There is large scale mixing which happens via specific circulations, and there’s small scale mixing which is much more varied and localised.

Cloud mixing at the boundary layer (from paper)

Cloud mixing at the boundary layer (from paper)

The researchers measured each type of mixing (small scale and large scale) separately to see whether they would dry out with increased temperatures or not. They measured the temperature and humidity change as well as drying between 700hPa and 850hPa pressure to work out the method of mixing involved.

From the 48 different models they looked at – those that got warmer also got drier. For the small scale mixing, with 4oC of warming, the drying increased by 6-30% compared with only 8% on land. This means that while the land surface is experiencing smaller amounts of drying, clouds are drying out faster. So if you think the drought is bad – it’s worse for the poor clouds.

For the large scale mixing, the researchers looked at monthly data from 10 different models and from applying temperature, humidity and drying to climate sensitivity were able to account for 50% of the variance in system cloud feedbacks! Considering they started their research project saying that ‘no compelling theory of low cloud amount has yet emerged’ I’d say that’s a very successful experiment.

The bad news is they also found with large scale mixing that for every degree Celsius of warming, drying increased by 5-7%. They also found that moisture transport increases strongly with warming, but further research will be needed to work out why that happens (yay science! Discovering new things!).

So this means that from their models with increased accuracy of cloud feedback measuring, a doubling of CO2 in the atmosphere shows a sensitivity of 4oC warming, with a lower limit of 3oC. The researchers found that anything less than 3oC of warming did not line up with cloud observations for that concentration of CO2. The researchers point out that you can’t rule out something weird happening in nature that could change the process suddenly (those scary tipping points), but the process they found was this:

Lower tropospheric mixing dries out the boundary layer by 5-7% per degree of warming because of stronger vertical water vapour gradients, which lead to surface evaporation increases of 2% per degree of warming.

While it’s good to know with more certainty the impacts of doubling CO2 in the atmosphere, unfortunately this time it’s not good news. The way to solve this problem? Stop burning carbon before we’ve doubled CO2.

Our Fast-Paced Modern Climate

How can we determine dangerous levels of climate change so that we can stay within those limits?

WHO: James Hansen, Makiko Sato, Jeffrey Sachs, Earth Institute, Columbia University, New York, USA Pushker Kharecha, Earth Institute, Columbia University, New York, Goddard Institute for Space Studies, NASA, New York, USA
Valerie Masson-Delmotte, Institut Pierre Simon Laplace, Laboratoire des Sciences du Climat et de l’Environnement (CEA-CNRS-UVSQ), Gif-sur-Yvette, France
Frank Ackerman, Synapse Energy Economics, Cambridge, Massachusetts, USA
David J. Beerling, Department of Animal and Plant Sciences, University of Sheffield, South Yorkshire, UK
Paul J. Hearty, Department of Environmental Studies, University of North Carolina, USA
Ove Hoegh-Guldberg, Global Change Institute, University of Queensland, Australia
Shi-Ling Hsu, College of Law, Florida State University, Tallahassee, Florida, USA
Camille Parmesan, Marine Institute, Plymouth University, Plymouth, Devon, UK, Integrative Biology, University of Texas, Austin, Texas, USA
Johan Rockstrom, Stockholm Resilience Center, Stockholm University, Sweden
Eelco J. Rohling, School of Ocean and Earth Science, University of Southampton, Hampshire, UK Research School of Earth Sciences, Australian National University, Canberra, ACT, Australia
Pete Smith, University of Aberdeen, Aberdeen, Scotland, United Kingdom
Konrad Steffen, Swiss Federal Institute of Technology, Swiss Federal Research Institute WSL, Zurich, Switzerland
Lise Van Susteren, Center for Health and the Global Environment, Advisory Board, Harvard School of Public Health, Boston, Massachusetts, USA
Karina von Schuckmann, L’Institut Francais de Recherche pour l’Exploitation de la Mer, Ifremer, Toulon, France
James C. Zachos, Earth and Planetary Science, University of California, Santa Cruz, USA

WHAT: Working out what the limit to dangerous climate change is and what the implications are for the amount of carbon we need to not burn.

WHEN: December 2013

WHERE: PLOS One, Vol 8. Issue 12

TITLE: Assessing ‘‘Dangerous Climate Change’’: Required Reduction of Carbon Emissions to Protect Young People, Future Generations and Nature (open access)

This (very) lengthy and detailed paper runs us all through exactly what’s happening with the planet’s climate, what’s making it change so rapidly (spoiler: it’s us) and what objectively we need to do about it. Needless to say, since the lead author is Dr. James Hansen, the godfather of climate science, we would do well to heed his warnings. He knows his stuff; he was doing climate models before I was born (seriously).

Firstly, the paper points out that humans are the main cause of climate change and then also neatly points out that while all 170 signatories to the UN Framework on climate change (UNFCCC) have agreed to reduce emissions, so far not only have we not worked out what the limit for ‘dangerous’ climate change is, we’ve also done nothing to fix it except fiddle at the edges.

Epic procrastination fail, humanity.

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP

Then, the researchers look at 2oC warming as a target. In reality, while 2oC is a nice, seemingly round number that is far enough away from our current 0.8oC of warming, the reason it was randomly chosen to be our line in the sand is that it’s the point beyond which ecosystems start collapsing. I have a sneaking suspicion it was also easy to agree on because it was way into the ‘distant’ future, but let’s play nice and believe it was all for rational scientific rigour.

The latest IPCC report says that if we’re going to stay below 2oC, we can burn a total of 1,000GtC (Gigatonnes of carbon). Ever. This means we need to leave fossil fuels in the ground and stop cutting down more trees than we plant.

As has been pointed out in previous papers, the researchers show that burning all the fossil fuels is a really bad idea. A really bad idea in a mass extinction like the dinosaurs kind of way.

So, if we look at all the warming that has happened so far and measure the energy imbalance in the atmosphere, what do we get? Firstly a step back – energy imbalance. This is your energy budget where you always want it to be constant. Energy comes into the atmosphere from the sun, some goes back out, some stays and keeps us warm and comfy on planet Earth.

Fossil fuels mean that humans have taken a seriously large amount of energy out of the ground and burned it. Releasing this energy into the atmosphere means we’ve now got too much energy inside our atmosphere and we’re out of balance.

What happens when we’re out of balance? Well, so far it hasn’t been pretty. With only 0.8oC of global warming 98% of Greenland’s surface melted for the first time in recorded history, Arctic sea ice hit new record lows, the planet has seen more frequent more extreme storms, floods, typhoons, hurricanes, droughts, fires, algal blooms, glacial melt, and ocean acidification. We’ve had weird storms no-one has ever heard of before like Derechos, we’ve had tropical diseases in new places, and the Jet Stream over the Northern Hemisphere getting ‘stuck’ and dumping more weird weather on us. It’s pretty clear the planet is unwell and that it’s because of us.

you have humans

If all that terrifying stuff is happening at 0.8oC of warming, what does that make 2oC? Hopefully your answer is ‘horrifying’, because that’s what my answer is. Since 2050 (when we’ll arrive at 2oC if we keep going business as usual) is within my working lifetime, I’ll let you know how horrifying it is when we get there.

More scientific than ‘horrifying’ though, the researchers point out that previous paleoclimate changes, from the Earth’s tilt and other slow oscillations took between 20,000 – 400,000 years to happen. Changes happening at that rate give the plants and animals and fish time to relocate and survive. The rate at which we’re changing our modern climate is bad news for things that are not mobile.

How far out of balance are we? The paper estimates that between 2005-2010 the planet was 0.58 W/m2 (± .15W/m2) out of balance. How much of that was caused by humanity? Well, solar irradiance has been going down over the last while, so it’s pretty much all us.

If we are 0.5 W/m2 out of balance, the researchers calculate that we would need to reduce the CO2 concentration down to 360ppm to have energy balance again (we’re currently at 395ppm). If you include some error in the calculations and we’re 0.75W/m2 out of balance, humanity needs to get CO2 down to a concentration of 345ppm.

To make planning easier, the researchers suggest we just aim to get and stay below 350ppm.

The paper then runs through all the reasons why 2oC is a really bad idea to let happen. Because it will lead to damaging sea level rise (sorry Miami), because change is happening too quickly for many species to survive and more than half the species on the planet could go extinct from too much warming (and yes, if we warm this planet enough, humans could be part of that mass extinction).

Because the recovery back to normal temperatures happens on a timescale of millions of years which is beyond the comprehension of humanity.

So to avoid being the next mass extinction, what do we need to do? First, we need to quantify how quickly fossil fuels need to be totally phased out.

If emissions are reduced to zero in 2015, the world could get back to 350ppm by 2100. If we wait until 2035, it would take until 2300. If we wait until 2055, it would take until the year 3000. So when we start reducing emissions is important.

Reduction scenarios (from paper) BAU: Business As Usual

Reduction scenarios (from paper) BAU: Business As Usual

If we had started reducing emissions in 2005, it would only have taken reductions of 3.5% per year. Since we didn’t do that, if we start now, we need to reduce emissions by 6% a year. If we delay until 2020 it becomes 15% per year, so let’s not procrastinate on this one humanity. Also keep in mind that the amount that is considered ‘politically possible’ is currently around 2-3% reductions each year, which means that scientific reality and political delusions are going to collide very soon.

If we reduce our carbon emissions by 6% per year to keep below 350ppm of carbon dioxide by the end of the century, our total carbon budget is going to be 500GtC.

This means we’ve got ~129GtC that we can burn between now and 2050, and another 14GtC left over for 2050-2100. Humanity has already burned through ~370GtC from fossil fuels, so we’ve got to kick this habit quickly.

The paper points out that this means all of our remaining fossil fuel budget can be provided for by current conventional fossil fuels. Therefore, we would require the rapid phase-out of coal and leave all unconventional fossil fuels in the ground. Yes, all of them – all the tar sands, the shale gas, the shale oil, the Arctic oil, the methane hydrates, all of it.

The researchers also say that slow climate feedbacks need to be incorporated into planning, because we’re probably starting to push those limits. Slow feedbacks include things like melting ice sheets (Greenland and Antarctica), deforestation, melting permafrost and methane hydrates.

These things are like climate ‘black swans’ – they’re unquantifiable in that you don’t know when you’ve passed the irreversible tipping point until after you’ve gone beyond it, but things like the ocean no longer being able to absorb most of the carbon we spew into the atmosphere and the rapidly melting permafrost need to be considered in daylight as well as our nightmares now. This is because slow feedbacks can amplify climate changes by 30-50% which puts a big hole in our ‘not burning carbon anymore’ plan.

The paper points out: ‘warming of 2oC to at least the Eemian level could cause major dislocations for civilisation’ which I don’t even need to translate from scientist, because scientists are no longer bothering to pull their punches when explaining how quickly we need to stop burning carbon before we’re really screwed.

So what do we do? The paper makes some suggestions, pointing out that since the science clearly shows what’s happening, the range of responses is also pretty clear.

The first thing is a price on carbon. This paper suggests a carbon ‘fee’ with a border levy for countries that don’t sign up to the fee idea. The fee starts at $15/tonne of carbon and increases by $10/tonne each year. Imports from countries that don’t have the fee get charged at the border, which can then be used for assistance to industries that are exporting to countries without the fee.

They point out that this fee is below the price of cleaning up our climate carbon mess. If we wanted to pay to sequester 100ppm of CO2 out of the atmosphere, it would cost ~$500/tonne of carbon. If that was charged to all countries based on their cumulative emissions, that would be a cost of $28 trillion for the USA (or $90,000 per person) who is responsible for 25% of cumulative global emissions. Hmmm – expensive.

The other things we need to get rid of fossil fuels are hybrid renewable smart grids and better efficiency as well as not only an end to deforestation but ‘reforestation’ and increasing the amount of trees on the planet.

There’s a lot of work to be done, but the clearest thing from this paper is the choice we cannot make is to do nothing. So let’s stop burning carbon.

Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?

 

Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

Smoking Kills, so does Climate Change

A translation of the IPCC 5th Assessment Report Summary for Policymakers

WHO: The Intergovernmental Panel on Climate Change

WHAT: Summary for policy makers of their 2000 page 5th Assessment Report (AR5) of the state of the climate and climate science.

WHEN: 27 September 2013

WHERE: On the IPCC website

TITLE: Climate Change 2013: The Physical Science Basis Summary for Policymakers (open access)

There’s a lot of things not to like about the way the IPCC communicates what they do, but for me the main one is that they speak a very specific dialect of bureaucrat that no-one else understands unless they’ve also worked on UN things and speak the same sort of acronym.

The worst bit of this dialect of bureaucrat is the words they use to describe how confident they are that their findings are correct. They probably believe they’re being really clear, however they use language that none of the rest of us would ever use and it means their findings make little sense without their ‘very likely = 90-100% certain’ footnote at the front.

So now that we’ve established that the UN doesn’t speak an understandable form of English, what does the report actually say? It works its way through each of the different climate systems and how they’re changing because humans are burning fossil fuels.

As you can see from this lovely graph, each of the last three decades has been noticeably warmer than the proceeding one, and the IPCC are 66% sure that 1983-2012 was the warmest 30 year period in 1,400 years.

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

One of the reasons I really like this graph is you can see how the rate of change is speeding up (one of the key dangers with climate change). From 1850 through to around 1980 each decade’s average is touching the box of the average before it, until after the 80s when the heat shoots up much more rapidly.

The report did have this dig for the deniers though: ‘Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends’. Which is UN bureaucrat for ‘when you cherry pick data to fit your denier talking points you’re doing it wrong’.

Looking at regional atmospheric trends, the report notes that while things like the Medieval Warm Period did have multi-decadal periods of change, these changes didn’t happen across the whole globe like the warming currently being measured.

In the oceans, the top layer has warmed (the top 75m) by 0.11oC per decade from 1971 to 2010, and more than 60% of the carbon energy we’ve pumped into the atmosphere since 1971 has been stored in the top layer, with another 30% being stored in the ocean below 700m.

This extra heat is not just causing thermal expansion, it’s speeding up sea level rise, which the IPCC are 90% certain increased from 1901 to 2010 from 1.7mm per year to 3.2mm per year. This is now happening faster than the past two millenniums. Yes, sea level is rising faster than it has for the last 2,000,000 years so you might want to sell your waterfront property sooner, rather than later.

The extra carbon has also made it harder to live in the ocean if you own a shell, because the acidity of the ocean has increased by 26% which makes shells thinner and harder to grow.

On the glaciers and the ice sheets, the IPCC is 90% certain that the rate of melting from Greenland has increased from 34Gigatonnes (Gt) of ice per year to 215Gt of ice after 2002. Yes, increased from 34Gt to 215Gt – it’s melting six times faster now thanks to us.

For Antarctica, the IPCC is 66% certain that the rate of ice loss has increased from 30Gt per year to 147Gt per year, with most of that loss coming from the Northern Peninsula and West Antarctica. Worryingly, this ice loss will also include the parts of Antarctica that are gaining ice due to natural variability.

And at the North Pole, Santa is going to have to buy himself and his elves some boats or floating pontoons soon, because the IPCC have found ‘multiple lines of evidence support[ing] very substantial Artctic warming since the mid-20th Century’. Sorry Santa!

As for the carbon we’ve been spewing into the atmosphere since the 1850s, well, we’re winning that race too! ‘The atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years’. Congratulations humanity – in the last century and a half, we’ve changed the composition of the atmosphere so rapidly that this hasn’t been seen in 800,000 years!

Methane levels have gone up by 150%, and I’m undecided as to whether that means I should eat more beef to stop the cows from farting, or if it means we raised too many cows to be steaks in the first place…

This is the part of the report where we get into the one excellent thing the IPCC did this time around – our carbon budget. I’m not sure whether they realised that committing to reduce certain percentages by certain years from different baselines meant that governments were able to shuffle the numbers to do nothing and make themselves look good at the same time, but this is a promising step.

I’ve written about the very excellent work of Kevin Anderson at the Tyndall Centre in the UK before, but the basic deal with a carbon budget is this: it doesn’t matter when we burn the carbon or how fast, all the matters is the total emissions in the end. You can eat half the chocolate bar now, and half the chocolate bar later, but you’re still eating a whole bar.

Our budget to have a 2/3 chance of not going beyond dangerous climate change is 1,000Gt of carbon and so far we’ve burnt 545Gt, so we’re more than halfway there. All of this leads to the conclusion that ‘human influence on the climate system is clear. This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming and understanding of the climate system.’

What observations you may ask? Scientists have made progress on working out how climate change pumps extreme weather up and makes it worse. They also got it right for the frequency of extreme warm and cold days, which if you live in North America was the hot extremes winning 10:1 over the cold extremes. Round of applause for science everyone!

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

They’re also 95% sure that more than half of the observed global surface warming from 1951 is from humanity. So next time there’s a nasty heatwave that’s more frequent than it should be, blame humans.

The report does also point out though that even though the heat records are beating the cold records 10-1, this doesn’t mean that snow disproves climate change (sorry Fox News!). There will still be yearly and decade by decade by variability in how our planet cooks which will not be the same across the whole planet. Which sounds to me like we’re being warmed in an uneven microwave. For instance, El Niño and La Niña will still be big influencers over the Pacific and will determine to a great extent the variability in the Pacific North West (yeah, it’s still going to rain a lot Vancouver).

For those that were fans of the film The Day After Tomorrow, there’s a 66% chance the Atlantic Meridional Circulation will slow down, but only a 10% chance it will undergo an abrupt change or collapse like it did in the film, so you’re not going to be running away from a flash freezing ocean any time this century.

The report then runs through the different scenarios they’ve decided to model that range from ‘we did a lot to reduce carbon emissions’ to ‘we did nothing to reduce carbon emissions and burned all the fossil fuels’. Because this is the IPCC and they had to get EVERYONE to agree on each line of the report (I’m serious, they approved it line by line, which has to be the most painful process I can think of) the scenarios are conservative in their estimations, not measuring tipping points (which are really hard to incorporate anyway). So their ‘worst case scenario’ is only 4.0oC of surface warming by 2100.

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Now, obviously ‘only’ 4oC of climate change by the end of the century is still pretty unbearable. There will still be a lot of hardship, drought, famine, refugee migration and uninhabitable parts of the planet with 4oC. However, once we get to 4oC, it’s likely to have triggered tipping points like methane release from permafrost, so 4oC would be a stopping point towards 6oC even if we ran out of carbon to burn. And 6oC of course, as you all hear me say frequently is mass extinction time. It’s also the point at which even if humanity did survive, you wouldn’t want to live here anymore.

The paper finishes up with a subtle dig at the insanity of relying on geoengineering, pointing out that trying to put shade reflectors into orbit or artificially suck carbon out of the air has a high chance of going horribly wrong. They also point out that if we did manage large scale geoegineering and it then broke down, re-releasing that carbon back into the atmosphere would super-cook the planet really quickly.

The moral of this 36 page ‘summary’ is that it’s us guys. We’re as certain that we’ve done this as we are that smoking causes cancer. We have burned this carbon and it’s changed the chemical energy balance of the atmosphere and if we don’t stop burning carbon we’re going to cook the whole planet. Seriously. So let’s finally, actually stop burning carbon.

If We Burn All the Fossil Fuels

“The practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned” – Hansen et al. September 2013

WHO: James Hansen, Makiko Sato, The Earth Institute, Columbia University, New York, NY
Gary Russell, NASA Goddard Institute for Space Studies, New York, NY
Pushker Kharecha, The Earth Institute, Columbia University, NASA Goddard Institute for Space Studies, New York, NY

WHAT: Using deep ocean oxygen isotope ratios to determine the sensitivity of climate forcing for sea levels and surface temperatures.

WHEN: September 2013

WHERE: Philosophical Transactions of the Royal Society A (Phil Trans R Soc A) Vol. 371, No. 2001

TITLE: Climate sensitivity, sea level and atmospheric carbon dioxide (open access)

Ok, firstly, let us just take a moment to geek out about how awesome science is. This paper has looked at what our planet was like millions of years ago by studying at the amount of different oxygen and carbon types in the shells of foraminifera that have been buried at the bottom of the ocean since they died millions of years ago. Science can tell us not only how old they are by dating the carbon in their fossilised bodies, but also what the temperature was too. That is awesome.

Foraminifera from Japan (Wikimedia commons)

Foraminifera from Japan (Wikimedia commons)

The lead author of this paper – Dr. James Hansen is pretty much the Godfather of climate science. He’s been doing climate models looking at the possible effects of extra carbon in our atmosphere since he basically had to do them by hand in the 1980s before we had the internet. He knows his stuff. And so far, he’s been right with his projections.

The paper (which is a very long read at 25 pages) focuses on the Cenozoic climate, which is the period of time from 65.5 million years ago to present. The Cenozoic is the period after the Cretaceous (so we’re talking mammals here, not dinosaurs) and includes the Palaeocene-Eocene thermal maximum where the deep ocean was 12oC warmer than today as well as the cooling from there that led to the formation of the Greenland and Antarctic ice sheets.

The period of time studied by the paper (bottom axis is million years before present) (from paper)

The period of time studied by the paper (bottom axis is million years before present) (from paper)

What does this show us? The warming that eventually led to the Palaeocene-Eocene thermal maximum started around 3,000 years before there was a massive carbon release. The researchers think this carbon release was from methane hydrates in the ocean venting, because there was a lag in the warming in the intermediate ocean after the carbon release.

The thermal maximum had global surface temperatures around 5oC warmer than today, and there was about 4,000 – 7,000 Gigatonnes (Gt) of carbon that was released into the atmosphere to force that kind of warming.

After this warming happened there were ‘hyperthermal’ events (where the temperature spiked again) as the planet slowly cooled, showing how long the recovery time for the planet was from this greenhouse warmed state.

In the warmed world of the Palaeocene-Eocene maximum, sea levels were probably 120m higher than they are now. The researchers found that there’s a snowball effect with changes in ocean temperatures where a -1oC difference in deep ocean temperatures was enough to trigger the last ice age, while sea levels were 5- 10m higher when temperatures were ‘barely warmer than the Holocene’ (which is us – we live in the Holocene).

The researchers found that during the Pliocene, (about 5million years ago) sea levels were 15m higher than today, which they point out means that the East and West Antarctic ice sheets are likely to be unstable at temperatures we will reach this century from burning fossil fuels.

From the data they then tried to work out what the sensitivity of the atmosphere is to extra carbon. This is important to know, because we’re currently changing the chemical composition of the atmosphere much faster than ever before. The previous greenhouse warming that the planet experienced occurred over millennial time scales – the current rate that we’re pumping carbon into the atmosphere is causing change over only hundreds of years.

To work out how sensitive the climate is to being forced by carbon, the researchers used a simplified model where the atmosphere was split into 24 layers to test the rapid equilibrium responses to forcing.

They wanted to find out if we could be in danger of runaway climate change – the most extreme version of which happened on the planet Venus where runaway climate change amplified by water vapour led to a new stable average temperature of 450oC and the carbon was baked onto the surface of the planet and all the water evaporated into the sky. Obviously, humanity will want to avoid that one… Good news is there isn’t enough carbon on this planet for humans to accidentally do that to ourselves until the sun does it to us in a billion years or so.

Venus_Clouds

We’ve avoided this for now (NASA NSSDC Photo Gallery)

The researchers then tested the response to doubling and halving the CO2 in the system, from the 1950 concentration of 310ppm of CO2 in the atmosphere. They found that three halving gives you a ‘snowball Earth’ response of mass glaciations, while in the other direction 1-4x CO2 is when all the snow disappears, which speeds up the feedback (because snow reflects heat) making the fast feedback sensitivity 5oC of global warming. For 8-32x CO2 the sensitivity is approx. 8oC with water vapour feedbacks (what happened on Venus but a smaller scale).

But what do any of these numbers mean?

As the paper says; ‘the practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned’.

So here’s the lesson we need to learn from the Palaeocene-Eocene thermal maximum. For global warming we can assume that 75% of it is from CO2, and the remaining 25% is from other greenhouse gasses like methane and nitrous oxide. If we burn all the fossil fuels we have left in the ground, that’s about 10-15,000Gt of carbon that we could put in the atmosphere.

That gives us 5x the CO2 from 1950, or 1,400ppm. This will give us 16oC of global warming. It will be a world where there’s an average temperature of 20oC on land and 30oC at the poles (the current average is 14oC). Keep in mind also, that 6oC of warming is generally enough for a mass extinction like the dinosaurs.

This will eliminate grain production across most of the globe and seriously increase the amount of water vapour in the air, which means it’s getting more humid (also the water vapour will destroy most of the ozone layer too).

A wet bulb temperature is the temperature with the humidity included. Humans generally live with wet bulb temperatures between 26-27oC up to 31oC in the tropics. A wet bulb temperature of 35oC or above means the body can’t cool down and results in ‘lethal hyperthermia’ which is scientist for it’s so hot and sticky that you die from the heat.

Burning all the fossil fuels will result in a planet with wet bulb temperatures routinely above 35oC, which means we’ll have cooked the atmosphere enough that we’ll end up cooking ourselves.

If the climate has a low sensitivity to this kind of forcing, it will take 4.8x CO2 concentrations to cause an unlivable climate. If the climate is more sensitive, it will take less than that to cook ourselves.

Oh, and the other kicker? The Palaeocene-Eocene thermal maximum took millions of years to take place, so the mammals survived by evolving to be smaller. Our climate change is only taking hundreds of years, which is not enough time for any plants or animals to evolve and adapt.

Basically, if we burn all the fossil fuels, we’re all going down and taking the rest of the species on the planet with us, and we really will be the dumbest smart species ever to cause our own extinction.

So far, James Hansen has been correct with his climate projections. So when he says we can’t burn all the fossil fuels because if we do we’ll cook the planet, I say we pay close attention to what he says. Oh, and we should stop burning carbon.

Vote for last week’s paper!

climate voter

Remember how I was excited about the possibilities of scaling up the carbon sequestration process outlined in last week’s post from the Proceedings of the National Academy of Sciences in the USA?

Turns out you can vote for it!

I had an email from the lead author of the paper (I send my blog posts to the lead authors when I post them) letting me know that their process has made the finalists of two MIT Climate CoLab ideas. So if you’re excited about the idea of feasibly sequestering carbon dioxide from the oceans being tested out as well, you can vote for them.

The first proposal is for the Geoengineering section called ‘Saving the Planet v2.0‘. The second proposal is for the Electric power sector section called ‘Spontaneous Conversion of Power Plant CO2 to Dissolved Calcium Bicarbonate‘.

Climate CoLab is an online space where people work to try and crowdsource ideas for what to do about climate change. The contest voting closes in 11 days (August 30th) and the winning proposals will be presented at the Crowds & Climate Conference at MIT in November.

So if it takes your fancy, and you’d like to see this project presented at the conference, go forth and vote!

 

Disclosure: I am not affiliated with either the paper or the MIT CoLab project.

Antacid for our Oceans

An electrolysis method that removes CO2 from seawater could be affordably scaled up for commercial carbon sequestration.

WHO: Greg H. Rau, Institute of Marine Sciences, University of California, Santa Cruz, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA
Susan A. Carroll, William L. Bourcier, Michael J. Singleton, Megan M. Smith, and Roger D. Aines, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA

WHAT: An electrochemical method of sequestering CO2 from sea water using silicate rocks.

WHEN: June 18, 2013

WHERE: Proceedings of the National Academy of Sciences (USA), PNAS vol. 110, no. 25

TITLE: Direct electrolytic dissolution of silicate minerals for air CO2 mitigation and carbon-negative H2 production (open access)

This paper was fun – I got to get my chemistry nerd back on thinking in moles per litre and chemical equations! It almost made me miss university chemistry lectures.

No, not those moles per litre! (IFLS facebook page)

No, not those moles per litre! (IFLS facebook page)

So what does chemistry jokes have to do with carbon sequestration? It’s looking increasingly like humanity is going to have to figure out ways to draw carbon out of the atmosphere or the oceans because we’ve procrastinated on reducing our carbon emissions for so long.

There’s two options for this – you can either create a chemical reaction that will draw CO2 out of the air, or you can create a chemical reaction that will draw CO2 out of a solution, and given how quickly the oceans are acidifying, using sea water would be a good idea. The good news is; that’s exactly what these researchers did!

Silicate rock (which is mostly basalt rock) is the most common rock type in the Earth’s crust. It also reacts with CO2 to form stable carbonate and bicarbonate solids (like the bicarbonate soda you bake with). Normally this takes place very slowly through rock weathering, but what if you used it as a process to sequester CO2?

The researchers created a saline water electrolytic cell to test it out. An electrolytic cell is the one you made in school where you had an anode and a cathode and two different solutions (generally) and when you put an electric current through it you created a chemical reaction. What these researchers did was put silicate minerals, saline water and CO2 in on one side, and when they added electricity got bicarbonates, hydrogen, chlorine or oxygen, silicates and salts.

A basic schematic of the experiment (from paper)

A basic schematic of the experiment (from paper)

The researchers used an acid/base reaction (remember those from school?!) to speed up the silicate and CO2 reaction, which also works well in an ocean because large differences in pH are produced in saline electrolysis. Are you ready to get really nerdy with me? The chemical equation is this:

Chemical equation for the experiment (from paper)

Chemical equation for the experiment (from paper)

So how did the experiment go? It worked! They got successfully sequestered carbon dioxide with an efficiency of 23-32% that sequestered 0.147g of CO2 per kilojoule (kJ) of electricity used.

There are issues around the scaling up of the reaction of course – once the bicarbonate has been created, where do you store it? The paper suggested ocean storage as the bicarbonate solids would be inert (un-reactive). I would hope that a re-use option could be found – has anyone looked into using bicarbonate solids as an eco-building material?

There’s also the issue of needing to power the reaction with electricity. If scaled up, this process would have to make sure it was powered by renewable energy, because burning carbon to sequester carbon gives you zero.

Also, if sea water is used, the main by-product is Cl2 so the researchers point out that while it would be feasible to do this process directly in the ocean, the issue of what to do with all that chlorine would need to be dealt with. The paper suggests using oxygen selective anodes in the electrolysis, or ion-selective membranes around the reaction to keep the chlorine separate from the ocean.

That being said, there are some exciting upsides to this process. The paper points out that the amount of silicate rock in the world ‘dwarf[s] that needed for conversion of all present and future anthropogenic CO2.’ Also, using sea water is an easier way to sequester CO2 rather than air-based methods.

Scaling the method up is economically feasible too. The researchers estimated that 1.9 MWh (megawatt hours) of power would be needed per metric tonne of CO2 sequestered. If the waste hydrogen from the process were sold as hydrogen fuel for fuel cells, the price of CO2 sequestered would be $86/tonne. If the hydrogen fuel wasn’t feasible, it would still only be $154/tonne, which compares very favourably to most current carbon capture and storage feasibility estimates of $600-$1000/tonne of CO2.

So, like an antacid for the oceans, if this process can be scaled up commercially through research and development, we could have an effective way to not only capture and store carbon, but also reduce ocean acidification. A good-news story indeed – now we just need to stop burning carbon.

CO2 garden steroids

Is additional atmospheric CO2 fertilizing plants? Is this a good thing?

WHO: Randall J. Donohue, Tim R. McVicar, CSIRO Land and Water, Canberra, ACT, Australia
Michael L. Roderick, Research School of Biology, Australian National University, Canberra,
Research School of Earth Sciences, Australian National University, Canberra,  Australian Research Council Centre of Excellence for Climate System Science, Sydney, New South Wales, Australia.
Graham D. Farquhar, Research School of Biology, Australian National University, Canberra, ACT, Australia.

WHAT: Trying to measure the effect that CO2 fertilization has on plants from increased atmospheric CO2 from global warming

WHEN: 19 June 2013

WHERE: Geophysical Research Letters, Vol. 40, Issue 12, June 2013

TITLE: Impact of CO2 fertilization on maximum foliage cover across the globe’s warm, arid environments (subs req.)

Climate deniers and people who don’t want to take action on climate change often say that increased levels of CO2 in our atmosphere will be a good thing, because plants need CO2 to grow, so why not let the plants have all the CO2 they want and watch them grow like gangbusters!?

More CO2= good? (Chris Gin, flickr)

More CO2= good? (Chris Gin, flickr)

This is the same as suggesting that because humans need water that I should drink water all the time without pause and that would be awesome (it wouldn’t; I would die from water intoxication). Flaws in denier logic aside, these researchers from the CSIRO tried to find out if you could measure whether there was an increase in plant growth from increased atmospheric CO2 and whether you could measure it.

The researchers looked at warm, arid areas of the world where rain is the limiting factor for plant growth. Places like South Western Australia, Southern California and South Eastern Spain. They then looked at the plant growth data from 1982-2010 and broke it into three year segments that were averaged to make sure they accounted for a lag time between changes in rain patterns and plant growth.

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Then they ran the numbers for what plant growth would have been with constant amounts of rain so they could separate out the effect of CO2 alone.

What they found was that transpiration and photosynthesis are directly coupled to atmospheric CO2, which plays a role in setting the Fx edge (upper limit of plant growth). Plant growth increased ~11% between 1982 and 2010 from increased levels of atmospheric CO2 fertilization.

Then, just to make sure they were correct, they looked at the different things that could have influenced that increase. Increased temperatures lowered plant growth (too hot to grow). Plant productivity increased to a certain point under drought conditions (as plants became more water efficient and drought tolerant), but that couldn’t account for the 11% increase. There was an observed 14% increase in plant growth from a 10% increase in precipitation, but that couldn’t account for their numbers either because they ran them for a constant level of rain.

So, as the researchers stated in their paper, this ‘provides a means of directly observing the CO2 fertilization effect as it has historically occurred across the globe’s warm, arid landscapes’.

But does this mean all plants will grow more and we don’t have to worry about climate change anymore? Unfortunately, no.

This only applies to warm arid areas where water is the dominant limit to growth. Also, the other effects of climate change – longer droughts, more extreme storms, longer heatwaves, more extreme bushfires – are likely going to outweigh the positive effect of the increase in growth from CO2 fertilization.

The researchers point out in their conclusion that this research doesn’t simply translate to other environments with different limitations. In a Q&A with the CSIRO when asked whether this research means that climate change is good, the lead author Dr. Donohue stated; ‘this does not mean that climate change is good for the planet.’

So yes, there is a fertilization effect from increased CO2, but no, it doesn’t mean we get to keep burning carbon.

Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Your odds just shortened – Aussie Heatwaves

Climate change has increased the chance of extreme heatwaves in Australia by more than five times.

WHO: Sophie C. Lewis and David J. Karoly, School of Earth Sciences and ARC Centre of Excellence for Climate System Science, University of Melbourne, Victoria, Australia

WHAT: Looking at how much influence human-caused climate changes are pushing Australian summers into more extreme heat.

WHEN: July 2013

WHERE: Geophysical Research Letters (DOI: 10.1002/grl.50673), pre-publication release

TITLE: Anthropogenic contributions to Australia’s record summer temperatures of 2013 (subs. req)

There’s some interesting research happening at my alma mater Melbourne University these days (go Melbourne!). Even if you weren’t there to experience the extreme summer of 2012-13 in Australia, I’m sure you all remember the new colour that had to be created by the Australian Bureau of Meteorology for the weather maps when they maxed out above 50oC, or maybe the new rating for bushfires of ‘catastrophic’ for the climate fuelled fires that are beyond just an extreme risk?

Extreme Heat in January 2013 (Bureau of Meteorology)

Extreme Heat in January 2013 (Bureau of Meteorology)

So, to answer that age-old question ‘exactly how much have we messed this up?’ these researchers looked at the historical monthly weather patterns, weather patterns with natural forcing only and patterns with natural forcing and human forcing and matched those up with what actually happened.

They looked at the average monthly mean temperatures, maximum temperatures and minimum temperatures and found that the monthly extremes are increasing faster than the daily extremes – that is that there are more months that are more overall extreme than there are days of extremes.

The historical data they used for the experiment was from 1850 to 2005, with the baseline climate data (what they used as a reference for ‘normal’) being 1911-1940 because 30 years of weather data makes a climate!

They then created experiments for the data with natural forcing only, with natural and human forcing and ran exciting statistical functions like a probability density function with a kernel smoothing function that almost sounds like a super-cool popcorn maker.

To double check for error, they used the second hottest summer temperatures to make sure they could pick out the human influences from the randomness that can be the data, thereby deliberately making their findings conservative.

Once they’d run their fun popcorn-sounding numbers, they calculated the FAR – Fraction of Attributable Risk, which is exactly what it sounds like – the fraction of the risk of something happening that can attributed to a cause.

So if our ‘bad guy’ is human-induced climate change, how much can we blame it for the Australian Angry Summer of 2012-13? Well, a fair bit.

When they compared the numbers, they had 90% confidence that there was a 2.5 times increase in extreme heat from human influences. When they compared 1976-2005 data and extended the model out to 2020, the fraction increased again to a 5 times increased likelihood.

Extreme heat is ‘substantially more likely’ because of humans burning fossil fuels, which are pretty bold words from research scientists – when there’s a greater than 90% chance of something they say ‘very likely’ where most people would say ‘very certain’. In their research, events that should have been occurring 1-in-16 years naturally were happening 1-in-6 years with the historical numbers and 1-in-2 years with the model out to 2020. Ouch – summer just got more uncomfortable more often.

MTSOfan, flickr

MTSOfan, flickr

For me, the kicker came when the paper pointed out that the 2012-13 summer in Australia came in a La Niña year. Extreme heat events normally come with an El Niño year – La Niña years are cooler with more rain. So the fact that the Angry Summer occurred in a La Niña year is scary – sea surface temperatures were average or cooler in some places while at the same time the Bureau of Meteorology was scrambling for new map colours.

The paper concludes that their research supports ‘a clear conclusion that anthropogenic climate change had a substantial influence on the extreme summer heat over Australia’ and that these kinds of events are now five times as likely to occur. Welcome to summers on climate change?