Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?

 

Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

Smoking Kills, so does Climate Change

A translation of the IPCC 5th Assessment Report Summary for Policymakers

WHO: The Intergovernmental Panel on Climate Change

WHAT: Summary for policy makers of their 2000 page 5th Assessment Report (AR5) of the state of the climate and climate science.

WHEN: 27 September 2013

WHERE: On the IPCC website

TITLE: Climate Change 2013: The Physical Science Basis Summary for Policymakers (open access)

There’s a lot of things not to like about the way the IPCC communicates what they do, but for me the main one is that they speak a very specific dialect of bureaucrat that no-one else understands unless they’ve also worked on UN things and speak the same sort of acronym.

The worst bit of this dialect of bureaucrat is the words they use to describe how confident they are that their findings are correct. They probably believe they’re being really clear, however they use language that none of the rest of us would ever use and it means their findings make little sense without their ‘very likely = 90-100% certain’ footnote at the front.

So now that we’ve established that the UN doesn’t speak an understandable form of English, what does the report actually say? It works its way through each of the different climate systems and how they’re changing because humans are burning fossil fuels.

As you can see from this lovely graph, each of the last three decades has been noticeably warmer than the proceeding one, and the IPCC are 66% sure that 1983-2012 was the warmest 30 year period in 1,400 years.

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

One of the reasons I really like this graph is you can see how the rate of change is speeding up (one of the key dangers with climate change). From 1850 through to around 1980 each decade’s average is touching the box of the average before it, until after the 80s when the heat shoots up much more rapidly.

The report did have this dig for the deniers though: ‘Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends’. Which is UN bureaucrat for ‘when you cherry pick data to fit your denier talking points you’re doing it wrong’.

Looking at regional atmospheric trends, the report notes that while things like the Medieval Warm Period did have multi-decadal periods of change, these changes didn’t happen across the whole globe like the warming currently being measured.

In the oceans, the top layer has warmed (the top 75m) by 0.11oC per decade from 1971 to 2010, and more than 60% of the carbon energy we’ve pumped into the atmosphere since 1971 has been stored in the top layer, with another 30% being stored in the ocean below 700m.

This extra heat is not just causing thermal expansion, it’s speeding up sea level rise, which the IPCC are 90% certain increased from 1901 to 2010 from 1.7mm per year to 3.2mm per year. This is now happening faster than the past two millenniums. Yes, sea level is rising faster than it has for the last 2,000,000 years so you might want to sell your waterfront property sooner, rather than later.

The extra carbon has also made it harder to live in the ocean if you own a shell, because the acidity of the ocean has increased by 26% which makes shells thinner and harder to grow.

On the glaciers and the ice sheets, the IPCC is 90% certain that the rate of melting from Greenland has increased from 34Gigatonnes (Gt) of ice per year to 215Gt of ice after 2002. Yes, increased from 34Gt to 215Gt – it’s melting six times faster now thanks to us.

For Antarctica, the IPCC is 66% certain that the rate of ice loss has increased from 30Gt per year to 147Gt per year, with most of that loss coming from the Northern Peninsula and West Antarctica. Worryingly, this ice loss will also include the parts of Antarctica that are gaining ice due to natural variability.

And at the North Pole, Santa is going to have to buy himself and his elves some boats or floating pontoons soon, because the IPCC have found ‘multiple lines of evidence support[ing] very substantial Artctic warming since the mid-20th Century’. Sorry Santa!

As for the carbon we’ve been spewing into the atmosphere since the 1850s, well, we’re winning that race too! ‘The atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years’. Congratulations humanity – in the last century and a half, we’ve changed the composition of the atmosphere so rapidly that this hasn’t been seen in 800,000 years!

Methane levels have gone up by 150%, and I’m undecided as to whether that means I should eat more beef to stop the cows from farting, or if it means we raised too many cows to be steaks in the first place…

This is the part of the report where we get into the one excellent thing the IPCC did this time around – our carbon budget. I’m not sure whether they realised that committing to reduce certain percentages by certain years from different baselines meant that governments were able to shuffle the numbers to do nothing and make themselves look good at the same time, but this is a promising step.

I’ve written about the very excellent work of Kevin Anderson at the Tyndall Centre in the UK before, but the basic deal with a carbon budget is this: it doesn’t matter when we burn the carbon or how fast, all the matters is the total emissions in the end. You can eat half the chocolate bar now, and half the chocolate bar later, but you’re still eating a whole bar.

Our budget to have a 2/3 chance of not going beyond dangerous climate change is 1,000Gt of carbon and so far we’ve burnt 545Gt, so we’re more than halfway there. All of this leads to the conclusion that ‘human influence on the climate system is clear. This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming and understanding of the climate system.’

What observations you may ask? Scientists have made progress on working out how climate change pumps extreme weather up and makes it worse. They also got it right for the frequency of extreme warm and cold days, which if you live in North America was the hot extremes winning 10:1 over the cold extremes. Round of applause for science everyone!

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

They’re also 95% sure that more than half of the observed global surface warming from 1951 is from humanity. So next time there’s a nasty heatwave that’s more frequent than it should be, blame humans.

The report does also point out though that even though the heat records are beating the cold records 10-1, this doesn’t mean that snow disproves climate change (sorry Fox News!). There will still be yearly and decade by decade by variability in how our planet cooks which will not be the same across the whole planet. Which sounds to me like we’re being warmed in an uneven microwave. For instance, El Niño and La Niña will still be big influencers over the Pacific and will determine to a great extent the variability in the Pacific North West (yeah, it’s still going to rain a lot Vancouver).

For those that were fans of the film The Day After Tomorrow, there’s a 66% chance the Atlantic Meridional Circulation will slow down, but only a 10% chance it will undergo an abrupt change or collapse like it did in the film, so you’re not going to be running away from a flash freezing ocean any time this century.

The report then runs through the different scenarios they’ve decided to model that range from ‘we did a lot to reduce carbon emissions’ to ‘we did nothing to reduce carbon emissions and burned all the fossil fuels’. Because this is the IPCC and they had to get EVERYONE to agree on each line of the report (I’m serious, they approved it line by line, which has to be the most painful process I can think of) the scenarios are conservative in their estimations, not measuring tipping points (which are really hard to incorporate anyway). So their ‘worst case scenario’ is only 4.0oC of surface warming by 2100.

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Now, obviously ‘only’ 4oC of climate change by the end of the century is still pretty unbearable. There will still be a lot of hardship, drought, famine, refugee migration and uninhabitable parts of the planet with 4oC. However, once we get to 4oC, it’s likely to have triggered tipping points like methane release from permafrost, so 4oC would be a stopping point towards 6oC even if we ran out of carbon to burn. And 6oC of course, as you all hear me say frequently is mass extinction time. It’s also the point at which even if humanity did survive, you wouldn’t want to live here anymore.

The paper finishes up with a subtle dig at the insanity of relying on geoengineering, pointing out that trying to put shade reflectors into orbit or artificially suck carbon out of the air has a high chance of going horribly wrong. They also point out that if we did manage large scale geoegineering and it then broke down, re-releasing that carbon back into the atmosphere would super-cook the planet really quickly.

The moral of this 36 page ‘summary’ is that it’s us guys. We’re as certain that we’ve done this as we are that smoking causes cancer. We have burned this carbon and it’s changed the chemical energy balance of the atmosphere and if we don’t stop burning carbon we’re going to cook the whole planet. Seriously. So let’s finally, actually stop burning carbon.

Nemo the Climate Refugee

If you collected all the recent research on marine species and climate change, could you see a pattern of fish and marine species migration?

WHO: Elvira S. Poloczanska, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia
Christopher J. Brown, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia, School of Biological Sciences, The University of Queensland, Australia
William J. Sydeman, Sarah Ann Thompson, Farallon Institute for Advanced Ecosystem Research, Petaluma, California, USA
Wolfgang Kiessling, Museum für Naturkunde, Leibniz Institute for Research on Evolution and Biodiversity, Berlin, Germany, GeoZentrum Nordbayern, Paläoumwelt, Universität Erlangen-Nürnberg, Erlangen, Germany
David S. Schoeman, Faculty of Science, Health and Education, University of the Sunshine Coast, Maroochydore, Queensland, Australia, Department of Zoology, Nelson Mandela Metropolitan University, Port Elizabeth, South Africa
Pippa J. Moore, Centre for Marine Ecosystems Research, Edith Cowan University, Perth, Western Australia, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth UK
Keith Brander, DTU Aqua—Centre for Ocean Life, Technical University of Denmark, Charlottenlund Slot, Denmark
John F. Bruno, Lauren B. Buckley, Department of Biology, The University of North Carolina at Chapel Hill, North Carolina, USA
Michael T. Burrows, Scottish Association for Marine Science, Scottish Marine Institute, Oban, UK
Johnna Holding, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain
Carlos M. Duarte, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain, The UWA Oceans Institute, University of Western Australia,
Benjamin S. Halpern, Carrie V. Kappel, National Center for Ecological Analysis and Synthesis, Santa Barbara, California, USA
Mary I. O’Connor, University of British Columbia, Department of Zoology, Vancouver, Canada
John M. Pandolfi, Australian Research Council Centre of Excellence for Coral Reef Studies, School of Biological Sciences, The University of Queensland, Australia
Camille Parmesan, Integrative Biology, Patterson Laboratories 141, University of Texas, Austin, Texas Marine Institute, A425 Portland Square, Drake Circus, University of Plymouth, Plymouth, UK
Franklin Schwing, Office of Sustainable Fisheries, NOAA Fisheries Service, Maryland, USA
Anthony J. Richardson, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia Centre for Applications in Natural Resource Mathematics (CARM), School of Mathematics and Physics, University of Queensland, Australia

WHAT: A review and synthesis of all available peer reviewed studies of marine species changing under climate change.

WHEN: 4 August 2013

WHERE: Nature Climate Change, August 2013

TITLE: Global imprint of climate change on marine life subs req.

This paper, with its laundry list of collaborative authors must have had an awesome ‘we got published’ party. However, when you think about what they did – all that data would have taken forever to number crunch, so it’s a good thing it was all hands on deck.

So what were they looking at? They were trying to work out if you can see the fingerprint of climate change in the distribution changes of marine species. And to do that, they looked at all the available studies in the peer reviewed literature that were looking at expected changes for fish and other species in the ocean with climate change. Then, they lined up the predictions with the observed results to see what happened, and it turns out we’ve got some frequent travelling fish.

After getting all the studies together, the researchers had 1,735 different observations for everything from phytoplankton to zooplankton to fish and seabirds from 208 studies of 857 different species. They used all of the data they had which included the changes that lined up with climate change projections, the ones that had no changes and the ones that had unexpected changes.

Global marine migration (from paper)

Global marine migration (from paper)

Ocean currents make it easier for species to travel longer distances than plants and animals on land. There’s only so far a seed can travel from the tree with the wind, after all. However in this research they found that the average distance of expansion for marine species was 72km/decade (±13.5km). This doesn’t sound like a lot to a human, but it’s an order of magnitude further than land based migration averages, and it’s a long way for a mollusc or a starfish to go.

The species chalking up the most frequent flier points were phytoplankton which have been moving 469.9km/decade (±115km) followed by the fish who have been moving 227.5km/decade (±76.9km). Of the 1,735 observations, a whopping 1,092 were moving in the directions expected by climate change.

For each species migration, the researchers looked at what the expected decadal rates of ocean temperature change would have been in the area and found that some groups move early, some wait longer, others are falling behind.

For example, in the Bering Sea (where the Discovery Channel show ‘The Deadliest Catch’ was set), many species rely on the really cold water that is less than 2oC and separates the Arctic and subarctic animals. This cold pool of water has been moving further north as the Arctic ice sheet melts, but the responses by species are varied. Some are at the leading edge and move early, others don’t. The researchers think this is related to issues around population size, ability to migrate, dependence on habitat (remember how Nemo’s dad didn’t want to leave the reef?), competition for food and others.

Clownfish (Wikimedia commons)

Clownfish (Wikimedia commons)

I guess it’s similar to when a natural disaster happens in a human area and some families leave, others rebuild and it’s for a whole complicated list of reasons like family, jobs, resources and more. Anyway, back to the fish.

The researchers tested their data for a climate change fingerprint globally. They used a binomial test against 0.5, which is the result you would get if these changes in location were random variability and from their test, 83% of the changes had climate change as a dominant driving force.

If they limited their data only to studies that were multi-species, there were still 81% of the changes that were driven by climate change. They ran the data to exclude every bias they could think of and still they concluded that it provided ‘convincing evidence that climate change is the primary driver behind the observed biological changes’.

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

So there you have it – climate refugees aren’t just land based. Nemo’s going to have to move too.

Vote for last week’s paper!

climate voter

Remember how I was excited about the possibilities of scaling up the carbon sequestration process outlined in last week’s post from the Proceedings of the National Academy of Sciences in the USA?

Turns out you can vote for it!

I had an email from the lead author of the paper (I send my blog posts to the lead authors when I post them) letting me know that their process has made the finalists of two MIT Climate CoLab ideas. So if you’re excited about the idea of feasibly sequestering carbon dioxide from the oceans being tested out as well, you can vote for them.

The first proposal is for the Geoengineering section called ‘Saving the Planet v2.0‘. The second proposal is for the Electric power sector section called ‘Spontaneous Conversion of Power Plant CO2 to Dissolved Calcium Bicarbonate‘.

Climate CoLab is an online space where people work to try and crowdsource ideas for what to do about climate change. The contest voting closes in 11 days (August 30th) and the winning proposals will be presented at the Crowds & Climate Conference at MIT in November.

So if it takes your fancy, and you’d like to see this project presented at the conference, go forth and vote!

 

Disclosure: I am not affiliated with either the paper or the MIT CoLab project.

Antacid for our Oceans

An electrolysis method that removes CO2 from seawater could be affordably scaled up for commercial carbon sequestration.

WHO: Greg H. Rau, Institute of Marine Sciences, University of California, Santa Cruz, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA
Susan A. Carroll, William L. Bourcier, Michael J. Singleton, Megan M. Smith, and Roger D. Aines, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA

WHAT: An electrochemical method of sequestering CO2 from sea water using silicate rocks.

WHEN: June 18, 2013

WHERE: Proceedings of the National Academy of Sciences (USA), PNAS vol. 110, no. 25

TITLE: Direct electrolytic dissolution of silicate minerals for air CO2 mitigation and carbon-negative H2 production (open access)

This paper was fun – I got to get my chemistry nerd back on thinking in moles per litre and chemical equations! It almost made me miss university chemistry lectures.

No, not those moles per litre! (IFLS facebook page)

No, not those moles per litre! (IFLS facebook page)

So what does chemistry jokes have to do with carbon sequestration? It’s looking increasingly like humanity is going to have to figure out ways to draw carbon out of the atmosphere or the oceans because we’ve procrastinated on reducing our carbon emissions for so long.

There’s two options for this – you can either create a chemical reaction that will draw CO2 out of the air, or you can create a chemical reaction that will draw CO2 out of a solution, and given how quickly the oceans are acidifying, using sea water would be a good idea. The good news is; that’s exactly what these researchers did!

Silicate rock (which is mostly basalt rock) is the most common rock type in the Earth’s crust. It also reacts with CO2 to form stable carbonate and bicarbonate solids (like the bicarbonate soda you bake with). Normally this takes place very slowly through rock weathering, but what if you used it as a process to sequester CO2?

The researchers created a saline water electrolytic cell to test it out. An electrolytic cell is the one you made in school where you had an anode and a cathode and two different solutions (generally) and when you put an electric current through it you created a chemical reaction. What these researchers did was put silicate minerals, saline water and CO2 in on one side, and when they added electricity got bicarbonates, hydrogen, chlorine or oxygen, silicates and salts.

A basic schematic of the experiment (from paper)

A basic schematic of the experiment (from paper)

The researchers used an acid/base reaction (remember those from school?!) to speed up the silicate and CO2 reaction, which also works well in an ocean because large differences in pH are produced in saline electrolysis. Are you ready to get really nerdy with me? The chemical equation is this:

Chemical equation for the experiment (from paper)

Chemical equation for the experiment (from paper)

So how did the experiment go? It worked! They got successfully sequestered carbon dioxide with an efficiency of 23-32% that sequestered 0.147g of CO2 per kilojoule (kJ) of electricity used.

There are issues around the scaling up of the reaction of course – once the bicarbonate has been created, where do you store it? The paper suggested ocean storage as the bicarbonate solids would be inert (un-reactive). I would hope that a re-use option could be found – has anyone looked into using bicarbonate solids as an eco-building material?

There’s also the issue of needing to power the reaction with electricity. If scaled up, this process would have to make sure it was powered by renewable energy, because burning carbon to sequester carbon gives you zero.

Also, if sea water is used, the main by-product is Cl2 so the researchers point out that while it would be feasible to do this process directly in the ocean, the issue of what to do with all that chlorine would need to be dealt with. The paper suggests using oxygen selective anodes in the electrolysis, or ion-selective membranes around the reaction to keep the chlorine separate from the ocean.

That being said, there are some exciting upsides to this process. The paper points out that the amount of silicate rock in the world ‘dwarf[s] that needed for conversion of all present and future anthropogenic CO2.’ Also, using sea water is an easier way to sequester CO2 rather than air-based methods.

Scaling the method up is economically feasible too. The researchers estimated that 1.9 MWh (megawatt hours) of power would be needed per metric tonne of CO2 sequestered. If the waste hydrogen from the process were sold as hydrogen fuel for fuel cells, the price of CO2 sequestered would be $86/tonne. If the hydrogen fuel wasn’t feasible, it would still only be $154/tonne, which compares very favourably to most current carbon capture and storage feasibility estimates of $600-$1000/tonne of CO2.

So, like an antacid for the oceans, if this process can be scaled up commercially through research and development, we could have an effective way to not only capture and store carbon, but also reduce ocean acidification. A good-news story indeed – now we just need to stop burning carbon.

Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Drought – worse than we thought

Inconsistencies with drought models that don’t account for sea surface temperature changes mean that drought in a climate changed world could be worse than predicted.

WHO: Aiguo Dai, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: Looking at the impact of sea surface temperature variability on drought

WHEN: January 2013

WHERE: Nature Climate Change, Vol. 3, No. 1, January 2013

TITLE: Increasing drought under global warming in observations and models (open access)

Climate deniers love it when the models are slightly wrong for predicting future climate changes, and believe me, I’d love it if climate change weren’t so verifiably real and we could all retire and live la dolce vita.

However, that’s not reality, and in the case of this paper, where the model doesn’t quite line up with the observed changes that’s because it’s worse than we previously though. Oh dear.

Global warming since the 1980s has contributed to an 8% increase in drought-ridden areas in the 2000s. It’s led to things like diminished corn crops and the steady draining of underground water aquifers in the USA, much of which is currently experiencing persistent drought. The letter L on the map below stands for long term drought.

Long term drought in the Southwest of the USA (from US Drought Monitor)

Long term drought in the Southwest of the USA (from US Drought Monitor)

What’s that got to do with climate models? Well, while the droughts in Southern Europe or my homeland of Australia are due to lack of rain drying things out, drought can also be from increased evaporation from warmer air temperatures, which the models don’t fully take into account.

These droughts are harder to measure because they’re related to sea surface temperature changes that take decades and can be hard to identify as a human forced signal rather than just natural variations. So this researcher compared sea surface temperatures with drought predictions and observed warming to try and work out what is going on.

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

There were two areas where the models differed from the observed changes – the Sahel area in Africa and the USA.

In the Sahel, the models predicted there would be warming in the North Atlantic Ocean which would lead to increased rain. What actually happened was that there was large warming in the South Atlantic Ocean compared to the North Atlantic and steady warming over the Indian Ocean which meant less rain, not more. Similarly, for the predicted patterns in the USA, the influence of the Pacific Multidecadal Oscillation was not known to be influenced by human climate forcing. However, it switched to a warm phase from above-normal sea surface temperature.

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

These sea surface variations that were missed in some of the previous models have some obvious consequences for planning for the slow pressure cooker of stress that drought is on anyone living through it, let alone trying to make a living from agriculture.

The researcher noted that there were also some differences from the models when looking at sulphate aerosols, however for the 21st Century the signal from greenhouse gases will be much stronger than those from aerosols, so shouldn’t mess with the data too much.

So what does this all mean? Well, it means that there are both regional and broader trends for drought in a changed climate. The broad patterns are fairly stable ‘because of the large forced trend compared with natural variations’, which is scientist for humans are making a large enough mess out of this to see the evidence clearly.

The paper ends quite bluntly stating that having re-worked the simulations to take into account the new data for sea surface temperature and other variables, that it’s only more bad news.

It’s likely to be ‘severe drought conditions by the late half of this century over many densely populated areas such as Europe, the eastern USA, southeast Asia and Brazil. This dire prediction could have devastating impacts on a large number of the population if the model’s regional predictions turn out to be true.’

Yes, a researcher actually used the word ‘dire’ in a scientific paper. Oh, and this was with an intermediate emissions scenario, not the business as usual path we’re currently all on. How about we all agree to stop burning carbon now?

Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Much ado about phosphorus

‘Life can multiply until all the phosphorus has gone and then there is an inexorable halt which nothing can prevent’ – Isaac Asimov, 1974

WHO: K. Ashley, D. Mavinic, Department of Civil Engineering, Faculty of Applied Science, University of British Columbia, Vancouver, BC, Canada
D. Cordell, Institute for Sustainable Futures, University of Technology, Sydney, Australia

WHAT: A brief history of phosphorus use by humans and ideas on how we can prevent the global food security risk of ‘Peak Phosphorus’

WHEN: 8 April 2011

WHERE: Chemosphere Vol. 84 (2011) 737–746

TITLE: A brief history of phosphorus: From the philosopher’s stone to nutrient recovery and reuse (subs req.)

Phosphorus can be found on the right hand side of your periodic table on the second row down underneath Nitrogen. It’s one of those funny elements that we all need to live and survive and grow things, but is also highly reactive, very explosive and toxic.

It’s in our DNA – in the AGCT bases that connect to form the double helix structure of DNA, the sides of the ladder are held together by phosphodiester bonds. Phosphorus is literally helping to hold us together.

Phosphodiester bonds in DNA (from Introduction to DNA structure, Richard B. Hallick, U Arizona)

Phosphodiester bonds in DNA (from Introduction to DNA structure, Richard B. Hallick, U Arizona)

Phosphorus can be pretty easily extracted from human urine, which was what German alchemist Henning Brandt did in the 1660s in an attempt to create the Philosopher’s Stone which would be able to turn base metals into gold. No seriously, apparently he was committed enough to the idea to distill 50 buckets of his own pee to do this!

What do alchemy, DNA, and human pee have to do with a scientific paper? Well these researchers were looking at how we’ve previously used phosphorus, why it is that we’re now running out of it and what we can learn from history to try and avoid a global food security risk.

Phosphorus comes in three forms – white, black and red. The phosphorus that is mined for fertilizer today is apatite rock containing P2O5 and has generally taken 10 – 15million years to form. However, in traditional short term human thinking, the fact that it takes that long for the rocks to form didn’t stop people from mining it and thinking it was an ‘endless’ resource (just like oil, coal, forests, oceans etc.).

The paper states that originally, phosphorus was used for ‘highly questionable medicinal purposes’ and then doesn’t detail what kinds of whacky things it was used for (boo!). Given the properties of white phosphorus; it’s highly reactive and flammable when exposed to the air, can spontaneously combust and is poisonous to humans, the mind boggles as to what ‘medicinal’ uses phosphorus had.

The major use of phosphorus is as an agricultural fertilizer, which used to be achieved through the recycling of human waste and sewage pre-industrialisation. However, with 2.5million people living in Victorian-era London, the problems of excess human waste become unmanageable and led to all kinds of nasty things like cholera and the ‘Great Stink’ of the Thames in 1858 that was so bad that it shut down Parliament.

This led to what was called the ‘Sanitary Revolution’ aka the invention of flush toilets and plumbing on a large scale. This fundamentally changed the phosphorus cycle – from a closed loop of localised use and reuse to a more linear system as the waste was taken further away.

After the Second World War, the use of mined mineral phosphorus really took off – the use of phosphorus as a fertilizer rose six fold between 1950-2000 – and modern agricultural processes are now dependent on phosphorus based fertilizers. This has led to major phosphorus leakage into waterways and oceans from agricultural runoff creating eutrophication and ocean deadzones from excess phosphorus.

Eutrophication in the sea of Azov, south of the Ukraine  (SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE)

Eutrophication in the sea of Azov, south of the Ukraine
(SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE)

The problem here is, that we’ve switched from a closed loop system where the waste from the farm house goes into the farm yard and all the phosphorus can recycle, to a linear system where the phosphorus gets mined, used as fertilizer and much of it runs off into the ocean. It’s not even a very efficient system – only a fifth of the phosphorus mined for food production actually ends up in the food we eat.

The problem that we’re now facing is the long term ramifications of this new system where phosphorus has become a scarce global resource and we’ve now been forced to start mining the rocks that have lower quality phosphorus with higher rates of contaminants and are more difficult to access. We’re down to the tar sands equivalent of minable phosphorus, most of which is found in only five countries; Morocco, China, the USA, Jordan and South Africa. Maybe they can be the next OPEC cartel for phosphorus?

Peak phosphorus is likely to happen somewhere between 2030 and 2040, which is where the scary link to climate change comes in. The researchers cheerfully call phosphorus shortages the ‘companion show-stopper to climate change’, by which they mean that soils will start to run out of the nutrients they need at about the same time that extended droughts from climate change will be diminishing crop yields and we’ll have about 9 billion people to scramble to feed.

Basically, a phosphorus shortage is something that we can easily avoid through better and more efficient nutrient recycling, but it’s something that will kick us in the ass once we’re already struggling to deal with the consequences of climate change. The paper states that we need to start re-thinking our ‘western style’ of sewage treatment to better recover water, heat, energy, carbon, nitrogen and phosphorus from our waste systems. This doesn’t mean (thankfully) having to return to a middle ages style of living – it means having cities that are innovative enough about their municipal systems (I was surprised to find out that sewage treatment is one of the most expensive and energy intensive parts of public infrastructure).

The False Creek Neighbourhood Energy Utility in Vancouver

The False Creek Neighbourhood Energy Utility in Vancouver

In Vancouver, we’re already starting to do that with the waste cogeneration system at Science World and the False Creek Neighbourhood Energy Utility that produces energy from sewer heat.

It’s pretty logical; we need to re-close the loop on phosphorus use and we need to do it sensibly before our failure to stop burning carbon means ‘Peak Phosphorus’ becomes the straw that breaks the camel’s proverbial back.

Pandora’s Permafrost Freezer

What we know about permafrost melt is less than what we don’t know about it. So how do we determine the permafrost contribution to climate change?

WHO: E. A. G. Schuur, S. M. Natali, C. Schädel, University of Florida, Gainesville, FL, USA
B. W. Abbott, F. S. Chapin III, G. Grosse, J. B. Jones, C. L. Ping, V. E. Romanovsky, K. M. Walter Anthony University of Alaska Fairbanks, Fairbanks, AK, USA
W. B. Bowden, University of Vermont, Burlington, VT, USA
V. Brovkin, T. Kleinen, Max Planck Institute for Meteorology, Hamburg, Germany
P. Camill, Bowdoin College, Brunswick, ME, USA
J. G. Canadell, Global Carbon Project CSIRO Marine and Atmospheric Research, Canberra, Australia
J. P. Chanton, Florida State University, Tallahassee, FL, USA
T. R. Christensen, Lund University, Lund, Sweden
P. Ciais, LSCE, CEA-CNRS-UVSQ, Gif-sur-Yvette, France
B. T. Crosby, Idaho State University, Pocatello, ID, USA
C. I. Czimczik, University of California, Irvine, CA, USA
J. Harden, US Geological Survey, Menlo Park, CA, USA
D. J. Hayes, M. P.Waldrop, Oak Ridge National Laboratory, Oak Ridge, TN, USA
G. Hugelius, P. Kuhry, A. B. K. Sannel, Stockholm University, Stockholm, Sweden
J. D. Jastrow, Argonne National Laboratory, Argonne, IL, USA
C. D. Koven, W. J. Riley, Z. M. Subin, Lawrence Berkeley National Lab, Berkeley, CA, USA
G. Krinner, CNRS/UJF-Grenoble 1, LGGE, Grenoble, France
D. M. Lawrence, National Center for Atmospheric Research, Boulder, CO, USA
A. D. McGuire, U.S. Geological Survey, Alaska Cooperative Fish and Wildlife Research Unit, University of Alaska, Fairbanks, AK, USA
J. A. O’Donnell, Arctic Network, National Park Service, Fairbanks, AK, USA
A. Rinke, Alfred Wegener Institute, Potsdam, Germany
K. Schaefer, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, CO, USA
J. Sky, University of Oxford, Oxford, UK
C. Tarnocai, AgriFoods, Ottawa, ON, Canada
M. R. Turetsky, University of Guelph, Guelph, ON, Canada
K. P. Wickland, U.S. Geological Survey, Boulder, CO, USA
C. J. Wilson, Los Alamos National Laboratory, Los Alamos, NM, USA
 S. A. Zimov, North-East Scientific Station, Cherskii, Siberia

WHAT: Interviewing and averaging the best estimates by world experts on how much permafrost in the Arctic is likely to melt and how much that will contribute to climate change.

WHEN: 26 March 2013

WHERE: Climactic Change, Vol. 117, Issue 1-2, March 2013

TITLE: Expert assessment of vulnerability of permafrost carbon to climate change (open access!)

We are all told that you should never judge a book by its cover, however I’ll freely admit that I chose to read this paper because the headline in Nature Climate Change was ‘Pandora’s Freezer’ and I just love a clever play on words.

So what’s the deal with permafrost and climate change? Permafrost is the solid, permanently frozen dirt/mud/sludge in the Arctic that often looks like cliffs of chocolate mousse when it’s melting. The fact that it’s melting is the problem, because when it melts, the carbon gets disturbed and moved around and released into the atmosphere.

Releasing ancient carbon into the atmosphere is what humans have been doing at an ever greater rate since we worked out that fossilised carbon makes a really efficient energy source, so when the Arctic starts doing that as well, it’s adding to the limited remaining carbon budget our atmosphere has left. Which means melting permafrost has consequences for how much time humanity has left to wean ourselves off our destructive fossil fuel addiction.

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

 How much time do we have? How much carbon is in those cliffs of chocolate mousse? We’re not sure. And that’s a big problem. Estimates in recent research think there could be as much as 1,700 billion tonnes of carbon stored in permafrost in the Arctic, which is much higher than earlier estimates from research in the 1990s.

To give that very large number some context, 1,700 billion tonnes can also be called 1,700 Gigatonnes, which should ring a bell for anyone who read Bill McKibben’s Rolling Stone global warming math article. The article stated that the best current estimate for humanity to have a shot at keeping global average temperatures below a 2oC increase is a carbon budget of 565Gt. So if all the permafrost melted, we’ve blown that budget twice.

What this paper did, was ask the above long list of experts on soil, carbon in soil, permafrost and Arctic research three questions over three different time scales.

  1. How much permafrost is likely to degrade (aka quantitative estimates of surface permafrost degradation)
  2. How much carbon it will likely release
  3. How much methane it will likely release

They included the methane question because methane has short term ramifications for the atmosphere. Methane ‘only’ stays in the atmosphere for around 100 years (compared to carbon dioxide’s 1000 plus years) and it has 33 times the global warming potential (GWP) of CO2 over a 100 year period. So for the first hundred years after you’ve released it, one tonne of methane is as bad as 33 tonnes of CO2. This could quickly blow our carbon budgets as we head merrily past 400 parts per million of CO2 in the atmosphere from human forcing.

The time periods for each question were; by 2040 with 1.5-2.5oC Arctic temperature rise (the Arctic warms faster than lower latitudes), by 2100 with between 2.0-7.5oC temperature rise (so from ‘we can possibly deal with this’ to ‘catastrophic climate change’), and by 2300 where temperatures are stable after 2100.

The estimates the experts gave were then screened for level of expertise (you don’t want to be asking an atmospheric specialist the soil questions!) and averaged to give an estimate range. For surface loss of permafrost under the highest warming scenario, the results were;

  1. 9-16% loss by 2040
  2. 48-63% loss by 2100
  3. 67-80% loss by 2300

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Ouch. If we don’t start doing something serious about reducing our carbon emissions soon, we could be blowing that carbon budget really quickly.

For how much carbon the highest warming scenario may release, the results were;

  1. 19-45billion tonnes (Gt) CO2 by 2040
  2. 162-288Gt CO2 by 2100
  3. 381-616Gt CO2 by 2300

Hmm. So if we don’t stop burning carbon by 2040, melting permafrost will have taken 45Gt of CO2 out of our atmospheric carbon budget of 565Gt. Let’s hope we haven’t burned through the rest by then too.

However, if Arctic temperature rises were limited to 2oC by 2100, the CO2 emissions would ‘only’ be;

  1. 6-17Gt CO2 by 2040
  2. 41-80Gt CO2 by 2100
  3. 119-200Gt CO2 by 2300

That’s about a third of the highest warming estimates, but still nothing to breathe a sigh of relief at given that the 2000-2010 average annual rate of fossil fuel burning was 7.9Gt per year. So even the low estimate has permafrost releasing more than two years worth of global emissions, meaning we’d have to stop burning carbon two years earlier.

When the researchers calculated the expected methane emissions, the estimates were low. However, when they calculated the CO2 equivalent (CO2e) for the methane (methane being 33 times more potent than CO2 over 100 years), they got;

  1. 29-60Gt CO2e by 2040
  2. 250-463Gt CO2e by 2100
  3. 572-1004Gt CO2e by 2300

Thankfully, most of the carbon in the permafrost is expected to be released as the less potent carbon dioxide, but working out the balance between how much methane may be released into the atmosphere vs how much will be carbon dioxide is really crucial for working out global carbon budgets.

The other problem is that most climate models that look at permafrost contributions to climate change do it in a linear manner where increased temps lead directly to an increase in microbes and bacteria and the carbon is released. In reality, permafrost is much more dynamic and non-linear and therefore more unpredictable, which makes it a pain to put into models. It’s really difficult to predict abrupt thaw processes (as was seen over 98% of Greenland last summer) where ice wedges can melt and the ground could collapse irreversibly.

These kinds of non-linear processes (the really terrifying bit about climate change) made the news this week when it was reported that the Alaskan town of Newtok is likely to wash away by 2017, making the townspeople the first climate refugees from the USA.

The paper points out that one of the key limitations to knowing exactly what the permafrost is going to do is the lack of historical permafrost data. Permafrost is in really remote hard to get to places where people don’t live because the ground is permanently frozen. People haven’t been going to these places and taking samples unlike more populated areas that have lengthy and detailed climate records. But if you don’t know how much permafrost was historically there, you can’t tell how fast it’s melting.

The key point from this paper is that even though we’re not sure exactly how much permafrost will contribute to global carbon budgets and temperature rise, this uncertainty alone should not be enough to stall action on climate change.

Yes, there is uncertainty in exactly how badly climate change will affect the biosphere and everything that lives within it, but currently our options range from ‘uncomfortable and we may be able to adapt’ to ‘the next mass extinction’.

So while we’re working out exactly how far we’ve opened the Pandora’s Freezer of permafrost, let’s also stop burning carbon.