Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?


Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

Smoking Kills, so does Climate Change

A translation of the IPCC 5th Assessment Report Summary for Policymakers

WHO: The Intergovernmental Panel on Climate Change

WHAT: Summary for policy makers of their 2000 page 5th Assessment Report (AR5) of the state of the climate and climate science.

WHEN: 27 September 2013

WHERE: On the IPCC website

TITLE: Climate Change 2013: The Physical Science Basis Summary for Policymakers (open access)

There’s a lot of things not to like about the way the IPCC communicates what they do, but for me the main one is that they speak a very specific dialect of bureaucrat that no-one else understands unless they’ve also worked on UN things and speak the same sort of acronym.

The worst bit of this dialect of bureaucrat is the words they use to describe how confident they are that their findings are correct. They probably believe they’re being really clear, however they use language that none of the rest of us would ever use and it means their findings make little sense without their ‘very likely = 90-100% certain’ footnote at the front.

So now that we’ve established that the UN doesn’t speak an understandable form of English, what does the report actually say? It works its way through each of the different climate systems and how they’re changing because humans are burning fossil fuels.

As you can see from this lovely graph, each of the last three decades has been noticeably warmer than the proceeding one, and the IPCC are 66% sure that 1983-2012 was the warmest 30 year period in 1,400 years.

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

One of the reasons I really like this graph is you can see how the rate of change is speeding up (one of the key dangers with climate change). From 1850 through to around 1980 each decade’s average is touching the box of the average before it, until after the 80s when the heat shoots up much more rapidly.

The report did have this dig for the deniers though: ‘Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends’. Which is UN bureaucrat for ‘when you cherry pick data to fit your denier talking points you’re doing it wrong’.

Looking at regional atmospheric trends, the report notes that while things like the Medieval Warm Period did have multi-decadal periods of change, these changes didn’t happen across the whole globe like the warming currently being measured.

In the oceans, the top layer has warmed (the top 75m) by 0.11oC per decade from 1971 to 2010, and more than 60% of the carbon energy we’ve pumped into the atmosphere since 1971 has been stored in the top layer, with another 30% being stored in the ocean below 700m.

This extra heat is not just causing thermal expansion, it’s speeding up sea level rise, which the IPCC are 90% certain increased from 1901 to 2010 from 1.7mm per year to 3.2mm per year. This is now happening faster than the past two millenniums. Yes, sea level is rising faster than it has for the last 2,000,000 years so you might want to sell your waterfront property sooner, rather than later.

The extra carbon has also made it harder to live in the ocean if you own a shell, because the acidity of the ocean has increased by 26% which makes shells thinner and harder to grow.

On the glaciers and the ice sheets, the IPCC is 90% certain that the rate of melting from Greenland has increased from 34Gigatonnes (Gt) of ice per year to 215Gt of ice after 2002. Yes, increased from 34Gt to 215Gt – it’s melting six times faster now thanks to us.

For Antarctica, the IPCC is 66% certain that the rate of ice loss has increased from 30Gt per year to 147Gt per year, with most of that loss coming from the Northern Peninsula and West Antarctica. Worryingly, this ice loss will also include the parts of Antarctica that are gaining ice due to natural variability.

And at the North Pole, Santa is going to have to buy himself and his elves some boats or floating pontoons soon, because the IPCC have found ‘multiple lines of evidence support[ing] very substantial Artctic warming since the mid-20th Century’. Sorry Santa!

As for the carbon we’ve been spewing into the atmosphere since the 1850s, well, we’re winning that race too! ‘The atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years’. Congratulations humanity – in the last century and a half, we’ve changed the composition of the atmosphere so rapidly that this hasn’t been seen in 800,000 years!

Methane levels have gone up by 150%, and I’m undecided as to whether that means I should eat more beef to stop the cows from farting, or if it means we raised too many cows to be steaks in the first place…

This is the part of the report where we get into the one excellent thing the IPCC did this time around – our carbon budget. I’m not sure whether they realised that committing to reduce certain percentages by certain years from different baselines meant that governments were able to shuffle the numbers to do nothing and make themselves look good at the same time, but this is a promising step.

I’ve written about the very excellent work of Kevin Anderson at the Tyndall Centre in the UK before, but the basic deal with a carbon budget is this: it doesn’t matter when we burn the carbon or how fast, all the matters is the total emissions in the end. You can eat half the chocolate bar now, and half the chocolate bar later, but you’re still eating a whole bar.

Our budget to have a 2/3 chance of not going beyond dangerous climate change is 1,000Gt of carbon and so far we’ve burnt 545Gt, so we’re more than halfway there. All of this leads to the conclusion that ‘human influence on the climate system is clear. This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming and understanding of the climate system.’

What observations you may ask? Scientists have made progress on working out how climate change pumps extreme weather up and makes it worse. They also got it right for the frequency of extreme warm and cold days, which if you live in North America was the hot extremes winning 10:1 over the cold extremes. Round of applause for science everyone!

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

They’re also 95% sure that more than half of the observed global surface warming from 1951 is from humanity. So next time there’s a nasty heatwave that’s more frequent than it should be, blame humans.

The report does also point out though that even though the heat records are beating the cold records 10-1, this doesn’t mean that snow disproves climate change (sorry Fox News!). There will still be yearly and decade by decade by variability in how our planet cooks which will not be the same across the whole planet. Which sounds to me like we’re being warmed in an uneven microwave. For instance, El Niño and La Niña will still be big influencers over the Pacific and will determine to a great extent the variability in the Pacific North West (yeah, it’s still going to rain a lot Vancouver).

For those that were fans of the film The Day After Tomorrow, there’s a 66% chance the Atlantic Meridional Circulation will slow down, but only a 10% chance it will undergo an abrupt change or collapse like it did in the film, so you’re not going to be running away from a flash freezing ocean any time this century.

The report then runs through the different scenarios they’ve decided to model that range from ‘we did a lot to reduce carbon emissions’ to ‘we did nothing to reduce carbon emissions and burned all the fossil fuels’. Because this is the IPCC and they had to get EVERYONE to agree on each line of the report (I’m serious, they approved it line by line, which has to be the most painful process I can think of) the scenarios are conservative in their estimations, not measuring tipping points (which are really hard to incorporate anyway). So their ‘worst case scenario’ is only 4.0oC of surface warming by 2100.

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Now, obviously ‘only’ 4oC of climate change by the end of the century is still pretty unbearable. There will still be a lot of hardship, drought, famine, refugee migration and uninhabitable parts of the planet with 4oC. However, once we get to 4oC, it’s likely to have triggered tipping points like methane release from permafrost, so 4oC would be a stopping point towards 6oC even if we ran out of carbon to burn. And 6oC of course, as you all hear me say frequently is mass extinction time. It’s also the point at which even if humanity did survive, you wouldn’t want to live here anymore.

The paper finishes up with a subtle dig at the insanity of relying on geoengineering, pointing out that trying to put shade reflectors into orbit or artificially suck carbon out of the air has a high chance of going horribly wrong. They also point out that if we did manage large scale geoegineering and it then broke down, re-releasing that carbon back into the atmosphere would super-cook the planet really quickly.

The moral of this 36 page ‘summary’ is that it’s us guys. We’re as certain that we’ve done this as we are that smoking causes cancer. We have burned this carbon and it’s changed the chemical energy balance of the atmosphere and if we don’t stop burning carbon we’re going to cook the whole planet. Seriously. So let’s finally, actually stop burning carbon.

Nemo the Climate Refugee

If you collected all the recent research on marine species and climate change, could you see a pattern of fish and marine species migration?

WHO: Elvira S. Poloczanska, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia
Christopher J. Brown, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia, School of Biological Sciences, The University of Queensland, Australia
William J. Sydeman, Sarah Ann Thompson, Farallon Institute for Advanced Ecosystem Research, Petaluma, California, USA
Wolfgang Kiessling, Museum für Naturkunde, Leibniz Institute for Research on Evolution and Biodiversity, Berlin, Germany, GeoZentrum Nordbayern, Paläoumwelt, Universität Erlangen-Nürnberg, Erlangen, Germany
David S. Schoeman, Faculty of Science, Health and Education, University of the Sunshine Coast, Maroochydore, Queensland, Australia, Department of Zoology, Nelson Mandela Metropolitan University, Port Elizabeth, South Africa
Pippa J. Moore, Centre for Marine Ecosystems Research, Edith Cowan University, Perth, Western Australia, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth UK
Keith Brander, DTU Aqua—Centre for Ocean Life, Technical University of Denmark, Charlottenlund Slot, Denmark
John F. Bruno, Lauren B. Buckley, Department of Biology, The University of North Carolina at Chapel Hill, North Carolina, USA
Michael T. Burrows, Scottish Association for Marine Science, Scottish Marine Institute, Oban, UK
Johnna Holding, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain
Carlos M. Duarte, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain, The UWA Oceans Institute, University of Western Australia,
Benjamin S. Halpern, Carrie V. Kappel, National Center for Ecological Analysis and Synthesis, Santa Barbara, California, USA
Mary I. O’Connor, University of British Columbia, Department of Zoology, Vancouver, Canada
John M. Pandolfi, Australian Research Council Centre of Excellence for Coral Reef Studies, School of Biological Sciences, The University of Queensland, Australia
Camille Parmesan, Integrative Biology, Patterson Laboratories 141, University of Texas, Austin, Texas Marine Institute, A425 Portland Square, Drake Circus, University of Plymouth, Plymouth, UK
Franklin Schwing, Office of Sustainable Fisheries, NOAA Fisheries Service, Maryland, USA
Anthony J. Richardson, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia Centre for Applications in Natural Resource Mathematics (CARM), School of Mathematics and Physics, University of Queensland, Australia

WHAT: A review and synthesis of all available peer reviewed studies of marine species changing under climate change.

WHEN: 4 August 2013

WHERE: Nature Climate Change, August 2013

TITLE: Global imprint of climate change on marine life subs req.

This paper, with its laundry list of collaborative authors must have had an awesome ‘we got published’ party. However, when you think about what they did – all that data would have taken forever to number crunch, so it’s a good thing it was all hands on deck.

So what were they looking at? They were trying to work out if you can see the fingerprint of climate change in the distribution changes of marine species. And to do that, they looked at all the available studies in the peer reviewed literature that were looking at expected changes for fish and other species in the ocean with climate change. Then, they lined up the predictions with the observed results to see what happened, and it turns out we’ve got some frequent travelling fish.

After getting all the studies together, the researchers had 1,735 different observations for everything from phytoplankton to zooplankton to fish and seabirds from 208 studies of 857 different species. They used all of the data they had which included the changes that lined up with climate change projections, the ones that had no changes and the ones that had unexpected changes.

Global marine migration (from paper)

Global marine migration (from paper)

Ocean currents make it easier for species to travel longer distances than plants and animals on land. There’s only so far a seed can travel from the tree with the wind, after all. However in this research they found that the average distance of expansion for marine species was 72km/decade (±13.5km). This doesn’t sound like a lot to a human, but it’s an order of magnitude further than land based migration averages, and it’s a long way for a mollusc or a starfish to go.

The species chalking up the most frequent flier points were phytoplankton which have been moving 469.9km/decade (±115km) followed by the fish who have been moving 227.5km/decade (±76.9km). Of the 1,735 observations, a whopping 1,092 were moving in the directions expected by climate change.

For each species migration, the researchers looked at what the expected decadal rates of ocean temperature change would have been in the area and found that some groups move early, some wait longer, others are falling behind.

For example, in the Bering Sea (where the Discovery Channel show ‘The Deadliest Catch’ was set), many species rely on the really cold water that is less than 2oC and separates the Arctic and subarctic animals. This cold pool of water has been moving further north as the Arctic ice sheet melts, but the responses by species are varied. Some are at the leading edge and move early, others don’t. The researchers think this is related to issues around population size, ability to migrate, dependence on habitat (remember how Nemo’s dad didn’t want to leave the reef?), competition for food and others.

Clownfish (Wikimedia commons)

Clownfish (Wikimedia commons)

I guess it’s similar to when a natural disaster happens in a human area and some families leave, others rebuild and it’s for a whole complicated list of reasons like family, jobs, resources and more. Anyway, back to the fish.

The researchers tested their data for a climate change fingerprint globally. They used a binomial test against 0.5, which is the result you would get if these changes in location were random variability and from their test, 83% of the changes had climate change as a dominant driving force.

If they limited their data only to studies that were multi-species, there were still 81% of the changes that were driven by climate change. They ran the data to exclude every bias they could think of and still they concluded that it provided ‘convincing evidence that climate change is the primary driver behind the observed biological changes’.

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

So there you have it – climate refugees aren’t just land based. Nemo’s going to have to move too.

Vote for last week’s paper!

climate voter

Remember how I was excited about the possibilities of scaling up the carbon sequestration process outlined in last week’s post from the Proceedings of the National Academy of Sciences in the USA?

Turns out you can vote for it!

I had an email from the lead author of the paper (I send my blog posts to the lead authors when I post them) letting me know that their process has made the finalists of two MIT Climate CoLab ideas. So if you’re excited about the idea of feasibly sequestering carbon dioxide from the oceans being tested out as well, you can vote for them.

The first proposal is for the Geoengineering section called ‘Saving the Planet v2.0‘. The second proposal is for the Electric power sector section called ‘Spontaneous Conversion of Power Plant CO2 to Dissolved Calcium Bicarbonate‘.

Climate CoLab is an online space where people work to try and crowdsource ideas for what to do about climate change. The contest voting closes in 11 days (August 30th) and the winning proposals will be presented at the Crowds & Climate Conference at MIT in November.

So if it takes your fancy, and you’d like to see this project presented at the conference, go forth and vote!


Disclosure: I am not affiliated with either the paper or the MIT CoLab project.

Antacid for our Oceans

An electrolysis method that removes CO2 from seawater could be affordably scaled up for commercial carbon sequestration.

WHO: Greg H. Rau, Institute of Marine Sciences, University of California, Santa Cruz, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA
Susan A. Carroll, William L. Bourcier, Michael J. Singleton, Megan M. Smith, and Roger D. Aines, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA

WHAT: An electrochemical method of sequestering CO2 from sea water using silicate rocks.

WHEN: June 18, 2013

WHERE: Proceedings of the National Academy of Sciences (USA), PNAS vol. 110, no. 25

TITLE: Direct electrolytic dissolution of silicate minerals for air CO2 mitigation and carbon-negative H2 production (open access)

This paper was fun – I got to get my chemistry nerd back on thinking in moles per litre and chemical equations! It almost made me miss university chemistry lectures.

No, not those moles per litre! (IFLS facebook page)

No, not those moles per litre! (IFLS facebook page)

So what does chemistry jokes have to do with carbon sequestration? It’s looking increasingly like humanity is going to have to figure out ways to draw carbon out of the atmosphere or the oceans because we’ve procrastinated on reducing our carbon emissions for so long.

There’s two options for this – you can either create a chemical reaction that will draw CO2 out of the air, or you can create a chemical reaction that will draw CO2 out of a solution, and given how quickly the oceans are acidifying, using sea water would be a good idea. The good news is; that’s exactly what these researchers did!

Silicate rock (which is mostly basalt rock) is the most common rock type in the Earth’s crust. It also reacts with CO2 to form stable carbonate and bicarbonate solids (like the bicarbonate soda you bake with). Normally this takes place very slowly through rock weathering, but what if you used it as a process to sequester CO2?

The researchers created a saline water electrolytic cell to test it out. An electrolytic cell is the one you made in school where you had an anode and a cathode and two different solutions (generally) and when you put an electric current through it you created a chemical reaction. What these researchers did was put silicate minerals, saline water and CO2 in on one side, and when they added electricity got bicarbonates, hydrogen, chlorine or oxygen, silicates and salts.

A basic schematic of the experiment (from paper)

A basic schematic of the experiment (from paper)

The researchers used an acid/base reaction (remember those from school?!) to speed up the silicate and CO2 reaction, which also works well in an ocean because large differences in pH are produced in saline electrolysis. Are you ready to get really nerdy with me? The chemical equation is this:

Chemical equation for the experiment (from paper)

Chemical equation for the experiment (from paper)

So how did the experiment go? It worked! They got successfully sequestered carbon dioxide with an efficiency of 23-32% that sequestered 0.147g of CO2 per kilojoule (kJ) of electricity used.

There are issues around the scaling up of the reaction of course – once the bicarbonate has been created, where do you store it? The paper suggested ocean storage as the bicarbonate solids would be inert (un-reactive). I would hope that a re-use option could be found – has anyone looked into using bicarbonate solids as an eco-building material?

There’s also the issue of needing to power the reaction with electricity. If scaled up, this process would have to make sure it was powered by renewable energy, because burning carbon to sequester carbon gives you zero.

Also, if sea water is used, the main by-product is Cl2 so the researchers point out that while it would be feasible to do this process directly in the ocean, the issue of what to do with all that chlorine would need to be dealt with. The paper suggests using oxygen selective anodes in the electrolysis, or ion-selective membranes around the reaction to keep the chlorine separate from the ocean.

That being said, there are some exciting upsides to this process. The paper points out that the amount of silicate rock in the world ‘dwarf[s] that needed for conversion of all present and future anthropogenic CO2.’ Also, using sea water is an easier way to sequester CO2 rather than air-based methods.

Scaling the method up is economically feasible too. The researchers estimated that 1.9 MWh (megawatt hours) of power would be needed per metric tonne of CO2 sequestered. If the waste hydrogen from the process were sold as hydrogen fuel for fuel cells, the price of CO2 sequestered would be $86/tonne. If the hydrogen fuel wasn’t feasible, it would still only be $154/tonne, which compares very favourably to most current carbon capture and storage feasibility estimates of $600-$1000/tonne of CO2.

So, like an antacid for the oceans, if this process can be scaled up commercially through research and development, we could have an effective way to not only capture and store carbon, but also reduce ocean acidification. A good-news story indeed – now we just need to stop burning carbon.

Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Drought – worse than we thought

Inconsistencies with drought models that don’t account for sea surface temperature changes mean that drought in a climate changed world could be worse than predicted.

WHO: Aiguo Dai, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: Looking at the impact of sea surface temperature variability on drought

WHEN: January 2013

WHERE: Nature Climate Change, Vol. 3, No. 1, January 2013

TITLE: Increasing drought under global warming in observations and models (open access)

Climate deniers love it when the models are slightly wrong for predicting future climate changes, and believe me, I’d love it if climate change weren’t so verifiably real and we could all retire and live la dolce vita.

However, that’s not reality, and in the case of this paper, where the model doesn’t quite line up with the observed changes that’s because it’s worse than we previously though. Oh dear.

Global warming since the 1980s has contributed to an 8% increase in drought-ridden areas in the 2000s. It’s led to things like diminished corn crops and the steady draining of underground water aquifers in the USA, much of which is currently experiencing persistent drought. The letter L on the map below stands for long term drought.

Long term drought in the Southwest of the USA (from US Drought Monitor)

Long term drought in the Southwest of the USA (from US Drought Monitor)

What’s that got to do with climate models? Well, while the droughts in Southern Europe or my homeland of Australia are due to lack of rain drying things out, drought can also be from increased evaporation from warmer air temperatures, which the models don’t fully take into account.

These droughts are harder to measure because they’re related to sea surface temperature changes that take decades and can be hard to identify as a human forced signal rather than just natural variations. So this researcher compared sea surface temperatures with drought predictions and observed warming to try and work out what is going on.

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

There were two areas where the models differed from the observed changes – the Sahel area in Africa and the USA.

In the Sahel, the models predicted there would be warming in the North Atlantic Ocean which would lead to increased rain. What actually happened was that there was large warming in the South Atlantic Ocean compared to the North Atlantic and steady warming over the Indian Ocean which meant less rain, not more. Similarly, for the predicted patterns in the USA, the influence of the Pacific Multidecadal Oscillation was not known to be influenced by human climate forcing. However, it switched to a warm phase from above-normal sea surface temperature.

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

These sea surface variations that were missed in some of the previous models have some obvious consequences for planning for the slow pressure cooker of stress that drought is on anyone living through it, let alone trying to make a living from agriculture.

The researcher noted that there were also some differences from the models when looking at sulphate aerosols, however for the 21st Century the signal from greenhouse gases will be much stronger than those from aerosols, so shouldn’t mess with the data too much.

So what does this all mean? Well, it means that there are both regional and broader trends for drought in a changed climate. The broad patterns are fairly stable ‘because of the large forced trend compared with natural variations’, which is scientist for humans are making a large enough mess out of this to see the evidence clearly.

The paper ends quite bluntly stating that having re-worked the simulations to take into account the new data for sea surface temperature and other variables, that it’s only more bad news.

It’s likely to be ‘severe drought conditions by the late half of this century over many densely populated areas such as Europe, the eastern USA, southeast Asia and Brazil. This dire prediction could have devastating impacts on a large number of the population if the model’s regional predictions turn out to be true.’

Yes, a researcher actually used the word ‘dire’ in a scientific paper. Oh, and this was with an intermediate emissions scenario, not the business as usual path we’re currently all on. How about we all agree to stop burning carbon now?