Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Playing the Emissions on Shuffle

What do the emission reductions of industrialised nations look like when you count the imports manufactured overseas?

WHO: Glen P. Peters, Center for International Climate and Environmental Research, Oslo, Norway
Jan C. Minx, Department for Sustainable Engineering, and Department for the Economics of Climate Change, Technical University Berlin, Germany
Christopher L. Weberd, Science and Technology Policy Institute, Washington, Civil and Environmental Engineering, Carnegie Mellon University, Pittsburgh, USA
Ottmar Edenhofer, Department for the Economics of Climate Change, Technical University Berlin, Potsdam Institute for Climate Impact Research, Potsdam, Germany

WHAT: Measuring the transfer of CO2 emissions through international trade

WHEN: 24 May 2011

WHERE: Proceedings of the National Academy of Sciences (PNAS) vol. 108 no. 21, May 2011

TITLE: Growth in emission transfers via international trade from 1990 to 2008 (open access)

These researchers have found a problem with the way we count carbon emissions. When we count them, countries tend to count them for industries that emit within their own territorial borders, which means that emissions in the developing world have kept going up, while emissions in the developed world (or first world) have either flattened or dropped, depending on how much your government likes to admit the reality of climate change.

However, most of the emissions from the developed world are to produce goods for places like North America and Europe. So these researchers wanted to work out exactly how much international trade contributed towards global emissions increasing by 39% from 1990 – 2008. Was the increase in emissions due to development in countries like China, or was it a case of wealthy countries just shuffling their manufacturing emissions to another country and continuing to increase consumption rates?

As you might guess (spoiler alert) it’s the latter. Turns out all we’ve been doing is moving most of our industrial manufacturing emissions to developing countries and importing the products back, allowing everyone to say ‘yay, we reduced emissions!’ while the actual amount of carbon being burned continues to increase.

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

But don’t take my word for it – what does the paper say?

The researchers took the global economy and broke it down into 113 regions with 95 individual countries and 57 economic sectors. They then looked at all the national and international data they could get on supply chain emissions to produce goods between 1990-2008, as well as doing extra detailed analysis for the years 1997, 2001 and 2004. They called it a Time Series with Trade and it was based on GDP, bilateral trade and emissions statistics (all of which you can generally find at your national statistics office online). The only thing they left out of their analysis was emissions from land use change, because there wasn’t enough data for them to thoroughly analyse it.

They found that global CO2 emissions from exported goods rose from 4.3 Gigatonnes (Gt) in 1990 to 7.8 Gt of CO2 in 2008, with a big increase in the decade up to 2008. Exports have increased their share of global emissions from 20% to 26% and grew on average by 4.3% per year, which was faster than the global population grew (1.4%), faster than total global CO2 emissions grew (2%) and faster than global GDP grew (3.6%).

The only thing that export emissions didn’t grow faster than was the dollar value of all that trade, which increased by 12% each year. So not only are all those new iPhones costing you a lot of money (and making Apple super wealthy), they’re also burning a lot of carbon.

But the thing the paper points out is that international trade has led to simply shifting the location of the emissions, rather than reducing the emissions – shuffling them around the planet to avoid counting them. The researchers estimate that the transfer of emissions from wealthy countries to developing countries has been 17% per year increasing from 0.4 Gt of CO2 in 1990 to 1.6 Gt in 2008.

This is an issue, because it means that all of the countries that signed on to Kyoto to reduce their carbon emissions – most of which promised around 0.7 Gt CO2 reduction per year – have simply shifted those emissions through trade to make them someone else’s problem, while continuing to consume stuff at an ever increasing rate.

More and more stuff (epSos, flickr)

More and more stuff (epSos, flickr)

The researchers point out that while China is currently the world’s largest emitter of carbon emissions, with the USA at number two, if you counted consumption emissions (meaning you made the USA count the emissions for all the stuff they use that’s made in China), they’d swap places and the USA would be the world’s largest emitter.

This makes sense if you think it through – have a look around your house at everything that’s made in China. All of that carbon that China is burning, which is destroying their air quality and polluting their cities and people; all of that is to make stuff for you to consume.

If you count the consumption emissions, the emissions reduction of 3% from the developed world becomes an emissions growth of 11%. Oops. Also, the researchers point out that emissions reductions in wealthy countries are often exceeded by the growth of trade emissions.

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

So what does this mean, other than the fact that everyone is trying to avoid having to clean up their own mess?

It means there’s a problem with the way we count emissions from trade vs emissions from consumption. It also means that we’re currently failing to reduce our carbon emissions in any significant way, which puts us on a straight path to 4, 5 or 6oC of global warming, otherwise known as the next mass extinction.

Climate Question: Do We Get to Keep Flying?

An analysis of jet fuel alternatives that could be viable in the next decade.

WHO: James I. Hileman, Hsin Min Wong, Pearl E. Donohoo, Malcolm A. Weiss, Ian A. Waitz, Massachusetts Institute of Technology (MIT)
David S. Ortiz, James T. Bartis, RAND Corporation Environment, Energy and Economic Development Program

WHAT: A feasibility study of alternatives to conventional jet fuel for aircraft.

WHEN: 2009

WHERE: Published on both the MIT website and RAND Corporation website

TITLE: Near-Term Feasibility of Alternative Jet Fuels

Last week, I looked at how our transport systems could be carbon free by 2100 and was intrigued by the comment ‘hydro-processed renewable jet fuel made from plant oils or animal fats is likely to be the only biomass-based fuel that could be used as a pure fuel for aviation, but would require various additives in order to be viable as an aviation fuel’.

It made me wonder what was being done for airplane fuel alternatives, or do we not have any alternatives and will I have to give up visiting my family in Australia?

Any other options? (photo: Amy Huva 2013)

Any other options? (photo: Amy Huva 2013)

I came across this technical report by MIT and the RAND Corporation (apparently RAND stands for Research ANd Development) and sat down to read all 150pages (you’re welcome) and see what our options for fuels that we could feasibly use in the next decade are.

The paper compared alternative fuels on the basis of compatibility with existing aircraft and infrastructure, production potential, production costs, lifecycle Greenhouse Gas (GHG) emissions, air quality emissions, merit of the fuel as jet fuel vs ground fuel and the maturity of the technology.

The researchers pointed out (quite rightly) that emissions from biofuels need to take into account the carbon emitted through land use changes because if you clear-fell a forest to plant a biofuel crop any carbon you’ve saved by not burning oil has just been invalidated by the carbon emitted from clear-felling the forest.

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

There were five different fuel groups looked at;

  1. Conventional Petroleum
  2. Unconventional Petroleum
  3. Synthetic fuel from natural gas, coal or biomass
  4. Renewable oils
  5. Alcohols

The standard fuel used in North America for aviation is called Jet A and was used as the benchmark for the study. So what did they find?

Conventional Petroleum Fuels

Almost all Jet A fuel comes from crude oil and is kerosene based. The emissions from Jet A are 87.5g of CO2e (CO2 equivalent) per megajoule (MJ) of energy created (g CO2e/MJ). Of that 87.5g, 73.2g comes from the burning of the fuel and there can be a 7% variation on the amount emitted from refining depending on the quality of the crude oil used and the refining process.

The world consumption of jet fuel is estimated at 5million barrels per day of oil. This number is really hard to wrap your head around, so let me quickly walk you through some math. A barrel of oil is 159L, which means 5million barrels per day is 795,000,000L of oil burned each day. To get that volume of water, you would have to run a fire hose (359L/minute) for 101 years (yes, YEARS). We burn that much in one day.

Given that a conventional fuel is already oil based and you can’t reduce those carbon emissions, the tweak for this paper was an Ultra Low Sulfur Jet A fuel, which would reduce sulfur emissions from burning the fuel.

While it’s a great to reduce sulfur emissions that cause acid rain, the extra refining needed upped the lifecycle emissions to 89g CO2e/MJ.

Unconventional Petroleum Fuels

Unconventional fuels are things like the Canadian tar sands (or oil sands if you’re their PR people) and Venezuelan Heavy Oil. These oils are dirtier and require more refining to be made into jet fuel. They also require more effort to get out of the ground, and so the lifecycle emissions are 103g CO2e/MJ (with an uncertainty of 5%). The upstream emissions of sourcing and refining the fuel are what add the extra – burning the fuel has the same emissions as Jet A, and the upstream emissions range from 16g CO2e/MJ for open cut mining to 26g CO2e/MJ for in-situ mining.

You can also get shale oil through the process of fracking and refine it to Jet A. Shale based Jet A also burns the same as Jet A, but the extraction emissions are a whopping 41g CO2e/MJ which is double the tar sands extraction emissions, giving an overall 114.2g CO2e/MJ lifecycle emissions.

Fischer-Tropsch Synthetic Fuels

These are fuels derived through the catalysed Fisher-Tropsch process and then refined into a fuel. These fuels are good because they have almost zero sulfur content (and therefore almost zero sulfur emissions). They don’t work as a 100% fuel without an engine refit because of the different aromatic hydrocarbon content, and the energy density is 3% less than Jet A (meaning you’d need 3% more fuel in the tank to go the same distance as Jet A fuel). However, it does combine easily to make a 50/50 blend for planes.

You can make FT Synthetic fuel from natural gas which gives you 101g CO2e/MJ emissions, from coal which gives you between 120-195g CO2e/MJ and relies on carbon capture and storage as a technical fix, or from biomass, which has almost zero lifecycle emissions ONLY if you use a waste product as the source and don’t contribute to further land use changes.

Renewable Oils

These are biodiesel or biokerosene which can be made from soybean oil, canola oil, palm oil, coconut oil, animal fats, waste products or halophytes and algae.

Because this paper was looking at fuels that could be commercially used in the next 10 years, they looked at a 5% blend with Jet A fuel to meet freeze point requirements (most renewable oils freeze at too high a temperature for the altitude planes fly at). They found too many safety and freezing point issues with biodiesel or biokerosene, so didn’t calculate the emissions from them as they’re not practical for use.

Another renewable oil is Hydroprocessed Jet Fuel entertainingly sometimes called ‘Syntroleum’. This is made from plant oils, animal fats or waste grease. Soybean oil without land use emissions would have only 40-80% of the emissions of Jet A, while palm oil would have 30-40% the emissions of Jet A.

Alcohol Fuels

The paper looked at using ethanol (the alcohol we drink) and butanol as replacement fuels. They both had lower energy densities to Jet A, higher volatility (being flammable and explosive) and issues with freezing at cruising altitude. While butanol is slightly safer to use as a jet fuel than ethanol, the report suggests it’s better used as a ground transport fuel than a jet fuel (I assume the better use of ethanol as a drink is implied).

Options for jet fuel alternatives (from paper)

Options for jet fuel alternatives (from paper)

After going through all the options, the researchers found that the three main options we have for alternative fuels over the next decade that could be commercially implemented are;

  1. Tar sands oil
  2. Coal-derived FT Synthetic oil
  3. Hydroprocessed Renewable jet fuel

They recommended that when looking to reduce the emissions from the transport sector that aviation shouldn’t be treated any differently. While strongly recommending that land use changes be taken into account for the use of biofuels, they also pointed out that the use for aviation should also be looked at as limited biofuel resources may be more effective producing heat and power rather than being used for transport.

Personally, I don’t find the report very heartening given that the first two options involve either dirtier oil or really dirty coal when what we need to be doing is reducing our emissions, not changing the form they’re in and still burning them. I’ll be keeping my eye out for any new research into hydroprocessed renewable jet fuels that could use waste products or algae – given the speed that oceans are acidifying, there could be a lot of ocean deadzones that are really good at growing algae and could be used as a jet fuel.

But until then, it looks like there aren’t many options for the continuation of air travel once we start seriously reducing our emissions – they’ll be a really quick way to burn through our remaining carbon budget.

Pandora’s Permafrost Freezer

What we know about permafrost melt is less than what we don’t know about it. So how do we determine the permafrost contribution to climate change?

WHO: E. A. G. Schuur, S. M. Natali, C. Schädel, University of Florida, Gainesville, FL, USA
B. W. Abbott, F. S. Chapin III, G. Grosse, J. B. Jones, C. L. Ping, V. E. Romanovsky, K. M. Walter Anthony University of Alaska Fairbanks, Fairbanks, AK, USA
W. B. Bowden, University of Vermont, Burlington, VT, USA
V. Brovkin, T. Kleinen, Max Planck Institute for Meteorology, Hamburg, Germany
P. Camill, Bowdoin College, Brunswick, ME, USA
J. G. Canadell, Global Carbon Project CSIRO Marine and Atmospheric Research, Canberra, Australia
J. P. Chanton, Florida State University, Tallahassee, FL, USA
T. R. Christensen, Lund University, Lund, Sweden
P. Ciais, LSCE, CEA-CNRS-UVSQ, Gif-sur-Yvette, France
B. T. Crosby, Idaho State University, Pocatello, ID, USA
C. I. Czimczik, University of California, Irvine, CA, USA
J. Harden, US Geological Survey, Menlo Park, CA, USA
D. J. Hayes, M. P.Waldrop, Oak Ridge National Laboratory, Oak Ridge, TN, USA
G. Hugelius, P. Kuhry, A. B. K. Sannel, Stockholm University, Stockholm, Sweden
J. D. Jastrow, Argonne National Laboratory, Argonne, IL, USA
C. D. Koven, W. J. Riley, Z. M. Subin, Lawrence Berkeley National Lab, Berkeley, CA, USA
G. Krinner, CNRS/UJF-Grenoble 1, LGGE, Grenoble, France
D. M. Lawrence, National Center for Atmospheric Research, Boulder, CO, USA
A. D. McGuire, U.S. Geological Survey, Alaska Cooperative Fish and Wildlife Research Unit, University of Alaska, Fairbanks, AK, USA
J. A. O’Donnell, Arctic Network, National Park Service, Fairbanks, AK, USA
A. Rinke, Alfred Wegener Institute, Potsdam, Germany
K. Schaefer, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, CO, USA
J. Sky, University of Oxford, Oxford, UK
C. Tarnocai, AgriFoods, Ottawa, ON, Canada
M. R. Turetsky, University of Guelph, Guelph, ON, Canada
K. P. Wickland, U.S. Geological Survey, Boulder, CO, USA
C. J. Wilson, Los Alamos National Laboratory, Los Alamos, NM, USA
 S. A. Zimov, North-East Scientific Station, Cherskii, Siberia

WHAT: Interviewing and averaging the best estimates by world experts on how much permafrost in the Arctic is likely to melt and how much that will contribute to climate change.

WHEN: 26 March 2013

WHERE: Climactic Change, Vol. 117, Issue 1-2, March 2013

TITLE: Expert assessment of vulnerability of permafrost carbon to climate change (open access!)

We are all told that you should never judge a book by its cover, however I’ll freely admit that I chose to read this paper because the headline in Nature Climate Change was ‘Pandora’s Freezer’ and I just love a clever play on words.

So what’s the deal with permafrost and climate change? Permafrost is the solid, permanently frozen dirt/mud/sludge in the Arctic that often looks like cliffs of chocolate mousse when it’s melting. The fact that it’s melting is the problem, because when it melts, the carbon gets disturbed and moved around and released into the atmosphere.

Releasing ancient carbon into the atmosphere is what humans have been doing at an ever greater rate since we worked out that fossilised carbon makes a really efficient energy source, so when the Arctic starts doing that as well, it’s adding to the limited remaining carbon budget our atmosphere has left. Which means melting permafrost has consequences for how much time humanity has left to wean ourselves off our destructive fossil fuel addiction.

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

 How much time do we have? How much carbon is in those cliffs of chocolate mousse? We’re not sure. And that’s a big problem. Estimates in recent research think there could be as much as 1,700 billion tonnes of carbon stored in permafrost in the Arctic, which is much higher than earlier estimates from research in the 1990s.

To give that very large number some context, 1,700 billion tonnes can also be called 1,700 Gigatonnes, which should ring a bell for anyone who read Bill McKibben’s Rolling Stone global warming math article. The article stated that the best current estimate for humanity to have a shot at keeping global average temperatures below a 2oC increase is a carbon budget of 565Gt. So if all the permafrost melted, we’ve blown that budget twice.

What this paper did, was ask the above long list of experts on soil, carbon in soil, permafrost and Arctic research three questions over three different time scales.

  1. How much permafrost is likely to degrade (aka quantitative estimates of surface permafrost degradation)
  2. How much carbon it will likely release
  3. How much methane it will likely release

They included the methane question because methane has short term ramifications for the atmosphere. Methane ‘only’ stays in the atmosphere for around 100 years (compared to carbon dioxide’s 1000 plus years) and it has 33 times the global warming potential (GWP) of CO2 over a 100 year period. So for the first hundred years after you’ve released it, one tonne of methane is as bad as 33 tonnes of CO2. This could quickly blow our carbon budgets as we head merrily past 400 parts per million of CO2 in the atmosphere from human forcing.

The time periods for each question were; by 2040 with 1.5-2.5oC Arctic temperature rise (the Arctic warms faster than lower latitudes), by 2100 with between 2.0-7.5oC temperature rise (so from ‘we can possibly deal with this’ to ‘catastrophic climate change’), and by 2300 where temperatures are stable after 2100.

The estimates the experts gave were then screened for level of expertise (you don’t want to be asking an atmospheric specialist the soil questions!) and averaged to give an estimate range. For surface loss of permafrost under the highest warming scenario, the results were;

  1. 9-16% loss by 2040
  2. 48-63% loss by 2100
  3. 67-80% loss by 2300

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Ouch. If we don’t start doing something serious about reducing our carbon emissions soon, we could be blowing that carbon budget really quickly.

For how much carbon the highest warming scenario may release, the results were;

  1. 19-45billion tonnes (Gt) CO2 by 2040
  2. 162-288Gt CO2 by 2100
  3. 381-616Gt CO2 by 2300

Hmm. So if we don’t stop burning carbon by 2040, melting permafrost will have taken 45Gt of CO2 out of our atmospheric carbon budget of 565Gt. Let’s hope we haven’t burned through the rest by then too.

However, if Arctic temperature rises were limited to 2oC by 2100, the CO2 emissions would ‘only’ be;

  1. 6-17Gt CO2 by 2040
  2. 41-80Gt CO2 by 2100
  3. 119-200Gt CO2 by 2300

That’s about a third of the highest warming estimates, but still nothing to breathe a sigh of relief at given that the 2000-2010 average annual rate of fossil fuel burning was 7.9Gt per year. So even the low estimate has permafrost releasing more than two years worth of global emissions, meaning we’d have to stop burning carbon two years earlier.

When the researchers calculated the expected methane emissions, the estimates were low. However, when they calculated the CO2 equivalent (CO2e) for the methane (methane being 33 times more potent than CO2 over 100 years), they got;

  1. 29-60Gt CO2e by 2040
  2. 250-463Gt CO2e by 2100
  3. 572-1004Gt CO2e by 2300

Thankfully, most of the carbon in the permafrost is expected to be released as the less potent carbon dioxide, but working out the balance between how much methane may be released into the atmosphere vs how much will be carbon dioxide is really crucial for working out global carbon budgets.

The other problem is that most climate models that look at permafrost contributions to climate change do it in a linear manner where increased temps lead directly to an increase in microbes and bacteria and the carbon is released. In reality, permafrost is much more dynamic and non-linear and therefore more unpredictable, which makes it a pain to put into models. It’s really difficult to predict abrupt thaw processes (as was seen over 98% of Greenland last summer) where ice wedges can melt and the ground could collapse irreversibly.

These kinds of non-linear processes (the really terrifying bit about climate change) made the news this week when it was reported that the Alaskan town of Newtok is likely to wash away by 2017, making the townspeople the first climate refugees from the USA.

The paper points out that one of the key limitations to knowing exactly what the permafrost is going to do is the lack of historical permafrost data. Permafrost is in really remote hard to get to places where people don’t live because the ground is permanently frozen. People haven’t been going to these places and taking samples unlike more populated areas that have lengthy and detailed climate records. But if you don’t know how much permafrost was historically there, you can’t tell how fast it’s melting.

The key point from this paper is that even though we’re not sure exactly how much permafrost will contribute to global carbon budgets and temperature rise, this uncertainty alone should not be enough to stall action on climate change.

Yes, there is uncertainty in exactly how badly climate change will affect the biosphere and everything that lives within it, but currently our options range from ‘uncomfortable and we may be able to adapt’ to ‘the next mass extinction’.

So while we’re working out exactly how far we’ve opened the Pandora’s Freezer of permafrost, let’s also stop burning carbon. 

Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?

Carbon in Your Supply Chain

How will a real price on carbon affect supply chains and logistics?

WHO:  Justin Bull, (PhD Candidate, Faculty of Forestry, University of British Columbia, Canada)
Graham Kissack, (Communications Environment and Sustainability Consultant, Mill Bay, Canada)
Christopher Elliott, (Forest Carbon Initiative, WWF International, Gland, Switzerland)
Robert Kozak, (Professor, Faculty of Forestry, University of British Columbia, Canada)
Gary Bull, (Associate Professor, Faculty of Forestry, University of British Columbia, Canada)

WHAT: Looking at how a price on carbon can affect supply chains, with the example of magazine printing

WHEN: 2011

WHERE: Journal of Forest Products Business Research, Vol. 8, Article 2, 2011

TITLE: Carbon’s Potential to Reshape Supply Chains in Paper and Print: A Case Study (membership req)

Forestry is an industry that’s been doing it tough in the face of rapidly changing markets for a while. From the Clayoquot sound protests of the 1990s to stop clearcutting practices to the growing realisation that deforestation is one of the leading contributors to climate change, it’s the kind of industry where you either innovate or you don’t survive.

Which makes this paper – a case study into how monetising carbon has the potential to re-shape supply chains and make them low carbon – really interesting. From the outset, the researchers recognise where our planet is heading through climate change stating ‘any business that emits carbon will [have to] pay for its emissions’.

To look at the potential for low carbon supply chains, the paper looks at an example of producing and printing a magazine in North America – measuring the carbon emissions from cutting down the trees, to turning the trees into paper, transporting at each stage of the process and the printing process.

Trees to magazines (risa ikead, flickr)

Trees to magazines (risa ikead, flickr)

They did not count the emissions of the distribution process or any carbon emissions related to disposal after it was read by the consumer because these had too many uncertainties in the data. However, they worked with the companies that were involved in the process to try and get the most accurate picture of the process they possibly could.

The researchers found that the majority of the carbon is emitted in the paper manufacturing process (41%) as the paper went from a tree on Vancouver Island, was shipped as fibre to Port Alberni in a truck, manufactured into paper and then shipped by truck and barge to Richmond and then by train to the printing press in Merced, California.

Activity Carbon Emissions (CO2/ADt) Percentage of Total
Harvesting, road-building, felling, transport to sawmills

55kg

12%

Sawmilling into dimensional and residual products

45kg

10%

Transport of chips to mill

8kg

2%

Paper manufacturing process

185kg

41%

Transportation to print facility

127kg

28%

Printing process

36kg

8%

Total

456kg

100%

Supply Chain Emissions (Table 1. Reproduced verbatim from hardcopy)

The case study showed that upstream suppliers consume more energy than downstream suppliers, however downstream suppliers are most visible to consumers, which poses a challenge when trying to get larger emitters to minimise their carbon footprint, as there’s less likelihood of consumer pressure on lesser known organisations.

That being said, there can be major economic incentives for companies to try and minimise their carbon footprint given that Burlington Northern Santa Fe Railways (who shipped the paper from Canada to the printing press in California in this study) spent approximately $4.6billion on diesel fuel in 2008 (the data year for the case study).

Given that California implemented a carbon cap and trade market at the end of 2012 and that increasing awareness of the urgency to reduce our carbon emissions rapidly and significantly means the price of carbon is likely to increase, $4.6billion in diesel costs could rapidly escalate for a company like BNSF. If part or all of their transport costs could be switched to clean energy, as polluting fossil fuel sources are phased out the company will start saving themselves significant amounts.

The companies in this study were very aware of these issues, which is encouraging. They agreed that carbon and sustainability will be considered more closely in the future and that carbon has the potential to change the value of existing industrial assets as corporations who are ‘carbon-efficient’ may become preferred suppliers.

The researchers identified three types of risk that companies could face related to carbon; regulatory risk, financial risk and market access risk. The innovative businesses who will thrive in a low carbon 21st century economy will be thinking about and preparing for operating in an economy that doesn’t burn carbon for fuel, or where burning carbon is no longer profitable.

I really liked the paper’s example of financial risk in the bond market ‘where analysts are projecting a premium on corporate bonds for new coal fired power plants’, meaning it will be harder for companies to raise money to further pollute our atmosphere. This is especially important given that Deutsche Bank and Standard and Poors put the fossil fuel industry on notice last week saying that easy finance for their fossil fuels party is rapidly coming to an end.

Of course, no-one ever wants to believe that the boom times are coming to an end. But the companies that think ahead of the curve and innovate to reduce their carbon risk instead of going hell for leather on fossil fuels will be the ones that succeed in the long run.

Hot Air Balloon – Heat Emissions in London

Detailed measurements of the thermal pollution in Greater London and a look at the long term trend.

WHO: Mario Iamarino, Dipartimento di Ingegneria e Fisica dell’Ambiente, Università degli Studi della Basilicata, Potenza, Italy
Sean Beevers,  Environmental Research Group, King’s College London, London, UK
C. S. B. Grimmond, Environmental Monitoring and Modelling Group, Department of Geography, King’s College London, London, UK

WHAT: Measuring the heat pollution of London with regards with to both the space and time the emissions occur in.

WHEN: July 2011 (online version)

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 32, Issue 11, September 2012

TITLE: High-resolution (space, time) anthropogenic heat emissions: London 1970–2025 (subs. req)

Cities have a rather unique position in our world when we think about climate change. They are both the source of the majority of emissions, but also due to their dense nature they are the sites where the greatest innovations and reductions in emissions can occur fastest.

If you want to be able to harness the creativity and innovation of cities to be able to reduce emissions, you need to know what emissions are coming from where and when. You need data first.

That’s where these scientists from Italy and the UK stepped in to work out exactly how much waste heat the city of London was emitting. Waste heat is an issue because it contributes to the Urban Heat Island effect where the temperatures in cities can be higher than the surrounding area, which can be deadly in summer heat waves and lead to greater power usage.

Something interesting that I hadn’t considered before is that concentrated emissions can also change the atmospheric chemistry in a localised area.

The researchers looked at some very detailed data from between 2005-2008 for the city of London and broke it down into as accurate a picture as they could. They looked at heat from buildings looking at the differences between residential and commercial buildings and the different ways the different building uses would emit heat.

They looked at transport data using traffic congestion, the London Atmospheric Emissions Inventory, and data from several acronym-happy government departments to work out from what kinds of vehicles at what times of the day the waste heat was being emitted and at what temperature each different type of fuel combusted. They didn’t count trains (underground or overground) because they are mostly electric (and some of them recover waste heat from breaking) or small boats which didn’t emit enough to be counted.

They looked at the heat emitted by people at various stages of activity, age and time of day assuming a 1:1 ratio of females to males. They even looked at the standard metabolic rates of people to work out how much heat a person emits exercising, resting or sleeping!

Data! London waste heat emissions (from paper)

Data! London waste heat emissions (from paper)

What all that number and formula crunching told them was the total average anthropogenic heat flux for London was 10.9 Wm-2 (watts per square metre). This calculates to be 150 Terawatt hours of waste energy (in the form of heat), which as a comparison is all of the electricity used in Sweden in 2010.

Of that total, 80% came from buildings with 42% being domestic residences and 38% being industrial buildings. The next biggest source of heat was from transport, at 15% of the 10.9 Wm-2. Of the transport category, personal cars were the biggest contributor (64% of the transport portion).

Human heat only contributed 5.1% of the total (0.55 Wm-2), so maybe they’re not doing enough exercise in London? The information had peaks and valleys of heat loss – winter releases more waste heat than summer, because heating systems for winter are more widespread than air conditioners in summer. The industrial building emissions were concentrated in the core of the City of London (especially Canary Wharf where there are many new centrally heated/cooled high rise offices) while the domestic building emissions were much more spread around the centre of London.

Building heat emissions domestic (left) and industrial (right) (from paper)

Building heat emissions domestic (left) and industrial (right) (from paper)

Once they got all the data from 2005 to 2008, they considered trends from 1970 projected out to 2025 to see how great a role heat emissions could play in climate change. Using data from the Department of Energy and Climate Change, the London Atmospheric Emissions Inventory (LAEI) and data on population dynamics and traffic patterns, they estimated that there would be an increase in all contributors to heat emissions unless the UK Greenhouse gas reduction targets are fully implemented.

The reduction targets are for 80% reduction by 2050 (against the baseline of 1990 emissions). From the research indicating that buildings are the biggest heat emitters (and are therefore burning more energy to keep at the right temperature), this would mean there’s a great need for increasing building efficiency to meet those targets.

The paper notes that if the Low Carbon Transition Plan for London is implemented, the average waste heat emissions for London will drop to 9 Wm-2 by 2025, but in the central City of London, the best emissions reductions are likely to only be to 2008 levels, due to the expected growth in the area.

So what does any of this mean? It means London now has the data to know where they can find efficiencies that can complement their greenhouse gas mitigation programs. Because that’s the thing about combating climate change – it’s not a ‘one problem, one solution’ kind of issue. We need to do as many different things as possible all at once to try and add up to the levels of decarbonising that the world needs to be doing to avoid catastrophic climate change. So go forth London, take this data and use it well!

Unprecedented: Melting Before Our Eyes

The volume of Arctic sea ice is reducing faster than the area of sea ice, further speeding the arctic death spiral.

WHO:  Seymour W. Laxon, Katharine A. Giles, Andy L. Ridout, Duncan J. Wingham, Rosemary Willatt, Centre for Polar Observation and Modelling, Department of Earth Sciences, University College London, London, UK
Robert Cullen, Malcolm Davidson, European Space Agency, Noordwijk, The Netherlands
Ron Kwok, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California, USA.
Axel Schweiger, Jinlun Zhang, Polar Science Center, Applied Physics Laboratory, University of Washington, Seattle, Washington, USA
Christian Haas, Department of Earth and Space Science and Engineering, York University, Toronto, Canada.
Stefan Hendricks, Alfred Wegener Institute for Polar and Marine Research, Bremerhaven, Germany
Richard Krishfield, Woods Hole Oceanographic Institution, Woods Hole, Massachusetts, USA
Nathan Kurtz,School of Computer, Math, and Natural Sciences, Morgan State University, Baltimore, Maryland, USA.
Sinead Farrell, Earth System Science Interdisciplinary Center, University of Maryland, Maryland, USA.

WHAT: Measuring the volume of polar ice melt

WHEN: February 2013 (online pre-published version)

WHERE: American Geophysical Union, 2013, doi: 10.1002/grl.50193

TITLE:  CryoSat-2 estimates of Arctic sea ice thickness and volume (subs req.)

Much has been written about the Arctic Death Spiral of sea ice melting each spring and summer, with many researchers attempting to model and predict exactly how fast the sea ice is melting and when we will get the horrifying reality of an ice-free summer Arctic.

But is it just melting at the edges? Or is the thickness, and therefore the volume of sea ice being reduced as well? That’s what these researchers set out to try and find out using satellite data from CryoSat-2 (Science with satellites!).

The researchers used satellite radar altimeter measurements of sea ice thickness, and then compared their results with measured in-situ data as well as other Arctic sea ice models.

A loss of volume in Arctic sea ice is a signifier of changes in the heat exchange between the ice, ocean and atmosphere, and most global climate models predict a decrease in sea ice volume of 3.4% per decade which is larger than the predicted 2.4% per decade of sea ice area.

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

The researchers ran their numbers for ice volume in winter 2010/11 and winter 2011/12, and then used the recorded data sets to check the accuracy of their satellites (calibration for my fellow science nerds).

The most striking thing they found was a much greater loss of ice thickness in the north of Greenland and the Canadian Archipelago. Additionally, they found that the first year ice was thinner in autumn, which made it harder to catch up to average thickness during the winter, and made greater melting easier in summer.

Interestingly, they found that there was additional ice growth in winter between 2010-12 (7,500km3) compared to 2003-08 (5,000km3), which makes for an extra 36cm of ice growth in the winter. Unfortunately the increased summer melt is much greater than the extra growth, so it’s not adding to the overall thickness of the sea ice.

For the period of 2010-12 the satellite measured rate of decline in autumn sea ice was 60% greater than the predicted decline using PIOMAS (Panarctic Ice Ocean Modeling and Assimilation System). Most researchers when seeing results like that might hope that there’s an error, however when measured against the recorded data, the CryoSat-2 data was within 0.1 metres of accuracy. So while astounding, the 60% greater than expected loss of sea ice thickness is pretty spot on.

The researchers think that lower ice thickness at the end of winter in February and March could be contributing to the scarily low September minimums in the arctic death spiral, but the greatest risk here is that the ever increasing melt rate of ice in the arctic could take the climate beyond a tipping point where climate change is both irreversible and uncontrollable in a way we are unable to adapt to.

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

So as usual, my remedy for all of this is: stop burning carbon.

When the Party’s Over: Permian Mass Extinction

“The implication of our study is that elevated CO2 is sufficient to lead to inhospitable conditions for marine life and excessively high temperatures over land would contribute to the demise of terrestrial life.”

WHO: Jeffrey T. Kiehl, Christine A. Shields, Climate Change Research Section, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: A complex climate model of atmospheric, ocean and land conditions at the Permian mass extinction 251 million years ago to look at CO2 concentrations and their effect.

WHEN: September 2005

WHERE: Geological Society of America, Geology vol. 33 no. 9, September 2005

TITLE: Climate simulation of the latest Permian: Implications for mass extinction

The largest mass extinction on earth occurred approximately 251million years ago at the end of the Permian geologic era. Almost 95% of all ocean species and 70% of land species died, and research has shown that what probably happened to cause this extinction was carbon dioxide levels.

As the saying goes; those who do not learn from history are doomed to repeat it, so let’s see what happened to the planet 251 million years ago and work out how we humans can avoid doing it to ourselves at high speed.

This research paper from 2005 did the first comprehensive climate model of the Permian extinction, which means their model was complicated enough to include the interaction between the land and the oceans (as different to ‘uncoupled’ models that just looked at one or the other and not how they affected each other).

The researchers used the CCSM3 climate model that is currently housed at the National Centre for Atmospheric Research (NCAR) and is one of the major climate models currently being used by the IPCC to look forward and model how our climate may change with increasing atmospheric carbon pollution (or emission reduction). They organised their model to have ‘realistic boundary conditions’ for things like ocean layers (25 ocean layers for those playing at home), atmospheric resolution and energy system balance. They then ran the simulation for 900 years with current conditions and matched it with observed atmospheric conditions and got all of their data points correct with observed data.

Then, they made their model Permian, which meant taking CO2 concentrations and increasing them from our current 397ppm to 3,550ppm which is the estimated CO2 concentrations from the end of the Permian era.

What did ramping up the CO2 in this manner do for the planet’s living conditions in the model? It increased the global average temperature to a very high 23.5oC (the historic global average temperature for the Holocene (current era) is 14oC).

Oceans
Changing the CO2 concentrations so dramatically in the model changed the global average ocean surface temperature 4oC warmer than current conditions. Looking at all the ocean layers in their model, the water was warmer in deeper areas as well, with some areas at depths of 3000m below sea level measuring 4.5-5oC where they are currently near freezing.

The greatest warming in the oceans occurred at higher latitudes, where ocean temps were modelled at 8oC warmer than present, while equatorial tropical oceans were not substantially warmer. The oceans were also much saltier than they currently are.

The big problem for all of the things that called the ocean home at the end of the Permian era is the slowing of ocean circulation and mixing. Currently, dense salty water cools at the poles and sinks, oxygenating and mixing with deeper water allowing complex organisms to grow and live. If this slows down, which it did in this model, it has serious consequences for all ocean residents.

Current ocean circulation patterns (NOAA, Wikimedia commons)

Current ocean circulation patterns (NOAA, Wikimedia commons)

Their Permian model measured ocean overturning circulation around 10 Sv (million cubic metres per second), whereas current ocean overturning circulation is around 15-23 Sv. The researchers think the ocean currents could have slowed down enough to create anoxic oceans, which are also known as ‘ocean dead zones’ or ‘Canfield Oceans’, and stated that it set the stage for a large-scale marine die off.

Land
If the end of the Permian was pretty nasty for ocean residents, how did it fare for land-dwellers? What happened to the tetrapods of Gondwanaland? Well it looked really different to how earth looks today.

Permian land mass (Wikimedia commons)

Permian land mass (Wikimedia commons)

There were deciduous forests at high latitudes, and the elevated CO2 in the model was the dominant reason for warm, ice free Polar Regions (which also hindered ocean circulation). Land surface temperatures were between 10 – 40oC warmer than they are today. In their model, dry sub-tropical climates like the Mediterranean or Los Angeles and Southern California were much hotter, with the average daily minimum temperatures around 51oC. Yes, Los Angeles, your overnight low could be 51oC.

Understandably, the authors state that ‘these extreme daily temperature maxima in these regions could contribute to a decrease in terrestrial flora and fauna’, which is scientist for ‘it’s so damn hot outside nothing except cacti can grow’.

All of these changes were run over a 2,700 year period in the model, which if you take the 2005 CO2 concentration of 379ppm as your base is an increase of 1.17ppm per year.

This is the important bit to remember if we’re going to learn from history and not go the way of the Permian residents. Our current rate of increase in CO2 concentrations is 2ppm per year, which means we are on a super speed path to mass extinction. If we continue with business as usual, which has been aptly renamed ‘business as suicide’ by climate blogger Joe Romm, we will be at the end of the next mass extinction in around 1,500 years.

Where humanity is headed (from Royal Society Publishing)

Where humanity is headed (from Royal Society Publishing)

All we need to do to guarantee this being the outcome of all of humanity is keep the status quo and keep burning fossil fuels and the entire sum of humans as a species on this planet will be a tiny geological blip where we turned up, became the most successful invasive species on the globe, burned everything in sight and kept burning it even when we knew it was killing us.

However, I think this part from the paper’s conclusion should give most of us a pause for thought;

 ‘Given the sensitivity of ocean circulation to high-latitude warming, it is hypothesized that some critical level of high-latitude warming was reached where connection of surface waters to the deep ocean was dramatically reduced, thus leading to a shutdown of marine biologic activity, which in turn would have led to increased atmospheric CO2 and accelerated warming.’

As a species, if we are going to survive we need to make sure we do not go past any of those critical levels of warming or tipping points. Which means we need to make sure we stop burning carbon as fast as possible. Otherwise, T-Rex outlasted us as a species by about two million years which would be kinda embarrassing.