Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.

Advertisements

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?

Carbon in Your Supply Chain

How will a real price on carbon affect supply chains and logistics?

WHO:  Justin Bull, (PhD Candidate, Faculty of Forestry, University of British Columbia, Canada)
Graham Kissack, (Communications Environment and Sustainability Consultant, Mill Bay, Canada)
Christopher Elliott, (Forest Carbon Initiative, WWF International, Gland, Switzerland)
Robert Kozak, (Professor, Faculty of Forestry, University of British Columbia, Canada)
Gary Bull, (Associate Professor, Faculty of Forestry, University of British Columbia, Canada)

WHAT: Looking at how a price on carbon can affect supply chains, with the example of magazine printing

WHEN: 2011

WHERE: Journal of Forest Products Business Research, Vol. 8, Article 2, 2011

TITLE: Carbon’s Potential to Reshape Supply Chains in Paper and Print: A Case Study (membership req)

Forestry is an industry that’s been doing it tough in the face of rapidly changing markets for a while. From the Clayoquot sound protests of the 1990s to stop clearcutting practices to the growing realisation that deforestation is one of the leading contributors to climate change, it’s the kind of industry where you either innovate or you don’t survive.

Which makes this paper – a case study into how monetising carbon has the potential to re-shape supply chains and make them low carbon – really interesting. From the outset, the researchers recognise where our planet is heading through climate change stating ‘any business that emits carbon will [have to] pay for its emissions’.

To look at the potential for low carbon supply chains, the paper looks at an example of producing and printing a magazine in North America – measuring the carbon emissions from cutting down the trees, to turning the trees into paper, transporting at each stage of the process and the printing process.

Trees to magazines (risa ikead, flickr)

Trees to magazines (risa ikead, flickr)

They did not count the emissions of the distribution process or any carbon emissions related to disposal after it was read by the consumer because these had too many uncertainties in the data. However, they worked with the companies that were involved in the process to try and get the most accurate picture of the process they possibly could.

The researchers found that the majority of the carbon is emitted in the paper manufacturing process (41%) as the paper went from a tree on Vancouver Island, was shipped as fibre to Port Alberni in a truck, manufactured into paper and then shipped by truck and barge to Richmond and then by train to the printing press in Merced, California.

Activity Carbon Emissions (CO2/ADt) Percentage of Total
Harvesting, road-building, felling, transport to sawmills

55kg

12%

Sawmilling into dimensional and residual products

45kg

10%

Transport of chips to mill

8kg

2%

Paper manufacturing process

185kg

41%

Transportation to print facility

127kg

28%

Printing process

36kg

8%

Total

456kg

100%

Supply Chain Emissions (Table 1. Reproduced verbatim from hardcopy)

The case study showed that upstream suppliers consume more energy than downstream suppliers, however downstream suppliers are most visible to consumers, which poses a challenge when trying to get larger emitters to minimise their carbon footprint, as there’s less likelihood of consumer pressure on lesser known organisations.

That being said, there can be major economic incentives for companies to try and minimise their carbon footprint given that Burlington Northern Santa Fe Railways (who shipped the paper from Canada to the printing press in California in this study) spent approximately $4.6billion on diesel fuel in 2008 (the data year for the case study).

Given that California implemented a carbon cap and trade market at the end of 2012 and that increasing awareness of the urgency to reduce our carbon emissions rapidly and significantly means the price of carbon is likely to increase, $4.6billion in diesel costs could rapidly escalate for a company like BNSF. If part or all of their transport costs could be switched to clean energy, as polluting fossil fuel sources are phased out the company will start saving themselves significant amounts.

The companies in this study were very aware of these issues, which is encouraging. They agreed that carbon and sustainability will be considered more closely in the future and that carbon has the potential to change the value of existing industrial assets as corporations who are ‘carbon-efficient’ may become preferred suppliers.

The researchers identified three types of risk that companies could face related to carbon; regulatory risk, financial risk and market access risk. The innovative businesses who will thrive in a low carbon 21st century economy will be thinking about and preparing for operating in an economy that doesn’t burn carbon for fuel, or where burning carbon is no longer profitable.

I really liked the paper’s example of financial risk in the bond market ‘where analysts are projecting a premium on corporate bonds for new coal fired power plants’, meaning it will be harder for companies to raise money to further pollute our atmosphere. This is especially important given that Deutsche Bank and Standard and Poors put the fossil fuel industry on notice last week saying that easy finance for their fossil fuels party is rapidly coming to an end.

Of course, no-one ever wants to believe that the boom times are coming to an end. But the companies that think ahead of the curve and innovate to reduce their carbon risk instead of going hell for leather on fossil fuels will be the ones that succeed in the long run.

Hot Air Balloon – Heat Emissions in London

Detailed measurements of the thermal pollution in Greater London and a look at the long term trend.

WHO: Mario Iamarino, Dipartimento di Ingegneria e Fisica dell’Ambiente, Università degli Studi della Basilicata, Potenza, Italy
Sean Beevers,  Environmental Research Group, King’s College London, London, UK
C. S. B. Grimmond, Environmental Monitoring and Modelling Group, Department of Geography, King’s College London, London, UK

WHAT: Measuring the heat pollution of London with regards with to both the space and time the emissions occur in.

WHEN: July 2011 (online version)

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 32, Issue 11, September 2012

TITLE: High-resolution (space, time) anthropogenic heat emissions: London 1970–2025 (subs. req)

Cities have a rather unique position in our world when we think about climate change. They are both the source of the majority of emissions, but also due to their dense nature they are the sites where the greatest innovations and reductions in emissions can occur fastest.

If you want to be able to harness the creativity and innovation of cities to be able to reduce emissions, you need to know what emissions are coming from where and when. You need data first.

That’s where these scientists from Italy and the UK stepped in to work out exactly how much waste heat the city of London was emitting. Waste heat is an issue because it contributes to the Urban Heat Island effect where the temperatures in cities can be higher than the surrounding area, which can be deadly in summer heat waves and lead to greater power usage.

Something interesting that I hadn’t considered before is that concentrated emissions can also change the atmospheric chemistry in a localised area.

The researchers looked at some very detailed data from between 2005-2008 for the city of London and broke it down into as accurate a picture as they could. They looked at heat from buildings looking at the differences between residential and commercial buildings and the different ways the different building uses would emit heat.

They looked at transport data using traffic congestion, the London Atmospheric Emissions Inventory, and data from several acronym-happy government departments to work out from what kinds of vehicles at what times of the day the waste heat was being emitted and at what temperature each different type of fuel combusted. They didn’t count trains (underground or overground) because they are mostly electric (and some of them recover waste heat from breaking) or small boats which didn’t emit enough to be counted.

They looked at the heat emitted by people at various stages of activity, age and time of day assuming a 1:1 ratio of females to males. They even looked at the standard metabolic rates of people to work out how much heat a person emits exercising, resting or sleeping!

Data! London waste heat emissions (from paper)

Data! London waste heat emissions (from paper)

What all that number and formula crunching told them was the total average anthropogenic heat flux for London was 10.9 Wm-2 (watts per square metre). This calculates to be 150 Terawatt hours of waste energy (in the form of heat), which as a comparison is all of the electricity used in Sweden in 2010.

Of that total, 80% came from buildings with 42% being domestic residences and 38% being industrial buildings. The next biggest source of heat was from transport, at 15% of the 10.9 Wm-2. Of the transport category, personal cars were the biggest contributor (64% of the transport portion).

Human heat only contributed 5.1% of the total (0.55 Wm-2), so maybe they’re not doing enough exercise in London? The information had peaks and valleys of heat loss – winter releases more waste heat than summer, because heating systems for winter are more widespread than air conditioners in summer. The industrial building emissions were concentrated in the core of the City of London (especially Canary Wharf where there are many new centrally heated/cooled high rise offices) while the domestic building emissions were much more spread around the centre of London.

Building heat emissions domestic (left) and industrial (right) (from paper)

Building heat emissions domestic (left) and industrial (right) (from paper)

Once they got all the data from 2005 to 2008, they considered trends from 1970 projected out to 2025 to see how great a role heat emissions could play in climate change. Using data from the Department of Energy and Climate Change, the London Atmospheric Emissions Inventory (LAEI) and data on population dynamics and traffic patterns, they estimated that there would be an increase in all contributors to heat emissions unless the UK Greenhouse gas reduction targets are fully implemented.

The reduction targets are for 80% reduction by 2050 (against the baseline of 1990 emissions). From the research indicating that buildings are the biggest heat emitters (and are therefore burning more energy to keep at the right temperature), this would mean there’s a great need for increasing building efficiency to meet those targets.

The paper notes that if the Low Carbon Transition Plan for London is implemented, the average waste heat emissions for London will drop to 9 Wm-2 by 2025, but in the central City of London, the best emissions reductions are likely to only be to 2008 levels, due to the expected growth in the area.

So what does any of this mean? It means London now has the data to know where they can find efficiencies that can complement their greenhouse gas mitigation programs. Because that’s the thing about combating climate change – it’s not a ‘one problem, one solution’ kind of issue. We need to do as many different things as possible all at once to try and add up to the levels of decarbonising that the world needs to be doing to avoid catastrophic climate change. So go forth London, take this data and use it well!

Unprecedented: Melting Before Our Eyes

The volume of Arctic sea ice is reducing faster than the area of sea ice, further speeding the arctic death spiral.

WHO:  Seymour W. Laxon, Katharine A. Giles, Andy L. Ridout, Duncan J. Wingham, Rosemary Willatt, Centre for Polar Observation and Modelling, Department of Earth Sciences, University College London, London, UK
Robert Cullen, Malcolm Davidson, European Space Agency, Noordwijk, The Netherlands
Ron Kwok, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California, USA.
Axel Schweiger, Jinlun Zhang, Polar Science Center, Applied Physics Laboratory, University of Washington, Seattle, Washington, USA
Christian Haas, Department of Earth and Space Science and Engineering, York University, Toronto, Canada.
Stefan Hendricks, Alfred Wegener Institute for Polar and Marine Research, Bremerhaven, Germany
Richard Krishfield, Woods Hole Oceanographic Institution, Woods Hole, Massachusetts, USA
Nathan Kurtz,School of Computer, Math, and Natural Sciences, Morgan State University, Baltimore, Maryland, USA.
Sinead Farrell, Earth System Science Interdisciplinary Center, University of Maryland, Maryland, USA.

WHAT: Measuring the volume of polar ice melt

WHEN: February 2013 (online pre-published version)

WHERE: American Geophysical Union, 2013, doi: 10.1002/grl.50193

TITLE:  CryoSat-2 estimates of Arctic sea ice thickness and volume (subs req.)

Much has been written about the Arctic Death Spiral of sea ice melting each spring and summer, with many researchers attempting to model and predict exactly how fast the sea ice is melting and when we will get the horrifying reality of an ice-free summer Arctic.

But is it just melting at the edges? Or is the thickness, and therefore the volume of sea ice being reduced as well? That’s what these researchers set out to try and find out using satellite data from CryoSat-2 (Science with satellites!).

The researchers used satellite radar altimeter measurements of sea ice thickness, and then compared their results with measured in-situ data as well as other Arctic sea ice models.

A loss of volume in Arctic sea ice is a signifier of changes in the heat exchange between the ice, ocean and atmosphere, and most global climate models predict a decrease in sea ice volume of 3.4% per decade which is larger than the predicted 2.4% per decade of sea ice area.

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

The researchers ran their numbers for ice volume in winter 2010/11 and winter 2011/12, and then used the recorded data sets to check the accuracy of their satellites (calibration for my fellow science nerds).

The most striking thing they found was a much greater loss of ice thickness in the north of Greenland and the Canadian Archipelago. Additionally, they found that the first year ice was thinner in autumn, which made it harder to catch up to average thickness during the winter, and made greater melting easier in summer.

Interestingly, they found that there was additional ice growth in winter between 2010-12 (7,500km3) compared to 2003-08 (5,000km3), which makes for an extra 36cm of ice growth in the winter. Unfortunately the increased summer melt is much greater than the extra growth, so it’s not adding to the overall thickness of the sea ice.

For the period of 2010-12 the satellite measured rate of decline in autumn sea ice was 60% greater than the predicted decline using PIOMAS (Panarctic Ice Ocean Modeling and Assimilation System). Most researchers when seeing results like that might hope that there’s an error, however when measured against the recorded data, the CryoSat-2 data was within 0.1 metres of accuracy. So while astounding, the 60% greater than expected loss of sea ice thickness is pretty spot on.

The researchers think that lower ice thickness at the end of winter in February and March could be contributing to the scarily low September minimums in the arctic death spiral, but the greatest risk here is that the ever increasing melt rate of ice in the arctic could take the climate beyond a tipping point where climate change is both irreversible and uncontrollable in a way we are unable to adapt to.

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

So as usual, my remedy for all of this is: stop burning carbon.

When the Party’s Over: Permian Mass Extinction

“The implication of our study is that elevated CO2 is sufficient to lead to inhospitable conditions for marine life and excessively high temperatures over land would contribute to the demise of terrestrial life.”

WHO: Jeffrey T. Kiehl, Christine A. Shields, Climate Change Research Section, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: A complex climate model of atmospheric, ocean and land conditions at the Permian mass extinction 251 million years ago to look at CO2 concentrations and their effect.

WHEN: September 2005

WHERE: Geological Society of America, Geology vol. 33 no. 9, September 2005

TITLE: Climate simulation of the latest Permian: Implications for mass extinction

The largest mass extinction on earth occurred approximately 251million years ago at the end of the Permian geologic era. Almost 95% of all ocean species and 70% of land species died, and research has shown that what probably happened to cause this extinction was carbon dioxide levels.

As the saying goes; those who do not learn from history are doomed to repeat it, so let’s see what happened to the planet 251 million years ago and work out how we humans can avoid doing it to ourselves at high speed.

This research paper from 2005 did the first comprehensive climate model of the Permian extinction, which means their model was complicated enough to include the interaction between the land and the oceans (as different to ‘uncoupled’ models that just looked at one or the other and not how they affected each other).

The researchers used the CCSM3 climate model that is currently housed at the National Centre for Atmospheric Research (NCAR) and is one of the major climate models currently being used by the IPCC to look forward and model how our climate may change with increasing atmospheric carbon pollution (or emission reduction). They organised their model to have ‘realistic boundary conditions’ for things like ocean layers (25 ocean layers for those playing at home), atmospheric resolution and energy system balance. They then ran the simulation for 900 years with current conditions and matched it with observed atmospheric conditions and got all of their data points correct with observed data.

Then, they made their model Permian, which meant taking CO2 concentrations and increasing them from our current 397ppm to 3,550ppm which is the estimated CO2 concentrations from the end of the Permian era.

What did ramping up the CO2 in this manner do for the planet’s living conditions in the model? It increased the global average temperature to a very high 23.5oC (the historic global average temperature for the Holocene (current era) is 14oC).

Oceans
Changing the CO2 concentrations so dramatically in the model changed the global average ocean surface temperature 4oC warmer than current conditions. Looking at all the ocean layers in their model, the water was warmer in deeper areas as well, with some areas at depths of 3000m below sea level measuring 4.5-5oC where they are currently near freezing.

The greatest warming in the oceans occurred at higher latitudes, where ocean temps were modelled at 8oC warmer than present, while equatorial tropical oceans were not substantially warmer. The oceans were also much saltier than they currently are.

The big problem for all of the things that called the ocean home at the end of the Permian era is the slowing of ocean circulation and mixing. Currently, dense salty water cools at the poles and sinks, oxygenating and mixing with deeper water allowing complex organisms to grow and live. If this slows down, which it did in this model, it has serious consequences for all ocean residents.

Current ocean circulation patterns (NOAA, Wikimedia commons)

Current ocean circulation patterns (NOAA, Wikimedia commons)

Their Permian model measured ocean overturning circulation around 10 Sv (million cubic metres per second), whereas current ocean overturning circulation is around 15-23 Sv. The researchers think the ocean currents could have slowed down enough to create anoxic oceans, which are also known as ‘ocean dead zones’ or ‘Canfield Oceans’, and stated that it set the stage for a large-scale marine die off.

Land
If the end of the Permian was pretty nasty for ocean residents, how did it fare for land-dwellers? What happened to the tetrapods of Gondwanaland? Well it looked really different to how earth looks today.

Permian land mass (Wikimedia commons)

Permian land mass (Wikimedia commons)

There were deciduous forests at high latitudes, and the elevated CO2 in the model was the dominant reason for warm, ice free Polar Regions (which also hindered ocean circulation). Land surface temperatures were between 10 – 40oC warmer than they are today. In their model, dry sub-tropical climates like the Mediterranean or Los Angeles and Southern California were much hotter, with the average daily minimum temperatures around 51oC. Yes, Los Angeles, your overnight low could be 51oC.

Understandably, the authors state that ‘these extreme daily temperature maxima in these regions could contribute to a decrease in terrestrial flora and fauna’, which is scientist for ‘it’s so damn hot outside nothing except cacti can grow’.

All of these changes were run over a 2,700 year period in the model, which if you take the 2005 CO2 concentration of 379ppm as your base is an increase of 1.17ppm per year.

This is the important bit to remember if we’re going to learn from history and not go the way of the Permian residents. Our current rate of increase in CO2 concentrations is 2ppm per year, which means we are on a super speed path to mass extinction. If we continue with business as usual, which has been aptly renamed ‘business as suicide’ by climate blogger Joe Romm, we will be at the end of the next mass extinction in around 1,500 years.

Where humanity is headed (from Royal Society Publishing)

Where humanity is headed (from Royal Society Publishing)

All we need to do to guarantee this being the outcome of all of humanity is keep the status quo and keep burning fossil fuels and the entire sum of humans as a species on this planet will be a tiny geological blip where we turned up, became the most successful invasive species on the globe, burned everything in sight and kept burning it even when we knew it was killing us.

However, I think this part from the paper’s conclusion should give most of us a pause for thought;

 ‘Given the sensitivity of ocean circulation to high-latitude warming, it is hypothesized that some critical level of high-latitude warming was reached where connection of surface waters to the deep ocean was dramatically reduced, thus leading to a shutdown of marine biologic activity, which in turn would have led to increased atmospheric CO2 and accelerated warming.’

As a species, if we are going to survive we need to make sure we do not go past any of those critical levels of warming or tipping points. Which means we need to make sure we stop burning carbon as fast as possible. Otherwise, T-Rex outlasted us as a species by about two million years which would be kinda embarrassing.

Hot Enough Yet? Warming in Western North America

How much and in what ways has the western part of North America warmed from climate change between 1950 and 2005?

WHO: Evan L. J. Booth, James M. Byrne and Dan L. Johnson, Water and Environmental Sciences, University of Lethbridge, Alberta, Canada

WHAT: Collating all of the weather station data from North America west of the Mississippi River and looking at the long term trends.

WHEN: 13 December 2012

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 32, Issue 15 (2012)

TITLE: Climatic changes in western North America, 1950–2005

As we all know, climate change is a global problem with regionally specific impacts. How the climate changes will depend on what your local climate was originally like. So how much has the western end of North America changed from 1950 to 2005? That’s what these researchers in Alberta set out to discover.

Firstly, for this research paper their area of western North America is more than just the Pacific Northwest. They decided to go with everything west of the Mississippi River in the USA and everything west of Manitoba in Canada, which is pretty diverse in terms of climate ranging from desert to mountains to prairies.

Climate study regions (from paper) and Google maps (for geographic locations)

Climate study regions (from paper) and Google maps (for geographic locations)

The researchers wanted to look at 50 years worth of data so that they could take out the natural variations like El Niño and La Niña years as well as the Pacific Decadal Oscillation and just focus on the human-caused effects of warming (aka anthropogenic warming for those who like big words).

The researchers looked at several different climate indicators. They counted the number of frost days, the length of the growing season, number of warm days and warm nights, number of wet days (>5mm of rain), very wet days (wetter than >95% of all the other days), daily rain intensity, and annual rain totals.

They noted that while climate change will definitely increase the intensity of the hydrological cycle, the future trends of rain (where it will be, how much there will be) are much more difficult to predict. However, the overall trend found was for rain increasing in the Pacific Northwest (sorry Vancouverites!) more than other areas.

Another interesting thing they found was that natural variability in climate may be masking the effects of climate change in Canada more than in the USA, given the large number of extreme weather events recently observed in the US compared with relatively few extreme events in Canada. Which doesn’t mean we’re getting off scot-free Canada, it means climate change is coming for us later!

There were 490 weather stations that contributed data to the paper, which meant for some massive number crunching, added to the fact that they had to develop an additional computer program to convert all the US measurements into metric (Dear USA, please join the rest of the world and go metric!).

As you can see from the map above, the area was broken down into six different regions and analysed for climate trends. The results were:

Pacific Northwest

The Pacific Northwest saw a significant decrease in frost days at a rate of 2.4 days/decade and a significant increase in warm nights. There was a general increasing trend for all the other measurements – the growing season was extended, and all the rain indicators went up (yeah, winter really is getting wetter Vancouver). The researchers noted that the Pacific Northwest has experienced ‘significant warming’ over the 50 year period, and that the reduction in frost days has severe consequences for Pine Beetle infestations and increasing wildfires.

Frost days: significantly decreasing (from paper)

Frost days: significantly decreasing (from paper)

The one exception to the rule was Oregon, where a significant warm and dry patch in the southern part of the state is a sign of the Californian desert climate moving north as the temperature increases.

Rain totals: Dry patch over Oregon (from paper)

Rain totals: Dry patch over Oregon (from paper)

Northwest Plains (Wyoming, Montana, Alberta, Saskatchewan)

The Northwest plains saw a significant decrease in frost days at a rate of .16 days per year. There was a significant increase in the number of warm days and warm nights, with an increase in all other factors. While the increases in growing season and precipitation are beneficial so far, the researchers noted that continued warming will have detrimental effects on soil moisture and that the earlier spring runoff will pose challenges for water management.

Humid Continental Plains (North Dakota, South Dakota, Nebraska, Iowa, Minnesota, Manitoba)

Changes in this area were more extreme than the Pacific Northwest or the plains. There were significant increases in all indicators except for frost days, which saw a significant decrease. The researchers were concerned to note that warm nights are outnumbering cool nights in the continental plains by 5:1. The average rainfall is increasing by .11mm per year which will eventually have serious consequences as the paper notes that most farmland and urban areas in the continental plains are located on flood plains.

Gulf (Texas, Oklahoma, Kansas, Missouri, Arkansas, Louisiana)

The Gulf States saw the most significant increase in rain totals with annual averages going up by 2.8mm per year. There were also significant increases in the number of warm nights, wet days and rain intensity. While there was a significant decrease in the growing season length, it was most pronounced in the northern states and possibly linked to the significant decrease in frost days. Interestingly, there was a significant decrease in warm days, which the researchers think could be linked to the increase in rain (more clouds = less sunlight beating down on you).

American Southwest (Utah, Colorado, Arizona, New Mexico)

Climatically, this one is a real mixed bag going from amazing ski mountains all the way to New Mexican desert. However, there were still some overarching trends. There were significant increases in warm days and nights, rain totals, rain intensity and wet days. There was a significant decrease in frost days and an increase in very wet days and the growing season. The paper noted that while the increase in rain in the Southwest is currently positive, that growing extremes in temperature and the evaporation associated with it will likely negate this factor in the future.

One large concern was the decrease in frost days, given that much of the flow from the Colorado River comes from snowmelt, which was wonderfully understated as:

‘While best management practices may be able to mitigate the risk of widespread system failure, current levels of development in arid areas of the region may be unsustainable.’

This is scientist for ‘you either deal with this now, or something’s going to give in a really nasty way later’. Or, as one of my favourite climate bloggers Joe Romm says ‘Hell and High Water’ which will bring us the next Dust Bowl.

California-Nevada

The final segment in Western North America had significant decreases in frost days (as did all of Western North America), significant increases in warm nights and increases in all the other indicators. This may seem milder; however the researchers warn that California had substantial warming with only a slight increase in precipitation. This will be deadly as climate change continues. As the paper states:

‘A decline in the availability of water supplies may make the current intensive agriculture industry in California’s Central Valley unsustainable in the long term.’

Did you hear that? It’s the sound of your favourite Napa wine grapes shrivelling and dying in the heat.

The end of irrigated agriculture in California? (photo: flickr)

The end of irrigated agriculture in California? (photo: flickr)

So what does this all mean? Well, long story short it means that while we here in Canada aren’t experiencing the worst of climate extremes yet, and while each region of North America will change specifically based on their local climate, we haven’t seen anything yet.

The long term trends are pretty clear for most areas (or at least statistically significant) and the consequences for communities and industries aren’t good. And that’s even before you start to think about non-linear climate responses and ecosystem tipping points. So, for the sake of the wine in Napa, the skiing in the Rockies, the agriculture in the prairies and the people who call New Mexico home, let’s stop burning carbon.