Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.

Admin Post

scientific evidence

From DeSmogblog.com

Yesterday, much to my excitement, the climate scientist Michael Mann retweeted my most recent blog, thus sending my page views to a new record and attracting several climate deniers to try and comment about how ‘climate change isn’t real’.

On this blog, I aim to bring peer reviewed climate science to a wider audience, so that we can better understand the challenges facing humanity caused by the burning of fossil fuels.

While there are certainly many aspects of climate mitigation that require much discussion and debate, what does not require debate is whether the science of climate change is real.

Anyone who wishes to try and prove climate change is not real (thus attempting to disprove the overwhelming consensus of scientific research) in the comments section will not make it through moderation.

And now for less seriousness, here’s one of my favourite climate videos of all time from Australia

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?

Carbon in Your Supply Chain

How will a real price on carbon affect supply chains and logistics?

WHO:  Justin Bull, (PhD Candidate, Faculty of Forestry, University of British Columbia, Canada)
Graham Kissack, (Communications Environment and Sustainability Consultant, Mill Bay, Canada)
Christopher Elliott, (Forest Carbon Initiative, WWF International, Gland, Switzerland)
Robert Kozak, (Professor, Faculty of Forestry, University of British Columbia, Canada)
Gary Bull, (Associate Professor, Faculty of Forestry, University of British Columbia, Canada)

WHAT: Looking at how a price on carbon can affect supply chains, with the example of magazine printing

WHEN: 2011

WHERE: Journal of Forest Products Business Research, Vol. 8, Article 2, 2011

TITLE: Carbon’s Potential to Reshape Supply Chains in Paper and Print: A Case Study (membership req)

Forestry is an industry that’s been doing it tough in the face of rapidly changing markets for a while. From the Clayoquot sound protests of the 1990s to stop clearcutting practices to the growing realisation that deforestation is one of the leading contributors to climate change, it’s the kind of industry where you either innovate or you don’t survive.

Which makes this paper – a case study into how monetising carbon has the potential to re-shape supply chains and make them low carbon – really interesting. From the outset, the researchers recognise where our planet is heading through climate change stating ‘any business that emits carbon will [have to] pay for its emissions’.

To look at the potential for low carbon supply chains, the paper looks at an example of producing and printing a magazine in North America – measuring the carbon emissions from cutting down the trees, to turning the trees into paper, transporting at each stage of the process and the printing process.

Trees to magazines (risa ikead, flickr)

Trees to magazines (risa ikead, flickr)

They did not count the emissions of the distribution process or any carbon emissions related to disposal after it was read by the consumer because these had too many uncertainties in the data. However, they worked with the companies that were involved in the process to try and get the most accurate picture of the process they possibly could.

The researchers found that the majority of the carbon is emitted in the paper manufacturing process (41%) as the paper went from a tree on Vancouver Island, was shipped as fibre to Port Alberni in a truck, manufactured into paper and then shipped by truck and barge to Richmond and then by train to the printing press in Merced, California.

Activity Carbon Emissions (CO2/ADt) Percentage of Total
Harvesting, road-building, felling, transport to sawmills

55kg

12%

Sawmilling into dimensional and residual products

45kg

10%

Transport of chips to mill

8kg

2%

Paper manufacturing process

185kg

41%

Transportation to print facility

127kg

28%

Printing process

36kg

8%

Total

456kg

100%

Supply Chain Emissions (Table 1. Reproduced verbatim from hardcopy)

The case study showed that upstream suppliers consume more energy than downstream suppliers, however downstream suppliers are most visible to consumers, which poses a challenge when trying to get larger emitters to minimise their carbon footprint, as there’s less likelihood of consumer pressure on lesser known organisations.

That being said, there can be major economic incentives for companies to try and minimise their carbon footprint given that Burlington Northern Santa Fe Railways (who shipped the paper from Canada to the printing press in California in this study) spent approximately $4.6billion on diesel fuel in 2008 (the data year for the case study).

Given that California implemented a carbon cap and trade market at the end of 2012 and that increasing awareness of the urgency to reduce our carbon emissions rapidly and significantly means the price of carbon is likely to increase, $4.6billion in diesel costs could rapidly escalate for a company like BNSF. If part or all of their transport costs could be switched to clean energy, as polluting fossil fuel sources are phased out the company will start saving themselves significant amounts.

The companies in this study were very aware of these issues, which is encouraging. They agreed that carbon and sustainability will be considered more closely in the future and that carbon has the potential to change the value of existing industrial assets as corporations who are ‘carbon-efficient’ may become preferred suppliers.

The researchers identified three types of risk that companies could face related to carbon; regulatory risk, financial risk and market access risk. The innovative businesses who will thrive in a low carbon 21st century economy will be thinking about and preparing for operating in an economy that doesn’t burn carbon for fuel, or where burning carbon is no longer profitable.

I really liked the paper’s example of financial risk in the bond market ‘where analysts are projecting a premium on corporate bonds for new coal fired power plants’, meaning it will be harder for companies to raise money to further pollute our atmosphere. This is especially important given that Deutsche Bank and Standard and Poors put the fossil fuel industry on notice last week saying that easy finance for their fossil fuels party is rapidly coming to an end.

Of course, no-one ever wants to believe that the boom times are coming to an end. But the companies that think ahead of the curve and innovate to reduce their carbon risk instead of going hell for leather on fossil fuels will be the ones that succeed in the long run.

Hot Air Balloon – Heat Emissions in London

Detailed measurements of the thermal pollution in Greater London and a look at the long term trend.

WHO: Mario Iamarino, Dipartimento di Ingegneria e Fisica dell’Ambiente, Università degli Studi della Basilicata, Potenza, Italy
Sean Beevers,  Environmental Research Group, King’s College London, London, UK
C. S. B. Grimmond, Environmental Monitoring and Modelling Group, Department of Geography, King’s College London, London, UK

WHAT: Measuring the heat pollution of London with regards with to both the space and time the emissions occur in.

WHEN: July 2011 (online version)

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 32, Issue 11, September 2012

TITLE: High-resolution (space, time) anthropogenic heat emissions: London 1970–2025 (subs. req)

Cities have a rather unique position in our world when we think about climate change. They are both the source of the majority of emissions, but also due to their dense nature they are the sites where the greatest innovations and reductions in emissions can occur fastest.

If you want to be able to harness the creativity and innovation of cities to be able to reduce emissions, you need to know what emissions are coming from where and when. You need data first.

That’s where these scientists from Italy and the UK stepped in to work out exactly how much waste heat the city of London was emitting. Waste heat is an issue because it contributes to the Urban Heat Island effect where the temperatures in cities can be higher than the surrounding area, which can be deadly in summer heat waves and lead to greater power usage.

Something interesting that I hadn’t considered before is that concentrated emissions can also change the atmospheric chemistry in a localised area.

The researchers looked at some very detailed data from between 2005-2008 for the city of London and broke it down into as accurate a picture as they could. They looked at heat from buildings looking at the differences between residential and commercial buildings and the different ways the different building uses would emit heat.

They looked at transport data using traffic congestion, the London Atmospheric Emissions Inventory, and data from several acronym-happy government departments to work out from what kinds of vehicles at what times of the day the waste heat was being emitted and at what temperature each different type of fuel combusted. They didn’t count trains (underground or overground) because they are mostly electric (and some of them recover waste heat from breaking) or small boats which didn’t emit enough to be counted.

They looked at the heat emitted by people at various stages of activity, age and time of day assuming a 1:1 ratio of females to males. They even looked at the standard metabolic rates of people to work out how much heat a person emits exercising, resting or sleeping!

Data! London waste heat emissions (from paper)

Data! London waste heat emissions (from paper)

What all that number and formula crunching told them was the total average anthropogenic heat flux for London was 10.9 Wm-2 (watts per square metre). This calculates to be 150 Terawatt hours of waste energy (in the form of heat), which as a comparison is all of the electricity used in Sweden in 2010.

Of that total, 80% came from buildings with 42% being domestic residences and 38% being industrial buildings. The next biggest source of heat was from transport, at 15% of the 10.9 Wm-2. Of the transport category, personal cars were the biggest contributor (64% of the transport portion).

Human heat only contributed 5.1% of the total (0.55 Wm-2), so maybe they’re not doing enough exercise in London? The information had peaks and valleys of heat loss – winter releases more waste heat than summer, because heating systems for winter are more widespread than air conditioners in summer. The industrial building emissions were concentrated in the core of the City of London (especially Canary Wharf where there are many new centrally heated/cooled high rise offices) while the domestic building emissions were much more spread around the centre of London.

Building heat emissions domestic (left) and industrial (right) (from paper)

Building heat emissions domestic (left) and industrial (right) (from paper)

Once they got all the data from 2005 to 2008, they considered trends from 1970 projected out to 2025 to see how great a role heat emissions could play in climate change. Using data from the Department of Energy and Climate Change, the London Atmospheric Emissions Inventory (LAEI) and data on population dynamics and traffic patterns, they estimated that there would be an increase in all contributors to heat emissions unless the UK Greenhouse gas reduction targets are fully implemented.

The reduction targets are for 80% reduction by 2050 (against the baseline of 1990 emissions). From the research indicating that buildings are the biggest heat emitters (and are therefore burning more energy to keep at the right temperature), this would mean there’s a great need for increasing building efficiency to meet those targets.

The paper notes that if the Low Carbon Transition Plan for London is implemented, the average waste heat emissions for London will drop to 9 Wm-2 by 2025, but in the central City of London, the best emissions reductions are likely to only be to 2008 levels, due to the expected growth in the area.

So what does any of this mean? It means London now has the data to know where they can find efficiencies that can complement their greenhouse gas mitigation programs. Because that’s the thing about combating climate change – it’s not a ‘one problem, one solution’ kind of issue. We need to do as many different things as possible all at once to try and add up to the levels of decarbonising that the world needs to be doing to avoid catastrophic climate change. So go forth London, take this data and use it well!