Agreeing to Zero

If the UN climate negotiations were to actually produce an agreement in 2015 to replace the Kyoto Accord, what would it look like?

WHO: Erik Haites, Margaree Consultants, Toronto, Canada
Farhana Yamin, University College London, Chatham House
Niklas Höhne, Ecofys, Wageningen University

WHAT: The possible legal elements of a 2015 agreement by the UN on climate change mitigation (which they’ve promised to try and do by 2015)

WHEN: 13 October 2013

WHERE: Institute for Sustainable Development and International Relations, Paris,

TITLE:  Possible Elements of a 2015 Legal Agreement on Climate Change (open access)

Let’s take a walk down fantasy lane for a moment since the UNFCCC climate change talks are happening in Warsaw over the next two weeks and imagine that the process begins to work. I’ll take off my cynical and sarcastic hat, and we can imagine together what the world might look like if nations sent negotiators with some actual power to commit their countries to reducing their carbon emissions.

Realistically, as any Canadian who has paid passing attention to the news can tell you, the Harper Government here in Canada is currently spending a lot of time and money muzzling and silencing their scientists who might talk about inconvenient things like climate change. So we know very little will come of these current negotiations. But I digress- as I said, let’s imagine, because that’s what this paper is doing.

These researchers looked at the current negotiations and figured that since all 194 countries that are part of the UN climate negotiations have agreed to negotiate a new legally binding agreement by 2015 to be implemented in 2020, maybe we should sit down and work out what the best one would look like? They helpfully point out that unless there’s an overarching plan for the negotiations that they get bogged down in procedural details, which is true given that Russia held up almost the entire negotiations on deforestation prevention financing in Bonn this year over the agenda.

Yes, Russia held up two weeks of negotiations over the agenda for the meeting (golf clap for Russia).

So to avoid negotiation grandstanding over the formatting of the agenda, what should the game plan be?

The researchers start by aiming high, stating ‘the climate regime needs to move out of continuous negotiation and into a framework of continuous implementation’ which would be awesome. They suggest we need a hybrid approach – avoiding the ‘top down’ regulation from above that tells countries how they can do things, allowing a ‘bottom up’ approach where countries get to choose how they want to reduce their emissions.

Elements of a 2015 agreement (from paper)

Elements of a 2015 agreement (from paper)

They also suggest we end the current split between developed and least developed countries that has been standard so far in the negotiations (generally Annex 1 countries are the western industrialised countries while non-Annex 1 countries are the developing countries/third world).

Instead of sitting down and trying to hash out exactly what each country is going to do to reduce their carbon emissions, the researchers say they should all agree to have net zero carbon emissions by 2050. Everyone. We all agree to zero by 2050 and then it’s up to each nation to work out how.

Using the words ‘net zero’ gives countries that believe in the commercial viability of carbon capture and storage (CCS) to use that technology and still have zero emissions, but most importantly it leaves the details up to each country.

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

The simplicity of this idea means that the only agreement needed for 2015 is ‘we all agree to have zero emissions by 2050 and agree that progress will be monitored by the UNFCCC’ (although the UN will say it in their very wordy form of bureaucrat). It makes the idea quite appealing and possibly(?) achievable?

So let’s imagine that everyone agreed that we should all have net zero carbon emissions by 2050. How would the UNFCCC monitor this?

It will need several things – ways to measure reductions, ways to enforce reductions, ways to avoid free-riding and commitments from each country that get upgraded every four years.

The idea these researchers have is that once everyone signs on, each country is then responsible for proving how they’re going to reduce their emissions. They’ll submit these national plans to the UN and the UN will either approve it or send it back telling them it’s not good enough. Each country will also have to prove they’re doing their fair share to reduce emissions (time to pony up the reductions EU, UK, Canada, US, Australia and others!).

The plans will then be reported on every year with the exception of countries that produce less than 0.1% of global emissions. Something that might be surprising to you – 100 countries fall under that threshold! So those countries would only have to report on their progress to zero emissions every five years.

In order to make the process complete, the agreement would need to include everything – air travel emissions, shipping emissions (and who is responsible for the emissions – the country of origin or the country who ordered the stuff?). There will also need to be more money for climate adaptation, because we are already in the age of climate impacts, which will involve wealthier countries coughing up the $100billion each year they promised and haven’t delivered on.

Oh, and of course the agreement needs to be legally binding because everyone knows voluntary commitments are like buying a gym membership as a New Year’s resolution.

Now of course, my first question when reading about an internationally agreed, legally binding commitment to reduce carbon emissions to zero by 2050 through rolling four-year periods that ramp up each time automatically is whether the unicorns are included or extras.

Complimentary unicorns? (Gordon Ednie, flickr)

Complimentary unicorns? (Gordon Ednie, flickr)

However, the researchers rightly point out that with currently available technologies and some political will, it’s possible to have reduced 90% of carbon emissions by 2050. They list sensible and achievable things to get us there like net zero carbon electricity by 2040, totally recyclable products by 2050, zero energy buildings by 2025, decarbonised and electrified passenger transit by 2040, Hydroflurocarbon (HFC) phase out by 2030, zero food in landfill by 2025 and the end of deforestation by 2025.

Even better, they’ve got a list of things we can do before the agreement kicks in for 2020 like: removing the billions of dollars that is current spent each year subsidising fossil fuels, better energy standards and air pollution standards, regulation of shipping and aviation emissions and incentives for early mitigation. Sounds simple, no?

They also recommend incentives for countries that are beating their reduction targets as well as recognition for companies and other organisations within countries that are doing their part too.

This idea is great, if we could get past the sticking point of getting countries to send negotiators who would actually have the power to agree to a legally binding agreement (this year Australia didn’t bother to send their Minister for the Environment, because the new Prime Minister doesn’t believe in climate change).

But if we did all agree that global emissions should be zero by 2050 (which they need to be to preserve a livable climate), this paper outlines a pretty good idea of what it could look like.

Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?


Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

Smoking Kills, so does Climate Change

A translation of the IPCC 5th Assessment Report Summary for Policymakers

WHO: The Intergovernmental Panel on Climate Change

WHAT: Summary for policy makers of their 2000 page 5th Assessment Report (AR5) of the state of the climate and climate science.

WHEN: 27 September 2013

WHERE: On the IPCC website

TITLE: Climate Change 2013: The Physical Science Basis Summary for Policymakers (open access)

There’s a lot of things not to like about the way the IPCC communicates what they do, but for me the main one is that they speak a very specific dialect of bureaucrat that no-one else understands unless they’ve also worked on UN things and speak the same sort of acronym.

The worst bit of this dialect of bureaucrat is the words they use to describe how confident they are that their findings are correct. They probably believe they’re being really clear, however they use language that none of the rest of us would ever use and it means their findings make little sense without their ‘very likely = 90-100% certain’ footnote at the front.

So now that we’ve established that the UN doesn’t speak an understandable form of English, what does the report actually say? It works its way through each of the different climate systems and how they’re changing because humans are burning fossil fuels.

As you can see from this lovely graph, each of the last three decades has been noticeably warmer than the proceeding one, and the IPCC are 66% sure that 1983-2012 was the warmest 30 year period in 1,400 years.

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

Decade by decade average temperatures (Y axis is change in Celsius from base year of 1950) (from paper)

One of the reasons I really like this graph is you can see how the rate of change is speeding up (one of the key dangers with climate change). From 1850 through to around 1980 each decade’s average is touching the box of the average before it, until after the 80s when the heat shoots up much more rapidly.

The report did have this dig for the deniers though: ‘Due to natural variability, trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends’. Which is UN bureaucrat for ‘when you cherry pick data to fit your denier talking points you’re doing it wrong’.

Looking at regional atmospheric trends, the report notes that while things like the Medieval Warm Period did have multi-decadal periods of change, these changes didn’t happen across the whole globe like the warming currently being measured.

In the oceans, the top layer has warmed (the top 75m) by 0.11oC per decade from 1971 to 2010, and more than 60% of the carbon energy we’ve pumped into the atmosphere since 1971 has been stored in the top layer, with another 30% being stored in the ocean below 700m.

This extra heat is not just causing thermal expansion, it’s speeding up sea level rise, which the IPCC are 90% certain increased from 1901 to 2010 from 1.7mm per year to 3.2mm per year. This is now happening faster than the past two millenniums. Yes, sea level is rising faster than it has for the last 2,000,000 years so you might want to sell your waterfront property sooner, rather than later.

The extra carbon has also made it harder to live in the ocean if you own a shell, because the acidity of the ocean has increased by 26% which makes shells thinner and harder to grow.

On the glaciers and the ice sheets, the IPCC is 90% certain that the rate of melting from Greenland has increased from 34Gigatonnes (Gt) of ice per year to 215Gt of ice after 2002. Yes, increased from 34Gt to 215Gt – it’s melting six times faster now thanks to us.

For Antarctica, the IPCC is 66% certain that the rate of ice loss has increased from 30Gt per year to 147Gt per year, with most of that loss coming from the Northern Peninsula and West Antarctica. Worryingly, this ice loss will also include the parts of Antarctica that are gaining ice due to natural variability.

And at the North Pole, Santa is going to have to buy himself and his elves some boats or floating pontoons soon, because the IPCC have found ‘multiple lines of evidence support[ing] very substantial Artctic warming since the mid-20th Century’. Sorry Santa!

As for the carbon we’ve been spewing into the atmosphere since the 1850s, well, we’re winning that race too! ‘The atmospheric concentrations of carbon dioxide, methane and nitrous oxide have increased to levels unprecedented in at least the last 800,000 years’. Congratulations humanity – in the last century and a half, we’ve changed the composition of the atmosphere so rapidly that this hasn’t been seen in 800,000 years!

Methane levels have gone up by 150%, and I’m undecided as to whether that means I should eat more beef to stop the cows from farting, or if it means we raised too many cows to be steaks in the first place…

This is the part of the report where we get into the one excellent thing the IPCC did this time around – our carbon budget. I’m not sure whether they realised that committing to reduce certain percentages by certain years from different baselines meant that governments were able to shuffle the numbers to do nothing and make themselves look good at the same time, but this is a promising step.

I’ve written about the very excellent work of Kevin Anderson at the Tyndall Centre in the UK before, but the basic deal with a carbon budget is this: it doesn’t matter when we burn the carbon or how fast, all the matters is the total emissions in the end. You can eat half the chocolate bar now, and half the chocolate bar later, but you’re still eating a whole bar.

Our budget to have a 2/3 chance of not going beyond dangerous climate change is 1,000Gt of carbon and so far we’ve burnt 545Gt, so we’re more than halfway there. All of this leads to the conclusion that ‘human influence on the climate system is clear. This is evident from the increasing greenhouse gas concentrations in the atmosphere, positive radiative forcing, observed warming and understanding of the climate system.’

What observations you may ask? Scientists have made progress on working out how climate change pumps extreme weather up and makes it worse. They also got it right for the frequency of extreme warm and cold days, which if you live in North America was the hot extremes winning 10:1 over the cold extremes. Round of applause for science everyone!

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

Warming with natural forcing vs human forcing and how it lines up with the observations (from paper)

They’re also 95% sure that more than half of the observed global surface warming from 1951 is from humanity. So next time there’s a nasty heatwave that’s more frequent than it should be, blame humans.

The report does also point out though that even though the heat records are beating the cold records 10-1, this doesn’t mean that snow disproves climate change (sorry Fox News!). There will still be yearly and decade by decade by variability in how our planet cooks which will not be the same across the whole planet. Which sounds to me like we’re being warmed in an uneven microwave. For instance, El Niño and La Niña will still be big influencers over the Pacific and will determine to a great extent the variability in the Pacific North West (yeah, it’s still going to rain a lot Vancouver).

For those that were fans of the film The Day After Tomorrow, there’s a 66% chance the Atlantic Meridional Circulation will slow down, but only a 10% chance it will undergo an abrupt change or collapse like it did in the film, so you’re not going to be running away from a flash freezing ocean any time this century.

The report then runs through the different scenarios they’ve decided to model that range from ‘we did a lot to reduce carbon emissions’ to ‘we did nothing to reduce carbon emissions and burned all the fossil fuels’. Because this is the IPCC and they had to get EVERYONE to agree on each line of the report (I’m serious, they approved it line by line, which has to be the most painful process I can think of) the scenarios are conservative in their estimations, not measuring tipping points (which are really hard to incorporate anyway). So their ‘worst case scenario’ is only 4.0oC of surface warming by 2100.

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Representative Concentration Pathway (RCP) Scenarios from the IPCC AR5

Now, obviously ‘only’ 4oC of climate change by the end of the century is still pretty unbearable. There will still be a lot of hardship, drought, famine, refugee migration and uninhabitable parts of the planet with 4oC. However, once we get to 4oC, it’s likely to have triggered tipping points like methane release from permafrost, so 4oC would be a stopping point towards 6oC even if we ran out of carbon to burn. And 6oC of course, as you all hear me say frequently is mass extinction time. It’s also the point at which even if humanity did survive, you wouldn’t want to live here anymore.

The paper finishes up with a subtle dig at the insanity of relying on geoengineering, pointing out that trying to put shade reflectors into orbit or artificially suck carbon out of the air has a high chance of going horribly wrong. They also point out that if we did manage large scale geoegineering and it then broke down, re-releasing that carbon back into the atmosphere would super-cook the planet really quickly.

The moral of this 36 page ‘summary’ is that it’s us guys. We’re as certain that we’ve done this as we are that smoking causes cancer. We have burned this carbon and it’s changed the chemical energy balance of the atmosphere and if we don’t stop burning carbon we’re going to cook the whole planet. Seriously. So let’s finally, actually stop burning carbon.

Reviewing Our Budget

In the lead up to the IPCC 5th Assessment Report next week, let’s review the Unburnable Carbon report and remind ourselves how much carbon we have left to burn.

WHO: James Leaton, Nicola Ranger, Bob Ward, Luke Sussams, and Meg Brown, Carbon Tracker Initiative

WHAT: Measuring the amount of capital, assets and infrastructure that is currently overvalued and will be stranded or wasted when we act on climate change.

WHEN: 2013

WHERE: On the Carbon Tracker website

TITLE: Unburnable Carbon 2013: Wasted capital and stranded assets (open access)

As I’m sure all of you RtS readers are aware (and excited about!); the IPCC are releasing the first part of their 5th Assessment Report on Friday September 27th and then slowly drip feeding us chapter by chapter over the next year.

This is exciting for climate nerds like me because the last IPCC report came out in 2007, so it was looking at the state of climate science in a very different world – before the 2008 financial crash, before the weather started getting seriously weird and going haywire, before 98% of Greenland melted one summer, the Arctic Death Spiral, the failure of the 2009 Copenhagen talks…. yeah, a lot has happened since 2007.

So, in preparation for when this international ‘State of the Climate’ report comes out, I thought it would be good to look at the Carbon Tracker’s Unburnbable Carbon report from this year to remind ourselves of the budget of carbon we have left that we can spew into the atmosphere and still have a chance of not totally cooking the climate.

The Carbon Tracker report looks at two different budgets – if we want to have an 80% chance of not going beyond a certain amount of global warming, and if we want to have a 50% chance of not going beyond a certain amount of global warming. Given that we haven’t done much to lower global carbon emissions yet, I think we’ll push to a 50/50 chance of cooking our habitat (humans are great at procrastinating after all), but feel free to be optimistic and look at the 80% option.

Carbon budget from now until 2050 (from paper)

Carbon budget from now until 2050 (from paper)

If we start from the assumption that humanity will act to save ourselves and keep global warming at 2oC or less with a 50/50 chance, we have 1,075 Gigatonnes (Gt) of CO2 left to burn over the next 37 years.

Now, you might ask – what about carbon capture and storage? Everyone says that technology is going to be big! The Carbon Tracker people ran those numbers. The 2015 estimate for carbon capture and storage projects (CCS) is 2.25million tonnes of CO2 being sequestered over 16 projects. The idealised scenario for CCS is that it will be able to sequester around 8Gt of CO2 each year, which Carbon Tracker worked out would be 3,800 projects operating by 2050 and would only reduce the above carbon budgets by 125Gt. It definitely isn’t a ‘get out of jail free and burn the fossil fuels’ card.

Speaking of burning all the fossil fuels – how much do we have left? The World Energy Outlook, which gets released by the International Energy Agency each year estimated in 2012 (the 2013 report will be released in November this year) that there were total assets equivalent to 2,860Gt CO2 in the world. This is enough carbon to cook the atmosphere beyond 3oC and probably into the next mass extinction.

The report rightly points out that if we assume we want to save a livable climate and keep within the above carbon budgets, then between 65-80% of all the listed reserves for fossil fuel companies cannot be burned. It’s simple math: 2,860 is more than double the budget for keeping under 2oC with a 50/50 chance of blowing past the temperature.

But enough about trying not to cook the atmosphere – how about the important things – like what does it mean for financial markets?

Carbon Tracker looked at the capital expenditure by publicly listed fossil fuel companies to work out how much money is being spent trying to find new reserves of fossil fuels that will add to the list we can’t burn and are therefore being over-valued, because the market valuation assumes they will be burned and a profit will be made from burning it.

In 2012, the 200 listed fossil fuel companies spent $674billion on capital expenditure. $593billion of that was spent looking for more oil and gas, while $81billion of that was spent looking for more coal. If these kinds of spending continue (if the companies don’t admit that there is going to be an end to carbon pollution) over the next decade $6.74trillion dollars could be wasted looking for fossil fuels that have to stay in the ground.

As the authors say: ‘this has profound implications for asset owners with significant holdings in fossil fuel stocks’ because what investors are being sold is the lie that these reserves can be profitably sold and burned.

There’s also a lot of risk associated with this. Over the last two years, the amount of carbon being traded on the New York Stock Exchange has increased by 37% and in London it’s increased by 7%. This means that similar to the sub-prime loan crisis that precipitated the 2008 financial crash, all investors are exposed to carbon risk through the over-valuation of fossil fuel companies.

Map of oil, gas and coal reserves listed on world stock exchanges (from paper)

Map of oil, gas and coal reserves listed on world stock exchanges (from paper)

There’s a risk of carbon not being properly managed as a risk within stock portfolios which could create a carbon bubble that will burst as carbon starts being constrained, and there’s also the risk of stranded assets, where the fossil fuel companies sink all the capital expenditure into their projects only to find they can’t burn the carbon and there was no point in building the mine/gas well/oil platform in the first place.

The report states: ‘investors need to challenge the continued pursuit of potentially unprofitable projects before costs are sunk’. This makes sense also because oil and gas are becoming harder to get at (tarsands, tight oil, gas fracking, Arctic drilling), so the cost is going up and the profit margins are getting squeezed, unless the price of oil keeps climbing. This means that fossil fuels are going to increasingly become challenged by lower cost lower carbon options, because mining for sunshine is really not that hard.

So if we agree that we’ll stop burning carbon before we’ve cooked the atmosphere and therefore that means that 80% of the world’s fossil fuel reserves need to stay in the ground, what does it mean for the fossil fuel companies themselves?

Well, they may have some debt problems on the horizon. The report points out that even without a global climate change agreement the coal industry in the USA is looking increasingly shaky, just from new air quality regulations. They point out that if the business models unravel that quickly, these companies may have problems refinancing their debt when they mature in the near future (which is also a risk for investors). They point out that any company with tar sands exposure will also have stranded asset risk because the business model relies on high oil prices to be profitable.

Basically they show that traditional business models are no longer viable in energy markets that will be moving towards decarbonisation and that different information is going to be needed for investors to manage the risk of carbon in their portfolio.

In the final section of the report, Carbon Tracker gives a list of recommendations for investors, policy makers, finance ministers, financial regulators, analysts and ratings agencies for how we can avoid this financial carbon bubble. The recommendations include better regulation, shareholder engagement and resolutions, fossil fuel divestment, and better risk definition.

The full list of Carbon Tracker recommendations (click to embiggen) (from paper)

The full list of Carbon Tracker recommendations (click to embiggen) (from paper)

For what it’s worth, my recommendations would be to remove fossil fuel subsidies, stop looking for new reserves of carbon that we can’t burn and price carbon pollution. And as usual, stop burning carbon.

If We Burn All the Fossil Fuels

“The practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned” – Hansen et al. September 2013

WHO: James Hansen, Makiko Sato, The Earth Institute, Columbia University, New York, NY
Gary Russell, NASA Goddard Institute for Space Studies, New York, NY
Pushker Kharecha, The Earth Institute, Columbia University, NASA Goddard Institute for Space Studies, New York, NY

WHAT: Using deep ocean oxygen isotope ratios to determine the sensitivity of climate forcing for sea levels and surface temperatures.

WHEN: September 2013

WHERE: Philosophical Transactions of the Royal Society A (Phil Trans R Soc A) Vol. 371, No. 2001

TITLE: Climate sensitivity, sea level and atmospheric carbon dioxide (open access)

Ok, firstly, let us just take a moment to geek out about how awesome science is. This paper has looked at what our planet was like millions of years ago by studying at the amount of different oxygen and carbon types in the shells of foraminifera that have been buried at the bottom of the ocean since they died millions of years ago. Science can tell us not only how old they are by dating the carbon in their fossilised bodies, but also what the temperature was too. That is awesome.

Foraminifera from Japan (Wikimedia commons)

Foraminifera from Japan (Wikimedia commons)

The lead author of this paper – Dr. James Hansen is pretty much the Godfather of climate science. He’s been doing climate models looking at the possible effects of extra carbon in our atmosphere since he basically had to do them by hand in the 1980s before we had the internet. He knows his stuff. And so far, he’s been right with his projections.

The paper (which is a very long read at 25 pages) focuses on the Cenozoic climate, which is the period of time from 65.5 million years ago to present. The Cenozoic is the period after the Cretaceous (so we’re talking mammals here, not dinosaurs) and includes the Palaeocene-Eocene thermal maximum where the deep ocean was 12oC warmer than today as well as the cooling from there that led to the formation of the Greenland and Antarctic ice sheets.

The period of time studied by the paper (bottom axis is million years before present) (from paper)

The period of time studied by the paper (bottom axis is million years before present) (from paper)

What does this show us? The warming that eventually led to the Palaeocene-Eocene thermal maximum started around 3,000 years before there was a massive carbon release. The researchers think this carbon release was from methane hydrates in the ocean venting, because there was a lag in the warming in the intermediate ocean after the carbon release.

The thermal maximum had global surface temperatures around 5oC warmer than today, and there was about 4,000 – 7,000 Gigatonnes (Gt) of carbon that was released into the atmosphere to force that kind of warming.

After this warming happened there were ‘hyperthermal’ events (where the temperature spiked again) as the planet slowly cooled, showing how long the recovery time for the planet was from this greenhouse warmed state.

In the warmed world of the Palaeocene-Eocene maximum, sea levels were probably 120m higher than they are now. The researchers found that there’s a snowball effect with changes in ocean temperatures where a -1oC difference in deep ocean temperatures was enough to trigger the last ice age, while sea levels were 5- 10m higher when temperatures were ‘barely warmer than the Holocene’ (which is us – we live in the Holocene).

The researchers found that during the Pliocene, (about 5million years ago) sea levels were 15m higher than today, which they point out means that the East and West Antarctic ice sheets are likely to be unstable at temperatures we will reach this century from burning fossil fuels.

From the data they then tried to work out what the sensitivity of the atmosphere is to extra carbon. This is important to know, because we’re currently changing the chemical composition of the atmosphere much faster than ever before. The previous greenhouse warming that the planet experienced occurred over millennial time scales – the current rate that we’re pumping carbon into the atmosphere is causing change over only hundreds of years.

To work out how sensitive the climate is to being forced by carbon, the researchers used a simplified model where the atmosphere was split into 24 layers to test the rapid equilibrium responses to forcing.

They wanted to find out if we could be in danger of runaway climate change – the most extreme version of which happened on the planet Venus where runaway climate change amplified by water vapour led to a new stable average temperature of 450oC and the carbon was baked onto the surface of the planet and all the water evaporated into the sky. Obviously, humanity will want to avoid that one… Good news is there isn’t enough carbon on this planet for humans to accidentally do that to ourselves until the sun does it to us in a billion years or so.


We’ve avoided this for now (NASA NSSDC Photo Gallery)

The researchers then tested the response to doubling and halving the CO2 in the system, from the 1950 concentration of 310ppm of CO2 in the atmosphere. They found that three halving gives you a ‘snowball Earth’ response of mass glaciations, while in the other direction 1-4x CO2 is when all the snow disappears, which speeds up the feedback (because snow reflects heat) making the fast feedback sensitivity 5oC of global warming. For 8-32x CO2 the sensitivity is approx. 8oC with water vapour feedbacks (what happened on Venus but a smaller scale).

But what do any of these numbers mean?

As the paper says; ‘the practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned’.

So here’s the lesson we need to learn from the Palaeocene-Eocene thermal maximum. For global warming we can assume that 75% of it is from CO2, and the remaining 25% is from other greenhouse gasses like methane and nitrous oxide. If we burn all the fossil fuels we have left in the ground, that’s about 10-15,000Gt of carbon that we could put in the atmosphere.

That gives us 5x the CO2 from 1950, or 1,400ppm. This will give us 16oC of global warming. It will be a world where there’s an average temperature of 20oC on land and 30oC at the poles (the current average is 14oC). Keep in mind also, that 6oC of warming is generally enough for a mass extinction like the dinosaurs.

This will eliminate grain production across most of the globe and seriously increase the amount of water vapour in the air, which means it’s getting more humid (also the water vapour will destroy most of the ozone layer too).

A wet bulb temperature is the temperature with the humidity included. Humans generally live with wet bulb temperatures between 26-27oC up to 31oC in the tropics. A wet bulb temperature of 35oC or above means the body can’t cool down and results in ‘lethal hyperthermia’ which is scientist for it’s so hot and sticky that you die from the heat.

Burning all the fossil fuels will result in a planet with wet bulb temperatures routinely above 35oC, which means we’ll have cooked the atmosphere enough that we’ll end up cooking ourselves.

If the climate has a low sensitivity to this kind of forcing, it will take 4.8x CO2 concentrations to cause an unlivable climate. If the climate is more sensitive, it will take less than that to cook ourselves.

Oh, and the other kicker? The Palaeocene-Eocene thermal maximum took millions of years to take place, so the mammals survived by evolving to be smaller. Our climate change is only taking hundreds of years, which is not enough time for any plants or animals to evolve and adapt.

Basically, if we burn all the fossil fuels, we’re all going down and taking the rest of the species on the planet with us, and we really will be the dumbest smart species ever to cause our own extinction.

So far, James Hansen has been correct with his climate projections. So when he says we can’t burn all the fossil fuels because if we do we’ll cook the planet, I say we pay close attention to what he says. Oh, and we should stop burning carbon.

Nemo the Climate Refugee

If you collected all the recent research on marine species and climate change, could you see a pattern of fish and marine species migration?

WHO: Elvira S. Poloczanska, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia
Christopher J. Brown, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia, School of Biological Sciences, The University of Queensland, Australia
William J. Sydeman, Sarah Ann Thompson, Farallon Institute for Advanced Ecosystem Research, Petaluma, California, USA
Wolfgang Kiessling, Museum für Naturkunde, Leibniz Institute for Research on Evolution and Biodiversity, Berlin, Germany, GeoZentrum Nordbayern, Paläoumwelt, Universität Erlangen-Nürnberg, Erlangen, Germany
David S. Schoeman, Faculty of Science, Health and Education, University of the Sunshine Coast, Maroochydore, Queensland, Australia, Department of Zoology, Nelson Mandela Metropolitan University, Port Elizabeth, South Africa
Pippa J. Moore, Centre for Marine Ecosystems Research, Edith Cowan University, Perth, Western Australia, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth UK
Keith Brander, DTU Aqua—Centre for Ocean Life, Technical University of Denmark, Charlottenlund Slot, Denmark
John F. Bruno, Lauren B. Buckley, Department of Biology, The University of North Carolina at Chapel Hill, North Carolina, USA
Michael T. Burrows, Scottish Association for Marine Science, Scottish Marine Institute, Oban, UK
Johnna Holding, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain
Carlos M. Duarte, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain, The UWA Oceans Institute, University of Western Australia,
Benjamin S. Halpern, Carrie V. Kappel, National Center for Ecological Analysis and Synthesis, Santa Barbara, California, USA
Mary I. O’Connor, University of British Columbia, Department of Zoology, Vancouver, Canada
John M. Pandolfi, Australian Research Council Centre of Excellence for Coral Reef Studies, School of Biological Sciences, The University of Queensland, Australia
Camille Parmesan, Integrative Biology, Patterson Laboratories 141, University of Texas, Austin, Texas Marine Institute, A425 Portland Square, Drake Circus, University of Plymouth, Plymouth, UK
Franklin Schwing, Office of Sustainable Fisheries, NOAA Fisheries Service, Maryland, USA
Anthony J. Richardson, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia Centre for Applications in Natural Resource Mathematics (CARM), School of Mathematics and Physics, University of Queensland, Australia

WHAT: A review and synthesis of all available peer reviewed studies of marine species changing under climate change.

WHEN: 4 August 2013

WHERE: Nature Climate Change, August 2013

TITLE: Global imprint of climate change on marine life subs req.

This paper, with its laundry list of collaborative authors must have had an awesome ‘we got published’ party. However, when you think about what they did – all that data would have taken forever to number crunch, so it’s a good thing it was all hands on deck.

So what were they looking at? They were trying to work out if you can see the fingerprint of climate change in the distribution changes of marine species. And to do that, they looked at all the available studies in the peer reviewed literature that were looking at expected changes for fish and other species in the ocean with climate change. Then, they lined up the predictions with the observed results to see what happened, and it turns out we’ve got some frequent travelling fish.

After getting all the studies together, the researchers had 1,735 different observations for everything from phytoplankton to zooplankton to fish and seabirds from 208 studies of 857 different species. They used all of the data they had which included the changes that lined up with climate change projections, the ones that had no changes and the ones that had unexpected changes.

Global marine migration (from paper)

Global marine migration (from paper)

Ocean currents make it easier for species to travel longer distances than plants and animals on land. There’s only so far a seed can travel from the tree with the wind, after all. However in this research they found that the average distance of expansion for marine species was 72km/decade (±13.5km). This doesn’t sound like a lot to a human, but it’s an order of magnitude further than land based migration averages, and it’s a long way for a mollusc or a starfish to go.

The species chalking up the most frequent flier points were phytoplankton which have been moving 469.9km/decade (±115km) followed by the fish who have been moving 227.5km/decade (±76.9km). Of the 1,735 observations, a whopping 1,092 were moving in the directions expected by climate change.

For each species migration, the researchers looked at what the expected decadal rates of ocean temperature change would have been in the area and found that some groups move early, some wait longer, others are falling behind.

For example, in the Bering Sea (where the Discovery Channel show ‘The Deadliest Catch’ was set), many species rely on the really cold water that is less than 2oC and separates the Arctic and subarctic animals. This cold pool of water has been moving further north as the Arctic ice sheet melts, but the responses by species are varied. Some are at the leading edge and move early, others don’t. The researchers think this is related to issues around population size, ability to migrate, dependence on habitat (remember how Nemo’s dad didn’t want to leave the reef?), competition for food and others.

Clownfish (Wikimedia commons)

Clownfish (Wikimedia commons)

I guess it’s similar to when a natural disaster happens in a human area and some families leave, others rebuild and it’s for a whole complicated list of reasons like family, jobs, resources and more. Anyway, back to the fish.

The researchers tested their data for a climate change fingerprint globally. They used a binomial test against 0.5, which is the result you would get if these changes in location were random variability and from their test, 83% of the changes had climate change as a dominant driving force.

If they limited their data only to studies that were multi-species, there were still 81% of the changes that were driven by climate change. They ran the data to exclude every bias they could think of and still they concluded that it provided ‘convincing evidence that climate change is the primary driver behind the observed biological changes’.

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

So there you have it – climate refugees aren’t just land based. Nemo’s going to have to move too.

100% Australian Renewable

What does 100% renewable electricity for the whole of Australia look like?

WHO: The Australian Energy Market Operator, commissioned by the Australian Federal Government

WHAT: Modelling for what a 100% renewable national electricity grid for Australia would look like.

WHEN: July 2013

WHERE: Online at the Department of Climate Change website


The Australian Department of Climate Change (yes, they have one!) commissioned the Australian Energy Market Operator in conjunction with CSIRO and ROAM Consulting to model what a national energy market would look like in 2030 and 2050 with 100% renewable electricity. Oh, and when they say ‘national’ they mean the more densely populated East Coast of the country (sorry WA and NT…)

The ‘national’ energy market (from paper)

The ‘national’ energy market (from paper)

They looked at two different scenarios – the first one was rapid deployment of renewable technology with moderate electricity demand growth (ie. including energy efficiency gains), and the second one was moderate deployment of renewable technology with high demand growth (no efficiency gains).

They ran both scenarios for getting our act together by 2030 and procrastinating until 2050 to see what might happen.

Given that this is a government document, it comes with many caveats (of course!). There are uncertainties (always); CSIRO says bioenergy is feasible, other groups say it’s not that feasible. The costs don’t include transitional factors (and change over time), the costs of land acquisition or stranded fossil fuel assets and infrastructure. Phew.

They also pointed out the obvious (someone has to say it I guess) that because this is looking at 100% renewable electricity it does not look at nuclear, natural gas or coal with carbon capture and storage. This is a fossil free zone people!

Ok, so what did they look at? They took data from the 2012 Australian Technology Assessment by the Australian Government Bureau of Resources and Energy Economics, and using that looked at what demand might look like in 2030 and 2050, and calculated the approximate costs.

Their findings in a nutshell are that a renewable system needs more storage (you can’t put solar in a pile like coal to burn later), is a more diverse and distributed system, needs an expanded transmission network and will be primarily driven in Australia by solar.

Depending on when Australia does it, it will cost somewhere between $219billion and $332billion dollars to build. No surprises that it’s cheaper to do it now, not to mention the stranded infrastructure and assets you save by starting the transition now. It’s cheaper after all not to build the coal terminal if you’re only going to use it for a short period of time.

Cost calculations for Scenario 1 (rapid deployment) and Scenario 2 (moderate deployment) (from paper)

Cost calculations for Scenario 1 (rapid deployment) and Scenario 2 (moderate deployment) (from paper)

They included energy consumption by electric vehicles (EVs) as well as the reduction of demand from rooftop solar. Interestingly, rooftop solar will dramatically change the makeup of a national energy grid. Currently the energy grid is summer peaking, which means more power is used in summer (for things like air conditioners when it’s seriously hot outside). With the uptake of rooftop solar, the grid will become winter peaking, because demand decreases in summer when everyone’s solar panels are doing great.

They ran the numbers to make sure a renewable power grid is as reliable as the current power grid, which is 99.998% reliable, and made sure the technologies they picked are either currently commercially available, or projected to be available soon.

They found that the capital costs are the main factor, given that once renewable power is installed; it costs almost nothing to run, because you don’t have to feed it fossil fuels to go. There are maintenance costs, but all power stations have maintenance costs.

Storage capacity wasn’t found to be economically viable with batteries once scaled up, given that a renewable grid needs 100-130% excess capacity. So storage would be in solar thermal, pumped hydro, biogas or biomass. The paper noted that geothermal (which Australia has a fair bit of) and biomass are similar to current standard baseload power in the way they can be used. Concentrated solar thermal is still a new technology that is being developed, so the scale up potential is not fully known yet, but it’s working well in Spain so far.

The space required to do this (to put the solar panels on and the wind turbines in) is between 2,400 – 5,000km2 which is small change in a country that has 7.7mill km2 and is mostly desert. So people won’t need to worry about wind turbines being put forcibly in their backyards, unless they want them (can I have one? They’re pretty!).

The most economic spread of renewables for transmission costs was a combination of remote with higher transmission costs and local with lower energy generation capacity.

Transmission possibilities (from paper)

Transmission possibilities (from paper)

The sticking point was meeting evening demand – when everyone comes home from work and turns the lights on and starts cooking dinner and plugs in their EV in the garage. The paper pointed out that work-based charging stations could promote charging your car during the day, but also ran scenarios where the demand shortfall could be met by biogas. This also applied for weeks where the storage capacity of the renewables was low (a week of low wind or a week of overcast weather).

Meeting demand shortfall by dispatching biogas and biomass (from paper)

Meeting demand shortfall by dispatching biogas and biomass (from paper)

Long story short, the future is hybrid renewable systems.

Breakdown of each technology for the different scenarios (from paper)

Breakdown of each technology for the different scenarios (from paper)

There is no single technology that can replace the energy density of fossil fuels, but a hybrid grid can. Diversifying both the technology and geography of the power grid will not only allow for 100% renewable generation, it will also build resilience.

As climate change extreme weather events become more common, having a distributed power system will avoid mass blackouts. It will be better for everyone’s health (living near a coal mine or a coal power station is NOT good for your health) and it will slow the rate at which we’re cooking the planet. Sounds good to me.