Reviewing Our Budget

In the lead up to the IPCC 5th Assessment Report next week, let’s review the Unburnable Carbon report and remind ourselves how much carbon we have left to burn.

WHO: James Leaton, Nicola Ranger, Bob Ward, Luke Sussams, and Meg Brown, Carbon Tracker Initiative

WHAT: Measuring the amount of capital, assets and infrastructure that is currently overvalued and will be stranded or wasted when we act on climate change.

WHEN: 2013

WHERE: On the Carbon Tracker website

TITLE: Unburnable Carbon 2013: Wasted capital and stranded assets (open access)

As I’m sure all of you RtS readers are aware (and excited about!); the IPCC are releasing the first part of their 5th Assessment Report on Friday September 27th and then slowly drip feeding us chapter by chapter over the next year.

This is exciting for climate nerds like me because the last IPCC report came out in 2007, so it was looking at the state of climate science in a very different world – before the 2008 financial crash, before the weather started getting seriously weird and going haywire, before 98% of Greenland melted one summer, the Arctic Death Spiral, the failure of the 2009 Copenhagen talks…. yeah, a lot has happened since 2007.

So, in preparation for when this international ‘State of the Climate’ report comes out, I thought it would be good to look at the Carbon Tracker’s Unburnbable Carbon report from this year to remind ourselves of the budget of carbon we have left that we can spew into the atmosphere and still have a chance of not totally cooking the climate.

The Carbon Tracker report looks at two different budgets – if we want to have an 80% chance of not going beyond a certain amount of global warming, and if we want to have a 50% chance of not going beyond a certain amount of global warming. Given that we haven’t done much to lower global carbon emissions yet, I think we’ll push to a 50/50 chance of cooking our habitat (humans are great at procrastinating after all), but feel free to be optimistic and look at the 80% option.

Carbon budget from now until 2050 (from paper)

Carbon budget from now until 2050 (from paper)

If we start from the assumption that humanity will act to save ourselves and keep global warming at 2oC or less with a 50/50 chance, we have 1,075 Gigatonnes (Gt) of CO2 left to burn over the next 37 years.

Now, you might ask – what about carbon capture and storage? Everyone says that technology is going to be big! The Carbon Tracker people ran those numbers. The 2015 estimate for carbon capture and storage projects (CCS) is 2.25million tonnes of CO2 being sequestered over 16 projects. The idealised scenario for CCS is that it will be able to sequester around 8Gt of CO2 each year, which Carbon Tracker worked out would be 3,800 projects operating by 2050 and would only reduce the above carbon budgets by 125Gt. It definitely isn’t a ‘get out of jail free and burn the fossil fuels’ card.

Speaking of burning all the fossil fuels – how much do we have left? The World Energy Outlook, which gets released by the International Energy Agency each year estimated in 2012 (the 2013 report will be released in November this year) that there were total assets equivalent to 2,860Gt CO2 in the world. This is enough carbon to cook the atmosphere beyond 3oC and probably into the next mass extinction.

The report rightly points out that if we assume we want to save a livable climate and keep within the above carbon budgets, then between 65-80% of all the listed reserves for fossil fuel companies cannot be burned. It’s simple math: 2,860 is more than double the budget for keeping under 2oC with a 50/50 chance of blowing past the temperature.

But enough about trying not to cook the atmosphere – how about the important things – like what does it mean for financial markets?

Carbon Tracker looked at the capital expenditure by publicly listed fossil fuel companies to work out how much money is being spent trying to find new reserves of fossil fuels that will add to the list we can’t burn and are therefore being over-valued, because the market valuation assumes they will be burned and a profit will be made from burning it.

In 2012, the 200 listed fossil fuel companies spent $674billion on capital expenditure. $593billion of that was spent looking for more oil and gas, while $81billion of that was spent looking for more coal. If these kinds of spending continue (if the companies don’t admit that there is going to be an end to carbon pollution) over the next decade $6.74trillion dollars could be wasted looking for fossil fuels that have to stay in the ground.

As the authors say: ‘this has profound implications for asset owners with significant holdings in fossil fuel stocks’ because what investors are being sold is the lie that these reserves can be profitably sold and burned.

There’s also a lot of risk associated with this. Over the last two years, the amount of carbon being traded on the New York Stock Exchange has increased by 37% and in London it’s increased by 7%. This means that similar to the sub-prime loan crisis that precipitated the 2008 financial crash, all investors are exposed to carbon risk through the over-valuation of fossil fuel companies.

Map of oil, gas and coal reserves listed on world stock exchanges (from paper)

Map of oil, gas and coal reserves listed on world stock exchanges (from paper)

There’s a risk of carbon not being properly managed as a risk within stock portfolios which could create a carbon bubble that will burst as carbon starts being constrained, and there’s also the risk of stranded assets, where the fossil fuel companies sink all the capital expenditure into their projects only to find they can’t burn the carbon and there was no point in building the mine/gas well/oil platform in the first place.

The report states: ‘investors need to challenge the continued pursuit of potentially unprofitable projects before costs are sunk’. This makes sense also because oil and gas are becoming harder to get at (tarsands, tight oil, gas fracking, Arctic drilling), so the cost is going up and the profit margins are getting squeezed, unless the price of oil keeps climbing. This means that fossil fuels are going to increasingly become challenged by lower cost lower carbon options, because mining for sunshine is really not that hard.

So if we agree that we’ll stop burning carbon before we’ve cooked the atmosphere and therefore that means that 80% of the world’s fossil fuel reserves need to stay in the ground, what does it mean for the fossil fuel companies themselves?

Well, they may have some debt problems on the horizon. The report points out that even without a global climate change agreement the coal industry in the USA is looking increasingly shaky, just from new air quality regulations. They point out that if the business models unravel that quickly, these companies may have problems refinancing their debt when they mature in the near future (which is also a risk for investors). They point out that any company with tar sands exposure will also have stranded asset risk because the business model relies on high oil prices to be profitable.

Basically they show that traditional business models are no longer viable in energy markets that will be moving towards decarbonisation and that different information is going to be needed for investors to manage the risk of carbon in their portfolio.

In the final section of the report, Carbon Tracker gives a list of recommendations for investors, policy makers, finance ministers, financial regulators, analysts and ratings agencies for how we can avoid this financial carbon bubble. The recommendations include better regulation, shareholder engagement and resolutions, fossil fuel divestment, and better risk definition.

The full list of Carbon Tracker recommendations (click to embiggen) (from paper)

The full list of Carbon Tracker recommendations (click to embiggen) (from paper)

For what it’s worth, my recommendations would be to remove fossil fuel subsidies, stop looking for new reserves of carbon that we can’t burn and price carbon pollution. And as usual, stop burning carbon.

If We Burn All the Fossil Fuels

“The practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned” – Hansen et al. September 2013

WHO: James Hansen, Makiko Sato, The Earth Institute, Columbia University, New York, NY
Gary Russell, NASA Goddard Institute for Space Studies, New York, NY
Pushker Kharecha, The Earth Institute, Columbia University, NASA Goddard Institute for Space Studies, New York, NY

WHAT: Using deep ocean oxygen isotope ratios to determine the sensitivity of climate forcing for sea levels and surface temperatures.

WHEN: September 2013

WHERE: Philosophical Transactions of the Royal Society A (Phil Trans R Soc A) Vol. 371, No. 2001

TITLE: Climate sensitivity, sea level and atmospheric carbon dioxide (open access)

Ok, firstly, let us just take a moment to geek out about how awesome science is. This paper has looked at what our planet was like millions of years ago by studying at the amount of different oxygen and carbon types in the shells of foraminifera that have been buried at the bottom of the ocean since they died millions of years ago. Science can tell us not only how old they are by dating the carbon in their fossilised bodies, but also what the temperature was too. That is awesome.

Foraminifera from Japan (Wikimedia commons)

Foraminifera from Japan (Wikimedia commons)

The lead author of this paper – Dr. James Hansen is pretty much the Godfather of climate science. He’s been doing climate models looking at the possible effects of extra carbon in our atmosphere since he basically had to do them by hand in the 1980s before we had the internet. He knows his stuff. And so far, he’s been right with his projections.

The paper (which is a very long read at 25 pages) focuses on the Cenozoic climate, which is the period of time from 65.5 million years ago to present. The Cenozoic is the period after the Cretaceous (so we’re talking mammals here, not dinosaurs) and includes the Palaeocene-Eocene thermal maximum where the deep ocean was 12oC warmer than today as well as the cooling from there that led to the formation of the Greenland and Antarctic ice sheets.

The period of time studied by the paper (bottom axis is million years before present) (from paper)

The period of time studied by the paper (bottom axis is million years before present) (from paper)

What does this show us? The warming that eventually led to the Palaeocene-Eocene thermal maximum started around 3,000 years before there was a massive carbon release. The researchers think this carbon release was from methane hydrates in the ocean venting, because there was a lag in the warming in the intermediate ocean after the carbon release.

The thermal maximum had global surface temperatures around 5oC warmer than today, and there was about 4,000 – 7,000 Gigatonnes (Gt) of carbon that was released into the atmosphere to force that kind of warming.

After this warming happened there were ‘hyperthermal’ events (where the temperature spiked again) as the planet slowly cooled, showing how long the recovery time for the planet was from this greenhouse warmed state.

In the warmed world of the Palaeocene-Eocene maximum, sea levels were probably 120m higher than they are now. The researchers found that there’s a snowball effect with changes in ocean temperatures where a -1oC difference in deep ocean temperatures was enough to trigger the last ice age, while sea levels were 5- 10m higher when temperatures were ‘barely warmer than the Holocene’ (which is us – we live in the Holocene).

The researchers found that during the Pliocene, (about 5million years ago) sea levels were 15m higher than today, which they point out means that the East and West Antarctic ice sheets are likely to be unstable at temperatures we will reach this century from burning fossil fuels.

From the data they then tried to work out what the sensitivity of the atmosphere is to extra carbon. This is important to know, because we’re currently changing the chemical composition of the atmosphere much faster than ever before. The previous greenhouse warming that the planet experienced occurred over millennial time scales – the current rate that we’re pumping carbon into the atmosphere is causing change over only hundreds of years.

To work out how sensitive the climate is to being forced by carbon, the researchers used a simplified model where the atmosphere was split into 24 layers to test the rapid equilibrium responses to forcing.

They wanted to find out if we could be in danger of runaway climate change – the most extreme version of which happened on the planet Venus where runaway climate change amplified by water vapour led to a new stable average temperature of 450oC and the carbon was baked onto the surface of the planet and all the water evaporated into the sky. Obviously, humanity will want to avoid that one… Good news is there isn’t enough carbon on this planet for humans to accidentally do that to ourselves until the sun does it to us in a billion years or so.

Venus_Clouds

We’ve avoided this for now (NASA NSSDC Photo Gallery)

The researchers then tested the response to doubling and halving the CO2 in the system, from the 1950 concentration of 310ppm of CO2 in the atmosphere. They found that three halving gives you a ‘snowball Earth’ response of mass glaciations, while in the other direction 1-4x CO2 is when all the snow disappears, which speeds up the feedback (because snow reflects heat) making the fast feedback sensitivity 5oC of global warming. For 8-32x CO2 the sensitivity is approx. 8oC with water vapour feedbacks (what happened on Venus but a smaller scale).

But what do any of these numbers mean?

As the paper says; ‘the practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned’.

So here’s the lesson we need to learn from the Palaeocene-Eocene thermal maximum. For global warming we can assume that 75% of it is from CO2, and the remaining 25% is from other greenhouse gasses like methane and nitrous oxide. If we burn all the fossil fuels we have left in the ground, that’s about 10-15,000Gt of carbon that we could put in the atmosphere.

That gives us 5x the CO2 from 1950, or 1,400ppm. This will give us 16oC of global warming. It will be a world where there’s an average temperature of 20oC on land and 30oC at the poles (the current average is 14oC). Keep in mind also, that 6oC of warming is generally enough for a mass extinction like the dinosaurs.

This will eliminate grain production across most of the globe and seriously increase the amount of water vapour in the air, which means it’s getting more humid (also the water vapour will destroy most of the ozone layer too).

A wet bulb temperature is the temperature with the humidity included. Humans generally live with wet bulb temperatures between 26-27oC up to 31oC in the tropics. A wet bulb temperature of 35oC or above means the body can’t cool down and results in ‘lethal hyperthermia’ which is scientist for it’s so hot and sticky that you die from the heat.

Burning all the fossil fuels will result in a planet with wet bulb temperatures routinely above 35oC, which means we’ll have cooked the atmosphere enough that we’ll end up cooking ourselves.

If the climate has a low sensitivity to this kind of forcing, it will take 4.8x CO2 concentrations to cause an unlivable climate. If the climate is more sensitive, it will take less than that to cook ourselves.

Oh, and the other kicker? The Palaeocene-Eocene thermal maximum took millions of years to take place, so the mammals survived by evolving to be smaller. Our climate change is only taking hundreds of years, which is not enough time for any plants or animals to evolve and adapt.

Basically, if we burn all the fossil fuels, we’re all going down and taking the rest of the species on the planet with us, and we really will be the dumbest smart species ever to cause our own extinction.

So far, James Hansen has been correct with his climate projections. So when he says we can’t burn all the fossil fuels because if we do we’ll cook the planet, I say we pay close attention to what he says. Oh, and we should stop burning carbon.

Nemo the Climate Refugee

If you collected all the recent research on marine species and climate change, could you see a pattern of fish and marine species migration?

WHO: Elvira S. Poloczanska, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia
Christopher J. Brown, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia, School of Biological Sciences, The University of Queensland, Australia
William J. Sydeman, Sarah Ann Thompson, Farallon Institute for Advanced Ecosystem Research, Petaluma, California, USA
Wolfgang Kiessling, Museum für Naturkunde, Leibniz Institute for Research on Evolution and Biodiversity, Berlin, Germany, GeoZentrum Nordbayern, Paläoumwelt, Universität Erlangen-Nürnberg, Erlangen, Germany
David S. Schoeman, Faculty of Science, Health and Education, University of the Sunshine Coast, Maroochydore, Queensland, Australia, Department of Zoology, Nelson Mandela Metropolitan University, Port Elizabeth, South Africa
Pippa J. Moore, Centre for Marine Ecosystems Research, Edith Cowan University, Perth, Western Australia, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth UK
Keith Brander, DTU Aqua—Centre for Ocean Life, Technical University of Denmark, Charlottenlund Slot, Denmark
John F. Bruno, Lauren B. Buckley, Department of Biology, The University of North Carolina at Chapel Hill, North Carolina, USA
Michael T. Burrows, Scottish Association for Marine Science, Scottish Marine Institute, Oban, UK
Johnna Holding, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain
Carlos M. Duarte, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain, The UWA Oceans Institute, University of Western Australia,
Benjamin S. Halpern, Carrie V. Kappel, National Center for Ecological Analysis and Synthesis, Santa Barbara, California, USA
Mary I. O’Connor, University of British Columbia, Department of Zoology, Vancouver, Canada
John M. Pandolfi, Australian Research Council Centre of Excellence for Coral Reef Studies, School of Biological Sciences, The University of Queensland, Australia
Camille Parmesan, Integrative Biology, Patterson Laboratories 141, University of Texas, Austin, Texas Marine Institute, A425 Portland Square, Drake Circus, University of Plymouth, Plymouth, UK
Franklin Schwing, Office of Sustainable Fisheries, NOAA Fisheries Service, Maryland, USA
Anthony J. Richardson, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia Centre for Applications in Natural Resource Mathematics (CARM), School of Mathematics and Physics, University of Queensland, Australia

WHAT: A review and synthesis of all available peer reviewed studies of marine species changing under climate change.

WHEN: 4 August 2013

WHERE: Nature Climate Change, August 2013

TITLE: Global imprint of climate change on marine life subs req.

This paper, with its laundry list of collaborative authors must have had an awesome ‘we got published’ party. However, when you think about what they did – all that data would have taken forever to number crunch, so it’s a good thing it was all hands on deck.

So what were they looking at? They were trying to work out if you can see the fingerprint of climate change in the distribution changes of marine species. And to do that, they looked at all the available studies in the peer reviewed literature that were looking at expected changes for fish and other species in the ocean with climate change. Then, they lined up the predictions with the observed results to see what happened, and it turns out we’ve got some frequent travelling fish.

After getting all the studies together, the researchers had 1,735 different observations for everything from phytoplankton to zooplankton to fish and seabirds from 208 studies of 857 different species. They used all of the data they had which included the changes that lined up with climate change projections, the ones that had no changes and the ones that had unexpected changes.

Global marine migration (from paper)

Global marine migration (from paper)

Ocean currents make it easier for species to travel longer distances than plants and animals on land. There’s only so far a seed can travel from the tree with the wind, after all. However in this research they found that the average distance of expansion for marine species was 72km/decade (±13.5km). This doesn’t sound like a lot to a human, but it’s an order of magnitude further than land based migration averages, and it’s a long way for a mollusc or a starfish to go.

The species chalking up the most frequent flier points were phytoplankton which have been moving 469.9km/decade (±115km) followed by the fish who have been moving 227.5km/decade (±76.9km). Of the 1,735 observations, a whopping 1,092 were moving in the directions expected by climate change.

For each species migration, the researchers looked at what the expected decadal rates of ocean temperature change would have been in the area and found that some groups move early, some wait longer, others are falling behind.

For example, in the Bering Sea (where the Discovery Channel show ‘The Deadliest Catch’ was set), many species rely on the really cold water that is less than 2oC and separates the Arctic and subarctic animals. This cold pool of water has been moving further north as the Arctic ice sheet melts, but the responses by species are varied. Some are at the leading edge and move early, others don’t. The researchers think this is related to issues around population size, ability to migrate, dependence on habitat (remember how Nemo’s dad didn’t want to leave the reef?), competition for food and others.

Clownfish (Wikimedia commons)

Clownfish (Wikimedia commons)

I guess it’s similar to when a natural disaster happens in a human area and some families leave, others rebuild and it’s for a whole complicated list of reasons like family, jobs, resources and more. Anyway, back to the fish.

The researchers tested their data for a climate change fingerprint globally. They used a binomial test against 0.5, which is the result you would get if these changes in location were random variability and from their test, 83% of the changes had climate change as a dominant driving force.

If they limited their data only to studies that were multi-species, there were still 81% of the changes that were driven by climate change. They ran the data to exclude every bias they could think of and still they concluded that it provided ‘convincing evidence that climate change is the primary driver behind the observed biological changes’.

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

So there you have it – climate refugees aren’t just land based. Nemo’s going to have to move too.

100% Australian Renewable

What does 100% renewable electricity for the whole of Australia look like?

WHO: The Australian Energy Market Operator, commissioned by the Australian Federal Government

WHAT: Modelling for what a 100% renewable national electricity grid for Australia would look like.

WHEN: July 2013

WHERE: Online at the Department of Climate Change website

TITLE: 100 PER CENT RENEWABLES STUDY – MODELLING OUTCOMES (open access)

The Australian Department of Climate Change (yes, they have one!) commissioned the Australian Energy Market Operator in conjunction with CSIRO and ROAM Consulting to model what a national energy market would look like in 2030 and 2050 with 100% renewable electricity. Oh, and when they say ‘national’ they mean the more densely populated East Coast of the country (sorry WA and NT…)

The ‘national’ energy market (from paper)

The ‘national’ energy market (from paper)

They looked at two different scenarios – the first one was rapid deployment of renewable technology with moderate electricity demand growth (ie. including energy efficiency gains), and the second one was moderate deployment of renewable technology with high demand growth (no efficiency gains).

They ran both scenarios for getting our act together by 2030 and procrastinating until 2050 to see what might happen.

Given that this is a government document, it comes with many caveats (of course!). There are uncertainties (always); CSIRO says bioenergy is feasible, other groups say it’s not that feasible. The costs don’t include transitional factors (and change over time), the costs of land acquisition or stranded fossil fuel assets and infrastructure. Phew.

They also pointed out the obvious (someone has to say it I guess) that because this is looking at 100% renewable electricity it does not look at nuclear, natural gas or coal with carbon capture and storage. This is a fossil free zone people!

Ok, so what did they look at? They took data from the 2012 Australian Technology Assessment by the Australian Government Bureau of Resources and Energy Economics, and using that looked at what demand might look like in 2030 and 2050, and calculated the approximate costs.

Their findings in a nutshell are that a renewable system needs more storage (you can’t put solar in a pile like coal to burn later), is a more diverse and distributed system, needs an expanded transmission network and will be primarily driven in Australia by solar.

Depending on when Australia does it, it will cost somewhere between $219billion and $332billion dollars to build. No surprises that it’s cheaper to do it now, not to mention the stranded infrastructure and assets you save by starting the transition now. It’s cheaper after all not to build the coal terminal if you’re only going to use it for a short period of time.

Cost calculations for Scenario 1 (rapid deployment) and Scenario 2 (moderate deployment) (from paper)

Cost calculations for Scenario 1 (rapid deployment) and Scenario 2 (moderate deployment) (from paper)

They included energy consumption by electric vehicles (EVs) as well as the reduction of demand from rooftop solar. Interestingly, rooftop solar will dramatically change the makeup of a national energy grid. Currently the energy grid is summer peaking, which means more power is used in summer (for things like air conditioners when it’s seriously hot outside). With the uptake of rooftop solar, the grid will become winter peaking, because demand decreases in summer when everyone’s solar panels are doing great.

They ran the numbers to make sure a renewable power grid is as reliable as the current power grid, which is 99.998% reliable, and made sure the technologies they picked are either currently commercially available, or projected to be available soon.

They found that the capital costs are the main factor, given that once renewable power is installed; it costs almost nothing to run, because you don’t have to feed it fossil fuels to go. There are maintenance costs, but all power stations have maintenance costs.

Storage capacity wasn’t found to be economically viable with batteries once scaled up, given that a renewable grid needs 100-130% excess capacity. So storage would be in solar thermal, pumped hydro, biogas or biomass. The paper noted that geothermal (which Australia has a fair bit of) and biomass are similar to current standard baseload power in the way they can be used. Concentrated solar thermal is still a new technology that is being developed, so the scale up potential is not fully known yet, but it’s working well in Spain so far.

The space required to do this (to put the solar panels on and the wind turbines in) is between 2,400 – 5,000km2 which is small change in a country that has 7.7mill km2 and is mostly desert. So people won’t need to worry about wind turbines being put forcibly in their backyards, unless they want them (can I have one? They’re pretty!).

The most economic spread of renewables for transmission costs was a combination of remote with higher transmission costs and local with lower energy generation capacity.

Transmission possibilities (from paper)

Transmission possibilities (from paper)

The sticking point was meeting evening demand – when everyone comes home from work and turns the lights on and starts cooking dinner and plugs in their EV in the garage. The paper pointed out that work-based charging stations could promote charging your car during the day, but also ran scenarios where the demand shortfall could be met by biogas. This also applied for weeks where the storage capacity of the renewables was low (a week of low wind or a week of overcast weather).

Meeting demand shortfall by dispatching biogas and biomass (from paper)

Meeting demand shortfall by dispatching biogas and biomass (from paper)

Long story short, the future is hybrid renewable systems.

Breakdown of each technology for the different scenarios (from paper)

Breakdown of each technology for the different scenarios (from paper)

There is no single technology that can replace the energy density of fossil fuels, but a hybrid grid can. Diversifying both the technology and geography of the power grid will not only allow for 100% renewable generation, it will also build resilience.

As climate change extreme weather events become more common, having a distributed power system will avoid mass blackouts. It will be better for everyone’s health (living near a coal mine or a coal power station is NOT good for your health) and it will slow the rate at which we’re cooking the planet. Sounds good to me.

Vote for last week’s paper!

climate voter

Remember how I was excited about the possibilities of scaling up the carbon sequestration process outlined in last week’s post from the Proceedings of the National Academy of Sciences in the USA?

Turns out you can vote for it!

I had an email from the lead author of the paper (I send my blog posts to the lead authors when I post them) letting me know that their process has made the finalists of two MIT Climate CoLab ideas. So if you’re excited about the idea of feasibly sequestering carbon dioxide from the oceans being tested out as well, you can vote for them.

The first proposal is for the Geoengineering section called ‘Saving the Planet v2.0‘. The second proposal is for the Electric power sector section called ‘Spontaneous Conversion of Power Plant CO2 to Dissolved Calcium Bicarbonate‘.

Climate CoLab is an online space where people work to try and crowdsource ideas for what to do about climate change. The contest voting closes in 11 days (August 30th) and the winning proposals will be presented at the Crowds & Climate Conference at MIT in November.

So if it takes your fancy, and you’d like to see this project presented at the conference, go forth and vote!

 

Disclosure: I am not affiliated with either the paper or the MIT CoLab project.

Antacid for our Oceans

An electrolysis method that removes CO2 from seawater could be affordably scaled up for commercial carbon sequestration.

WHO: Greg H. Rau, Institute of Marine Sciences, University of California, Santa Cruz, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA
Susan A. Carroll, William L. Bourcier, Michael J. Singleton, Megan M. Smith, and Roger D. Aines, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA

WHAT: An electrochemical method of sequestering CO2 from sea water using silicate rocks.

WHEN: June 18, 2013

WHERE: Proceedings of the National Academy of Sciences (USA), PNAS vol. 110, no. 25

TITLE: Direct electrolytic dissolution of silicate minerals for air CO2 mitigation and carbon-negative H2 production (open access)

This paper was fun – I got to get my chemistry nerd back on thinking in moles per litre and chemical equations! It almost made me miss university chemistry lectures.

No, not those moles per litre! (IFLS facebook page)

No, not those moles per litre! (IFLS facebook page)

So what does chemistry jokes have to do with carbon sequestration? It’s looking increasingly like humanity is going to have to figure out ways to draw carbon out of the atmosphere or the oceans because we’ve procrastinated on reducing our carbon emissions for so long.

There’s two options for this – you can either create a chemical reaction that will draw CO2 out of the air, or you can create a chemical reaction that will draw CO2 out of a solution, and given how quickly the oceans are acidifying, using sea water would be a good idea. The good news is; that’s exactly what these researchers did!

Silicate rock (which is mostly basalt rock) is the most common rock type in the Earth’s crust. It also reacts with CO2 to form stable carbonate and bicarbonate solids (like the bicarbonate soda you bake with). Normally this takes place very slowly through rock weathering, but what if you used it as a process to sequester CO2?

The researchers created a saline water electrolytic cell to test it out. An electrolytic cell is the one you made in school where you had an anode and a cathode and two different solutions (generally) and when you put an electric current through it you created a chemical reaction. What these researchers did was put silicate minerals, saline water and CO2 in on one side, and when they added electricity got bicarbonates, hydrogen, chlorine or oxygen, silicates and salts.

A basic schematic of the experiment (from paper)

A basic schematic of the experiment (from paper)

The researchers used an acid/base reaction (remember those from school?!) to speed up the silicate and CO2 reaction, which also works well in an ocean because large differences in pH are produced in saline electrolysis. Are you ready to get really nerdy with me? The chemical equation is this:

Chemical equation for the experiment (from paper)

Chemical equation for the experiment (from paper)

So how did the experiment go? It worked! They got successfully sequestered carbon dioxide with an efficiency of 23-32% that sequestered 0.147g of CO2 per kilojoule (kJ) of electricity used.

There are issues around the scaling up of the reaction of course – once the bicarbonate has been created, where do you store it? The paper suggested ocean storage as the bicarbonate solids would be inert (un-reactive). I would hope that a re-use option could be found – has anyone looked into using bicarbonate solids as an eco-building material?

There’s also the issue of needing to power the reaction with electricity. If scaled up, this process would have to make sure it was powered by renewable energy, because burning carbon to sequester carbon gives you zero.

Also, if sea water is used, the main by-product is Cl2 so the researchers point out that while it would be feasible to do this process directly in the ocean, the issue of what to do with all that chlorine would need to be dealt with. The paper suggests using oxygen selective anodes in the electrolysis, or ion-selective membranes around the reaction to keep the chlorine separate from the ocean.

That being said, there are some exciting upsides to this process. The paper points out that the amount of silicate rock in the world ‘dwarf[s] that needed for conversion of all present and future anthropogenic CO2.’ Also, using sea water is an easier way to sequester CO2 rather than air-based methods.

Scaling the method up is economically feasible too. The researchers estimated that 1.9 MWh (megawatt hours) of power would be needed per metric tonne of CO2 sequestered. If the waste hydrogen from the process were sold as hydrogen fuel for fuel cells, the price of CO2 sequestered would be $86/tonne. If the hydrogen fuel wasn’t feasible, it would still only be $154/tonne, which compares very favourably to most current carbon capture and storage feasibility estimates of $600-$1000/tonne of CO2.

So, like an antacid for the oceans, if this process can be scaled up commercially through research and development, we could have an effective way to not only capture and store carbon, but also reduce ocean acidification. A good-news story indeed – now we just need to stop burning carbon.

Plan B: Saving Political Face Beyond 2 Degrees

So far the ‘targets and timetables’ approach to keeping climate change below 2oC has done very little to reduce emissions. What happens when we start thinking about giving up the 2oC target?

WHO:  Oliver Geden, German Institute for International and Security Affairs (Stiftung Wissenschaft und Politik)

WHAT: Looking at the ‘politically possible’ in light of our failures to get anywhere near the emissions reductions needed to keep global warming below 2oC.

WHEN: June 2013

WHERE: Online at the German Institute for International and Security Affairs

TITLE:  Modifying the 2°C Target: Climate Policy Objectives in the Contested Terrain of Scientific Policy Advice, Political Preferences, and Rising Emissions (open access)

This paper is all about the realpolitik. At the outset, it points out that in the 20 years since the UN framework on climate change (UNFCCC) was adopted that progress has been ‘modest at best’. Also, in order to keep global emissions from soaring quickly beyond the 2oC limit, significant reductions will be needed in the decade between 2010-2020, which is ‘patently unrealistic’.

Ok, so we’ve procrastinated away the most important decades that we had to do something about climate change with minimal impacts on both the economy and the wider environment. What now?

This paper suggests that the best bet might be changing or ‘modifying’ the internationally agreed on 2oC target. The author points out (quite rightly) that unrealistic targets signal that you can disregard them with few consequences. For instance, I’m not about to say that I’m going to compete in the next Olympic Marathon, because the second I miss a single training session it’s obviously time to give up given I’ve never run a full marathon before.

So if the world is going to fail on our 2oC training schedule, what will we aim for instead? Should we just aim for redefining ‘safe’ and ‘catastrophic’ climate change? Should we aim for 2.5oC? Should we aim for short term overshoot in the hopes that future humans will pick up the slack when we’ve kicked the can down the road for them?

The author points out what many people don’t like to notice when their countries are failing on their carbon reduction diets – not only have we already warmed by 0.8oC, but we’ve already baked in another 0.5oC from current emissions, so we’re already really close to 2oC without even starting to give up our fossil fuel habits. Also, those reductions we’ve all been promising to make and failing to make (or withdrawing from completely in Canada’s case)? Yeah, if we met all those targets, we’d still blow ‘significantly’ past 2oC. Ouch.

The emissions gap (from paper)

The emissions gap (from paper)

Another issue – the current top-down UNFCCC approach assumes that once we reach an agreement, that effective governance structures can be set up and operating within a matter of years, which is highly unlikely given we can’t even reach an agreement yet.

So what does a ‘more pragmatic stance’ for the EU on climate policy look like if we’re going to collectively blow past 2oC? Will climate policy have any legitimacy?

The author argues that the coming palpable impacts of climate change will soon remove the political possibility of ignoring climate change as an issue while in office (which I for one am looking forward to). He also doesn’t place much faith in the UN process finding a global solution with enough time – if an agreement is reached in 2015, it’s unlikely to be ratified by 2020, at which point the targets from 2015 are obsolete.

One suggestion for the EU is reviewing the numbers for the likelihood of passing 2oC. Currently, humanity is vaguely aiming to have a 50/50 chance of staying below 2oC. If we could roll the dice with slightly higher odds of blowing 2oC, maybe we could buy some time to get our political butts in gear?

That idea puts all the hard work of mitigation on everyone post-2050, at which point we’ll all be dealing with the climate impacts as well as trying to find the time for mitigation.

The other option is to say that 2oC is a ‘benchmark’ (only slightly better than an ‘aspirational target?’) and put our faith in climate inertia allowing humanity to overshoot on emissions and then increase the amount of sequestration (negative emissions) to pull back from the brink of the next mass extinction.

The paper does acknowledge that this will implicitly approve a temperature overshoot as well as an emissions overshoot, which could possibly kick the can down the road to 2300 before global temperatures are below 2oC above what we used to call ‘normal’. Apologies to everyone’s great great great great grandchildren for making you responsible for all of that.

Kicking the can down the road to 2300 (from paper)

Kicking the can down the road to 2300 (from paper)

The author also acknowledges that overshoot policies will only be accepted by the wider public if they’re convinced that this time governments will actually respect them as limits not to be passed. Previous experience with the UNFCCC processes show that any extra time that can be wrangled through carbon accounting is likely to be procrastinated away as well.

The other option could be a target of 2.5oC or 550ppm of CO2 in the atmosphere, but as the paper points out, the ‘targets and timetables’ policies haven’t worked yet, and it might be time to look more towards feasible ‘policies and measures’.

The problem for me with this paper is that while it’s practical to look at aiming for what humanity can politically achieve in terms of climate policies, redefining what ‘dangerous climate change’ is to fit with realpolitik rather than physics won’t work. Physics doesn’t negotiate – the first law of thermodynamics doesn’t care that there was an economic downturn in 2008 that has made it harder to pass climate legislation.

So yes, we need to think about what is politically possible in the current ‘we can still procrastinate on this’ climate. But we also need to be planning for the tipping point once all the extreme weather adds up to business as usual no longer being feasible. We may be able to ‘postpone the impending failure of the 2oC target’, but we won’t be able to ignore the impacts of climate change.

CO2 garden steroids

Is additional atmospheric CO2 fertilizing plants? Is this a good thing?

WHO: Randall J. Donohue, Tim R. McVicar, CSIRO Land and Water, Canberra, ACT, Australia
Michael L. Roderick, Research School of Biology, Australian National University, Canberra,
Research School of Earth Sciences, Australian National University, Canberra,  Australian Research Council Centre of Excellence for Climate System Science, Sydney, New South Wales, Australia.
Graham D. Farquhar, Research School of Biology, Australian National University, Canberra, ACT, Australia.

WHAT: Trying to measure the effect that CO2 fertilization has on plants from increased atmospheric CO2 from global warming

WHEN: 19 June 2013

WHERE: Geophysical Research Letters, Vol. 40, Issue 12, June 2013

TITLE: Impact of CO2 fertilization on maximum foliage cover across the globe’s warm, arid environments (subs req.)

Climate deniers and people who don’t want to take action on climate change often say that increased levels of CO2 in our atmosphere will be a good thing, because plants need CO2 to grow, so why not let the plants have all the CO2 they want and watch them grow like gangbusters!?

More CO2= good? (Chris Gin, flickr)

More CO2= good? (Chris Gin, flickr)

This is the same as suggesting that because humans need water that I should drink water all the time without pause and that would be awesome (it wouldn’t; I would die from water intoxication). Flaws in denier logic aside, these researchers from the CSIRO tried to find out if you could measure whether there was an increase in plant growth from increased atmospheric CO2 and whether you could measure it.

The researchers looked at warm, arid areas of the world where rain is the limiting factor for plant growth. Places like South Western Australia, Southern California and South Eastern Spain. They then looked at the plant growth data from 1982-2010 and broke it into three year segments that were averaged to make sure they accounted for a lag time between changes in rain patterns and plant growth.

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Warm arid areas included in the study (yearly data in grey, 3yr averages in red, from paper)

Then they ran the numbers for what plant growth would have been with constant amounts of rain so they could separate out the effect of CO2 alone.

What they found was that transpiration and photosynthesis are directly coupled to atmospheric CO2, which plays a role in setting the Fx edge (upper limit of plant growth). Plant growth increased ~11% between 1982 and 2010 from increased levels of atmospheric CO2 fertilization.

Then, just to make sure they were correct, they looked at the different things that could have influenced that increase. Increased temperatures lowered plant growth (too hot to grow). Plant productivity increased to a certain point under drought conditions (as plants became more water efficient and drought tolerant), but that couldn’t account for the 11% increase. There was an observed 14% increase in plant growth from a 10% increase in precipitation, but that couldn’t account for their numbers either because they ran them for a constant level of rain.

So, as the researchers stated in their paper, this ‘provides a means of directly observing the CO2 fertilization effect as it has historically occurred across the globe’s warm, arid landscapes’.

But does this mean all plants will grow more and we don’t have to worry about climate change anymore? Unfortunately, no.

This only applies to warm arid areas where water is the dominant limit to growth. Also, the other effects of climate change – longer droughts, more extreme storms, longer heatwaves, more extreme bushfires – are likely going to outweigh the positive effect of the increase in growth from CO2 fertilization.

The researchers point out in their conclusion that this research doesn’t simply translate to other environments with different limitations. In a Q&A with the CSIRO when asked whether this research means that climate change is good, the lead author Dr. Donohue stated; ‘this does not mean that climate change is good for the planet.’

So yes, there is a fertilization effect from increased CO2, but no, it doesn’t mean we get to keep burning carbon.

Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Your odds just shortened – Aussie Heatwaves

Climate change has increased the chance of extreme heatwaves in Australia by more than five times.

WHO: Sophie C. Lewis and David J. Karoly, School of Earth Sciences and ARC Centre of Excellence for Climate System Science, University of Melbourne, Victoria, Australia

WHAT: Looking at how much influence human-caused climate changes are pushing Australian summers into more extreme heat.

WHEN: July 2013

WHERE: Geophysical Research Letters (DOI: 10.1002/grl.50673), pre-publication release

TITLE: Anthropogenic contributions to Australia’s record summer temperatures of 2013 (subs. req)

There’s some interesting research happening at my alma mater Melbourne University these days (go Melbourne!). Even if you weren’t there to experience the extreme summer of 2012-13 in Australia, I’m sure you all remember the new colour that had to be created by the Australian Bureau of Meteorology for the weather maps when they maxed out above 50oC, or maybe the new rating for bushfires of ‘catastrophic’ for the climate fuelled fires that are beyond just an extreme risk?

Extreme Heat in January 2013 (Bureau of Meteorology)

Extreme Heat in January 2013 (Bureau of Meteorology)

So, to answer that age-old question ‘exactly how much have we messed this up?’ these researchers looked at the historical monthly weather patterns, weather patterns with natural forcing only and patterns with natural forcing and human forcing and matched those up with what actually happened.

They looked at the average monthly mean temperatures, maximum temperatures and minimum temperatures and found that the monthly extremes are increasing faster than the daily extremes – that is that there are more months that are more overall extreme than there are days of extremes.

The historical data they used for the experiment was from 1850 to 2005, with the baseline climate data (what they used as a reference for ‘normal’) being 1911-1940 because 30 years of weather data makes a climate!

They then created experiments for the data with natural forcing only, with natural and human forcing and ran exciting statistical functions like a probability density function with a kernel smoothing function that almost sounds like a super-cool popcorn maker.

To double check for error, they used the second hottest summer temperatures to make sure they could pick out the human influences from the randomness that can be the data, thereby deliberately making their findings conservative.

Once they’d run their fun popcorn-sounding numbers, they calculated the FAR – Fraction of Attributable Risk, which is exactly what it sounds like – the fraction of the risk of something happening that can attributed to a cause.

So if our ‘bad guy’ is human-induced climate change, how much can we blame it for the Australian Angry Summer of 2012-13? Well, a fair bit.

When they compared the numbers, they had 90% confidence that there was a 2.5 times increase in extreme heat from human influences. When they compared 1976-2005 data and extended the model out to 2020, the fraction increased again to a 5 times increased likelihood.

Extreme heat is ‘substantially more likely’ because of humans burning fossil fuels, which are pretty bold words from research scientists – when there’s a greater than 90% chance of something they say ‘very likely’ where most people would say ‘very certain’. In their research, events that should have been occurring 1-in-16 years naturally were happening 1-in-6 years with the historical numbers and 1-in-2 years with the model out to 2020. Ouch – summer just got more uncomfortable more often.

MTSOfan, flickr

MTSOfan, flickr

For me, the kicker came when the paper pointed out that the 2012-13 summer in Australia came in a La Niña year. Extreme heat events normally come with an El Niño year – La Niña years are cooler with more rain. So the fact that the Angry Summer occurred in a La Niña year is scary – sea surface temperatures were average or cooler in some places while at the same time the Bureau of Meteorology was scrambling for new map colours.

The paper concludes that their research supports ‘a clear conclusion that anthropogenic climate change had a substantial influence on the extreme summer heat over Australia’ and that these kinds of events are now five times as likely to occur. Welcome to summers on climate change?