Show me the Money: Adaptation Finance

Cities and poor people living in cities are often forgotten in climate adaptation planning. How can we fix that?

WHO: Barry Smith, International Institute for Environment and Development (IIED) Climate Change and Human Settlements Group
Donald Brown, Independent Researcher
David Dodman, Senior Researcher, IIED Human Settlements and Climate Change Group; Team Leader, Cities and Climate Change

WHAT: A look at how adaptation finance can better serve poor communities, especially in cities

WHEN: January 2014

WHERE: IIED website

TITLE: Reconfiguring urban adaptation finance (open access working paper)

Climate change adaptation is expensive – more expensive than mitigation as the USA discovered last year when it broke a record for billion dollar disasters and spent more on disaster recovery than health and education combined. Unfortunately, the people who are most vulnerable to climate extremes are poor people who tend to live in areas that are less secure, have worse infrastructure, less access to finance and legal protection and less ability to move when a storm wipes out their neighbourhood.

Billion dollar disasters – they’re a growth industry right now (from Jeff Masters, Weather Underground)

Billion dollar disasters – they’re a growth industry right now (from Jeff Masters, Weather Underground)

This paper looks at adaptation in cities in developing countries that have large informal settlements like slums, but as Hurricanes Katrina and Sandy both showed, these issues are going to affect populations around the world, regardless of where you live.

However, in developing countries, the issue is on a larger scale. As the authors of this paper point out, slums are often built in hazard-prone areas, and because cities don’t want to admit they have problems with poverty and slums, they don’t get counted and often get ignored when climate adaptation is planned.

Issues with infrastructure and utilities – Kibera slum in Nairobi (Norvartis, flickr)

Issues with infrastructure and utilities – Kibera slum in Nairobi (Norvartis, flickr)

Currently, urban adaptation finance involves many international organisations (your top-down approach) as well as local organisations. The international ones are international NGOs, the World Bank and the UN through the UNFCCC (UN Framework Convention on Climate Change). One of the risks with this system is that instead of cities and countries getting new funding for adaptation, they could just get old funding ‘re-labelled’ as adaptation funding, which won’t work. The authors also point out that adaptation funding is required to be new and additional, which makes sense because climate impacts are also new and additional. Regular investments through international adaptation funding also need to be ‘climate proofed’ which could be called ‘because climate change will affect everything, it needs to be built into all decision-making processes’.

There is definitely an important role though, for NGOs to do this and ensure that including climate change in decision-making is mainstreamed, at which point it’s finally not just me being the weirdo at the party asking people how far above sea level their home is.

No seriously, how far above sea level is your house? (Nick Hewson, flickr)

No seriously, how far above sea level is your house? (Nick Hewson, flickr)

UNFCCC processes are (unsurprisingly) labyrinthine and difficult to get money out of. They have a special climate change fund (which gets shortened to SCCF because everything is an acronym at the UN), which has $239 million, but so far it’s been slow on the spending of that adaptation fund. This is because the fund has a governing council that represents the polluting countries (that’s us – the industrialised developed countries that are responsible for most climate change) more than the developing countries, giving the polluting countries an effective veto over the money.

The UNFCCC also has an Adaptation Fund, which is funded by a 2% levy on clean development mechanism projects. It’s more transparent, developing countries have more of a seat at the table and the process also includes NGOs, however the fund is comparatively smaller at $151million.

The World Bank has a Pilot Program for Climate Resilience (PPCR) which as a pot of $1billion for climate adaptation. The World Bank has only handed out $8million so far, so there’s some kind of disconnect happening there. The authors suggest that it’s because the planning processes for the projects are required to be between 3 – 18 months long. Which is fine for a project when everyone has agreed what you’re working towards, but when you’re doing climate change adaptation and you need to sit down with a community and explain to people why they might need to leave the place that their family has lived for generations because sea level rise will inundate them, that’s a longer, more difficult conversation.

National systems for funding climate adaptation are more successful at getting the money on the ground and working towards change. Some governments have been early leaders in climate adaptation funds, most notably Rwanda who started their National Climate and Environment Fund in 2005. Way to get ahead of the pack Rwanda! They were recently granted £22.5million from the British Department for International Development, which makes their fund the largest climate fund in Africa.

National adaptation funds give better access to funding for municipalities and local/regional governments. Which makes sense – it’s easier to apply for funding through an organisation you know that speaks the same language than to try and work out where you fit within the World Bank or the UN. However, the authors point out that strengthening the capacity of local organisations working on adaptation in cities could actually be the most effective spend of adaptation money. Why? Because cities are on the ground and have control over things that can make an impact like transit and sea walls and pipes and power lines and garbage collection and disposal. Things that are too local for an international organisation to deal with.

For example, microfinance has been a huge success as a ‘bottom up’ approach to funding adaptation and capacity in communities. Female-led savings groups in poor communities have linked together with groups like Urban Poor Fund International, Slum Dwellers International and the International Institute for Environment and Development (the organisation that commissioned this paper).

By their powers combined, these groups have coordinated pooled savings funds across 464 cities with successes across the globe including places like Malawi. The Malawi Homeless People’s Federation formed a Center for Community Organisation and Development where they coordinated and helped out local savings groups that generally comprised between 30-70 members of a neighbourhood. They also organised ‘exchange visits’ where people from an established savings group would go to a different neighbourhood to tell them about the program and help them set up their own savings group, creating stronger community links within the cities. They now have savings groups across 28 urban and rural centres in Malawi, with over $148,000 in savings, which is phenomenal.

The challenge is then, to connect these two processes – the local and sub-national projects that have connections within the community are the successful ones, and the international organisations are the ones with all the adaptation funding. How can we get them talking to each other so we can start working towards more resilient communities?

The authors suggest the biggest problem is how difficult the international programs are to navigate, with a mismatch between the fiduciary responsibilities and level of reporting at the international level and the ability for poor communities to have the capacity to complete that reporting. Because unless you speak bureaucrat, it can be almost impossible to fill out those forms. Also, international NGO and aid programs often work at a very high level and are often not directly accountable to the communities they work in.

So what should we do? The authors suggest re-tooling the international aid process a bit – focusing on smaller grants ($1,000-$3,000 goes a long way in small communities), focusing on cities and climate adaptation on the ground as key points of adaptation, and encouraging private investment options that include social impact bonds. It’s going to take work, but since we haven’t stopped burning carbon yet, we’re going to need to do it.

The Coffee Grower’s Paradox

Do climate change adaptation programs just shift climate vulnerability?

WHO: Aaron Atteridge and Elise Remling, Stockholm Environment Institute, Stockholm, Sweden

WHAT: Looking at the example of Colombian coffee growers to see if climate change adaptation projects shift vulnerability instead of fixing it.

WHEN: December 2013

WHERE: The Stockholm Environment Institute website

TITLE: The Indirect Effects of Adaptation: Pathways for Vulnerability Redistribution in the Colombian Coffee Sector (open access)

I love coffee and one of the things that really terrify me about climate change (other than the devastating droughts, diseases, storms, and lack of food) is that I might be forced to give up my daily caffeine habit. So naturally, I was interested to find out whether programs that were trying to promote adaptation in the coffee industry in Colombia (home of the worlds best quality Arabica beans) were hindering more than they were helping.

Coffee – we love it (photos: Amy Huva)

Coffee – we love it (photos: Amy Huva)

These researchers set out to look at the unintended consequences of adaptation – are we fixing the problem, or just making it someone else’s problem?

First, they needed to work out what could be vulnerable to change (not just climate change). The increasing globalisation of the world means that your coffee farm is no longer your own little island – it is affected by global prices, supply chains, and the connections between the people involved in all of those activities.

The researchers narrowed it down to five different categories:

  1. Natural capital (aka the ecosystem services you get from trees purifying your air, water feeding your soil etc.)
  2. Physical capital (how many coffee plants do you own? How many tractors does that require?)
  3. Social capital (strong family ties and community ties in the farming industry)
  4. Financial capital (how easy is it for the farmer to access capital to expand/improve their business?)
  5. Human capital (how educated is the farmer? How many local educated workers are nearby she can employ?)

From those categories, the researchers made a case study from interviews conducted with people in the coffee industry in Colombia in 2012. They interviewed farmers, buyers, representatives from the National Coffee Federation (Federación Nacional de Cafeteros), people at the national coffee research institute, federal and local government representatives as well as coffee pickers employed in the industry.

Aside from the fact that Colombia having a national institute dedicated to studying coffee being the most exciting thing I’ve heard all week (can I go visit? How do I donate to fund their research? Can I be a coffee scientist too?), the Colombian coffee industry is surprisingly large.

Coffee has been grown in Colombia for around 200 years and is still overwhelmingly run by small landholders, with 96% of farms being 1.6hectares or smaller. The National Coffee Federation not only represents over 500,000 farmers, but they have a national fund set up similar to Norway’s oil royalty future fund which uses profits from the coffee industry to benefit Colombians. Around 4million people in Colombia make a living from the coffee industry, and between 1964-2012 there were also 170 different international development projects hosted there.

I knew my coffee habit was a good thing!

Coffee science/tastings (photo: Amy Huva)

Coffee science/tastings (photo: Amy Huva)

From their interviews, the researchers worked out that the sources of vulnerability in the industry were pretty much the same as any other industry – price and price volatility, access to markets, access to finance, harvest and yield changes, production costs and knowledge of alternative/more efficient practices.

This means that many of the pressures Colombian coffee farmers are facing are exactly the same problems many other industries are facing like changes in market preference (organic coffee production possibly outstripping demand), or international development aid being used on things that are not a priority for the farmers. This led the researchers to conclude that unexpected flow-on effects are pretty much the norm, not the exception to the rule.

The difference for coffee growers in Colombia is the direct effect climate change will have on them as the prime coffee growing conditions move to higher latitudes where the soil is not as fertile.

The key risk for adaptation programs looking at climate change is that farmers in areas that are now marginal for growing coffee will get ignored or abandoned, when they’re the ones that really need the help to either diversify their crops or find new livelihoods as climate change really bites.

So the moral of the story seems to be that while climate change is going to affect the way we all do things, if we’re going to try and help with adaptation, we can’t forget the marginal growers.

Stinking Hot Down Under

The numbers are in: 2013 was the hottest year on record in Australia since records began.

WHO: Will Steffen, Australian National University, Canberra Australia

WHAT: A report on how many heat records were broken in Australia last year

WHEN: January 2014

WHERE: The Climate Council website

TITLE: Off the Charts: 2013 Australia’s Hottest Year (open access)

Last week in Australia it was stinking hot. I was texting with my brother (who is doing a PhD in Meteorology so is also a massive nerd like me) about the heatwave forecast and we came up with a new term for what an overnight minimum temperature should be called when it’s too high. We decided that an overnight low of 28.6oC should be called a ‘lower maximum’ because that’s too hot to sleep in.

Unfortunately, this is the new normal for Australia, which has just been shown in excellent infographic form by the Australian Climate Council. It was a banner year for Australia last year breaking all kinds of heat records and having the hottest average temperature since record keeping started in 1910.

 

It looks pretty, but it’s so hot the sand burns your feet. Mornington Peninsula, Vic (photo: Amy Huva)

It looks pretty, but it’s so hot the sand burns your feet. Mornington Peninsula, Vic (photo: Amy Huva)

Now, I know we Australians are a competitive people who always like to win, but breaking these records are not so much fun.

Nationally, the records broken were:

  •  Highest average temperature across the country 1.20oC higher than the 1961-90 baseline years
  • Highest mean maximum temperature across the country 1.45oC above the baseline years
  • Mean minimum temperature across the country of 0.94oC above baseline years
  • Hottest January on record
  • Hottest summer on record (Dec 2012-Feb 2013)
  • Hottest winter day on record – August 31st 29.92oC
  • Hottest September on record of 2.75oC above baseline
  • Hottest spring on record
  • Hottest December on record

Locally, some of the notable records were:

  • South Australia broke their spring monthly average temperature record by 5.39oC
  • New South Wales broke their spring monthly average temperature record by 4.68oC
  • Alice Springs had their hottest October day ever of 42.6oC
  • Canberra’s October was 2.5oC above average
  • West Kimberly in Western Australia was a shocking 4oC above average for October

Sea Surface Temperatures were record highest for January and February 2013 and of the 21 days Australia has ever had with a country-wide average temperature above 39oC there were 8 of them in 2013 and 7 of them happened consecutively in January 2013! Remember in the news when Australia had to create a new colour on their temperature maps? That was then.

Even worse, this extreme heat was not pumped up with the influence of El Niño, which normally makes years warmer. The year had no strong El Niño or La Niña effect, so it was a climate-changed year.

Since 1950, the number of heat records has beaten the cold records in Australia at a rate of 3:1 and in true Australian style; we’ve exceeded expectations and broken all kinds of records. This is the new normal, and it’s only going to get worse unless we stop burning carbon.

Infographic by the Climate Council.

Infographic by the Climate Council.

Send in the Clouds

The influence of clouds is one of the big uncertainties for climate sensitivity.

WHO: Steven C. Sherwood, Climate Change Research Centre and ARC Centre of Excellence for Climate System Science, University of New South Wales, Sydney, Australia
Sandrine Bony, Jean-Louis Dufresne, Laboratoire de Météorologie Dynamique and Institut Pierre Simon Laplace (LMD/IPSL), CNRS, Université Pierre et Marie Curie, Paris, France

WHAT: Measuring evaporation, humidity and temperature change in the lower troposphere

WHEN: 2 January 2014

WHERE: Nature, Vol 505, No. 7481

TITLE: Spread in model climate sensitivity traced to atmospheric convective mixing (subs req)

The big question for many people in climate science is what happens when we double the concentration of CO2 in our atmosphere? Is it good and plants grow more? Is it bad and we all die in the next mass extinction? Is it somewhere in between and maybe we’ll be ok?

Most climate models estimate between 1.5oC and 4.5oC of warming for the doubling of CO2, but as I’ve blogged before there’s a huge difference between the consequences of 1.5oC and 4.5oC. So how do we narrow this down for a more accurate model?

These scientists from Australia and France decided to measure the drying effect of cloud mixing to try and sort it out.

Cloud feedbacks are one of the unknowns in climate models – they’re constantly changing, their impact is localised, but overall they have a big impact. They also have different impacts at different levels. High clouds, which are 8km above ground (or at 400hPa pressure) play a minor role in feedback, because at that level of the atmosphere, the temperature isn’t changing that much – it’s happening in the lower atmosphere because that’s where we’re spewing all our carbon pollution (incidentally, warming in the lower atmosphere and lack of warming in the upper atmosphere is one of the ways we know humans are causing current climate change).

Mid level clouds (from 3km-8km) have medium levels of climate feedbacks, but the low clouds – below 3km (750hPa pressure) are capable of strong feedback. Because of this, these researchers decided to measure the low clouds.

Obligatory cloud photo (photo: Amy Huva)

Obligatory cloud photo (photo: Amy Huva)

The evaporation that increases as the temperature warms could lead to more clouds forming, however on the other hand a warmer climate could mean more dry air is drawn to the surface which doesn’t allow the clouds to form. Which one is it?

Firstly, cloud mixing. 2km above sea level there is a cloud boundary where a specific type of cloud mixing takes place. This is called ‘lower tropospheric mixing’ which sounds to me like a drink I’d order on a summer beach holiday. What it actually means is a mix of downdrafts of wind, local evaporation of rain and the transportation of shallow clouds. There is large scale mixing which happens via specific circulations, and there’s small scale mixing which is much more varied and localised.

Cloud mixing at the boundary layer (from paper)

Cloud mixing at the boundary layer (from paper)

The researchers measured each type of mixing (small scale and large scale) separately to see whether they would dry out with increased temperatures or not. They measured the temperature and humidity change as well as drying between 700hPa and 850hPa pressure to work out the method of mixing involved.

From the 48 different models they looked at – those that got warmer also got drier. For the small scale mixing, with 4oC of warming, the drying increased by 6-30% compared with only 8% on land. This means that while the land surface is experiencing smaller amounts of drying, clouds are drying out faster. So if you think the drought is bad – it’s worse for the poor clouds.

For the large scale mixing, the researchers looked at monthly data from 10 different models and from applying temperature, humidity and drying to climate sensitivity were able to account for 50% of the variance in system cloud feedbacks! Considering they started their research project saying that ‘no compelling theory of low cloud amount has yet emerged’ I’d say that’s a very successful experiment.

The bad news is they also found with large scale mixing that for every degree Celsius of warming, drying increased by 5-7%. They also found that moisture transport increases strongly with warming, but further research will be needed to work out why that happens (yay science! Discovering new things!).

So this means that from their models with increased accuracy of cloud feedback measuring, a doubling of CO2 in the atmosphere shows a sensitivity of 4oC warming, with a lower limit of 3oC. The researchers found that anything less than 3oC of warming did not line up with cloud observations for that concentration of CO2. The researchers point out that you can’t rule out something weird happening in nature that could change the process suddenly (those scary tipping points), but the process they found was this:

Lower tropospheric mixing dries out the boundary layer by 5-7% per degree of warming because of stronger vertical water vapour gradients, which lead to surface evaporation increases of 2% per degree of warming.

While it’s good to know with more certainty the impacts of doubling CO2 in the atmosphere, unfortunately this time it’s not good news. The way to solve this problem? Stop burning carbon before we’ve doubled CO2.

Our Fast-Paced Modern Climate

How can we determine dangerous levels of climate change so that we can stay within those limits?

WHO: James Hansen, Makiko Sato, Jeffrey Sachs, Earth Institute, Columbia University, New York, USA Pushker Kharecha, Earth Institute, Columbia University, New York, Goddard Institute for Space Studies, NASA, New York, USA
Valerie Masson-Delmotte, Institut Pierre Simon Laplace, Laboratoire des Sciences du Climat et de l’Environnement (CEA-CNRS-UVSQ), Gif-sur-Yvette, France
Frank Ackerman, Synapse Energy Economics, Cambridge, Massachusetts, USA
David J. Beerling, Department of Animal and Plant Sciences, University of Sheffield, South Yorkshire, UK
Paul J. Hearty, Department of Environmental Studies, University of North Carolina, USA
Ove Hoegh-Guldberg, Global Change Institute, University of Queensland, Australia
Shi-Ling Hsu, College of Law, Florida State University, Tallahassee, Florida, USA
Camille Parmesan, Marine Institute, Plymouth University, Plymouth, Devon, UK, Integrative Biology, University of Texas, Austin, Texas, USA
Johan Rockstrom, Stockholm Resilience Center, Stockholm University, Sweden
Eelco J. Rohling, School of Ocean and Earth Science, University of Southampton, Hampshire, UK Research School of Earth Sciences, Australian National University, Canberra, ACT, Australia
Pete Smith, University of Aberdeen, Aberdeen, Scotland, United Kingdom
Konrad Steffen, Swiss Federal Institute of Technology, Swiss Federal Research Institute WSL, Zurich, Switzerland
Lise Van Susteren, Center for Health and the Global Environment, Advisory Board, Harvard School of Public Health, Boston, Massachusetts, USA
Karina von Schuckmann, L’Institut Francais de Recherche pour l’Exploitation de la Mer, Ifremer, Toulon, France
James C. Zachos, Earth and Planetary Science, University of California, Santa Cruz, USA

WHAT: Working out what the limit to dangerous climate change is and what the implications are for the amount of carbon we need to not burn.

WHEN: December 2013

WHERE: PLOS One, Vol 8. Issue 12

TITLE: Assessing ‘‘Dangerous Climate Change’’: Required Reduction of Carbon Emissions to Protect Young People, Future Generations and Nature (open access)

This (very) lengthy and detailed paper runs us all through exactly what’s happening with the planet’s climate, what’s making it change so rapidly (spoiler: it’s us) and what objectively we need to do about it. Needless to say, since the lead author is Dr. James Hansen, the godfather of climate science, we would do well to heed his warnings. He knows his stuff; he was doing climate models before I was born (seriously).

Firstly, the paper points out that humans are the main cause of climate change and then also neatly points out that while all 170 signatories to the UN Framework on climate change (UNFCCC) have agreed to reduce emissions, so far not only have we not worked out what the limit for ‘dangerous’ climate change is, we’ve also done nothing to fix it except fiddle at the edges.

Epic procrastination fail, humanity.

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP

Then, the researchers look at 2oC warming as a target. In reality, while 2oC is a nice, seemingly round number that is far enough away from our current 0.8oC of warming, the reason it was randomly chosen to be our line in the sand is that it’s the point beyond which ecosystems start collapsing. I have a sneaking suspicion it was also easy to agree on because it was way into the ‘distant’ future, but let’s play nice and believe it was all for rational scientific rigour.

The latest IPCC report says that if we’re going to stay below 2oC, we can burn a total of 1,000GtC (Gigatonnes of carbon). Ever. This means we need to leave fossil fuels in the ground and stop cutting down more trees than we plant.

As has been pointed out in previous papers, the researchers show that burning all the fossil fuels is a really bad idea. A really bad idea in a mass extinction like the dinosaurs kind of way.

So, if we look at all the warming that has happened so far and measure the energy imbalance in the atmosphere, what do we get? Firstly a step back – energy imbalance. This is your energy budget where you always want it to be constant. Energy comes into the atmosphere from the sun, some goes back out, some stays and keeps us warm and comfy on planet Earth.

Fossil fuels mean that humans have taken a seriously large amount of energy out of the ground and burned it. Releasing this energy into the atmosphere means we’ve now got too much energy inside our atmosphere and we’re out of balance.

What happens when we’re out of balance? Well, so far it hasn’t been pretty. With only 0.8oC of global warming 98% of Greenland’s surface melted for the first time in recorded history, Arctic sea ice hit new record lows, the planet has seen more frequent more extreme storms, floods, typhoons, hurricanes, droughts, fires, algal blooms, glacial melt, and ocean acidification. We’ve had weird storms no-one has ever heard of before like Derechos, we’ve had tropical diseases in new places, and the Jet Stream over the Northern Hemisphere getting ‘stuck’ and dumping more weird weather on us. It’s pretty clear the planet is unwell and that it’s because of us.

you have humans

If all that terrifying stuff is happening at 0.8oC of warming, what does that make 2oC? Hopefully your answer is ‘horrifying’, because that’s what my answer is. Since 2050 (when we’ll arrive at 2oC if we keep going business as usual) is within my working lifetime, I’ll let you know how horrifying it is when we get there.

More scientific than ‘horrifying’ though, the researchers point out that previous paleoclimate changes, from the Earth’s tilt and other slow oscillations took between 20,000 – 400,000 years to happen. Changes happening at that rate give the plants and animals and fish time to relocate and survive. The rate at which we’re changing our modern climate is bad news for things that are not mobile.

How far out of balance are we? The paper estimates that between 2005-2010 the planet was 0.58 W/m2 (± .15W/m2) out of balance. How much of that was caused by humanity? Well, solar irradiance has been going down over the last while, so it’s pretty much all us.

If we are 0.5 W/m2 out of balance, the researchers calculate that we would need to reduce the CO2 concentration down to 360ppm to have energy balance again (we’re currently at 395ppm). If you include some error in the calculations and we’re 0.75W/m2 out of balance, humanity needs to get CO2 down to a concentration of 345ppm.

To make planning easier, the researchers suggest we just aim to get and stay below 350ppm.

The paper then runs through all the reasons why 2oC is a really bad idea to let happen. Because it will lead to damaging sea level rise (sorry Miami), because change is happening too quickly for many species to survive and more than half the species on the planet could go extinct from too much warming (and yes, if we warm this planet enough, humans could be part of that mass extinction).

Because the recovery back to normal temperatures happens on a timescale of millions of years which is beyond the comprehension of humanity.

So to avoid being the next mass extinction, what do we need to do? First, we need to quantify how quickly fossil fuels need to be totally phased out.

If emissions are reduced to zero in 2015, the world could get back to 350ppm by 2100. If we wait until 2035, it would take until 2300. If we wait until 2055, it would take until the year 3000. So when we start reducing emissions is important.

Reduction scenarios (from paper) BAU: Business As Usual

Reduction scenarios (from paper) BAU: Business As Usual

If we had started reducing emissions in 2005, it would only have taken reductions of 3.5% per year. Since we didn’t do that, if we start now, we need to reduce emissions by 6% a year. If we delay until 2020 it becomes 15% per year, so let’s not procrastinate on this one humanity. Also keep in mind that the amount that is considered ‘politically possible’ is currently around 2-3% reductions each year, which means that scientific reality and political delusions are going to collide very soon.

If we reduce our carbon emissions by 6% per year to keep below 350ppm of carbon dioxide by the end of the century, our total carbon budget is going to be 500GtC.

This means we’ve got ~129GtC that we can burn between now and 2050, and another 14GtC left over for 2050-2100. Humanity has already burned through ~370GtC from fossil fuels, so we’ve got to kick this habit quickly.

The paper points out that this means all of our remaining fossil fuel budget can be provided for by current conventional fossil fuels. Therefore, we would require the rapid phase-out of coal and leave all unconventional fossil fuels in the ground. Yes, all of them – all the tar sands, the shale gas, the shale oil, the Arctic oil, the methane hydrates, all of it.

The researchers also say that slow climate feedbacks need to be incorporated into planning, because we’re probably starting to push those limits. Slow feedbacks include things like melting ice sheets (Greenland and Antarctica), deforestation, melting permafrost and methane hydrates.

These things are like climate ‘black swans’ – they’re unquantifiable in that you don’t know when you’ve passed the irreversible tipping point until after you’ve gone beyond it, but things like the ocean no longer being able to absorb most of the carbon we spew into the atmosphere and the rapidly melting permafrost need to be considered in daylight as well as our nightmares now. This is because slow feedbacks can amplify climate changes by 30-50% which puts a big hole in our ‘not burning carbon anymore’ plan.

The paper points out: ‘warming of 2oC to at least the Eemian level could cause major dislocations for civilisation’ which I don’t even need to translate from scientist, because scientists are no longer bothering to pull their punches when explaining how quickly we need to stop burning carbon before we’re really screwed.

So what do we do? The paper makes some suggestions, pointing out that since the science clearly shows what’s happening, the range of responses is also pretty clear.

The first thing is a price on carbon. This paper suggests a carbon ‘fee’ with a border levy for countries that don’t sign up to the fee idea. The fee starts at $15/tonne of carbon and increases by $10/tonne each year. Imports from countries that don’t have the fee get charged at the border, which can then be used for assistance to industries that are exporting to countries without the fee.

They point out that this fee is below the price of cleaning up our climate carbon mess. If we wanted to pay to sequester 100ppm of CO2 out of the atmosphere, it would cost ~$500/tonne of carbon. If that was charged to all countries based on their cumulative emissions, that would be a cost of $28 trillion for the USA (or $90,000 per person) who is responsible for 25% of cumulative global emissions. Hmmm – expensive.

The other things we need to get rid of fossil fuels are hybrid renewable smart grids and better efficiency as well as not only an end to deforestation but ‘reforestation’ and increasing the amount of trees on the planet.

There’s a lot of work to be done, but the clearest thing from this paper is the choice we cannot make is to do nothing. So let’s stop burning carbon.

Agreeing to Zero

If the UN climate negotiations were to actually produce an agreement in 2015 to replace the Kyoto Accord, what would it look like?

WHO: Erik Haites, Margaree Consultants, Toronto, Canada
Farhana Yamin, University College London, Chatham House
Niklas Höhne, Ecofys, Wageningen University

WHAT: The possible legal elements of a 2015 agreement by the UN on climate change mitigation (which they’ve promised to try and do by 2015)

WHEN: 13 October 2013

WHERE: Institute for Sustainable Development and International Relations, Paris,

TITLE:  Possible Elements of a 2015 Legal Agreement on Climate Change (open access)

Let’s take a walk down fantasy lane for a moment since the UNFCCC climate change talks are happening in Warsaw over the next two weeks and imagine that the process begins to work. I’ll take off my cynical and sarcastic hat, and we can imagine together what the world might look like if nations sent negotiators with some actual power to commit their countries to reducing their carbon emissions.

Realistically, as any Canadian who has paid passing attention to the news can tell you, the Harper Government here in Canada is currently spending a lot of time and money muzzling and silencing their scientists who might talk about inconvenient things like climate change. So we know very little will come of these current negotiations. But I digress- as I said, let’s imagine, because that’s what this paper is doing.

These researchers looked at the current negotiations and figured that since all 194 countries that are part of the UN climate negotiations have agreed to negotiate a new legally binding agreement by 2015 to be implemented in 2020, maybe we should sit down and work out what the best one would look like? They helpfully point out that unless there’s an overarching plan for the negotiations that they get bogged down in procedural details, which is true given that Russia held up almost the entire negotiations on deforestation prevention financing in Bonn this year over the agenda.

Yes, Russia held up two weeks of negotiations over the agenda for the meeting (golf clap for Russia).

So to avoid negotiation grandstanding over the formatting of the agenda, what should the game plan be?

The researchers start by aiming high, stating ‘the climate regime needs to move out of continuous negotiation and into a framework of continuous implementation’ which would be awesome. They suggest we need a hybrid approach – avoiding the ‘top down’ regulation from above that tells countries how they can do things, allowing a ‘bottom up’ approach where countries get to choose how they want to reduce their emissions.

Elements of a 2015 agreement (from paper)

Elements of a 2015 agreement (from paper)

They also suggest we end the current split between developed and least developed countries that has been standard so far in the negotiations (generally Annex 1 countries are the western industrialised countries while non-Annex 1 countries are the developing countries/third world).

Instead of sitting down and trying to hash out exactly what each country is going to do to reduce their carbon emissions, the researchers say they should all agree to have net zero carbon emissions by 2050. Everyone. We all agree to zero by 2050 and then it’s up to each nation to work out how.

Using the words ‘net zero’ gives countries that believe in the commercial viability of carbon capture and storage (CCS) to use that technology and still have zero emissions, but most importantly it leaves the details up to each country.

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

One planet, different ways to reach zero emissions (Norman Kuring, NASA GSFC, using data from the VIIRS instrument aboard Suomi NPP)

The simplicity of this idea means that the only agreement needed for 2015 is ‘we all agree to have zero emissions by 2050 and agree that progress will be monitored by the UNFCCC’ (although the UN will say it in their very wordy form of bureaucrat). It makes the idea quite appealing and possibly(?) achievable?

So let’s imagine that everyone agreed that we should all have net zero carbon emissions by 2050. How would the UNFCCC monitor this?

It will need several things – ways to measure reductions, ways to enforce reductions, ways to avoid free-riding and commitments from each country that get upgraded every four years.

The idea these researchers have is that once everyone signs on, each country is then responsible for proving how they’re going to reduce their emissions. They’ll submit these national plans to the UN and the UN will either approve it or send it back telling them it’s not good enough. Each country will also have to prove they’re doing their fair share to reduce emissions (time to pony up the reductions EU, UK, Canada, US, Australia and others!).

The plans will then be reported on every year with the exception of countries that produce less than 0.1% of global emissions. Something that might be surprising to you – 100 countries fall under that threshold! So those countries would only have to report on their progress to zero emissions every five years.

In order to make the process complete, the agreement would need to include everything – air travel emissions, shipping emissions (and who is responsible for the emissions – the country of origin or the country who ordered the stuff?). There will also need to be more money for climate adaptation, because we are already in the age of climate impacts, which will involve wealthier countries coughing up the $100billion each year they promised and haven’t delivered on.

Oh, and of course the agreement needs to be legally binding because everyone knows voluntary commitments are like buying a gym membership as a New Year’s resolution.

Now of course, my first question when reading about an internationally agreed, legally binding commitment to reduce carbon emissions to zero by 2050 through rolling four-year periods that ramp up each time automatically is whether the unicorns are included or extras.

Complimentary unicorns? (Gordon Ednie, flickr)

Complimentary unicorns? (Gordon Ednie, flickr)

However, the researchers rightly point out that with currently available technologies and some political will, it’s possible to have reduced 90% of carbon emissions by 2050. They list sensible and achievable things to get us there like net zero carbon electricity by 2040, totally recyclable products by 2050, zero energy buildings by 2025, decarbonised and electrified passenger transit by 2040, Hydroflurocarbon (HFC) phase out by 2030, zero food in landfill by 2025 and the end of deforestation by 2025.

Even better, they’ve got a list of things we can do before the agreement kicks in for 2020 like: removing the billions of dollars that is current spent each year subsidising fossil fuels, better energy standards and air pollution standards, regulation of shipping and aviation emissions and incentives for early mitigation. Sounds simple, no?

They also recommend incentives for countries that are beating their reduction targets as well as recognition for companies and other organisations within countries that are doing their part too.

This idea is great, if we could get past the sticking point of getting countries to send negotiators who would actually have the power to agree to a legally binding agreement (this year Australia didn’t bother to send their Minister for the Environment, because the new Prime Minister doesn’t believe in climate change).

But if we did all agree that global emissions should be zero by 2050 (which they need to be to preserve a livable climate), this paper outlines a pretty good idea of what it could look like.

Timing the Carbon Excursion

Trying to fine tune the timeline for the Paleocene-Eocene mass extinction using fossilised mud

WHO: James D. Wright, Department of Earth and Planetary Sciences, Rutgers University
Morgan F. Schallera, Department of Earth and Planetary Sciences, Rutgers University, Department of Geological Sciences, Brown University, Providence, RI

WHAT:  Trying to get a more precise timeline of the Paleocene-Eocene thermal maximum mass extinction (PETM).

WHEN: October 1st, 2013

WHERE: Proceedings of the National Academy of Sciences of the United States of America (PNAS), vol. 110, no. 40, October 2013.

TITLE: Evidence for a rapid release of carbon at the Paleocene-Eocene thermal maximum (subs req.)

My favourite accidental metaphor from this paper is when the researchers talked about a ‘carbon isotope excursion’ that caused the Paleocene-Eocene thermal maximum, which is scientist for carbon being released into the atmosphere causing a mass extinction. However, I couldn’t help thinking of a school bus full of carbon isotopes…

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

CO2 getting on a school bus??? (image: Hannah Holt, Lightbulb Books)

The Paleocene-Eocene thermal maximum (or PETM for short) occurred around 55.8million years ago and is considered the best historical comparison to what we’re currently doing to the atmosphere with carbon pollution. There was rapid global warming of around 6oC which was enough to cause a mass extinction, but from Deep Ocean fossilised carbon isotopes the estimate for the time the extinction took is between less than 750 years and 30,000 years.

Obviously this is a huge window of time, and unless it can be further narrowed down, not very useful for humans trying to work out how long we have left to do something about climate change. Narrowing the time scale for the mass extinction is what these researchers tried to do.

Firstly, there are several ways the PETM mass extinction could have happened. The possibilities are; release of methane from reservoirs like permafrost or under the ocean, wildfires in peatlands, desiccation of the ocean (which is the air warming up enough that the ocean starts evaporating), or meteor impact.

Without working out the timing of the extinction, we can’t work out which method caused the extinction. So these researchers went to an area of the USA called the Salisbury Embayment, which is modern-day Delaware but was shallow ocean 55million years ago. Because it was shallow ocean that fossilised, it has a more detailed record of ancient ocean sediment because it’s now dry land, whereas ocean sediment currently still under the ocean keeps getting changed by ocean currents so only show bigger changes.

The researchers looked at sediment cores from Wilson Lake B and Millville which have distinct and rhythmic layers of silty clay through the entire section that corresponds to the carbon excursion into the atmosphere.

PNAS sediment cores

Layers of clay in the sediment cores (from paper)

The distinct layers you can see above occur every 1-3cm along the core which allows for a more precise chronology of the mass extinction.

First they ran tests for the amount of 18O (an oxygen isotope) and found cycles of a 1.1% change consistently through the core from Wilson Lake B. They also found a similar cycle in the Millville core. They then tried to work out what kind of time scale this could line up with. After running a few different ideas that didn’t fit with other evidence from the period, they worked out that the oxygen cycles were consistent with seasonal changes and means the sediment cores line up to being 2cm of mud per year. This is excitingly precise! It also fits for what researchers know about the build up of sediment in shallow areas – eg. The Amazon River can accumulate 10cm/year of mud and clay, so 2cm per year fits the other evidence.

Next in the researcher’s Whodunit, they sampled the beginning of the carbon release which was at 273m along the core. To be precise they sampled from 273.37 – 273.96m and tested for 13C (a carbon isotope).

They found the concentration of carbon dropped in the silt to 3.9%, then 1.45% and then to -2.48% over thirteen years. A drop in the ocean concentration of carbon means it’s being released into the atmosphere. The concentration of calcium carbonate (CaCO3) which is what shells are made of also dropped from 6% to 1% within a single layer of sediment. So there was definitely something going on, but what exactly was it?

The beginning of the carbon release is the unknown bit, and it’s generally agreed that the recovery from the 6oC of global warming took 150,000 years. The sediment layers show that the release of carbon was almost instantaneous (happened over 13 years) and after the release it took between 200-2,000 years for the ocean and the atmosphere to reach a new equilibrium. This means that after the initial carbon release the atmosphere changed rapidly for the next 2,000 years before it reached the ‘new normal’. From there it then took 150,000 years to go back to the original normal from before the carbon release.

Long road to recovery – PETM followed by 2,000 years of change, then a slow slope back to normal temps (from Hansen et. al PNAS 2013)

Long road to recovery – PETM followed by a slow slope back to normal temps (from Hansen et. al PNAS 2013) bottom scale in millions of years.

Looking back at what order of events could have caused this for a mass extinction, the only one that fits is methane release AND a meteorite. There was a recovery of carbon isotope concentrations in the upper clay which the researchers state indicates it was carbon from the atmosphere going into the ocean, with a greater level of carbon movement in the shallow areas than the deep ones. However, the meteorite alone (which is the carbon from the atmosphere into the ocean) wouldn’t be enough to cause 6oC of global warming. It requires around 3,000 Gigatonnes (Gt) of carbon to change the atmosphere by that degree, hence why the researchers think it was a meteorite and methane release from the ocean that add up to the 3,000Gt.

So what are the lessons for humanity in this great carbon excursion of mass extinction? Firstly, climate change can happen quite quickly. Secondly, once you’ve changed the atmosphere, it’s a long road back to the normal you had before you burned the carbon. Thirdly, it only takes 3,000Gt of carbon to cause a mass extinction.

The last UN IPCC report gave the planet a carbon budget of 1,000Gt of total emissions to keep humanity in a level of ‘safe’ climate change. Currently, the world has burned half of that budget and will blow through the rest of it within the next 30 years. If we burn enough carbon to start releasing methane from permafrost or the oceans, it’s a 150,000 year trek back from a possible mass extinction.

I guess it is true – those who don’t learn from history are doomed to repeat it. How about we learn from history and stop burning carbon?

 

Oct. 29th – caption for Cenozoic era graph changed for clarity (time scale is different as the graph is from a different paper).

If We Burn All the Fossil Fuels

“The practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned” – Hansen et al. September 2013

WHO: James Hansen, Makiko Sato, The Earth Institute, Columbia University, New York, NY
Gary Russell, NASA Goddard Institute for Space Studies, New York, NY
Pushker Kharecha, The Earth Institute, Columbia University, NASA Goddard Institute for Space Studies, New York, NY

WHAT: Using deep ocean oxygen isotope ratios to determine the sensitivity of climate forcing for sea levels and surface temperatures.

WHEN: September 2013

WHERE: Philosophical Transactions of the Royal Society A (Phil Trans R Soc A) Vol. 371, No. 2001

TITLE: Climate sensitivity, sea level and atmospheric carbon dioxide (open access)

Ok, firstly, let us just take a moment to geek out about how awesome science is. This paper has looked at what our planet was like millions of years ago by studying at the amount of different oxygen and carbon types in the shells of foraminifera that have been buried at the bottom of the ocean since they died millions of years ago. Science can tell us not only how old they are by dating the carbon in their fossilised bodies, but also what the temperature was too. That is awesome.

Foraminifera from Japan (Wikimedia commons)

Foraminifera from Japan (Wikimedia commons)

The lead author of this paper – Dr. James Hansen is pretty much the Godfather of climate science. He’s been doing climate models looking at the possible effects of extra carbon in our atmosphere since he basically had to do them by hand in the 1980s before we had the internet. He knows his stuff. And so far, he’s been right with his projections.

The paper (which is a very long read at 25 pages) focuses on the Cenozoic climate, which is the period of time from 65.5 million years ago to present. The Cenozoic is the period after the Cretaceous (so we’re talking mammals here, not dinosaurs) and includes the Palaeocene-Eocene thermal maximum where the deep ocean was 12oC warmer than today as well as the cooling from there that led to the formation of the Greenland and Antarctic ice sheets.

The period of time studied by the paper (bottom axis is million years before present) (from paper)

The period of time studied by the paper (bottom axis is million years before present) (from paper)

What does this show us? The warming that eventually led to the Palaeocene-Eocene thermal maximum started around 3,000 years before there was a massive carbon release. The researchers think this carbon release was from methane hydrates in the ocean venting, because there was a lag in the warming in the intermediate ocean after the carbon release.

The thermal maximum had global surface temperatures around 5oC warmer than today, and there was about 4,000 – 7,000 Gigatonnes (Gt) of carbon that was released into the atmosphere to force that kind of warming.

After this warming happened there were ‘hyperthermal’ events (where the temperature spiked again) as the planet slowly cooled, showing how long the recovery time for the planet was from this greenhouse warmed state.

In the warmed world of the Palaeocene-Eocene maximum, sea levels were probably 120m higher than they are now. The researchers found that there’s a snowball effect with changes in ocean temperatures where a -1oC difference in deep ocean temperatures was enough to trigger the last ice age, while sea levels were 5- 10m higher when temperatures were ‘barely warmer than the Holocene’ (which is us – we live in the Holocene).

The researchers found that during the Pliocene, (about 5million years ago) sea levels were 15m higher than today, which they point out means that the East and West Antarctic ice sheets are likely to be unstable at temperatures we will reach this century from burning fossil fuels.

From the data they then tried to work out what the sensitivity of the atmosphere is to extra carbon. This is important to know, because we’re currently changing the chemical composition of the atmosphere much faster than ever before. The previous greenhouse warming that the planet experienced occurred over millennial time scales – the current rate that we’re pumping carbon into the atmosphere is causing change over only hundreds of years.

To work out how sensitive the climate is to being forced by carbon, the researchers used a simplified model where the atmosphere was split into 24 layers to test the rapid equilibrium responses to forcing.

They wanted to find out if we could be in danger of runaway climate change – the most extreme version of which happened on the planet Venus where runaway climate change amplified by water vapour led to a new stable average temperature of 450oC and the carbon was baked onto the surface of the planet and all the water evaporated into the sky. Obviously, humanity will want to avoid that one… Good news is there isn’t enough carbon on this planet for humans to accidentally do that to ourselves until the sun does it to us in a billion years or so.

Venus_Clouds

We’ve avoided this for now (NASA NSSDC Photo Gallery)

The researchers then tested the response to doubling and halving the CO2 in the system, from the 1950 concentration of 310ppm of CO2 in the atmosphere. They found that three halving gives you a ‘snowball Earth’ response of mass glaciations, while in the other direction 1-4x CO2 is when all the snow disappears, which speeds up the feedback (because snow reflects heat) making the fast feedback sensitivity 5oC of global warming. For 8-32x CO2 the sensitivity is approx. 8oC with water vapour feedbacks (what happened on Venus but a smaller scale).

But what do any of these numbers mean?

As the paper says; ‘the practical concern for humanity is the high climate sensitivity and the eventual climate response that may be reached if all fossil fuels are burned’.

So here’s the lesson we need to learn from the Palaeocene-Eocene thermal maximum. For global warming we can assume that 75% of it is from CO2, and the remaining 25% is from other greenhouse gasses like methane and nitrous oxide. If we burn all the fossil fuels we have left in the ground, that’s about 10-15,000Gt of carbon that we could put in the atmosphere.

That gives us 5x the CO2 from 1950, or 1,400ppm. This will give us 16oC of global warming. It will be a world where there’s an average temperature of 20oC on land and 30oC at the poles (the current average is 14oC). Keep in mind also, that 6oC of warming is generally enough for a mass extinction like the dinosaurs.

This will eliminate grain production across most of the globe and seriously increase the amount of water vapour in the air, which means it’s getting more humid (also the water vapour will destroy most of the ozone layer too).

A wet bulb temperature is the temperature with the humidity included. Humans generally live with wet bulb temperatures between 26-27oC up to 31oC in the tropics. A wet bulb temperature of 35oC or above means the body can’t cool down and results in ‘lethal hyperthermia’ which is scientist for it’s so hot and sticky that you die from the heat.

Burning all the fossil fuels will result in a planet with wet bulb temperatures routinely above 35oC, which means we’ll have cooked the atmosphere enough that we’ll end up cooking ourselves.

If the climate has a low sensitivity to this kind of forcing, it will take 4.8x CO2 concentrations to cause an unlivable climate. If the climate is more sensitive, it will take less than that to cook ourselves.

Oh, and the other kicker? The Palaeocene-Eocene thermal maximum took millions of years to take place, so the mammals survived by evolving to be smaller. Our climate change is only taking hundreds of years, which is not enough time for any plants or animals to evolve and adapt.

Basically, if we burn all the fossil fuels, we’re all going down and taking the rest of the species on the planet with us, and we really will be the dumbest smart species ever to cause our own extinction.

So far, James Hansen has been correct with his climate projections. So when he says we can’t burn all the fossil fuels because if we do we’ll cook the planet, I say we pay close attention to what he says. Oh, and we should stop burning carbon.

Nemo the Climate Refugee

If you collected all the recent research on marine species and climate change, could you see a pattern of fish and marine species migration?

WHO: Elvira S. Poloczanska, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia
Christopher J. Brown, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia, School of Biological Sciences, The University of Queensland, Australia
William J. Sydeman, Sarah Ann Thompson, Farallon Institute for Advanced Ecosystem Research, Petaluma, California, USA
Wolfgang Kiessling, Museum für Naturkunde, Leibniz Institute for Research on Evolution and Biodiversity, Berlin, Germany, GeoZentrum Nordbayern, Paläoumwelt, Universität Erlangen-Nürnberg, Erlangen, Germany
David S. Schoeman, Faculty of Science, Health and Education, University of the Sunshine Coast, Maroochydore, Queensland, Australia, Department of Zoology, Nelson Mandela Metropolitan University, Port Elizabeth, South Africa
Pippa J. Moore, Centre for Marine Ecosystems Research, Edith Cowan University, Perth, Western Australia, Institute of Biological, Environmental and Rural Sciences, Aberystwyth University, Aberystwyth UK
Keith Brander, DTU Aqua—Centre for Ocean Life, Technical University of Denmark, Charlottenlund Slot, Denmark
John F. Bruno, Lauren B. Buckley, Department of Biology, The University of North Carolina at Chapel Hill, North Carolina, USA
Michael T. Burrows, Scottish Association for Marine Science, Scottish Marine Institute, Oban, UK
Johnna Holding, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain
Carlos M. Duarte, Department of Global Change Research, IMEDEA (UIB-CSIC), Instituto Mediterráneo de Estudios Avanzados, Esporles, Mallorca, Spain, The UWA Oceans Institute, University of Western Australia,
Benjamin S. Halpern, Carrie V. Kappel, National Center for Ecological Analysis and Synthesis, Santa Barbara, California, USA
Mary I. O’Connor, University of British Columbia, Department of Zoology, Vancouver, Canada
John M. Pandolfi, Australian Research Council Centre of Excellence for Coral Reef Studies, School of Biological Sciences, The University of Queensland, Australia
Camille Parmesan, Integrative Biology, Patterson Laboratories 141, University of Texas, Austin, Texas Marine Institute, A425 Portland Square, Drake Circus, University of Plymouth, Plymouth, UK
Franklin Schwing, Office of Sustainable Fisheries, NOAA Fisheries Service, Maryland, USA
Anthony J. Richardson, Climate Adaptation Flagship, CSIRO Marine and Atmospheric Research, Brisbane, Queensland, Australia Centre for Applications in Natural Resource Mathematics (CARM), School of Mathematics and Physics, University of Queensland, Australia

WHAT: A review and synthesis of all available peer reviewed studies of marine species changing under climate change.

WHEN: 4 August 2013

WHERE: Nature Climate Change, August 2013

TITLE: Global imprint of climate change on marine life subs req.

This paper, with its laundry list of collaborative authors must have had an awesome ‘we got published’ party. However, when you think about what they did – all that data would have taken forever to number crunch, so it’s a good thing it was all hands on deck.

So what were they looking at? They were trying to work out if you can see the fingerprint of climate change in the distribution changes of marine species. And to do that, they looked at all the available studies in the peer reviewed literature that were looking at expected changes for fish and other species in the ocean with climate change. Then, they lined up the predictions with the observed results to see what happened, and it turns out we’ve got some frequent travelling fish.

After getting all the studies together, the researchers had 1,735 different observations for everything from phytoplankton to zooplankton to fish and seabirds from 208 studies of 857 different species. They used all of the data they had which included the changes that lined up with climate change projections, the ones that had no changes and the ones that had unexpected changes.

Global marine migration (from paper)

Global marine migration (from paper)

Ocean currents make it easier for species to travel longer distances than plants and animals on land. There’s only so far a seed can travel from the tree with the wind, after all. However in this research they found that the average distance of expansion for marine species was 72km/decade (±13.5km). This doesn’t sound like a lot to a human, but it’s an order of magnitude further than land based migration averages, and it’s a long way for a mollusc or a starfish to go.

The species chalking up the most frequent flier points were phytoplankton which have been moving 469.9km/decade (±115km) followed by the fish who have been moving 227.5km/decade (±76.9km). Of the 1,735 observations, a whopping 1,092 were moving in the directions expected by climate change.

For each species migration, the researchers looked at what the expected decadal rates of ocean temperature change would have been in the area and found that some groups move early, some wait longer, others are falling behind.

For example, in the Bering Sea (where the Discovery Channel show ‘The Deadliest Catch’ was set), many species rely on the really cold water that is less than 2oC and separates the Arctic and subarctic animals. This cold pool of water has been moving further north as the Arctic ice sheet melts, but the responses by species are varied. Some are at the leading edge and move early, others don’t. The researchers think this is related to issues around population size, ability to migrate, dependence on habitat (remember how Nemo’s dad didn’t want to leave the reef?), competition for food and others.

Clownfish (Wikimedia commons)

Clownfish (Wikimedia commons)

I guess it’s similar to when a natural disaster happens in a human area and some families leave, others rebuild and it’s for a whole complicated list of reasons like family, jobs, resources and more. Anyway, back to the fish.

The researchers tested their data for a climate change fingerprint globally. They used a binomial test against 0.5, which is the result you would get if these changes in location were random variability and from their test, 83% of the changes had climate change as a dominant driving force.

If they limited their data only to studies that were multi-species, there were still 81% of the changes that were driven by climate change. They ran the data to exclude every bias they could think of and still they concluded that it provided ‘convincing evidence that climate change is the primary driver behind the observed biological changes’.

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

Binomial test results – if you get 0.5 it’s a random occurrence – this migration is climate caused (from paper)

So there you have it – climate refugees aren’t just land based. Nemo’s going to have to move too.

Antacid for our Oceans

An electrolysis method that removes CO2 from seawater could be affordably scaled up for commercial carbon sequestration.

WHO: Greg H. Rau, Institute of Marine Sciences, University of California, Santa Cruz, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA
Susan A. Carroll, William L. Bourcier, Michael J. Singleton, Megan M. Smith, and Roger D. Aines, Physical and Life Sciences, Lawrence Livermore National Laboratory, Livermore, CA

WHAT: An electrochemical method of sequestering CO2 from sea water using silicate rocks.

WHEN: June 18, 2013

WHERE: Proceedings of the National Academy of Sciences (USA), PNAS vol. 110, no. 25

TITLE: Direct electrolytic dissolution of silicate minerals for air CO2 mitigation and carbon-negative H2 production (open access)

This paper was fun – I got to get my chemistry nerd back on thinking in moles per litre and chemical equations! It almost made me miss university chemistry lectures.

No, not those moles per litre! (IFLS facebook page)

No, not those moles per litre! (IFLS facebook page)

So what does chemistry jokes have to do with carbon sequestration? It’s looking increasingly like humanity is going to have to figure out ways to draw carbon out of the atmosphere or the oceans because we’ve procrastinated on reducing our carbon emissions for so long.

There’s two options for this – you can either create a chemical reaction that will draw CO2 out of the air, or you can create a chemical reaction that will draw CO2 out of a solution, and given how quickly the oceans are acidifying, using sea water would be a good idea. The good news is; that’s exactly what these researchers did!

Silicate rock (which is mostly basalt rock) is the most common rock type in the Earth’s crust. It also reacts with CO2 to form stable carbonate and bicarbonate solids (like the bicarbonate soda you bake with). Normally this takes place very slowly through rock weathering, but what if you used it as a process to sequester CO2?

The researchers created a saline water electrolytic cell to test it out. An electrolytic cell is the one you made in school where you had an anode and a cathode and two different solutions (generally) and when you put an electric current through it you created a chemical reaction. What these researchers did was put silicate minerals, saline water and CO2 in on one side, and when they added electricity got bicarbonates, hydrogen, chlorine or oxygen, silicates and salts.

A basic schematic of the experiment (from paper)

A basic schematic of the experiment (from paper)

The researchers used an acid/base reaction (remember those from school?!) to speed up the silicate and CO2 reaction, which also works well in an ocean because large differences in pH are produced in saline electrolysis. Are you ready to get really nerdy with me? The chemical equation is this:

Chemical equation for the experiment (from paper)

Chemical equation for the experiment (from paper)

So how did the experiment go? It worked! They got successfully sequestered carbon dioxide with an efficiency of 23-32% that sequestered 0.147g of CO2 per kilojoule (kJ) of electricity used.

There are issues around the scaling up of the reaction of course – once the bicarbonate has been created, where do you store it? The paper suggested ocean storage as the bicarbonate solids would be inert (un-reactive). I would hope that a re-use option could be found – has anyone looked into using bicarbonate solids as an eco-building material?

There’s also the issue of needing to power the reaction with electricity. If scaled up, this process would have to make sure it was powered by renewable energy, because burning carbon to sequester carbon gives you zero.

Also, if sea water is used, the main by-product is Cl2 so the researchers point out that while it would be feasible to do this process directly in the ocean, the issue of what to do with all that chlorine would need to be dealt with. The paper suggests using oxygen selective anodes in the electrolysis, or ion-selective membranes around the reaction to keep the chlorine separate from the ocean.

That being said, there are some exciting upsides to this process. The paper points out that the amount of silicate rock in the world ‘dwarf[s] that needed for conversion of all present and future anthropogenic CO2.’ Also, using sea water is an easier way to sequester CO2 rather than air-based methods.

Scaling the method up is economically feasible too. The researchers estimated that 1.9 MWh (megawatt hours) of power would be needed per metric tonne of CO2 sequestered. If the waste hydrogen from the process were sold as hydrogen fuel for fuel cells, the price of CO2 sequestered would be $86/tonne. If the hydrogen fuel wasn’t feasible, it would still only be $154/tonne, which compares very favourably to most current carbon capture and storage feasibility estimates of $600-$1000/tonne of CO2.

So, like an antacid for the oceans, if this process can be scaled up commercially through research and development, we could have an effective way to not only capture and store carbon, but also reduce ocean acidification. A good-news story indeed – now we just need to stop burning carbon.