Drought – worse than we thought

Inconsistencies with drought models that don’t account for sea surface temperature changes mean that drought in a climate changed world could be worse than predicted.

WHO: Aiguo Dai, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: Looking at the impact of sea surface temperature variability on drought

WHEN: January 2013

WHERE: Nature Climate Change, Vol. 3, No. 1, January 2013

TITLE: Increasing drought under global warming in observations and models (open access)

Climate deniers love it when the models are slightly wrong for predicting future climate changes, and believe me, I’d love it if climate change weren’t so verifiably real and we could all retire and live la dolce vita.

However, that’s not reality, and in the case of this paper, where the model doesn’t quite line up with the observed changes that’s because it’s worse than we previously though. Oh dear.

Global warming since the 1980s has contributed to an 8% increase in drought-ridden areas in the 2000s. It’s led to things like diminished corn crops and the steady draining of underground water aquifers in the USA, much of which is currently experiencing persistent drought. The letter L on the map below stands for long term drought.

Long term drought in the Southwest of the USA (from US Drought Monitor)

Long term drought in the Southwest of the USA (from US Drought Monitor)

What’s that got to do with climate models? Well, while the droughts in Southern Europe or my homeland of Australia are due to lack of rain drying things out, drought can also be from increased evaporation from warmer air temperatures, which the models don’t fully take into account.

These droughts are harder to measure because they’re related to sea surface temperature changes that take decades and can be hard to identify as a human forced signal rather than just natural variations. So this researcher compared sea surface temperatures with drought predictions and observed warming to try and work out what is going on.

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

Predicted changes in soil moisture globally for 1980–2080 (black dots are where 9 out of 11 models agree on data) (from paper)

There were two areas where the models differed from the observed changes – the Sahel area in Africa and the USA.

In the Sahel, the models predicted there would be warming in the North Atlantic Ocean which would lead to increased rain. What actually happened was that there was large warming in the South Atlantic Ocean compared to the North Atlantic and steady warming over the Indian Ocean which meant less rain, not more. Similarly, for the predicted patterns in the USA, the influence of the Pacific Multidecadal Oscillation was not known to be influenced by human climate forcing. However, it switched to a warm phase from above-normal sea surface temperature.

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

Top: Observed sea surface temperatures. Bottom: predicted sea surface temperatures (from paper)

These sea surface variations that were missed in some of the previous models have some obvious consequences for planning for the slow pressure cooker of stress that drought is on anyone living through it, let alone trying to make a living from agriculture.

The researcher noted that there were also some differences from the models when looking at sulphate aerosols, however for the 21st Century the signal from greenhouse gases will be much stronger than those from aerosols, so shouldn’t mess with the data too much.

So what does this all mean? Well, it means that there are both regional and broader trends for drought in a changed climate. The broad patterns are fairly stable ‘because of the large forced trend compared with natural variations’, which is scientist for humans are making a large enough mess out of this to see the evidence clearly.

The paper ends quite bluntly stating that having re-worked the simulations to take into account the new data for sea surface temperature and other variables, that it’s only more bad news.

It’s likely to be ‘severe drought conditions by the late half of this century over many densely populated areas such as Europe, the eastern USA, southeast Asia and Brazil. This dire prediction could have devastating impacts on a large number of the population if the model’s regional predictions turn out to be true.’

Yes, a researcher actually used the word ‘dire’ in a scientific paper. Oh, and this was with an intermediate emissions scenario, not the business as usual path we’re currently all on. How about we all agree to stop burning carbon now?

Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Playing the Emissions on Shuffle

What do the emission reductions of industrialised nations look like when you count the imports manufactured overseas?

WHO: Glen P. Peters, Center for International Climate and Environmental Research, Oslo, Norway
Jan C. Minx, Department for Sustainable Engineering, and Department for the Economics of Climate Change, Technical University Berlin, Germany
Christopher L. Weberd, Science and Technology Policy Institute, Washington, Civil and Environmental Engineering, Carnegie Mellon University, Pittsburgh, USA
Ottmar Edenhofer, Department for the Economics of Climate Change, Technical University Berlin, Potsdam Institute for Climate Impact Research, Potsdam, Germany

WHAT: Measuring the transfer of CO2 emissions through international trade

WHEN: 24 May 2011

WHERE: Proceedings of the National Academy of Sciences (PNAS) vol. 108 no. 21, May 2011

TITLE: Growth in emission transfers via international trade from 1990 to 2008 (open access)

These researchers have found a problem with the way we count carbon emissions. When we count them, countries tend to count them for industries that emit within their own territorial borders, which means that emissions in the developing world have kept going up, while emissions in the developed world (or first world) have either flattened or dropped, depending on how much your government likes to admit the reality of climate change.

However, most of the emissions from the developed world are to produce goods for places like North America and Europe. So these researchers wanted to work out exactly how much international trade contributed towards global emissions increasing by 39% from 1990 – 2008. Was the increase in emissions due to development in countries like China, or was it a case of wealthy countries just shuffling their manufacturing emissions to another country and continuing to increase consumption rates?

As you might guess (spoiler alert) it’s the latter. Turns out all we’ve been doing is moving most of our industrial manufacturing emissions to developing countries and importing the products back, allowing everyone to say ‘yay, we reduced emissions!’ while the actual amount of carbon being burned continues to increase.

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

But don’t take my word for it – what does the paper say?

The researchers took the global economy and broke it down into 113 regions with 95 individual countries and 57 economic sectors. They then looked at all the national and international data they could get on supply chain emissions to produce goods between 1990-2008, as well as doing extra detailed analysis for the years 1997, 2001 and 2004. They called it a Time Series with Trade and it was based on GDP, bilateral trade and emissions statistics (all of which you can generally find at your national statistics office online). The only thing they left out of their analysis was emissions from land use change, because there wasn’t enough data for them to thoroughly analyse it.

They found that global CO2 emissions from exported goods rose from 4.3 Gigatonnes (Gt) in 1990 to 7.8 Gt of CO2 in 2008, with a big increase in the decade up to 2008. Exports have increased their share of global emissions from 20% to 26% and grew on average by 4.3% per year, which was faster than the global population grew (1.4%), faster than total global CO2 emissions grew (2%) and faster than global GDP grew (3.6%).

The only thing that export emissions didn’t grow faster than was the dollar value of all that trade, which increased by 12% each year. So not only are all those new iPhones costing you a lot of money (and making Apple super wealthy), they’re also burning a lot of carbon.

But the thing the paper points out is that international trade has led to simply shifting the location of the emissions, rather than reducing the emissions – shuffling them around the planet to avoid counting them. The researchers estimate that the transfer of emissions from wealthy countries to developing countries has been 17% per year increasing from 0.4 Gt of CO2 in 1990 to 1.6 Gt in 2008.

This is an issue, because it means that all of the countries that signed on to Kyoto to reduce their carbon emissions – most of which promised around 0.7 Gt CO2 reduction per year – have simply shifted those emissions through trade to make them someone else’s problem, while continuing to consume stuff at an ever increasing rate.

More and more stuff (epSos, flickr)

More and more stuff (epSos, flickr)

The researchers point out that while China is currently the world’s largest emitter of carbon emissions, with the USA at number two, if you counted consumption emissions (meaning you made the USA count the emissions for all the stuff they use that’s made in China), they’d swap places and the USA would be the world’s largest emitter.

This makes sense if you think it through – have a look around your house at everything that’s made in China. All of that carbon that China is burning, which is destroying their air quality and polluting their cities and people; all of that is to make stuff for you to consume.

If you count the consumption emissions, the emissions reduction of 3% from the developed world becomes an emissions growth of 11%. Oops. Also, the researchers point out that emissions reductions in wealthy countries are often exceeded by the growth of trade emissions.

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

So what does this mean, other than the fact that everyone is trying to avoid having to clean up their own mess?

It means there’s a problem with the way we count emissions from trade vs emissions from consumption. It also means that we’re currently failing to reduce our carbon emissions in any significant way, which puts us on a straight path to 4, 5 or 6oC of global warming, otherwise known as the next mass extinction.

One Size Doesn’t Fit All

Looking at the Nationally Appropriate Mitigation Actions being undertaken by developing countries with the UNFCCC.

WHO: Xander van Tilburg, Sophy Bristow, Frauke Röser, Donovan Escalante , Hanna Fekete, MitigationMomentum Project, Energy research Centre of the Netherlands (ECN)

WHAT: An update on the progress of the NAMA projects under the UNFCCC process

WHEN: June 2013

WHERE: Published on their website MitigationMomentum.org

TITLE: Status Report on Nationally Appropriate Mitigation Actions (NAMAs) (open access)

This week, the UNFCCC (United Nations Framework Convention on Climate Change) is meeting in Bonn to try and make some more progress towards action on climate change (yay!). One of the papers that was released to time with the negotiations is this one and I thought it would be interesting to look at what actually happens on the ground in relation to the high level negotiations. There will be lots of acronyms, so bear with me and I’ll try and get us there in English.

What is a NAMA?

NAMAs are not this guy (photo: Tamboko the Jaguar, flickr)

NAMAs are not this guy (photo: Tamboko the Jaguar, flickr)

Did you say Llama? No, NAMA… In true bureaucratic style, the UN came up with the really forgettable name of NAMA for Nationally Appropriate Mitigation Actions, which can also be called ‘correctly fitting climate jeans’ or even ‘different places are different’. I want to keep calling them llamas (because I lack that maturity) but I promise not to make any bad llama jokes. The main thing you need to remember is that climate mitigation in Alaska is obviously going to be different to climate mitigation in Indonesia because they’re very different locations and economies.

The idea with NAMA is for a bottom up approach to the UNFCCC negotiations and the ideas that come out of the negotiations (they’re not just talk-fests – they have program and policy ideas too!).

Because the wealthy industrialised countries (also called the First World, Global North or Annex 1 in UN speak) are mostly responsible for the emissions causing climate change, we are also then more responsible for the cleanup. So in 2007 at the Conference of the Parties (COP 13) in Bali, it was decided that NAMA projects should be created by developing countries for mitigation projects they’d like to do which would be funded by industrialised countries.

The projects need to be related to sustainable development and are supported through technology, financing and capacity building (training local people). The people running the projects also report back to the UNFCCC so that progress can be monitored (like any project). The first group of NAMAs were submitted to the UNFCCC Secretariat at the Copenhagen COP 15 in 2009.

NAMA projects are only conducted in developing countries because the idea is that it’s going to be easier for those countries to change the way they’re developing towards a low carbon economy, rather than just following in the full carbon burning footsteps of the industrialised world and then having to retrofit low carbon alternatives.

So if they’re going to try and build it right the first time round, what do they do? First, the country comes up with a feasibility study – what do they want to do and is it possible? If it is possible, then they develop the concept to present to the UNFCCC for funding. The concept has to have a mitigation objective and be clear about who is running the project as well as support from the government of the country.

Once they’ve worked out what they’re doing, they start the preparation phase where they work out the costs, the specific support they need to pull off the project and an estimate of how much carbon emissions will be reduced through the project.

Finally, they start the implementation of the project, which is my favourite bit – getting on the ground and getting it done.

NAMA projects by stage (left) and location (right) (from paper)

NAMA projects by stage (left) and location (right) (from paper)

So far, €100million has been provided to NAMA projects, and a NAMA facility was launched to help the projects with financial and technical support in December 2012. Most of the projects are related to energy supply and the majority of them (56%) are based in Latin America.

The funding agreed to was from 2010 until 2012, so a long term financing arrangement will need to be made at this year’s talks, but I think it’s really exciting to see the tangible reality of what the UNFCCC is trying to do.

The first two NAMA projects submitted were from Mali and Ethiopia looking at shifting freight to electric rail in Ethiopia and energy efficiency and renewable energy supply in Mali.

So far, five projects have advanced far enough to receive funding. The projects are between 3-5 years in length, need between €5 – €15million in funding and should be able to start quickly (within 3-12 months) after applying.

The five projects are:

  1. Small scale renewable energy projects in Northern Sumatra, Indonesia with a feed-in tariff for independent power producers (IPPs)
  2. A project to stimulate investment in renewable energy systems in Chile
  3. Waste to energy systems using agricultural waste in Peru (with different approaches tailored to different geographic locations)
  4. Energy conservation and efficiency standards for the building sector in Tunisia
  5. A geothermal energy project in Kenya

There are still details and processes that need to be worked out as the NAMA program progresses, given that one size never fits all for climate mitigation and renewable energy generation. But I really like the idea of locally developed projects that suit the challenges different countries face being implemented on the ground, supported at a high level from the UNFCCC.

Climate Question: Do We Get to Keep Flying?

An analysis of jet fuel alternatives that could be viable in the next decade.

WHO: James I. Hileman, Hsin Min Wong, Pearl E. Donohoo, Malcolm A. Weiss, Ian A. Waitz, Massachusetts Institute of Technology (MIT)
David S. Ortiz, James T. Bartis, RAND Corporation Environment, Energy and Economic Development Program

WHAT: A feasibility study of alternatives to conventional jet fuel for aircraft.

WHEN: 2009

WHERE: Published on both the MIT website and RAND Corporation website

TITLE: Near-Term Feasibility of Alternative Jet Fuels

Last week, I looked at how our transport systems could be carbon free by 2100 and was intrigued by the comment ‘hydro-processed renewable jet fuel made from plant oils or animal fats is likely to be the only biomass-based fuel that could be used as a pure fuel for aviation, but would require various additives in order to be viable as an aviation fuel’.

It made me wonder what was being done for airplane fuel alternatives, or do we not have any alternatives and will I have to give up visiting my family in Australia?

Any other options? (photo: Amy Huva 2013)

Any other options? (photo: Amy Huva 2013)

I came across this technical report by MIT and the RAND Corporation (apparently RAND stands for Research ANd Development) and sat down to read all 150pages (you’re welcome) and see what our options for fuels that we could feasibly use in the next decade are.

The paper compared alternative fuels on the basis of compatibility with existing aircraft and infrastructure, production potential, production costs, lifecycle Greenhouse Gas (GHG) emissions, air quality emissions, merit of the fuel as jet fuel vs ground fuel and the maturity of the technology.

The researchers pointed out (quite rightly) that emissions from biofuels need to take into account the carbon emitted through land use changes because if you clear-fell a forest to plant a biofuel crop any carbon you’ve saved by not burning oil has just been invalidated by the carbon emitted from clear-felling the forest.

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

There were five different fuel groups looked at;

  1. Conventional Petroleum
  2. Unconventional Petroleum
  3. Synthetic fuel from natural gas, coal or biomass
  4. Renewable oils
  5. Alcohols

The standard fuel used in North America for aviation is called Jet A and was used as the benchmark for the study. So what did they find?

Conventional Petroleum Fuels

Almost all Jet A fuel comes from crude oil and is kerosene based. The emissions from Jet A are 87.5g of CO2e (CO2 equivalent) per megajoule (MJ) of energy created (g CO2e/MJ). Of that 87.5g, 73.2g comes from the burning of the fuel and there can be a 7% variation on the amount emitted from refining depending on the quality of the crude oil used and the refining process.

The world consumption of jet fuel is estimated at 5million barrels per day of oil. This number is really hard to wrap your head around, so let me quickly walk you through some math. A barrel of oil is 159L, which means 5million barrels per day is 795,000,000L of oil burned each day. To get that volume of water, you would have to run a fire hose (359L/minute) for 101 years (yes, YEARS). We burn that much in one day.

Given that a conventional fuel is already oil based and you can’t reduce those carbon emissions, the tweak for this paper was an Ultra Low Sulfur Jet A fuel, which would reduce sulfur emissions from burning the fuel.

While it’s a great to reduce sulfur emissions that cause acid rain, the extra refining needed upped the lifecycle emissions to 89g CO2e/MJ.

Unconventional Petroleum Fuels

Unconventional fuels are things like the Canadian tar sands (or oil sands if you’re their PR people) and Venezuelan Heavy Oil. These oils are dirtier and require more refining to be made into jet fuel. They also require more effort to get out of the ground, and so the lifecycle emissions are 103g CO2e/MJ (with an uncertainty of 5%). The upstream emissions of sourcing and refining the fuel are what add the extra – burning the fuel has the same emissions as Jet A, and the upstream emissions range from 16g CO2e/MJ for open cut mining to 26g CO2e/MJ for in-situ mining.

You can also get shale oil through the process of fracking and refine it to Jet A. Shale based Jet A also burns the same as Jet A, but the extraction emissions are a whopping 41g CO2e/MJ which is double the tar sands extraction emissions, giving an overall 114.2g CO2e/MJ lifecycle emissions.

Fischer-Tropsch Synthetic Fuels

These are fuels derived through the catalysed Fisher-Tropsch process and then refined into a fuel. These fuels are good because they have almost zero sulfur content (and therefore almost zero sulfur emissions). They don’t work as a 100% fuel without an engine refit because of the different aromatic hydrocarbon content, and the energy density is 3% less than Jet A (meaning you’d need 3% more fuel in the tank to go the same distance as Jet A fuel). However, it does combine easily to make a 50/50 blend for planes.

You can make FT Synthetic fuel from natural gas which gives you 101g CO2e/MJ emissions, from coal which gives you between 120-195g CO2e/MJ and relies on carbon capture and storage as a technical fix, or from biomass, which has almost zero lifecycle emissions ONLY if you use a waste product as the source and don’t contribute to further land use changes.

Renewable Oils

These are biodiesel or biokerosene which can be made from soybean oil, canola oil, palm oil, coconut oil, animal fats, waste products or halophytes and algae.

Because this paper was looking at fuels that could be commercially used in the next 10 years, they looked at a 5% blend with Jet A fuel to meet freeze point requirements (most renewable oils freeze at too high a temperature for the altitude planes fly at). They found too many safety and freezing point issues with biodiesel or biokerosene, so didn’t calculate the emissions from them as they’re not practical for use.

Another renewable oil is Hydroprocessed Jet Fuel entertainingly sometimes called ‘Syntroleum’. This is made from plant oils, animal fats or waste grease. Soybean oil without land use emissions would have only 40-80% of the emissions of Jet A, while palm oil would have 30-40% the emissions of Jet A.

Alcohol Fuels

The paper looked at using ethanol (the alcohol we drink) and butanol as replacement fuels. They both had lower energy densities to Jet A, higher volatility (being flammable and explosive) and issues with freezing at cruising altitude. While butanol is slightly safer to use as a jet fuel than ethanol, the report suggests it’s better used as a ground transport fuel than a jet fuel (I assume the better use of ethanol as a drink is implied).

Options for jet fuel alternatives (from paper)

Options for jet fuel alternatives (from paper)

After going through all the options, the researchers found that the three main options we have for alternative fuels over the next decade that could be commercially implemented are;

  1. Tar sands oil
  2. Coal-derived FT Synthetic oil
  3. Hydroprocessed Renewable jet fuel

They recommended that when looking to reduce the emissions from the transport sector that aviation shouldn’t be treated any differently. While strongly recommending that land use changes be taken into account for the use of biofuels, they also pointed out that the use for aviation should also be looked at as limited biofuel resources may be more effective producing heat and power rather than being used for transport.

Personally, I don’t find the report very heartening given that the first two options involve either dirtier oil or really dirty coal when what we need to be doing is reducing our emissions, not changing the form they’re in and still burning them. I’ll be keeping my eye out for any new research into hydroprocessed renewable jet fuels that could use waste products or algae – given the speed that oceans are acidifying, there could be a lot of ocean deadzones that are really good at growing algae and could be used as a jet fuel.

But until then, it looks like there aren’t many options for the continuation of air travel once we start seriously reducing our emissions – they’ll be a really quick way to burn through our remaining carbon budget.

Your Transport – Carbon Free in 2100

Detailed scenarios looking at how all transport of people and goods can be zero carbon by 2100

WHO: L.D.D. Harvey, Department of Geography, University of Toronto, Canada

WHAT: Scenarios across all sectors of transport for people and goods and how they can be zero carbon by the year 2100

WHEN: March 2013

WHERE: Energy Policy, Vol. 54

TITLE: Global climate-oriented transportation scenarios (subs req.)

We need to decarbonise our economy, but what does that actually look like? What do our transit and transport systems look like with zero carbon? Are we all going back to the horse and cart? I don’t think my apartment can fit a horse!

This very very detailed paper from the University of Toronto looked at what might happen, and the general gist of it is that first we need to work really hard to increase the efficiency of all our transport. Once we’ve gotten the energy intensity as low as possible on everything, we need to switch the remaining energy requirements over to different fuel sources (bio fuels, hydrogen fuel cells, electric).

For this paper, the globe was divided into ten socio-economic regions that had different per capita incomes, activity levels, energy intensities, potential for future population growth, income growth and energy levels. Each segment was then analysed for the per capita travel of light duty vehicles (cars, SUVs, pickup trucks), air travel, rail travel and other modes of transport. To further complicate the calculations, there were low growth and high growth scenarios looked at as well.

The data was worked from 2005 and extrapolated out to 2100 and if this kind of large scale number crunching really gets you going, all the spreadsheets that the researcher used are available online here (Climate-OrientedTransportScenarios) for you to do your own zero carbon transport scenarios (thanks to Dr. Harvey for making this available open access).

Energy demand scenarios (from paper)

Energy demand scenarios (from paper)

Interestingly, growth in per capita travel relative to GDP growth has halted in several industrialised countries, which makes sense when you think about it – beyond a certain point you end up with more money to travel than time to do it in.

In terms of climate change, the paper assumes we’re able to stabilise the CO2 concentration in the atmosphere at 450ppm. The paper also talks a lot about peak oil and the effect it could have on resource prices and the availability of fossil fuels as fuel. Given that we need to leave 80% of the known fossil fuel reserves on the planet in the ground, I’m not so sure how much effect peak oil may have, but you never know – we could be suicidal enough to try and burn all of it.

Cars

Improvements need to be made reducing the weight of cars, improving the engine efficiency and the aerodynamics. Passenger space will increase so we can transport more people per car, air conditioning becomes more efficient (and necessary in some places because of climate change) and hybrid electric cars replace fossil fuel cars for urban driving. Fuel consumption drops from 10.4L/100km in 2005 to 1-2L/100km (of a biofuel) in 2100.

While I was really hoping the paper would tell me of the demise of ugly giant pickup trucks, sadly it looks like we may keep them and they’ll become hydrogen fuel cell monster trucks.

Buses

Buses will increase engine efficiency and ridership. Many buses are already diesel or electric, but the diesel efficiency will become around 50% and the hydrogen fuel cell buses will have 60% engine efficiency.

Passenger Rail

Trains will be electrified where they can be, and efficient diesel (becoming biofuel) where they can’t be electrified.

Air

The efficiency of planes is expected to increase by 20% from 2000 – 2020, with a 1% per year efficiency gain every year after that. The International Civil Aviation Organisation (ICAO) has already announced they’re aiming for 2% per year efficiency to 2050, so this one isn’t too far from reality. However, the paper points out that this will probably require a radical change in aircraft design, and a possible switch to plant oils or animal fat biomass-based fuel beyond that.

Freight

Freight trains need to reduce their weight, improve their engine efficiency, develop diesel-electric hybrid drive trains and get clever about load configuration to maximise efficiency. The energy requirement of tractors and other long haul trailers also needs to be reduced.

Marine freight is an interesting one. The paper points out that the majority of the world’s shipping is currently oil, coal and other bulk materials like iron ore. Obviously, none of this will need to be shipped anywhere in a zero carbon world, because we won’t need it. Mostly, marine freight will reduce the energy intensity of ships, and future transport will be made up of 60% container ships, 20% bulk ships, 10% general cargo ships and 10% biofuel supertankers.

Green Scenarios

The paper also looks at some ‘Green Scenarios’ which are the ones where we actually get ourselves into gear seriously to decarbonise (and hopefully stop having the endless debate about whether climate change is ‘real’).

The green scenarios have additional reduced total passenger travel with truck and air travel compensated by rail and other travel modes. There’s also an extra 20% decrease in global freight, which makes me hope people become more minimalist and have less junk in this future scenario? (I can dream!)

Initially, the greatest demand for biofuels are cars, but by 2035 freight is the biggest biofuel user, so maybe we’ve started to also become more clever in the way we plan urban areas with density and rapid transit too? (I think I like this future planet!)

Fuel demand scenarios (from paper)

Fuel demand scenarios (from paper)

The paper concludes that we need new urban development with higher density, more walkable, bikable and transit friendly options as well as making energy intensity reductions in all forms of transport and then switching the remaining fossil fuels to hydrogen or biofuel. This will go hand in hand with engine efficiency increases as well as battery technology improvements.

The key thing I took away from this paper is that we need to be doing ALL of this. We can’t just drive an electric car and still have our books from Amazon.com shipped here on an old, inefficient cargo ship belching fossil fuels. We also can’t fix one single transport sector and wash our hands of it saying ‘there- I fixed climate change!’

Climate change will affect everything, regardless of whether we actually do something about it or not. So we need to change the way we do everything to do it without carbon.

Much ado about phosphorus

‘Life can multiply until all the phosphorus has gone and then there is an inexorable halt which nothing can prevent’ – Isaac Asimov, 1974

WHO: K. Ashley, D. Mavinic, Department of Civil Engineering, Faculty of Applied Science, University of British Columbia, Vancouver, BC, Canada
D. Cordell, Institute for Sustainable Futures, University of Technology, Sydney, Australia

WHAT: A brief history of phosphorus use by humans and ideas on how we can prevent the global food security risk of ‘Peak Phosphorus’

WHEN: 8 April 2011

WHERE: Chemosphere Vol. 84 (2011) 737–746

TITLE: A brief history of phosphorus: From the philosopher’s stone to nutrient recovery and reuse (subs req.)

Phosphorus can be found on the right hand side of your periodic table on the second row down underneath Nitrogen. It’s one of those funny elements that we all need to live and survive and grow things, but is also highly reactive, very explosive and toxic.

It’s in our DNA – in the AGCT bases that connect to form the double helix structure of DNA, the sides of the ladder are held together by phosphodiester bonds. Phosphorus is literally helping to hold us together.

Phosphodiester bonds in DNA (from Introduction to DNA structure, Richard B. Hallick, U Arizona)

Phosphodiester bonds in DNA (from Introduction to DNA structure, Richard B. Hallick, U Arizona)

Phosphorus can be pretty easily extracted from human urine, which was what German alchemist Henning Brandt did in the 1660s in an attempt to create the Philosopher’s Stone which would be able to turn base metals into gold. No seriously, apparently he was committed enough to the idea to distill 50 buckets of his own pee to do this!

What do alchemy, DNA, and human pee have to do with a scientific paper? Well these researchers were looking at how we’ve previously used phosphorus, why it is that we’re now running out of it and what we can learn from history to try and avoid a global food security risk.

Phosphorus comes in three forms – white, black and red. The phosphorus that is mined for fertilizer today is apatite rock containing P2O5 and has generally taken 10 – 15million years to form. However, in traditional short term human thinking, the fact that it takes that long for the rocks to form didn’t stop people from mining it and thinking it was an ‘endless’ resource (just like oil, coal, forests, oceans etc.).

The paper states that originally, phosphorus was used for ‘highly questionable medicinal purposes’ and then doesn’t detail what kinds of whacky things it was used for (boo!). Given the properties of white phosphorus; it’s highly reactive and flammable when exposed to the air, can spontaneously combust and is poisonous to humans, the mind boggles as to what ‘medicinal’ uses phosphorus had.

The major use of phosphorus is as an agricultural fertilizer, which used to be achieved through the recycling of human waste and sewage pre-industrialisation. However, with 2.5million people living in Victorian-era London, the problems of excess human waste become unmanageable and led to all kinds of nasty things like cholera and the ‘Great Stink’ of the Thames in 1858 that was so bad that it shut down Parliament.

This led to what was called the ‘Sanitary Revolution’ aka the invention of flush toilets and plumbing on a large scale. This fundamentally changed the phosphorus cycle – from a closed loop of localised use and reuse to a more linear system as the waste was taken further away.

After the Second World War, the use of mined mineral phosphorus really took off – the use of phosphorus as a fertilizer rose six fold between 1950-2000 – and modern agricultural processes are now dependent on phosphorus based fertilizers. This has led to major phosphorus leakage into waterways and oceans from agricultural runoff creating eutrophication and ocean deadzones from excess phosphorus.

Eutrophication in the sea of Azov, south of the Ukraine  (SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE)

Eutrophication in the sea of Azov, south of the Ukraine
(SeaWiFS Project, NASA/Goddard Space Flight Center, and ORBIMAGE)

The problem here is, that we’ve switched from a closed loop system where the waste from the farm house goes into the farm yard and all the phosphorus can recycle, to a linear system where the phosphorus gets mined, used as fertilizer and much of it runs off into the ocean. It’s not even a very efficient system – only a fifth of the phosphorus mined for food production actually ends up in the food we eat.

The problem that we’re now facing is the long term ramifications of this new system where phosphorus has become a scarce global resource and we’ve now been forced to start mining the rocks that have lower quality phosphorus with higher rates of contaminants and are more difficult to access. We’re down to the tar sands equivalent of minable phosphorus, most of which is found in only five countries; Morocco, China, the USA, Jordan and South Africa. Maybe they can be the next OPEC cartel for phosphorus?

Peak phosphorus is likely to happen somewhere between 2030 and 2040, which is where the scary link to climate change comes in. The researchers cheerfully call phosphorus shortages the ‘companion show-stopper to climate change’, by which they mean that soils will start to run out of the nutrients they need at about the same time that extended droughts from climate change will be diminishing crop yields and we’ll have about 9 billion people to scramble to feed.

Basically, a phosphorus shortage is something that we can easily avoid through better and more efficient nutrient recycling, but it’s something that will kick us in the ass once we’re already struggling to deal with the consequences of climate change. The paper states that we need to start re-thinking our ‘western style’ of sewage treatment to better recover water, heat, energy, carbon, nitrogen and phosphorus from our waste systems. This doesn’t mean (thankfully) having to return to a middle ages style of living – it means having cities that are innovative enough about their municipal systems (I was surprised to find out that sewage treatment is one of the most expensive and energy intensive parts of public infrastructure).

The False Creek Neighbourhood Energy Utility in Vancouver

The False Creek Neighbourhood Energy Utility in Vancouver

In Vancouver, we’re already starting to do that with the waste cogeneration system at Science World and the False Creek Neighbourhood Energy Utility that produces energy from sewer heat.

It’s pretty logical; we need to re-close the loop on phosphorus use and we need to do it sensibly before our failure to stop burning carbon means ‘Peak Phosphorus’ becomes the straw that breaks the camel’s proverbial back.

Pandora’s Permafrost Freezer

What we know about permafrost melt is less than what we don’t know about it. So how do we determine the permafrost contribution to climate change?

WHO: E. A. G. Schuur, S. M. Natali, C. Schädel, University of Florida, Gainesville, FL, USA
B. W. Abbott, F. S. Chapin III, G. Grosse, J. B. Jones, C. L. Ping, V. E. Romanovsky, K. M. Walter Anthony University of Alaska Fairbanks, Fairbanks, AK, USA
W. B. Bowden, University of Vermont, Burlington, VT, USA
V. Brovkin, T. Kleinen, Max Planck Institute for Meteorology, Hamburg, Germany
P. Camill, Bowdoin College, Brunswick, ME, USA
J. G. Canadell, Global Carbon Project CSIRO Marine and Atmospheric Research, Canberra, Australia
J. P. Chanton, Florida State University, Tallahassee, FL, USA
T. R. Christensen, Lund University, Lund, Sweden
P. Ciais, LSCE, CEA-CNRS-UVSQ, Gif-sur-Yvette, France
B. T. Crosby, Idaho State University, Pocatello, ID, USA
C. I. Czimczik, University of California, Irvine, CA, USA
J. Harden, US Geological Survey, Menlo Park, CA, USA
D. J. Hayes, M. P.Waldrop, Oak Ridge National Laboratory, Oak Ridge, TN, USA
G. Hugelius, P. Kuhry, A. B. K. Sannel, Stockholm University, Stockholm, Sweden
J. D. Jastrow, Argonne National Laboratory, Argonne, IL, USA
C. D. Koven, W. J. Riley, Z. M. Subin, Lawrence Berkeley National Lab, Berkeley, CA, USA
G. Krinner, CNRS/UJF-Grenoble 1, LGGE, Grenoble, France
D. M. Lawrence, National Center for Atmospheric Research, Boulder, CO, USA
A. D. McGuire, U.S. Geological Survey, Alaska Cooperative Fish and Wildlife Research Unit, University of Alaska, Fairbanks, AK, USA
J. A. O’Donnell, Arctic Network, National Park Service, Fairbanks, AK, USA
A. Rinke, Alfred Wegener Institute, Potsdam, Germany
K. Schaefer, National Snow and Ice Data Center, Cooperative Institute for Research in Environmental Sciences, University of Colorado, Boulder, CO, USA
J. Sky, University of Oxford, Oxford, UK
C. Tarnocai, AgriFoods, Ottawa, ON, Canada
M. R. Turetsky, University of Guelph, Guelph, ON, Canada
K. P. Wickland, U.S. Geological Survey, Boulder, CO, USA
C. J. Wilson, Los Alamos National Laboratory, Los Alamos, NM, USA
 S. A. Zimov, North-East Scientific Station, Cherskii, Siberia

WHAT: Interviewing and averaging the best estimates by world experts on how much permafrost in the Arctic is likely to melt and how much that will contribute to climate change.

WHEN: 26 March 2013

WHERE: Climactic Change, Vol. 117, Issue 1-2, March 2013

TITLE: Expert assessment of vulnerability of permafrost carbon to climate change (open access!)

We are all told that you should never judge a book by its cover, however I’ll freely admit that I chose to read this paper because the headline in Nature Climate Change was ‘Pandora’s Freezer’ and I just love a clever play on words.

So what’s the deal with permafrost and climate change? Permafrost is the solid, permanently frozen dirt/mud/sludge in the Arctic that often looks like cliffs of chocolate mousse when it’s melting. The fact that it’s melting is the problem, because when it melts, the carbon gets disturbed and moved around and released into the atmosphere.

Releasing ancient carbon into the atmosphere is what humans have been doing at an ever greater rate since we worked out that fossilised carbon makes a really efficient energy source, so when the Arctic starts doing that as well, it’s adding to the limited remaining carbon budget our atmosphere has left. Which means melting permafrost has consequences for how much time humanity has left to wean ourselves off our destructive fossil fuel addiction.

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

Cliffs of chocolate mousse (photo: Mike Beauregard, flickr)

 How much time do we have? How much carbon is in those cliffs of chocolate mousse? We’re not sure. And that’s a big problem. Estimates in recent research think there could be as much as 1,700 billion tonnes of carbon stored in permafrost in the Arctic, which is much higher than earlier estimates from research in the 1990s.

To give that very large number some context, 1,700 billion tonnes can also be called 1,700 Gigatonnes, which should ring a bell for anyone who read Bill McKibben’s Rolling Stone global warming math article. The article stated that the best current estimate for humanity to have a shot at keeping global average temperatures below a 2oC increase is a carbon budget of 565Gt. So if all the permafrost melted, we’ve blown that budget twice.

What this paper did, was ask the above long list of experts on soil, carbon in soil, permafrost and Arctic research three questions over three different time scales.

  1. How much permafrost is likely to degrade (aka quantitative estimates of surface permafrost degradation)
  2. How much carbon it will likely release
  3. How much methane it will likely release

They included the methane question because methane has short term ramifications for the atmosphere. Methane ‘only’ stays in the atmosphere for around 100 years (compared to carbon dioxide’s 1000 plus years) and it has 33 times the global warming potential (GWP) of CO2 over a 100 year period. So for the first hundred years after you’ve released it, one tonne of methane is as bad as 33 tonnes of CO2. This could quickly blow our carbon budgets as we head merrily past 400 parts per million of CO2 in the atmosphere from human forcing.

The time periods for each question were; by 2040 with 1.5-2.5oC Arctic temperature rise (the Arctic warms faster than lower latitudes), by 2100 with between 2.0-7.5oC temperature rise (so from ‘we can possibly deal with this’ to ‘catastrophic climate change’), and by 2300 where temperatures are stable after 2100.

The estimates the experts gave were then screened for level of expertise (you don’t want to be asking an atmospheric specialist the soil questions!) and averaged to give an estimate range. For surface loss of permafrost under the highest warming scenario, the results were;

  1. 9-16% loss by 2040
  2. 48-63% loss by 2100
  3. 67-80% loss by 2300
Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Permafrost melting estimates for each time period over four different emissions scenarios (from paper)

Ouch. If we don’t start doing something serious about reducing our carbon emissions soon, we could be blowing that carbon budget really quickly.

For how much carbon the highest warming scenario may release, the results were;

  1. 19-45billion tonnes (Gt) CO2 by 2040
  2. 162-288Gt CO2 by 2100
  3. 381-616Gt CO2 by 2300

Hmm. So if we don’t stop burning carbon by 2040, melting permafrost will have taken 45Gt of CO2 out of our atmospheric carbon budget of 565Gt. Let’s hope we haven’t burned through the rest by then too.

However, if Arctic temperature rises were limited to 2oC by 2100, the CO2 emissions would ‘only’ be;

  1. 6-17Gt CO2 by 2040
  2. 41-80Gt CO2 by 2100
  3. 119-200Gt CO2 by 2300

That’s about a third of the highest warming estimates, but still nothing to breathe a sigh of relief at given that the 2000-2010 average annual rate of fossil fuel burning was 7.9Gt per year. So even the low estimate has permafrost releasing more than two years worth of global emissions, meaning we’d have to stop burning carbon two years earlier.

When the researchers calculated the expected methane emissions, the estimates were low. However, when they calculated the CO2 equivalent (CO2e) for the methane (methane being 33 times more potent than CO2 over 100 years), they got;

  1. 29-60Gt CO2e by 2040
  2. 250-463Gt CO2e by 2100
  3. 572-1004Gt CO2e by 2300

Thankfully, most of the carbon in the permafrost is expected to be released as the less potent carbon dioxide, but working out the balance between how much methane may be released into the atmosphere vs how much will be carbon dioxide is really crucial for working out global carbon budgets.

The other problem is that most climate models that look at permafrost contributions to climate change do it in a linear manner where increased temps lead directly to an increase in microbes and bacteria and the carbon is released. In reality, permafrost is much more dynamic and non-linear and therefore more unpredictable, which makes it a pain to put into models. It’s really difficult to predict abrupt thaw processes (as was seen over 98% of Greenland last summer) where ice wedges can melt and the ground could collapse irreversibly.

These kinds of non-linear processes (the really terrifying bit about climate change) made the news this week when it was reported that the Alaskan town of Newtok is likely to wash away by 2017, making the townspeople the first climate refugees from the USA.

The paper points out that one of the key limitations to knowing exactly what the permafrost is going to do is the lack of historical permafrost data. Permafrost is in really remote hard to get to places where people don’t live because the ground is permanently frozen. People haven’t been going to these places and taking samples unlike more populated areas that have lengthy and detailed climate records. But if you don’t know how much permafrost was historically there, you can’t tell how fast it’s melting.

The key point from this paper is that even though we’re not sure exactly how much permafrost will contribute to global carbon budgets and temperature rise, this uncertainty alone should not be enough to stall action on climate change.

Yes, there is uncertainty in exactly how badly climate change will affect the biosphere and everything that lives within it, but currently our options range from ‘uncomfortable and we may be able to adapt’ to ‘the next mass extinction’.

So while we’re working out exactly how far we’ve opened the Pandora’s Freezer of permafrost, let’s also stop burning carbon. 

Wind Power Kicks Fossil Power Butt

What if you ran the numbers for wind power replacing all fossil fuel and nuclear electricity in Canada? How could it work? How much would it cost?

WHO:  L.D. Danny Harvey, Department of Geography, University of Toronto, Canada

WHAT: Mapping and calculating the potential for wind electricity to completely replace fossil fuel and nuclear electricity in Canada

WHEN: February 1st, 2013

WHERE: Energy Vol. 50, 1 February 2013

TITLE: The potential of wind energy to largely displace existing Canadian fossil fuel and nuclear electricity generation (subs req.)

As a kid, I really loved the TV series Captain Planet. I used to play it in the school yard with my friends and I always wanted to be the one with the wind power. Mostly because my favourite colour is blue, but also because I thought the girl with the wind power was tough.

Go Planet! Combining the power of wind, water, earth, fire and heart (Wikimedia commons)

Go Planet! Combining the power of wind, water, earth, fire and heart (Wikimedia commons)

What’s my childhood got to do with this scientific paper? Well, what if you looked at the Canadian Wind Energy Atlas and worked out whether we could harness the power of wind in Canada to replace ALL fossil fuel and nuclear electricity? How would you do it? How much would it cost? That’s what this researcher set out to discover (in the only paper I’ve written about yet that has a single author!)

Refreshingly, the introduction to the paper has what I like to call real talk about climate change. He points out that the last time global average temperatures increased by 1oC, sea levels were 6.6 – 9.4m higher, which means ‘clearly, large and rapid reductions in emissions of CO2 and other greenhouse gases are required on a worldwide basis’.

Of global greenhouse gas emissions electricity counts for about 25%, and while there have been studies in the US and Europe looking at the spacing of wind farms to reduce variability for large scale electricity generation, no-one has looked at Canada yet.

So how does Canada stack up? Really well. In fact, the paper found that Canada has equivalent wind energy available for many times the current demand for electricity!

The researcher looked at onshore wind and offshore wind for 30m, 50m and 80m above the ground for each season to calculate the average wind speed and power generation.  Taking into account the wake effect of other turbines and eliminating areas that can’t have wind farms like cities, mountains above 1,600m elevation (to avoid wind farms on the Rocky Mountains), shorelines (to avoid wind farms on your beach) and wetlands, the paper took the Wind Energy Atlas and broke the map into cells.

For calculating your wind farm potential there are generally three options; you can maximise the electricity production, maximise the capacity factor, or minimise the cost of the electricity. The paper looked at all three options and found that the best overall option (which gives you a better average cost in some cases) was to aim for maximum capacity.

Using wind data and electricity demand data from 2007, the researcher ran the numbers. In 2007, the total capacity of fossil fuel and nuclear electricity was 49.0GW (Gigawatts), or 249.8TWh (Terrawatt hours) of generation. This is 40% of the total national electricity capacity for Canada of 123.9GW or 616.3TWh generation.

To deal with the issue of wind power being intermittent, the paper noted that there’s already the storage capacity for several years electricity through hydro in Quebec and Manitoba, as well as many other options for supply-demand mismatches (which this paper doesn’t address) making a national wind electricity grid feasible.

To run the numbers, the country was split into 5 sectors and starting with the sector with the greatest wind energy potential, the numbers were run until a combination was found where the wind energy in each sector met the national fossil fuel and nuclear requirements.

Wind farms required in each sector to provide enough electricity to completely replace the fossil fuel and nuclear power used in 2007 (from paper)

Wind farms required in each sector to provide enough electricity to completely replace the fossil fuel and nuclear power used in 2007 (from paper)

Once the researcher worked out that you could power the whole country’s fossil fuel and nuclear electricity with the wind energy from any sector, he looked at minimising costs and meeting the demand required for each province.

He looked at what size of wind farm would be needed, and then calculated the costs for infrastructure (building the turbines) as well as transmission (getting the electricity from the farm to the demand). Some offshore wind in BC, Hudson Bay, and Newfoundland and Labrador, combined with some onshore wind in the prairies and Quebec and that’s all we need.

The cost recovery for the investment on the infrastructure was calculated for 20 years for the turbines and 40 years for the transmission lines. The paper found that minimising transmission line distance resulted in the largest waste generation in winter, but smallest waste in the summer, however overall, the best method was to aim for maximising the capacity factor for the wind farms.

But the important question – how much would your power cost? On average, 5-7 cents per kWh (kilowatt hour), which is on par with the 7c/kWh that BC Hydro currently charges in Vancouver. Extra bonus – wind power comes without needing to mine coal or store radioactive nuclear waste for millions of years!

Estimated wind power costs for Canada (from paper)

Estimated wind power costs for Canada (from paper)

Some more food for thought – the researcher noted that the estimated cost for coal fired electricity with (still unproven) carbon capture and storage technology is likely to be around 9c/kWh, while the current cost for nuclear generated electricity is between 10-23c/kWh. Also, the technical capacity factor for turbines is likely to increase as the technology rapidly improves, which will reduce the cost of producing wind electricity all over again.

This is all great news – Canada has the wind energy and the potential to build a new industry to not only wean ourselves off the fossil fuels that are damaging and destabilising our atmosphere, but to export that knowledge as well. We can be an energy superpower for 21st Century fuels, not fossil fuels. I say let’s do it!

Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.