Playing the Emissions on Shuffle

What do the emission reductions of industrialised nations look like when you count the imports manufactured overseas?

WHO: Glen P. Peters, Center for International Climate and Environmental Research, Oslo, Norway
Jan C. Minx, Department for Sustainable Engineering, and Department for the Economics of Climate Change, Technical University Berlin, Germany
Christopher L. Weberd, Science and Technology Policy Institute, Washington, Civil and Environmental Engineering, Carnegie Mellon University, Pittsburgh, USA
Ottmar Edenhofer, Department for the Economics of Climate Change, Technical University Berlin, Potsdam Institute for Climate Impact Research, Potsdam, Germany

WHAT: Measuring the transfer of CO2 emissions through international trade

WHEN: 24 May 2011

WHERE: Proceedings of the National Academy of Sciences (PNAS) vol. 108 no. 21, May 2011

TITLE: Growth in emission transfers via international trade from 1990 to 2008 (open access)

These researchers have found a problem with the way we count carbon emissions. When we count them, countries tend to count them for industries that emit within their own territorial borders, which means that emissions in the developing world have kept going up, while emissions in the developed world (or first world) have either flattened or dropped, depending on how much your government likes to admit the reality of climate change.

However, most of the emissions from the developed world are to produce goods for places like North America and Europe. So these researchers wanted to work out exactly how much international trade contributed towards global emissions increasing by 39% from 1990 – 2008. Was the increase in emissions due to development in countries like China, or was it a case of wealthy countries just shuffling their manufacturing emissions to another country and continuing to increase consumption rates?

As you might guess (spoiler alert) it’s the latter. Turns out all we’ve been doing is moving most of our industrial manufacturing emissions to developing countries and importing the products back, allowing everyone to say ‘yay, we reduced emissions!’ while the actual amount of carbon being burned continues to increase.

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

Growth in emissions transferred around the globe – dumping our responsibility on other countries (from paper)

But don’t take my word for it – what does the paper say?

The researchers took the global economy and broke it down into 113 regions with 95 individual countries and 57 economic sectors. They then looked at all the national and international data they could get on supply chain emissions to produce goods between 1990-2008, as well as doing extra detailed analysis for the years 1997, 2001 and 2004. They called it a Time Series with Trade and it was based on GDP, bilateral trade and emissions statistics (all of which you can generally find at your national statistics office online). The only thing they left out of their analysis was emissions from land use change, because there wasn’t enough data for them to thoroughly analyse it.

They found that global CO2 emissions from exported goods rose from 4.3 Gigatonnes (Gt) in 1990 to 7.8 Gt of CO2 in 2008, with a big increase in the decade up to 2008. Exports have increased their share of global emissions from 20% to 26% and grew on average by 4.3% per year, which was faster than the global population grew (1.4%), faster than total global CO2 emissions grew (2%) and faster than global GDP grew (3.6%).

The only thing that export emissions didn’t grow faster than was the dollar value of all that trade, which increased by 12% each year. So not only are all those new iPhones costing you a lot of money (and making Apple super wealthy), they’re also burning a lot of carbon.

But the thing the paper points out is that international trade has led to simply shifting the location of the emissions, rather than reducing the emissions – shuffling them around the planet to avoid counting them. The researchers estimate that the transfer of emissions from wealthy countries to developing countries has been 17% per year increasing from 0.4 Gt of CO2 in 1990 to 1.6 Gt in 2008.

This is an issue, because it means that all of the countries that signed on to Kyoto to reduce their carbon emissions – most of which promised around 0.7 Gt CO2 reduction per year – have simply shifted those emissions through trade to make them someone else’s problem, while continuing to consume stuff at an ever increasing rate.

More and more stuff (epSos, flickr)

More and more stuff (epSos, flickr)

The researchers point out that while China is currently the world’s largest emitter of carbon emissions, with the USA at number two, if you counted consumption emissions (meaning you made the USA count the emissions for all the stuff they use that’s made in China), they’d swap places and the USA would be the world’s largest emitter.

This makes sense if you think it through – have a look around your house at everything that’s made in China. All of that carbon that China is burning, which is destroying their air quality and polluting their cities and people; all of that is to make stuff for you to consume.

If you count the consumption emissions, the emissions reduction of 3% from the developed world becomes an emissions growth of 11%. Oops. Also, the researchers point out that emissions reductions in wealthy countries are often exceeded by the growth of trade emissions.

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

Emission reductions vs emissions transferred to developing world. Annex B: developed world, non-Annex B: developing world (from paper)

So what does this mean, other than the fact that everyone is trying to avoid having to clean up their own mess?

It means there’s a problem with the way we count emissions from trade vs emissions from consumption. It also means that we’re currently failing to reduce our carbon emissions in any significant way, which puts us on a straight path to 4, 5 or 6oC of global warming, otherwise known as the next mass extinction.

Climate Question: Do We Get to Keep Flying?

An analysis of jet fuel alternatives that could be viable in the next decade.

WHO: James I. Hileman, Hsin Min Wong, Pearl E. Donohoo, Malcolm A. Weiss, Ian A. Waitz, Massachusetts Institute of Technology (MIT)
David S. Ortiz, James T. Bartis, RAND Corporation Environment, Energy and Economic Development Program

WHAT: A feasibility study of alternatives to conventional jet fuel for aircraft.

WHEN: 2009

WHERE: Published on both the MIT website and RAND Corporation website

TITLE: Near-Term Feasibility of Alternative Jet Fuels

Last week, I looked at how our transport systems could be carbon free by 2100 and was intrigued by the comment ‘hydro-processed renewable jet fuel made from plant oils or animal fats is likely to be the only biomass-based fuel that could be used as a pure fuel for aviation, but would require various additives in order to be viable as an aviation fuel’.

It made me wonder what was being done for airplane fuel alternatives, or do we not have any alternatives and will I have to give up visiting my family in Australia?

Any other options? (photo: Amy Huva 2013)

Any other options? (photo: Amy Huva 2013)

I came across this technical report by MIT and the RAND Corporation (apparently RAND stands for Research ANd Development) and sat down to read all 150pages (you’re welcome) and see what our options for fuels that we could feasibly use in the next decade are.

The paper compared alternative fuels on the basis of compatibility with existing aircraft and infrastructure, production potential, production costs, lifecycle Greenhouse Gas (GHG) emissions, air quality emissions, merit of the fuel as jet fuel vs ground fuel and the maturity of the technology.

The researchers pointed out (quite rightly) that emissions from biofuels need to take into account the carbon emitted through land use changes because if you clear-fell a forest to plant a biofuel crop any carbon you’ve saved by not burning oil has just been invalidated by the carbon emitted from clear-felling the forest.

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

Deforestation: not helpful. (Image by: Aidenvironment, flickr)

There were five different fuel groups looked at;

  1. Conventional Petroleum
  2. Unconventional Petroleum
  3. Synthetic fuel from natural gas, coal or biomass
  4. Renewable oils
  5. Alcohols

The standard fuel used in North America for aviation is called Jet A and was used as the benchmark for the study. So what did they find?

Conventional Petroleum Fuels

Almost all Jet A fuel comes from crude oil and is kerosene based. The emissions from Jet A are 87.5g of CO2e (CO2 equivalent) per megajoule (MJ) of energy created (g CO2e/MJ). Of that 87.5g, 73.2g comes from the burning of the fuel and there can be a 7% variation on the amount emitted from refining depending on the quality of the crude oil used and the refining process.

The world consumption of jet fuel is estimated at 5million barrels per day of oil. This number is really hard to wrap your head around, so let me quickly walk you through some math. A barrel of oil is 159L, which means 5million barrels per day is 795,000,000L of oil burned each day. To get that volume of water, you would have to run a fire hose (359L/minute) for 101 years (yes, YEARS). We burn that much in one day.

Given that a conventional fuel is already oil based and you can’t reduce those carbon emissions, the tweak for this paper was an Ultra Low Sulfur Jet A fuel, which would reduce sulfur emissions from burning the fuel.

While it’s a great to reduce sulfur emissions that cause acid rain, the extra refining needed upped the lifecycle emissions to 89g CO2e/MJ.

Unconventional Petroleum Fuels

Unconventional fuels are things like the Canadian tar sands (or oil sands if you’re their PR people) and Venezuelan Heavy Oil. These oils are dirtier and require more refining to be made into jet fuel. They also require more effort to get out of the ground, and so the lifecycle emissions are 103g CO2e/MJ (with an uncertainty of 5%). The upstream emissions of sourcing and refining the fuel are what add the extra – burning the fuel has the same emissions as Jet A, and the upstream emissions range from 16g CO2e/MJ for open cut mining to 26g CO2e/MJ for in-situ mining.

You can also get shale oil through the process of fracking and refine it to Jet A. Shale based Jet A also burns the same as Jet A, but the extraction emissions are a whopping 41g CO2e/MJ which is double the tar sands extraction emissions, giving an overall 114.2g CO2e/MJ lifecycle emissions.

Fischer-Tropsch Synthetic Fuels

These are fuels derived through the catalysed Fisher-Tropsch process and then refined into a fuel. These fuels are good because they have almost zero sulfur content (and therefore almost zero sulfur emissions). They don’t work as a 100% fuel without an engine refit because of the different aromatic hydrocarbon content, and the energy density is 3% less than Jet A (meaning you’d need 3% more fuel in the tank to go the same distance as Jet A fuel). However, it does combine easily to make a 50/50 blend for planes.

You can make FT Synthetic fuel from natural gas which gives you 101g CO2e/MJ emissions, from coal which gives you between 120-195g CO2e/MJ and relies on carbon capture and storage as a technical fix, or from biomass, which has almost zero lifecycle emissions ONLY if you use a waste product as the source and don’t contribute to further land use changes.

Renewable Oils

These are biodiesel or biokerosene which can be made from soybean oil, canola oil, palm oil, coconut oil, animal fats, waste products or halophytes and algae.

Because this paper was looking at fuels that could be commercially used in the next 10 years, they looked at a 5% blend with Jet A fuel to meet freeze point requirements (most renewable oils freeze at too high a temperature for the altitude planes fly at). They found too many safety and freezing point issues with biodiesel or biokerosene, so didn’t calculate the emissions from them as they’re not practical for use.

Another renewable oil is Hydroprocessed Jet Fuel entertainingly sometimes called ‘Syntroleum’. This is made from plant oils, animal fats or waste grease. Soybean oil without land use emissions would have only 40-80% of the emissions of Jet A, while palm oil would have 30-40% the emissions of Jet A.

Alcohol Fuels

The paper looked at using ethanol (the alcohol we drink) and butanol as replacement fuels. They both had lower energy densities to Jet A, higher volatility (being flammable and explosive) and issues with freezing at cruising altitude. While butanol is slightly safer to use as a jet fuel than ethanol, the report suggests it’s better used as a ground transport fuel than a jet fuel (I assume the better use of ethanol as a drink is implied).

Options for jet fuel alternatives (from paper)

Options for jet fuel alternatives (from paper)

After going through all the options, the researchers found that the three main options we have for alternative fuels over the next decade that could be commercially implemented are;

  1. Tar sands oil
  2. Coal-derived FT Synthetic oil
  3. Hydroprocessed Renewable jet fuel

They recommended that when looking to reduce the emissions from the transport sector that aviation shouldn’t be treated any differently. While strongly recommending that land use changes be taken into account for the use of biofuels, they also pointed out that the use for aviation should also be looked at as limited biofuel resources may be more effective producing heat and power rather than being used for transport.

Personally, I don’t find the report very heartening given that the first two options involve either dirtier oil or really dirty coal when what we need to be doing is reducing our emissions, not changing the form they’re in and still burning them. I’ll be keeping my eye out for any new research into hydroprocessed renewable jet fuels that could use waste products or algae – given the speed that oceans are acidifying, there could be a lot of ocean deadzones that are really good at growing algae and could be used as a jet fuel.

But until then, it looks like there aren’t many options for the continuation of air travel once we start seriously reducing our emissions – they’ll be a really quick way to burn through our remaining carbon budget.

Wind Power Kicks Fossil Power Butt

What if you ran the numbers for wind power replacing all fossil fuel and nuclear electricity in Canada? How could it work? How much would it cost?

WHO:  L.D. Danny Harvey, Department of Geography, University of Toronto, Canada

WHAT: Mapping and calculating the potential for wind electricity to completely replace fossil fuel and nuclear electricity in Canada

WHEN: February 1st, 2013

WHERE: Energy Vol. 50, 1 February 2013

TITLE: The potential of wind energy to largely displace existing Canadian fossil fuel and nuclear electricity generation (subs req.)

As a kid, I really loved the TV series Captain Planet. I used to play it in the school yard with my friends and I always wanted to be the one with the wind power. Mostly because my favourite colour is blue, but also because I thought the girl with the wind power was tough.

Go Planet! Combining the power of wind, water, earth, fire and heart (Wikimedia commons)

Go Planet! Combining the power of wind, water, earth, fire and heart (Wikimedia commons)

What’s my childhood got to do with this scientific paper? Well, what if you looked at the Canadian Wind Energy Atlas and worked out whether we could harness the power of wind in Canada to replace ALL fossil fuel and nuclear electricity? How would you do it? How much would it cost? That’s what this researcher set out to discover (in the only paper I’ve written about yet that has a single author!)

Refreshingly, the introduction to the paper has what I like to call real talk about climate change. He points out that the last time global average temperatures increased by 1oC, sea levels were 6.6 – 9.4m higher, which means ‘clearly, large and rapid reductions in emissions of CO2 and other greenhouse gases are required on a worldwide basis’.

Of global greenhouse gas emissions electricity counts for about 25%, and while there have been studies in the US and Europe looking at the spacing of wind farms to reduce variability for large scale electricity generation, no-one has looked at Canada yet.

So how does Canada stack up? Really well. In fact, the paper found that Canada has equivalent wind energy available for many times the current demand for electricity!

The researcher looked at onshore wind and offshore wind for 30m, 50m and 80m above the ground for each season to calculate the average wind speed and power generation.  Taking into account the wake effect of other turbines and eliminating areas that can’t have wind farms like cities, mountains above 1,600m elevation (to avoid wind farms on the Rocky Mountains), shorelines (to avoid wind farms on your beach) and wetlands, the paper took the Wind Energy Atlas and broke the map into cells.

For calculating your wind farm potential there are generally three options; you can maximise the electricity production, maximise the capacity factor, or minimise the cost of the electricity. The paper looked at all three options and found that the best overall option (which gives you a better average cost in some cases) was to aim for maximum capacity.

Using wind data and electricity demand data from 2007, the researcher ran the numbers. In 2007, the total capacity of fossil fuel and nuclear electricity was 49.0GW (Gigawatts), or 249.8TWh (Terrawatt hours) of generation. This is 40% of the total national electricity capacity for Canada of 123.9GW or 616.3TWh generation.

To deal with the issue of wind power being intermittent, the paper noted that there’s already the storage capacity for several years electricity through hydro in Quebec and Manitoba, as well as many other options for supply-demand mismatches (which this paper doesn’t address) making a national wind electricity grid feasible.

To run the numbers, the country was split into 5 sectors and starting with the sector with the greatest wind energy potential, the numbers were run until a combination was found where the wind energy in each sector met the national fossil fuel and nuclear requirements.

Wind farms required in each sector to provide enough electricity to completely replace the fossil fuel and nuclear power used in 2007 (from paper)

Wind farms required in each sector to provide enough electricity to completely replace the fossil fuel and nuclear power used in 2007 (from paper)

Once the researcher worked out that you could power the whole country’s fossil fuel and nuclear electricity with the wind energy from any sector, he looked at minimising costs and meeting the demand required for each province.

He looked at what size of wind farm would be needed, and then calculated the costs for infrastructure (building the turbines) as well as transmission (getting the electricity from the farm to the demand). Some offshore wind in BC, Hudson Bay, and Newfoundland and Labrador, combined with some onshore wind in the prairies and Quebec and that’s all we need.

The cost recovery for the investment on the infrastructure was calculated for 20 years for the turbines and 40 years for the transmission lines. The paper found that minimising transmission line distance resulted in the largest waste generation in winter, but smallest waste in the summer, however overall, the best method was to aim for maximising the capacity factor for the wind farms.

But the important question – how much would your power cost? On average, 5-7 cents per kWh (kilowatt hour), which is on par with the 7c/kWh that BC Hydro currently charges in Vancouver. Extra bonus – wind power comes without needing to mine coal or store radioactive nuclear waste for millions of years!

Estimated wind power costs for Canada (from paper)

Estimated wind power costs for Canada (from paper)

Some more food for thought – the researcher noted that the estimated cost for coal fired electricity with (still unproven) carbon capture and storage technology is likely to be around 9c/kWh, while the current cost for nuclear generated electricity is between 10-23c/kWh. Also, the technical capacity factor for turbines is likely to increase as the technology rapidly improves, which will reduce the cost of producing wind electricity all over again.

This is all great news – Canada has the wind energy and the potential to build a new industry to not only wean ourselves off the fossil fuels that are damaging and destabilising our atmosphere, but to export that knowledge as well. We can be an energy superpower for 21st Century fuels, not fossil fuels. I say let’s do it!

Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?

Carbon in Your Supply Chain

How will a real price on carbon affect supply chains and logistics?

WHO:  Justin Bull, (PhD Candidate, Faculty of Forestry, University of British Columbia, Canada)
Graham Kissack, (Communications Environment and Sustainability Consultant, Mill Bay, Canada)
Christopher Elliott, (Forest Carbon Initiative, WWF International, Gland, Switzerland)
Robert Kozak, (Professor, Faculty of Forestry, University of British Columbia, Canada)
Gary Bull, (Associate Professor, Faculty of Forestry, University of British Columbia, Canada)

WHAT: Looking at how a price on carbon can affect supply chains, with the example of magazine printing

WHEN: 2011

WHERE: Journal of Forest Products Business Research, Vol. 8, Article 2, 2011

TITLE: Carbon’s Potential to Reshape Supply Chains in Paper and Print: A Case Study (membership req)

Forestry is an industry that’s been doing it tough in the face of rapidly changing markets for a while. From the Clayoquot sound protests of the 1990s to stop clearcutting practices to the growing realisation that deforestation is one of the leading contributors to climate change, it’s the kind of industry where you either innovate or you don’t survive.

Which makes this paper – a case study into how monetising carbon has the potential to re-shape supply chains and make them low carbon – really interesting. From the outset, the researchers recognise where our planet is heading through climate change stating ‘any business that emits carbon will [have to] pay for its emissions’.

To look at the potential for low carbon supply chains, the paper looks at an example of producing and printing a magazine in North America – measuring the carbon emissions from cutting down the trees, to turning the trees into paper, transporting at each stage of the process and the printing process.

Trees to magazines (risa ikead, flickr)

Trees to magazines (risa ikead, flickr)

They did not count the emissions of the distribution process or any carbon emissions related to disposal after it was read by the consumer because these had too many uncertainties in the data. However, they worked with the companies that were involved in the process to try and get the most accurate picture of the process they possibly could.

The researchers found that the majority of the carbon is emitted in the paper manufacturing process (41%) as the paper went from a tree on Vancouver Island, was shipped as fibre to Port Alberni in a truck, manufactured into paper and then shipped by truck and barge to Richmond and then by train to the printing press in Merced, California.

Activity Carbon Emissions (CO2/ADt) Percentage of Total
Harvesting, road-building, felling, transport to sawmills

55kg

12%

Sawmilling into dimensional and residual products

45kg

10%

Transport of chips to mill

8kg

2%

Paper manufacturing process

185kg

41%

Transportation to print facility

127kg

28%

Printing process

36kg

8%

Total

456kg

100%

Supply Chain Emissions (Table 1. Reproduced verbatim from hardcopy)

The case study showed that upstream suppliers consume more energy than downstream suppliers, however downstream suppliers are most visible to consumers, which poses a challenge when trying to get larger emitters to minimise their carbon footprint, as there’s less likelihood of consumer pressure on lesser known organisations.

That being said, there can be major economic incentives for companies to try and minimise their carbon footprint given that Burlington Northern Santa Fe Railways (who shipped the paper from Canada to the printing press in California in this study) spent approximately $4.6billion on diesel fuel in 2008 (the data year for the case study).

Given that California implemented a carbon cap and trade market at the end of 2012 and that increasing awareness of the urgency to reduce our carbon emissions rapidly and significantly means the price of carbon is likely to increase, $4.6billion in diesel costs could rapidly escalate for a company like BNSF. If part or all of their transport costs could be switched to clean energy, as polluting fossil fuel sources are phased out the company will start saving themselves significant amounts.

The companies in this study were very aware of these issues, which is encouraging. They agreed that carbon and sustainability will be considered more closely in the future and that carbon has the potential to change the value of existing industrial assets as corporations who are ‘carbon-efficient’ may become preferred suppliers.

The researchers identified three types of risk that companies could face related to carbon; regulatory risk, financial risk and market access risk. The innovative businesses who will thrive in a low carbon 21st century economy will be thinking about and preparing for operating in an economy that doesn’t burn carbon for fuel, or where burning carbon is no longer profitable.

I really liked the paper’s example of financial risk in the bond market ‘where analysts are projecting a premium on corporate bonds for new coal fired power plants’, meaning it will be harder for companies to raise money to further pollute our atmosphere. This is especially important given that Deutsche Bank and Standard and Poors put the fossil fuel industry on notice last week saying that easy finance for their fossil fuels party is rapidly coming to an end.

Of course, no-one ever wants to believe that the boom times are coming to an end. But the companies that think ahead of the curve and innovate to reduce their carbon risk instead of going hell for leather on fossil fuels will be the ones that succeed in the long run.

Hot Air Balloon – Heat Emissions in London

Detailed measurements of the thermal pollution in Greater London and a look at the long term trend.

WHO: Mario Iamarino, Dipartimento di Ingegneria e Fisica dell’Ambiente, Università degli Studi della Basilicata, Potenza, Italy
Sean Beevers,  Environmental Research Group, King’s College London, London, UK
C. S. B. Grimmond, Environmental Monitoring and Modelling Group, Department of Geography, King’s College London, London, UK

WHAT: Measuring the heat pollution of London with regards with to both the space and time the emissions occur in.

WHEN: July 2011 (online version)

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 32, Issue 11, September 2012

TITLE: High-resolution (space, time) anthropogenic heat emissions: London 1970–2025 (subs. req)

Cities have a rather unique position in our world when we think about climate change. They are both the source of the majority of emissions, but also due to their dense nature they are the sites where the greatest innovations and reductions in emissions can occur fastest.

If you want to be able to harness the creativity and innovation of cities to be able to reduce emissions, you need to know what emissions are coming from where and when. You need data first.

That’s where these scientists from Italy and the UK stepped in to work out exactly how much waste heat the city of London was emitting. Waste heat is an issue because it contributes to the Urban Heat Island effect where the temperatures in cities can be higher than the surrounding area, which can be deadly in summer heat waves and lead to greater power usage.

Something interesting that I hadn’t considered before is that concentrated emissions can also change the atmospheric chemistry in a localised area.

The researchers looked at some very detailed data from between 2005-2008 for the city of London and broke it down into as accurate a picture as they could. They looked at heat from buildings looking at the differences between residential and commercial buildings and the different ways the different building uses would emit heat.

They looked at transport data using traffic congestion, the London Atmospheric Emissions Inventory, and data from several acronym-happy government departments to work out from what kinds of vehicles at what times of the day the waste heat was being emitted and at what temperature each different type of fuel combusted. They didn’t count trains (underground or overground) because they are mostly electric (and some of them recover waste heat from breaking) or small boats which didn’t emit enough to be counted.

They looked at the heat emitted by people at various stages of activity, age and time of day assuming a 1:1 ratio of females to males. They even looked at the standard metabolic rates of people to work out how much heat a person emits exercising, resting or sleeping!

Data! London waste heat emissions (from paper)

Data! London waste heat emissions (from paper)

What all that number and formula crunching told them was the total average anthropogenic heat flux for London was 10.9 Wm-2 (watts per square metre). This calculates to be 150 Terawatt hours of waste energy (in the form of heat), which as a comparison is all of the electricity used in Sweden in 2010.

Of that total, 80% came from buildings with 42% being domestic residences and 38% being industrial buildings. The next biggest source of heat was from transport, at 15% of the 10.9 Wm-2. Of the transport category, personal cars were the biggest contributor (64% of the transport portion).

Human heat only contributed 5.1% of the total (0.55 Wm-2), so maybe they’re not doing enough exercise in London? The information had peaks and valleys of heat loss – winter releases more waste heat than summer, because heating systems for winter are more widespread than air conditioners in summer. The industrial building emissions were concentrated in the core of the City of London (especially Canary Wharf where there are many new centrally heated/cooled high rise offices) while the domestic building emissions were much more spread around the centre of London.

Building heat emissions domestic (left) and industrial (right) (from paper)

Building heat emissions domestic (left) and industrial (right) (from paper)

Once they got all the data from 2005 to 2008, they considered trends from 1970 projected out to 2025 to see how great a role heat emissions could play in climate change. Using data from the Department of Energy and Climate Change, the London Atmospheric Emissions Inventory (LAEI) and data on population dynamics and traffic patterns, they estimated that there would be an increase in all contributors to heat emissions unless the UK Greenhouse gas reduction targets are fully implemented.

The reduction targets are for 80% reduction by 2050 (against the baseline of 1990 emissions). From the research indicating that buildings are the biggest heat emitters (and are therefore burning more energy to keep at the right temperature), this would mean there’s a great need for increasing building efficiency to meet those targets.

The paper notes that if the Low Carbon Transition Plan for London is implemented, the average waste heat emissions for London will drop to 9 Wm-2 by 2025, but in the central City of London, the best emissions reductions are likely to only be to 2008 levels, due to the expected growth in the area.

So what does any of this mean? It means London now has the data to know where they can find efficiencies that can complement their greenhouse gas mitigation programs. Because that’s the thing about combating climate change – it’s not a ‘one problem, one solution’ kind of issue. We need to do as many different things as possible all at once to try and add up to the levels of decarbonising that the world needs to be doing to avoid catastrophic climate change. So go forth London, take this data and use it well!

Unprecedented: Melting Before Our Eyes

The volume of Arctic sea ice is reducing faster than the area of sea ice, further speeding the arctic death spiral.

WHO:  Seymour W. Laxon, Katharine A. Giles, Andy L. Ridout, Duncan J. Wingham, Rosemary Willatt, Centre for Polar Observation and Modelling, Department of Earth Sciences, University College London, London, UK
Robert Cullen, Malcolm Davidson, European Space Agency, Noordwijk, The Netherlands
Ron Kwok, Jet Propulsion Laboratory, California Institute of Technology, Pasadena, California, USA.
Axel Schweiger, Jinlun Zhang, Polar Science Center, Applied Physics Laboratory, University of Washington, Seattle, Washington, USA
Christian Haas, Department of Earth and Space Science and Engineering, York University, Toronto, Canada.
Stefan Hendricks, Alfred Wegener Institute for Polar and Marine Research, Bremerhaven, Germany
Richard Krishfield, Woods Hole Oceanographic Institution, Woods Hole, Massachusetts, USA
Nathan Kurtz,School of Computer, Math, and Natural Sciences, Morgan State University, Baltimore, Maryland, USA.
Sinead Farrell, Earth System Science Interdisciplinary Center, University of Maryland, Maryland, USA.

WHAT: Measuring the volume of polar ice melt

WHEN: February 2013 (online pre-published version)

WHERE: American Geophysical Union, 2013, doi: 10.1002/grl.50193

TITLE:  CryoSat-2 estimates of Arctic sea ice thickness and volume (subs req.)

Much has been written about the Arctic Death Spiral of sea ice melting each spring and summer, with many researchers attempting to model and predict exactly how fast the sea ice is melting and when we will get the horrifying reality of an ice-free summer Arctic.

But is it just melting at the edges? Or is the thickness, and therefore the volume of sea ice being reduced as well? That’s what these researchers set out to try and find out using satellite data from CryoSat-2 (Science with satellites!).

The researchers used satellite radar altimeter measurements of sea ice thickness, and then compared their results with measured in-situ data as well as other Arctic sea ice models.

A loss of volume in Arctic sea ice is a signifier of changes in the heat exchange between the ice, ocean and atmosphere, and most global climate models predict a decrease in sea ice volume of 3.4% per decade which is larger than the predicted 2.4% per decade of sea ice area.

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

Sea ice area minimum from September 2012 (image: NASA/Goddard Space Flight Center Scientific Visualization Studio)

The researchers ran their numbers for ice volume in winter 2010/11 and winter 2011/12, and then used the recorded data sets to check the accuracy of their satellites (calibration for my fellow science nerds).

The most striking thing they found was a much greater loss of ice thickness in the north of Greenland and the Canadian Archipelago. Additionally, they found that the first year ice was thinner in autumn, which made it harder to catch up to average thickness during the winter, and made greater melting easier in summer.

Interestingly, they found that there was additional ice growth in winter between 2010-12 (7,500km3) compared to 2003-08 (5,000km3), which makes for an extra 36cm of ice growth in the winter. Unfortunately the increased summer melt is much greater than the extra growth, so it’s not adding to the overall thickness of the sea ice.

For the period of 2010-12 the satellite measured rate of decline in autumn sea ice was 60% greater than the predicted decline using PIOMAS (Panarctic Ice Ocean Modeling and Assimilation System). Most researchers when seeing results like that might hope that there’s an error, however when measured against the recorded data, the CryoSat-2 data was within 0.1 metres of accuracy. So while astounding, the 60% greater than expected loss of sea ice thickness is pretty spot on.

The researchers think that lower ice thickness at the end of winter in February and March could be contributing to the scarily low September minimums in the arctic death spiral, but the greatest risk here is that the ever increasing melt rate of ice in the arctic could take the climate beyond a tipping point where climate change is both irreversible and uncontrollable in a way we are unable to adapt to.

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

Visualisation of reduction in arctic sea ice thickness (from Andy Lee Robinson, via ClimateProgress)

So as usual, my remedy for all of this is: stop burning carbon.

When the Party’s Over: Permian Mass Extinction

“The implication of our study is that elevated CO2 is sufficient to lead to inhospitable conditions for marine life and excessively high temperatures over land would contribute to the demise of terrestrial life.”

WHO: Jeffrey T. Kiehl, Christine A. Shields, Climate Change Research Section, National Center for Atmospheric Research, Boulder, Colorado, USA

WHAT: A complex climate model of atmospheric, ocean and land conditions at the Permian mass extinction 251 million years ago to look at CO2 concentrations and their effect.

WHEN: September 2005

WHERE: Geological Society of America, Geology vol. 33 no. 9, September 2005

TITLE: Climate simulation of the latest Permian: Implications for mass extinction

The largest mass extinction on earth occurred approximately 251million years ago at the end of the Permian geologic era. Almost 95% of all ocean species and 70% of land species died, and research has shown that what probably happened to cause this extinction was carbon dioxide levels.

As the saying goes; those who do not learn from history are doomed to repeat it, so let’s see what happened to the planet 251 million years ago and work out how we humans can avoid doing it to ourselves at high speed.

This research paper from 2005 did the first comprehensive climate model of the Permian extinction, which means their model was complicated enough to include the interaction between the land and the oceans (as different to ‘uncoupled’ models that just looked at one or the other and not how they affected each other).

The researchers used the CCSM3 climate model that is currently housed at the National Centre for Atmospheric Research (NCAR) and is one of the major climate models currently being used by the IPCC to look forward and model how our climate may change with increasing atmospheric carbon pollution (or emission reduction). They organised their model to have ‘realistic boundary conditions’ for things like ocean layers (25 ocean layers for those playing at home), atmospheric resolution and energy system balance. They then ran the simulation for 900 years with current conditions and matched it with observed atmospheric conditions and got all of their data points correct with observed data.

Then, they made their model Permian, which meant taking CO2 concentrations and increasing them from our current 397ppm to 3,550ppm which is the estimated CO2 concentrations from the end of the Permian era.

What did ramping up the CO2 in this manner do for the planet’s living conditions in the model? It increased the global average temperature to a very high 23.5oC (the historic global average temperature for the Holocene (current era) is 14oC).

Oceans
Changing the CO2 concentrations so dramatically in the model changed the global average ocean surface temperature 4oC warmer than current conditions. Looking at all the ocean layers in their model, the water was warmer in deeper areas as well, with some areas at depths of 3000m below sea level measuring 4.5-5oC where they are currently near freezing.

The greatest warming in the oceans occurred at higher latitudes, where ocean temps were modelled at 8oC warmer than present, while equatorial tropical oceans were not substantially warmer. The oceans were also much saltier than they currently are.

The big problem for all of the things that called the ocean home at the end of the Permian era is the slowing of ocean circulation and mixing. Currently, dense salty water cools at the poles and sinks, oxygenating and mixing with deeper water allowing complex organisms to grow and live. If this slows down, which it did in this model, it has serious consequences for all ocean residents.

Current ocean circulation patterns (NOAA, Wikimedia commons)

Current ocean circulation patterns (NOAA, Wikimedia commons)

Their Permian model measured ocean overturning circulation around 10 Sv (million cubic metres per second), whereas current ocean overturning circulation is around 15-23 Sv. The researchers think the ocean currents could have slowed down enough to create anoxic oceans, which are also known as ‘ocean dead zones’ or ‘Canfield Oceans’, and stated that it set the stage for a large-scale marine die off.

Land
If the end of the Permian was pretty nasty for ocean residents, how did it fare for land-dwellers? What happened to the tetrapods of Gondwanaland? Well it looked really different to how earth looks today.

Permian land mass (Wikimedia commons)

Permian land mass (Wikimedia commons)

There were deciduous forests at high latitudes, and the elevated CO2 in the model was the dominant reason for warm, ice free Polar Regions (which also hindered ocean circulation). Land surface temperatures were between 10 – 40oC warmer than they are today. In their model, dry sub-tropical climates like the Mediterranean or Los Angeles and Southern California were much hotter, with the average daily minimum temperatures around 51oC. Yes, Los Angeles, your overnight low could be 51oC.

Understandably, the authors state that ‘these extreme daily temperature maxima in these regions could contribute to a decrease in terrestrial flora and fauna’, which is scientist for ‘it’s so damn hot outside nothing except cacti can grow’.

All of these changes were run over a 2,700 year period in the model, which if you take the 2005 CO2 concentration of 379ppm as your base is an increase of 1.17ppm per year.

This is the important bit to remember if we’re going to learn from history and not go the way of the Permian residents. Our current rate of increase in CO2 concentrations is 2ppm per year, which means we are on a super speed path to mass extinction. If we continue with business as usual, which has been aptly renamed ‘business as suicide’ by climate blogger Joe Romm, we will be at the end of the next mass extinction in around 1,500 years.

Where humanity is headed (from Royal Society Publishing)

Where humanity is headed (from Royal Society Publishing)

All we need to do to guarantee this being the outcome of all of humanity is keep the status quo and keep burning fossil fuels and the entire sum of humans as a species on this planet will be a tiny geological blip where we turned up, became the most successful invasive species on the globe, burned everything in sight and kept burning it even when we knew it was killing us.

However, I think this part from the paper’s conclusion should give most of us a pause for thought;

 ‘Given the sensitivity of ocean circulation to high-latitude warming, it is hypothesized that some critical level of high-latitude warming was reached where connection of surface waters to the deep ocean was dramatically reduced, thus leading to a shutdown of marine biologic activity, which in turn would have led to increased atmospheric CO2 and accelerated warming.’

As a species, if we are going to survive we need to make sure we do not go past any of those critical levels of warming or tipping points. Which means we need to make sure we stop burning carbon as fast as possible. Otherwise, T-Rex outlasted us as a species by about two million years which would be kinda embarrassing.

Sleepwalking off a Cliff: Can we Avoid Global Collapse?

‘Without significant pressure from the public demanding action, we fear there is little chance of changing course fast enough to forestall disaster’
Drs. Paul and Anne Ehrlich

WHO: Paul R. Ehrlich, Anne H. Ehrlich, Department of Biology, Stanford University, California, USA

WHAT: An ‘invited perspective’ from the Royal Society of London for Improving Natural Knowledge (the Royal Society) on the future of humanity following the election of Dr. Paul Ehrlich to the fellowship of the Royal Society.

WHEN: 26 January 2013

WHERE: Proceedings of the Royal Society, Biological Sciences (Proc. R. Soc. B) 280, January 2013

TITLE: Can a collapse of global civilization be avoided?

Dr. Paul Ehrlich has been warning humanity about the dangers of exceeding the planet’s carrying capacity for decades. He first wrote about the dangers of over-population in his 1968 book The Population Bomb, and now following his appointment to the fellowship of the Royal Society, he and his wife have written what I can only describe as a broad and sweeping essay on the challenges that currently face humanity (which you should all click the link and read as well).

When you think about it, we’re living in a very unique period of time. We are at the beginning of the next mass extinction on this planet, which is something that only happens every couple of hundred million years. And since humans are the driving force of this extinction, we are also in control of how far we let it go. So the question is, will we save ourselves, or will we sleepwalk off the cliff?

Drs. Ehrlich describe the multiple pressures currently facing the planet and its inhabitants as a perfect storm of challenges. Not only is there the overarching threat multiplier of climate change, which will make all of our existing problems harder to deal with, we have concurrent challenges facing us through the loss of ecosystem services and biodiversity from mass extinction, land degradation, the global spread of toxic chemicals, ocean acidification, infectious diseases and antibiotic resistance, resource depletion (especially ground water) and subsequent resource conflicts.

you have humans Wow. That’s quite the laundry list of problems we’ve got. Of course, all these issues interact not only with the biosphere; they interact with human socio-economic systems, including overpopulation, overconsumption and current unequal global economic system.

If you haven’t heard the term ‘carrying capacity’ before, it’s the limit any system has before things start going wrong – for instance if you put 10 people in a 4 person hot tub, it will start to overflow, because you’ve exceeded its carrying capacity.

The bad news is we’ve exceeded the planet’s carrying capacity. For the planet to sustainably house the current 7 billion people it has, we would need an extra half an empty planet to provide for everyone. If we wanted all 7 billion of us to over-consume at the living standards of the USA, we would need between 4 – 5 extra empty planets to provide for everyone. Better get searching NASA!

The Andromeda Galaxy (photo: ESA/NASA/JPL-Caltech/NHSC)

The Andromeda Galaxy (photo: ESA/NASA/JPL-Caltech/NHSC)

The next problem is that a global collapse could be triggered by any one of the above issues, with cascading effects, although Drs. Ehrlich think the biggest key will be feeding everyone (which I’ve written about before), because the social unrest triggered by mass famine would make dealing with all the other problems almost impossible.

So what do we need to do? We need to restructure our energy sources and remove fossil fuel use from agriculture, although Drs. Ehrlich do point out that peaking fossil fuel use by 2020 and halving it by 2050 will be difficult. There’s also the issue that it’s really ethically difficult to knowingly continue to run a lethal yet profitable business, hence the highly funded climate denial campaigns to try and keep the party running for Big Oil a little longer, which will get in the way of change.

The global spread of toxic compounds can only be managed and minimised as best we can, similarly, we don’t have many answers for the spread of infectious and tropical diseases along with increasing antibiotic resistance that will happen with climate change.

Helpfully, Drs. Ehrlich point out that the fastest way to cause a global collapse would be to have any kind of nuclear conflict, even one they refer to as a ‘regional conflict’ like India and Pakistan. But even without nuclear warfare (which I hope is unlikely!) 6 metres of sea level rise would displace around 400 million people.

One of the most important things that we can be doing right now to help humanity survive for a bit longer on this planet is population control. We need less people on this planet (and not just because I dislike screaming children in cafes and on airplanes), and Drs. Ehrlich think that instead of asking ‘how can we feed 9.6 billion people in 2050’ scientists should be asking ‘how can we humanely make sure it’s only 8.6 billion people in 2050’?

How can we do that? Firstly, we need to push back against what they refer to as the ‘endarkenment’, which is the rise of religious fundamentalism that rejects enlightenment ideas like freedom of thought, democracy, separation of church and state, and basing beliefs on empirical evidence, which leads to climate change denialism, failure to act on biodiversity loss and opposition to the use of contraceptives.

And why do we need to push back against people who refuse to base their beliefs on empirical evidence? Because the fastest and easiest way to control population growth is female emancipation. Drs. Ehrlich point out that giving women everywhere full rights, education and opportunities as well as giving everyone on the planet access to safe contraception and abortion is the best way to control population growth (you know, letting people choose whether they’d like children).

More importantly, Drs. Ehrlich want the world to develop a new way of thinking systematically about things, which they’ve called ‘foresight intelligence’. Since it’s rare that societies manage to mobilise around slow threats rather than immediate threats, there need to be new ways and mechanisms for greater cooperation between people, because we are not going to succeed as a species if we don’t work together.

They’d like to see the development of steady-state economics which would destroy the ‘fables such as ‘technological innovation will save us’’. They’d like to see natural scientists working together with social scientists to look at the dynamics of social movements, sustainability and equality and to scale up the places where that kind of work is already happening.

They point out that our current methods of governance are inadequate to meet the challenges we face and that we need to work with developing nations who are currently looking to reproduce the western nation’s ‘success’ of industrialisation, so that they can instead be leaders to the new economy, because playing catch up will lead to global collapse.

Do Drs. Ehrlich believe that we can avoid a global collapse of civilisation? They think we still can, but only if we get fully into gear and work together now, because unless we restructure our way of doing things, nature will do it for us. It’s your call humanity – shall we get going, or will we sleepwalk our species off the cliff?