Extreme Overachiever Decade 2001-2010

The World Meteorological Organisation global climate report for last decade shows it winning all the most extreme awards.

WHO: The World Meteorological Organisation (WMO) in conjunction with these international experts and meteorological and climate organisations.

WHAT: The Global Climate Report for the decade 2001-2010

WHEN: July 2013

WHERE: Online at the WMO website

TITLE: The Global Climate 2001-2010 A Decade of Climate Extremes (open access)

The World Meteorological Organisation (WMO) recently released their wrap up of the last decade’s worth of global weather related data. Now, as you will all remember, climate is the long term trend of the weather (generally over a 30 year period) but it’s also important to keep track of the decade to decade trends, if not because 10 is a nice round number to deal with.

So what did the last decade’s report card look like? Turns out 2001-2010 was an overachiever when it came to records and extremes, which is bad news for all of us, really. The last decade was the warmest on record for overall temperatures and the speed of warming has amped up as well. The long term warming trend is 0.062oC/decade, but since 1970 it’s sped up to 0.17oC/decade.

Decade by decade temperature record (from report)

Decade by decade temperature record (from report)

If you only look at land surface temperatures 2007 held the record for hottest with 0.95oC above average, with 2004 the ‘least warm’ (there’s no long term cold records happening here) at 0.68oC above average.

If you look at sea surface temperatures, 2003 wins with 0.40oC above average and 2008 was the least warm at 0.26oC above average. The warmest year in the Northern Hemisphere was 2007 at 1.13oC above average and in the Southern Hemisphere 2005 wins with 0.67oC.

When it comes to rain, it’s a mixed bag. There were places that got more rain, there were places with drought, there were more extremes. South America was wetter than normal, Africa was drier than normal. Overall, 2002 was the driest year of the decade and 2010 was the wettest. The problem with rain patterns changing with climate change is that the location and the time frame changes. Instead of slow soaking rains, it’s extremes of dry spells followed by flash flooding.

The patterns of El Niño and La Niña switched back quite a lot during the decade, with El Niño generally creating warmer trends and La Niña creating cooler trends.

El Niño and La Niña trends for sea surface temperatures (from report)

El Niño and La Niña trends for sea surface temperatures (from report)

To qualify as an extreme event, an event needs to result in 10 or more people dying, 100 or more people being affected, a declaration of a state of emergency and the need for international assistance, which I think is a pretty high bar. But of course, since the last decade was overachieving, there were 400 disasters of this scale that killed more than 1million people.

Weather disasters represented 88% of these extreme events and the damage from these events have increased significantly as well as seeing a 20% increase in casualties from the previous decade. The extra casualties have been from some extreme increases in certain categories like heatwaves. In 1991-2000 6,000 people died from heatwaves. In 2001-2010 that jumped to 136,000 people.

The price of extreme weather events has also gone up with 7,100 hydrometeorological events carrying a price tag of $US1trillion and resulting in 440,000 deaths over the decade. It’s also estimated that 20million people worldwide were displaced, so this was probably our first decade of a sizable number of climate refugees. Internal displacement will be one of the biggest factors as people move away from the more extreme parts of their country to the places where it still rains (eg. from Arizona to Oregon).

Tropical storms were a big issue, with the report noting ‘a steady increase in the exposure of OECD countries [to tropical cyclones] is also clear’. It’s nice to see them point out that issues around extreme weather are not a developing world problem because they don’t have the infrastructure to deal with them, shown through the flooding in Germany and Australia last decade.

There was also a special shout-out to my homeland of Australia, for the epic heatwave of 2009 where I experienced 46oC in Melbourne and can attest to it not being fun. Of course, that epic heatwave was beaten by 2013’s new extreme map colour. However I noted Australia was getting pretty much all of the extreme weather effects over the decade. Ouch.

Australia – can’t catch a break (compiled from report)

Australia – can’t catch a break (compiled from report)

Even for the category of ‘coldwaves’ where the Northern Hemisphere saw an increase in freak snowstorms, the average temperature for the winter was still +0.52oC warmer than average.

Last decade was also setting lots of records in the cryosphere (frozen part of the planet). 2005-2010 had the five lowest Arctic sea ice records which have been declining in extent and volume at a disturbingly rapid rate in what is commonly known as the Arctic Death Spiral. There’s been an acceleration of loss of mass from Greenland and the Antarctic ice sheets and a decrease in all global glaciers.

Arctic Death Spiral (from report)

Arctic Death Spiral (from report)

The World Glacier Monitoring Service describes the glacier melt rate and cumulative loss as ‘extraordinary’ and noted that glaciers are currently so far away from their equilibrium state that even without further warming they’re still going to keep melting. Oh, and last decade won the record for loss of snow cover too.

Declining snow cover (from report)

Declining snow cover (from report)

Sea level has started rising faster at a rate of 3mm/yr last decade, which is double the historical average of 1.6mm/yr. Interestingly, sea level rise is not even across the globe due to ocean circulation and mass. In a La Niña year, the Western Pacific Ocean can be 10-20cm higher than the average for the decade, but there’s only one way sea levels are going as the water warms and expands and the ice sheets melt – up.

Sea level rise (from report)

Sea level rise (from report)

Finally, if all of that wasn’t enough bad news for you – the report looked at the gas concentrations in the atmosphere and found (surprise, surprise) that CO2 is up and accounts for 64% of the increase in radiative forcing (making our planet warmer) over the past decade and 81% of the increase in the last five years. Methane is responsible for 18% of the increase and Nitrous Oxides chip in 6%.

Does it make anyone feel better if I tell you the hole in the ozone layer isn’t getting bigger anymore?

Basically the world we live in is getting more extreme as it heats up at an ever increasing rate. Given that these are the changes we’re seeing with a 0.8oC increase in global average temperatures and that’s from carbon we burnt over a decade ago, how about we stop burning carbon with a little more urgency now?

Advertisements

Your odds just shortened – Aussie Heatwaves

Climate change has increased the chance of extreme heatwaves in Australia by more than five times.

WHO: Sophie C. Lewis and David J. Karoly, School of Earth Sciences and ARC Centre of Excellence for Climate System Science, University of Melbourne, Victoria, Australia

WHAT: Looking at how much influence human-caused climate changes are pushing Australian summers into more extreme heat.

WHEN: July 2013

WHERE: Geophysical Research Letters (DOI: 10.1002/grl.50673), pre-publication release

TITLE: Anthropogenic contributions to Australia’s record summer temperatures of 2013 (subs. req)

There’s some interesting research happening at my alma mater Melbourne University these days (go Melbourne!). Even if you weren’t there to experience the extreme summer of 2012-13 in Australia, I’m sure you all remember the new colour that had to be created by the Australian Bureau of Meteorology for the weather maps when they maxed out above 50oC, or maybe the new rating for bushfires of ‘catastrophic’ for the climate fuelled fires that are beyond just an extreme risk?

Extreme Heat in January 2013 (Bureau of Meteorology)

Extreme Heat in January 2013 (Bureau of Meteorology)

So, to answer that age-old question ‘exactly how much have we messed this up?’ these researchers looked at the historical monthly weather patterns, weather patterns with natural forcing only and patterns with natural forcing and human forcing and matched those up with what actually happened.

They looked at the average monthly mean temperatures, maximum temperatures and minimum temperatures and found that the monthly extremes are increasing faster than the daily extremes – that is that there are more months that are more overall extreme than there are days of extremes.

The historical data they used for the experiment was from 1850 to 2005, with the baseline climate data (what they used as a reference for ‘normal’) being 1911-1940 because 30 years of weather data makes a climate!

They then created experiments for the data with natural forcing only, with natural and human forcing and ran exciting statistical functions like a probability density function with a kernel smoothing function that almost sounds like a super-cool popcorn maker.

To double check for error, they used the second hottest summer temperatures to make sure they could pick out the human influences from the randomness that can be the data, thereby deliberately making their findings conservative.

Once they’d run their fun popcorn-sounding numbers, they calculated the FAR – Fraction of Attributable Risk, which is exactly what it sounds like – the fraction of the risk of something happening that can attributed to a cause.

So if our ‘bad guy’ is human-induced climate change, how much can we blame it for the Australian Angry Summer of 2012-13? Well, a fair bit.

When they compared the numbers, they had 90% confidence that there was a 2.5 times increase in extreme heat from human influences. When they compared 1976-2005 data and extended the model out to 2020, the fraction increased again to a 5 times increased likelihood.

Extreme heat is ‘substantially more likely’ because of humans burning fossil fuels, which are pretty bold words from research scientists – when there’s a greater than 90% chance of something they say ‘very likely’ where most people would say ‘very certain’. In their research, events that should have been occurring 1-in-16 years naturally were happening 1-in-6 years with the historical numbers and 1-in-2 years with the model out to 2020. Ouch – summer just got more uncomfortable more often.

MTSOfan, flickr

MTSOfan, flickr

For me, the kicker came when the paper pointed out that the 2012-13 summer in Australia came in a La Niña year. Extreme heat events normally come with an El Niño year – La Niña years are cooler with more rain. So the fact that the Angry Summer occurred in a La Niña year is scary – sea surface temperatures were average or cooler in some places while at the same time the Bureau of Meteorology was scrambling for new map colours.

The paper concludes that their research supports ‘a clear conclusion that anthropogenic climate change had a substantial influence on the extreme summer heat over Australia’ and that these kinds of events are now five times as likely to occur. Welcome to summers on climate change?

Greenland Whodunit

“The next 5–10 years will reveal whether or not [the Greenland Ice Sheet melting of] 2012 was a one off/rare event resulting from the natural variability of the North Atlantic Oscillation or part of an emerging pattern of new extreme high melt years.”

WHO: Edward Hanna, Department of Geography, University of Sheffield, Sheffield, UK
 Xavier Fettweis, Laboratory of Climatology, Department of Geography, University of Liège, Belgium
Sebastian H. Mernild, Climate, Ocean and Sea Ice Modelling Group, Computational Physics and Methods, Los Alamos National Laboratory, USA, Glaciology and Climate Change Laboratory, Center for Scientific Studies/Centre de Estudios Cientificos (CECs), Valdivia, Chile
John Cappelen, Danish Meteorological Institute, Data and Climate, Copenhagen, Denmark
Mads H. Ribergaard, Centre for Ocean and Ice, Danish Meteorological Institute, Copenhagen, Denmark
Christopher A. Shuman, Joint Center for Earth Systems Technology, University of Maryland, Baltimore, USA,  Cryospheric Sciences Laboratory, NASA Goddard Space Flight Center, Greenbelt, USA
Konrad Steffen, Swiss Federal Research Institute WSL, Birmensdorf, Switzerland, Institute for Atmosphere and Climate, Swiss Federal Institute of Technology, Zürich, Switzerland, Architecture, Civil and Environmental Engineering, École Polytechnique Fédéral de Lausanne, Switzerland
 Len Wood, School of Marine Science and Engineering, University of Plymouth, Plymouth, UK
Thomas L. Mote, Department of Geography, University of Georgia, Athens, USA

WHAT: Trying to work out the cause of the unprecedented melting of the Greenland Ice Sheet in July 2012

WHEN: 14 June 2013

WHERE: International Journal of Climatology (Int. J. Climatol.) Vol. 33 Iss. 8, June 2013

TITLE: Atmospheric and oceanic climate forcing of the exceptional Greenland ice sheet surface melt in summer 2012 (subs req.)

Science can sometimes be like being a detective (although I would argue it’s cooler) – you’ve got to look at a problem and try and work out how it happened. These researchers set out to do exactly that to try and work out how the hell it was that 98.6% of the ice sheet on Greenland started melting last summer.

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

Greenland – July 8th on the left only half melting. July 12th on the right, almost all melting (Image: Nicolo E. DiGirolamo, SSAI/NASA GSFC, and Jesse Allen, NASA Earth Observatory)

For a bit of context, Greenland is the kind of place where the average July temperature in the middle of summer is 2oC and the average summer temperature at the summit of the ice sheet is -13.5oC. Brrr – practically beach weather! So there’s got to be something weird going on for the ice sheet to start melting like that. Who are the suspects?

Atmospheric air conditions
Suspect number one is the atmospheric air conditions. The summer of 2012 was influenced strongly by ‘dominant anti-cyclonic conditions’ which is where warm southerly air moves north and results in warmer and drier conditions. There was also a highly negative North Atlantic Oscillation (NAO) which created high temperatures at high altitudes around 4km above sea level, which could explain the melting on the summit. The researchers also pointed out that the drier conditions meant less cloud cover and more sunny days, contributing to speedier melting.

There were issues with the polar jet stream that summer, where it got ‘blocked’ and stuck over Greenland for a while. The researchers used the Greenland Blocking Index (GBI), which while not trading on the NYSE, does tell you about wind height anomalies at certain geopotential heights (yeah, lots of meteorological words in this paper!). All of this makes the atmosphere look pretty guilty.

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Jet stream getting funky – temperature anomaly patterns at 600 hectopascals pressure, aka 4000m above sea level with a big red blob over Greenland (from paper)

Sea surface temperatures
Suspect number two is sea surface temperatures. If it was warmer in the ocean – that could have created conditions where the ice sheet melted faster right? The researchers ran a simulation of the conditions around Greenland for the summer of 2012 and then played around with different temperature levels for sea surface, as well as levels of salinity. It didn’t make more than 1% difference, so they don’t think it was sea surface. Also, ocean temperatures change more slowly than air temperatures (that’s why the ocean is still so cold even in the middle of summer!) and when they looked at the data for sea surface temperature, it was actually a bit cooler in 2012 than it was in 2011. Not guilty sea surface temperatures.

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Sea surface temperatures (top) and salinity (bottom) both decreasing (from paper)

Cloud patterns
Suspect number three is cloud cover patterns, which the researchers said could be a contributor to the ice sheet melting. However, they don’t have a detailed enough data set to work out how much clouds could have contributed to the melt. Not guilty for now clouds – due to lack of evidence.

Which leaves suspect number one – atmospheric air conditions. Guilty! Or, as the paper says ‘our present results strongly suggest that the main forcing of the extreme Greenland Ice Sheet surface melt in July 2012 was atmospheric, linked with changes in the summer NAO, GBI and polar jet stream’.

Now comes the scary part – it’s the atmosphere that we’ve been conducting an accidental experiment on over the last 200 years by burning fossil fuels. As the researchers point out, the North Atlantic Oscillation has a natural variability and patterns, so we could all cross our fingers and hope that the Greenland melting was a once off anomaly. Given the work that Dr Jennifer Francis has been doing at Rutgers into polar ice melt and how that slows the jet stream and causes it to meander; this may not be a good bet. Combine this with the fact that this level of melting is well beyond ‘the most pessimistic future projections’ and it’s getting scarier. This kind of melting was not supposed to occur until 2100, or 2050 in the worst case scenarios.

Interestingly, this could also link through to some of the work Jason Box is doing with his DarkSnow project in Greenland looking at how soot from fires and industry are affecting the melting of Greenland. The paper concludes that the next 5-10 years will show us whether it was a one off or the beginning of a new normal. Unless we stop burning carbon, it will only be the start of a terrifying new normal.

Crash Diets and Carbon Detoxes: Irreversible Climate Change

Much of the changes humans are causing in our atmosphere today will be largely irreversible for the rest of the millennium.

WHO: Susan Solomon, Chemical Sciences Division, Earth System Research Laboratory, National Oceanic and Atmospheric Administration, Boulder, Colorado, USA
Gian-Kasper Plattner, Institute of Biogeochemistry and Pollutant Dynamics, Zurich, Switzerland
Reto Knutti, Institute for Atmospheric and Climate Science, Zurich, Switzerland,
Pierre Friedlingstein, Institut Pierre Simon Laplace/Laboratoire des Sciences du Climat et de  l’Environnement, Unité Mixte de Recherche à l’Energie Atomique – Centre National de la Recherche Scientifique–Université Versailles Saint-Quentin, Commissariat a l’Energie Atomique-Saclay, l’Orme des Merisiers, France

WHAT: Looking at the long term effects of climate pollution to the year 3000

WHEN: 10 February 2009

WHERE: Proceedings of the National Academy of Sciences of the USA (PNAS), vol. 106, no. 6 (2009)

TITLE: Irreversible climate change due to carbon dioxide emissions

Stopping climate change often involves the metaphor of ‘turning down the thermostat’ of the heater in your house; the heater gets left on too high for too long, you turn the thermostat back down, the room cools down, we are all happy.

This seems to also be the way many people think about climate change – we’ve put too much carbon pollution in the atmosphere for too long, so all we need to do is stop it, and the carbon dioxide will disappear like fog burning off in the morning.

Except it won’t. This paper, which is from 2009 but I came across it recently while reading around the internet, looks at the long term effects of climate change and found that for CO2 emissions, the effects can still be felt for 1,000 years after we stop polluting. Bummer. So much for that last minute carbon detox that politicians seem to be betting on. Turns out it won’t do much.

The researchers defined ‘irreversible’ in this paper at 1,000 years to just beyond the year 3000, because over a human life span, 1,000 years is more than 10 generations. Geologically, it’s not forever, but from our human point of view it pretty much is forever.

So what’s going to keep happening because we can’t give up fossil fuels today that your great-great-great-great-great-great-great-great-great-great grandkid is going to look back on and say ‘well that was stupid’?

The paper looked at the three most detailed and well known effects: atmospheric temperatures, precipitation patterns and sea level rise. Other long term impacts will be felt through Arctic sea ice melt, flooding and heavy rainfall, permafrost melt, hurricanes and the loss of glaciers and snowpack. However, the impacts with the most detailed models and greatest body of research were the ones chosen for this paper (which also excluded the potential for geo-engineering because it’s still very uncertain and unknown).

Our first problem is going to be temperature increases, because temperatures increase with increased CO2  accumulation in the atmosphere, but if we turned off emissions completely (which is unfeasible practically, but works best to model the long term effects) temperatures would remain constant within about 0.5oC until the year 3000.

Why does this occur? Why does the temperature not go back down just as quickly once we stop feeding it more CO2? Because CO2 stays in the atmosphere for a much longer time than other greenhouse gases. As the paper says: ‘current emissions of major non-carbon dioxide greenhouse gases such as methane or nitrous oxide are significant for climate change in the next few decades or century, but these gases do not persist over time in the same way as carbon dioxide.’

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Temperature changes to the year 3000 with different CO2 concentration peaks (from paper)

Our next problem is changing precipitation patterns, which can be described by the Clausius-Clapeyron law of the physics of phase transition in matter. What the law tells us is that as temperature increases, there is an increase in atmospheric water vapour, which changes how the vapour is transported through the atmosphere, changing the hydrological cycle.

The paper notes that these patterns are already happening consistent with the models for the Southwest of the USA and the Mediterranean. They found that dry seasons will become approx. 10% dryer for each degree of warming, and the Southwest of the USA is expected to be approx. 10% dryer with 2oC of global warming. As a comparison, the Dust Bowl of the 1930s was 10% dryer over two decades. Given that many climate scientists (and the World Bank) think that we’ve already reached the point where 2oC of warming is inevitable, it seems like Arizona is going to become a pretty uncomfortable place to live.

Additionally, if we managed to peak at 450ppm of CO2, irreversible decreases in precipitation of ~8-10% in the dry season would be expected in large areas of Europe, Western Australia and North America.

Dry season getting dryer around the world (from paper)

Dry season getting dryer around the world (from paper)

Finally, the paper looked at sea level rise, which is a triple-whammy. The first issue is that warming causes colder water to expand (aka thermal expansion) which increases sea level. The second is that ocean mixing through currents will continue, which will continue the warming and the thermal expansion. Thirdly, warming of icecaps on land contributes new volume to the ocean.

The paper estimates that the eventual sea level rise from thermal expansion of warming water is 20 – 60cm per degree of climate change. Additionally, the loss of glaciers and small icecaps will give us ~20 – 70cm of sea level rise too, so we’re looking at 40 – 130cm of sea level rise even before we start counting Greenland (which is melting faster than most estimates anyway).

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

Sea level rise from thermal expansion only with different CO2 concentration peaks (from paper)

What does all of this mean? Well firstly it means you should check how far above sea level your house is and you may want to hold off on that ski cabin with all the melting snowpack as well.

More importantly though, it means that any last minute ‘saves the day’ Hollywood-style plans for reversing climate change as the proverbial clock counts down to zero are misguided and wrong. The climate pollution that we are spewing into the atmosphere at ever greater rates today will continue to be a carbon hangover for humanity for the next 1000 years or so. Within human time scales, the changes that we are causing to our atmosphere are irreversible.

So naturally, we should stop burning carbon now.

What’s in a Standard Deviation?

“By 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario” Marcott et al.

WHO: Shaun A. Marcott, Peter U. Clark, Alan C. Mix, College of Earth, Ocean, and Atmospheric Sciences, Oregon State University, Corvallis, Oregon, USA
 Jeremy D. Shakun, Department of Earth and Planetary Sciences, Harvard University, Cambridge, Massachusetts, USA.

WHAT: A historical reconstruction of average temperature for the past 11,300 years

WHEN: March 2013

WHERE: Science, Vol. 339 no. 6124, 8 March 2013 pp. 1198-1201

TITLE: A Reconstruction of Regional and Global Temperature for the Past 11,300 Years (subs req.)

We all remember the standard deviation bell curve from high school statistics; where in a population (like your class at school) there will be a distribution of something with most people falling around the mean. The usual one you start off with is looking at the height of everyone in your classroom – most people will be around the same height, some will be taller, some shorter.

The more you vary from the mean, the less likely it is that will happen again because around 68% of the population will fit into the first standard deviation either side of the mean. However, the important bit you need to keep in mind when reading about this paper is that standard deviation curves have three standard deviations on either side of the mean, which covers 99.7% of all the data. The odds that a data point will be outside three standard deviations from the mean is 0.1% either side.

The standard deviation bell curve (Wikimedia commons)

The standard deviation bell curve (Wikimedia commons)

What does high school statistics have to do with global temperature reconstructions? Well, it’s always good to see what’s happening in the world within context. Unless we can see comparisons to what has come before, it can be really hard to see what is and isn’t weird when we’re living in the middle of it.

The famous ‘Hockey Stick’ graph that was constructed for the past 1,500 years by eminent climate scientist Michael Mann showed us how weird the current warming trend is compared to recent geologic history. But how does that compare to all of the Holocene period?

Well, we live in unusual times. But first, the details. These researchers used 73 globally distributed temperature records with various different proxies for their data. A proxy is looking at the chemical composition of something that has been around for more than our thermometers to work out what the temperature would have been. This can be done with ice cores, slow growing trees and marine species like coral. According to the NOAA website, fossil pollen can also be used, which I think is awesome (because it’s kind of like Jurassic Park!).

They used more marine proxies than most other reconstructions (80% of their proxies were marine) because they’re better suited for longer reconstructions. The resolutions for the proxies ranged from 20 years to 500 years and the median resolution was 120 years.

They then ran the data through a Monte Carlo randomisation scheme (which is less exotic than it sounds) to try and find any errors. Specifically, they ran a ‘white noise’ data set with a mean of zero to double check for any errors. Then the chemical data was converted into temperature data before it all got stacked together into a weighted mean with a confidence interval. It’s like building a layer cake, but with math!

Interestingly, with their white noise data, they found the model was more accurate with longer time periods. Variability was preserved best with 2,000 years or more, but only half was left on a 1,000 year scale and the variability was gone shorter than 300 years.

They also found that their reconstruction lined up over the final 1,500 years to present with the Mann et al. 2008 reconstruction and was also consistent with Milankovitch cycles (which ironically indicate that without human interference, we’d be heading into the next glacial period right now).

Temperature reconstructions Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

Temperature reconstructions
Marcott et al. in purple (with blue confidence interval) Mann et al. in grey (from paper)

They found that the global mean temperature for 2000-2009 has not yet exceeded the warmest temperatures in the Holocene, which occurred 5,000 – 10,000 years ago (or BP – before present). However, we are currently warmer than 82% of the Holocene distribution.

But the disturbing thing in this graph that made me feel really horrified (and I don’t get horrified by climate change much anymore because I read so much on it that I’m somewhat de-sensitised to the ‘end of the world’ scenarios) is the rate of change. The paper found that global temperatures have increased from the coldest during the Holocene (the bottom of the purple bit before it spikes up suddenly) to the warmest in the past century.

We are causing changes to happen so quickly in the earth’s atmosphere that something that would have taken over 11,000 years has just happened in the last 100. We’ve taken a 5,000 year trend of cooling and spiked up to super-heated in record time.

This would be bad enough on its own, but it’s not even the most horrifying thing in this paper. It’s this:

‘by 2100, global average temperatures will probably be 5 to 12 standard deviations above the Holocene temperature mean for the A1B scenario.’ (my emphasis)

Remember how I said keep in mind that 99.7% of all data points in a population are within three standard deviations on a bell curve? That’s because we are currently heading off the edge of the chart for weird and unprecedented climate, beyond even the 0.1% chance of occurring without human carbon pollution.

The A1B scenario by the IPCC is the ‘medium worst case scenario’ which we are currently outstripping through our continuously growing carbon emissions, which actually need to be shrinking. We are so far out into the tail of weird occurrences that it’s off the charts of a bell curve.

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

NOAA Mauna Loa CO2 data (396.80ppm at Feb. 2013)

As we continue to fail to reduce our carbon emissions in any meaningful way, we will reach 400ppm (parts per million) of carbon dioxide in the atmosphere in the next few years. At that point we will truly be in uncharted territory for any time in human history, on a trajectory that is so rapidly changing as to be off the charts beyond 99.7% of the data for the last 11,300 years. The question for humanity is; are we willing to play roulette with our ability to adapt to this kind of rapidly changing climate?

Too Hot in Texas

New modelling of climate change effects on mosquito populations in the United States has surprising results – it might get too hot in summer even for the mosquitoes

WHO: R A Erickson, S M Presley, Department of Environmental Toxicology, and Institute of Environmental and Human Health, Texas Tech University, Lubbock, Texas
K Hayhoe, Department of Political Science, Texas Tech University, Lubbock, Texas
L J S Allen, Institute of Environmental and Human Health, and Department of Mathematics and Statistics, Texas Tech University, Lubbock, Texas
K R Long, Department of Mathematics and Statistics, Texas Tech University, Lubbock, Texas
S B Cox, Department of Environmental Toxicology, and Institute of Environmental and Human Health, Texas Tech University, Lubbock, Texas and Research and Testing Laboratory, LLC, Lubbock, Texas

WHAT: Population modelling for the Asian Tiger mosquito which carries dengue fever under two climate change scenarios

WHEN: 5 July 2012

WHERE: Environmental Research Letters, Vol. 7, No. 3 (July-Sept 2012)

TITLE: Potential impacts of climate change on the ecology of dengue and its mosquito vector the Asian tiger mosquito (Aedes albopictus)

This group of researchers in Texas decided it would be interesting to look at different climate change emissions scenarios from the IPCC and see what the effect of climate change might be on everybody’s ‘friend’ the Asian Tiger mosquito. For those of you who haven’t met the Asian Tiger mosquito, it is the type that carries dengue fever, which makes you very sick. So understandably, how climate change affects the population spread of this mosquito is pretty important.

The Asian Tiger mosquito is not your friend (Wikipedia)

The researchers looked at three localised areas in the US to run their model – Lubbock TX (where their University is), Atlanta GA, and to look at the potential geographical spread of the mosquito; Chicago IL.

Many of the predicted consequences of climate change are currently happening decades ahead of schedule, and one of the consequences is the expansion of the tropical belt by around 2- 4.8o latitude since 1979. This wasn’t expected to occur until 2100, so it means mosquitoes could be moving north faster than previously predicted.

The climate scenarios used were the A1FI (high emissions) and B1 (medium emissions) from the IPCC Special Report on Emissions Scenarios, which relate to approximately 970ppm (A1FI) and 550ppm (B1) of CO2 in the atmosphere. To give some context for those numbers, we’re currently sitting at 391ppm. 550ppm is where feedback loops have already kicked in and there are large ocean ‘dead zones’ where there’s not enough oxygen for plant and animal life. 970ppm is the IPCC’s ‘worst case scenario’ where there is mass biodiversity loss and a high likelihood of mass extinction events.

IPCC Emissions Scenarios A1FI (above) and B1 (below)

Anyway, back to mosquitoes. The researchers used three of the world’s best and most detailed climate models; the CM3 model from the UK’s Hadley Centre, the National Centre for Atmospheric Research model in Colorado, and the National Oceanic and Atmospheric Administration’s CM2.1 model. They used the mean temperature data from their three locations and combined it with the climate model to work out what the average temperatures might look like under the scenarios. Then they applied those conditions to mosquito populations to see what might change.

What they found was very interesting, and not what the researchers had originally expected. While the population size and duration of the mosquito season in Chicago increased across the board along with the potential dengue fever outbreak size, in Lubbock and Atlanta the mid-summer temperatures got too hot even for the mosquitoes.

Chicago (left) and Lubbock (right) with mid and end of century predictions. Chicago has an increase in mosquito population while Lubbock has a noticeable mid-summer die-off of mosquitoes (from paper)

While the mosquito season in Lubbock started earlier and had a potential for greater dengue fever outbreaks, the super-hot summer temperatures under both of the climate change scenarios modelled led to mosquitoes dying and a reduction in potential dengue fever outbreaks. This could have many social and health policy ramifications in the areas studied and also shows that the local level effects of climate change may manifest in ways we haven’t previously thought of.

Humans are notoriously difficult to predict and we don’t know yet what humanity will do about climate change in the near future. This gets combined with natural systems and feedbacks that are highly integrated and complex which means one seemingly unrelated process may be triggered in another previously unrelated process.

However, complexity doesn’t mean that models aren’t relevant or useful and the proverbial baby should be thrown out with the bathwater. Models give us a range of possibilities to plan for and allow humans the opportunity to act in our own long term best interests.

Currently, we’re not acting for our long term well being, and humanity is currently burning carbon at a rate that matches or beats the A1FI high emissions scenario that very probably leads to mass extinction, including humans. Which means that now would be the time to stop burning fossil fuels. Before Texas becomes so scorching hot that even the mosquitoes die from the mid-summer heat.

Improved Drought Prediction: Now With Six Soil Layers

Predicting the severity of drought using multiple indices

WHO: Liu Sun, Scott W. Mitchell (Department of Geography and Environmental Studies, Geomatics and Landscape Ecology Research Laboratory, Carleton University, Ottawa, Ontario, Canada)
Andrew Davidson (Department of Geography and Environmental Studies, Carleton University, Ottawa National Land and Water Information Service, Agriculture and Agri-Food Canada, Ottawa, Ontario)

WHAT: Improving the accuracy of drought prediction in the Canadian prairies

WHEN: September 2012

WHERE: International Journal of Climatology, Vol 32, Issue 11, September 2012

TITLE: Multiple drought indices for agricultural drought risk assessment on the Canadian prairies (subs req)

Are you tired of your drought prediction methods using only two layers of soil structure to track moisture? Sick of having to work with constants when you’d much rather be using dynamically calculated values? Well, this paper is for you.

Drought is going to be a big issue with climate change as rainfall patterns change and move. Agricultural yields are not able to increase as quickly as the world’s population increases but people still need to eat.

Drought is going to affect all of us as extreme weather increases from climate change, whether it’s through increased food prices (I’m still upset about bacon), local water restrictions (stop hosing down concrete – stop it now), local ecosystems being stressed or climate refugees from newly arid areas. This is one of the great ironies about climate change – you can’t negotiate with or spin physics. The laws of physics aren’t going to change because of some slick advertising campaign trying to prop up a floundering status-quo, and climate change isn’t going to avoid you if you ignore it.

In terms of drought modelling and prediction, each method currently used has slightly different ways of predicting drought, which means they can’t easily be compared. The method the researchers used for this paper was to modify the original Palmer Drought Severity Index to include more variable data. They accounted for six soil layers and a new evaporation calculation. Instead of using constant numbers for the characteristics of the climate, they allowed each of those to be calculated too. This means most people end up with a giant math headache from extra calculations, but by allowing for greater variability, they also allowed for greater sensitivity and accuracy in their model. The new model was also tested for accuracy against the Palmer Drought Severity Index, Standardised Precipitation Data and Palmer Moisture Anomaly Index methods.

For any of these models to work, they need approximately 30 years of monthly weather data (temperature, rain etc). This paper looked at 1976 – 2003 as it was the period of most consistent data in the area they were studying (the Canadian Prairies).

Then they got into the serious math using all kinds of things like a ‘thin plate smooth spline surface fitting method’ to remove the noise from the data and a linear regression to remove yield differences from better agricultural practices, allowing them to just look at the data that was climate affected.

The different models: red dot indicating the new model. Spot on for most, slightly under for some (from paper)

It went pretty well; their predictions were more accurate than the other standard drought prediction methods, except for predicting extreme drought, which their model under-predicted. This is possibly because there wasn’t a lot of data points in the previous 30 years with extreme drought, so as extreme weather becomes more normal under climate change, their model will probably get more accurate. They also found that the model is more accurate for arid locations, as flooding messes up the model.

As the extreme, unpredictable realities of climate change start to affect everyone in the next decade or so, this drought prediction model will likely be very useful. Predicting the extremes as best we can is going to become an essential tool for preventing massive crop failures as well as loss of human lives.