Going to Computational Extremes on Weather Extremes
by Bill Chameides | September 13th, 2011
posted by Erica Rowell (Editor)
Using the search for intelligent life to search for climate answers?
There are lots of reasons for being concerned about global warming. Sea-level rise, growing stress on ecosystems due to shifting seasons and temperature ranges…. (Just last week a paper in the journal Geophysical Research Letters reported that much of the globe in the latter half of the 20th century experienced an upward trend in minimum nighttime temperatures and that the pattern of this increase matched what models predict from increased concentrations of greenhouse gases.)
Climate Change and Extreme Weather
But for many the most worrisome aspect of global warming is the spectre of an increasing severity and frequency of weather extremes — more droughts and wildfires, hurricanes, snowstorms, downpours, etc.
This year has provided Americans with a grim reminder of just how vulnerable we are to these types of events:
- The record-breaking drought that has devastated southern states, especially Texas, and that shows no signs of dissipating;
- Record-setting wildfires;
- Historic flooding besetting the Midwest (see also here); and
- The Northeast’s record flooding in the wakes of Hurricanes Irene and Lee (which in some cases has wiped out the fall harvest).
These events and more have combined to put 2011 into the No. 1 slot for the most “billion dollar” weather disasters to hit the U.S. homeland since 1980 when we started tracking these events. (More on this here.) Tragically, lives as well as dollars were lost in these events.
Climate Change and Extreme Weather
OK, so extreme weather is one of the more threatening aspects of our existence on the planet. But what does this have to do with global warming?
One can point out that in a general sense global warming might enhance the intensity and frequency of extreme weather, but that does not mean that any individual event, no matter how extreme, was conclusively due to a long-term rise in global temperatures. After all, one of the hallmarks of weather is its variability and seeming unpredictability.
In fact, weather is so unpredictable that forecasters are rarely definitive — notice how weather predictions are couched in statistical terms? Rare is the pronouncement that “tomorrow it will rain”; more common is something along the lines of “there’s a 70 percent chance of thunderstorms tomorrow.” Similarly, even for an extremely extreme or rare event, there’s almost always a finite probability that that is part of the natural variability of the system and not related to global warming.
Climate Scientists Take a Page From the Weatherman’s Book
All this seems to put us in an untenable position: One of the most serious threats of global warming is an increase in extreme weather events, but it would seem that except in a general way we are and will always be unable to assess whether global warming is having any effect on these events.
At least that was the thinking until recently. Now those clever little climate scientists, taking a page out of the weather forecasters’ playbook, are taking a probabilistic approach to better assess what is behind a given extreme weather event. Rather than trying to determine if such-and-such a hurricane was caused by global warming (which is essentially impossible), they’re assessing how much more likely said hurricane is as a result of global warming. It’s an approach called “fractional attribution.”
One of the first studies to adopt this approach was undertaken by Peter Stott of the UK Met Office’s Hadley Center. In 2004 Stott and his co-authors reported in the journal Nature on Europe’s record-breaking heat wave of 2003, the devastating scorcher of a month that led to some 35,000 heat-related deaths. The authors used a series of four climate simulations to assess the relative probabilities of extreme heat waves in Europe, both with and without greenhouse gases. They concluded that “it is very likely (confidence level >90%) that human influence has at least doubled the risk of a heatwave exceeding” the extremes experienced in 2003.
For Better Assessments, Scientists Collect More and More Data
More recently scientists have gone even farther in their pursuit of answers. Rather than depending on the limited number of climate simulations using in-house computational facilities, scientists have begun to make use of the huge computational power otherwise left untapped on the Web from idle computers to carry out thousands of individual climate simulations. (This system, known as distributed computing, is probably most famously known for the SETI@home project, which monitors the universe in search for extra terrestrial intelligence using tens of thousands of computers hooked up to the Internet — the distributed computing — otherwise left untapped on desktops around the globe.) By integrating across all of these individual simulations, it is possible to get a more robust assessment of the probability of a human attribution to a given event.
Last February the journal Nature published two papers along these lines. The first, by Seung-Ki Min of Environment Canada and colleagues, compared extreme precipitation observations from the Northern Hemisphere over the latter half of the 20th century with close to 40 model simulations to assess if emissions of greenhouse gases from human activities change the frequency of these same types of events. The model simulations showed an increasing trend in intense precipitation events over the study period consistent with the observed trend indicating that our activities are a factor driving the upward trend (more on this here).
The other paper, by Pardeep Pall of the University of Oxford and colleagues, looked at the likelihood of global warming contributing to an extreme flooding event that occurred in the United Kingdom in 2000. The researchers ran thousands of simulations of the weather conditions leading up to the flooding with and without anthropogenic greenhouse gases. Using a probabilistic analysis, the authors found that the same flood was more likely to occur with global warming than without:
“[I]n nine out of ten cases our model results indicate that twentieth-century anthropogenic greenhouse gas emissions increased the risk of floods occurring in England and Wales in autumn 2000 by more than 20%, and in two out of three cases by more than 90%.”
Now, as reported last week in Nature, scientists propose to develop a more formal program using fractional attribution under the acronym ACE for Attribution of Climate-related Events.
I find this to be an interesting development, not only because of the insights that will be gained about the relationship between climate and extreme weather, but also because of the democratization of climate science — with this, almost anyone could have a climate simulation running on his or her own computer. But there is an obvious limitation — the results are based on climate models, which are not especially well designed to simulate events on the spatial scales relevant to most extreme events.
Will fractional attribution settle the controversy surrounding the human contribution to extreme weather events? Maybe, but I’m not yet extremely confident.filed under: climate change, drought, faculty, global warming, heat waves, science, weather
and: climate models, climate science, climate scientist, distributed computing, extreme weather, floods, fractional attribution, prediction, weatherman, wild fires