Extreme tornado outbreaks have become more common, says study
Most death and destruction inflicted by tornadoes in North America occurs during outbreaks–large-scale weather events that can last one to three days and span huge regions. The largest ever recorded happened in 2011. It spawned 363 tornadoes across the United States and Canada, killing more than 350 people and causing $11 billion in damage.
Now, a new study shows that the average number of tornadoes in these outbreaks has risen since 1954, and that the chance of extreme outbreaks –tornado factories like the one in 2011–has also increased.
The study's authors said they do not know what is driving the changes. “The science is still open,” said lead author Michael Tippett, a climate and weather researcher at Columbia University's School of Applied Science and Engineering and Columbia's Data Science Institute. “It could be global warming, but our usual tools, the observational record and computer models, are not up to the task of answering this question yet.” Tippett points out that many scientists expect the frequency of atmospheric conditions favorable to tornadoes to increase in a warmer climate–but even today, the right conditions don't guarantee a tornado will occur. In any case, he said, “When it comes to tornadoes, almost everything terrible that happens, happens in outbreaks. If outbreaks contain more tornadoes on average, then the likelihood they'll cause damage somewhere increases.”
The results are expected to help insurance and reinsurance companies better understand the risks posed by outbreaks, which can also generate damaging hail and straight-line winds. Over the last 10 years, the industry has covered an average of $12.5 billion in insured losses each year, according to Willis Re, a global reinsurance advisor that helped sponsor the research. The article appears this week in the journal Nature Communications.
Every year, North America sees dozens of tornado outbreaks. Some are small and may give rise to only a few twisters; others, such as the so-called “super outbreaks” of 1974 and 2011, can generate hundreds. In the simplest terms, the intensity of each tornado is ranked on a zero-to-five scale, with other descriptive terms thrown in. The lower gradations cause only light damage, while the top ones, like a twister that tore through Joplin, Missouri, in 2011 can tear the bark off trees, rip houses from their foundations, and turn cars into missiles.
For this study, the authors calculated the mean number of tornadoes per outbreak for each year as well as the variance, or scatter, around this mean. They found that while the total number of tornadoes rated F/EF1 and higher each year hasn't increased, the average number per outbreak has, rising from about 10 to about 15 since the 1950s.
The study was coauthored by Joel Cohen, director of the Laboratory of Populations, which is based jointly at Rockefeller University and Columbia's Earth Institute. Cohen called the results “truly remarkable.”
“The analysis showed that as the mean number of tornadoes per outbreak rose, the variance around that mean rose four times faster. While the mean rose by a factor of 1.5 over the last 60 years, the variance rose by a factor of more than 5, or 1.5 x 1.5 x 1.5 x 1.5. This kind of relationship between variance and mean has a name in statistics: Taylor's power law of scaling.
“We have seen [Taylor's power law] in the distribution of stars in a galaxy, in death rates in countries, the population density of Norway, securities trading, oak trees in New York and many other cases,” Cohen says. “But this is the first time anyone has shown that it applies to scaling in tornado statistics.”
The exponent in Taylor's law number–in this case, the exponent was 4– can be a measure of clustering, Cohen says. If there's no clustering–if tornadoes occur just randomly–then Taylor's law has an exponent of 1. If there's clustering, then it's greater than 1. “In most ecological applications, the Taylor exponent seldom exceeds 2. To have an exponent of 4 is truly exceptional. It means that when it rains, it really, really, really pours,” says Cohen.
Extreme outbreaks have become more frequent because of two factors, Tippett said. First, the average number of tornadoes per outbreak has gone up; second, the rapidly increasing variance, or variability, means that numbers well above the average are more common.
Tippett was concerned that the findings could be artifacts of tornado observational data, which are based on eyewitness accounts and known to have problems with consistency and accuracy. To get around this, he re-ran his calculations after substituting the historical tornado data with environmental proxies for tornado occurrence and number of tornadoes per occurrence. These provide an independent–albeit imperfect–measure of tornado activity. The results were very nearly identical.
As for whether the climate is the cause, Tippett said, “The scientific community has thought a great deal about how the frequency of future weather and climate extremes may change in a warming climate. The simplest change to understand is a shift of the entire distribution, but increases in variability, or variance, are possible as well. With tornadoes, we're seeing both of those mechanisms at play.”
“This paper helps begin to answer one of the fundamental questions to which I'd like to know the answer,” says Harold Brooks of the U.S. National Oceanic and Atmospheric Administration's National Severe Storms Laboratory. “If tornadoes are being concentrated into more big days, what effect does that have on their impacts compared to when they were less concentrated?”
“The findings are very relevant to insurance companies that are writing business in multiple states, especially in the Midwest,” says Prasad Gunturi, senior vice president at Willis Re, who leads the company's catastrophe model research and evaluation activities for North America. “Overall growth in the economy means more buildings and infrastructure are in harm's way,” said Gunturi. “When you combine this increased exposure because outbreaks are generating more tornadoes across state lines and the outbreaks could be getting more extreme in general, it means more loss to the economy and to insurance portfolios.”
Insurance companies have contracts with reinsurance companies, and these contracts look similar to the ones people have for home and car insurance, though for much higher amounts. The new results will help companies ensure that contracts are written at an appropriate level and that the risks posed by outbreaks are better characterized, said Brooks.
“One big question raised by this work, and one we're working on now, is what in the climate system has been behind this increase in outbreak severity,” said Tippett.
###
This research was also supported by grants from Columbia's Research Initiatives for Science and Engineering, the Office of Naval Research, NOAA's Climate Program Office and the U.S. National Science Foundation.
The paper, “Tornado outbreak variability follows Taylor's power law of fluctuation scaling and increases dramatically with severity,” is available from the authors.
Scientist contacts:
Michael Tippett 212-851-5936 mkt14@columbia.edu
Joel Cohen 212-327-8883 cohen@mail.rockefeller.edu
More information: Kevin Krajick, Senior editor, science news, The Earth Institute kkrajick@ei.columbia.edu 212-854-9729
The Earth Institute, Columbia University mobilizes the sciences, education and public policy to achieve a sustainable earth. http://www.
Media Contact
All latest news from the category: Earth Sciences
Earth Sciences (also referred to as Geosciences), which deals with basic issues surrounding our planet, plays a vital role in the area of energy and raw materials supply.
Earth Sciences comprises subjects such as geology, geography, geological informatics, paleontology, mineralogy, petrography, crystallography, geophysics, geodesy, glaciology, cartography, photogrammetry, meteorology and seismology, early-warning systems, earthquake research and polar research.
Newest articles
First-of-its-kind study uses remote sensing to monitor plastic debris in rivers and lakes
Remote sensing creates a cost-effective solution to monitoring plastic pollution. A first-of-its-kind study from researchers at the University of Minnesota Twin Cities shows how remote sensing can help monitor and…
Laser-based artificial neuron mimics nerve cell functions at lightning speed
With a processing speed a billion times faster than nature, chip-based laser neuron could help advance AI tasks such as pattern recognition and sequence prediction. Researchers have developed a laser-based…
Optimising the processing of plastic waste
Just one look in the yellow bin reveals a colourful jumble of different types of plastic. However, the purer and more uniform plastic waste is, the easier it is to…