Supercomputers and high-speed data connections play a crucial role when researchers re-create tornadoes and thunderstorms to better understand the dynamics of these powerful natural phenomena.
Leigh Orf from CIMSS, the Cooperative Institute for Meteorological Satellite Studies at the University of Wisconsin-Madison, leads a group of researchers specialised in re-creating meteorological events leading up to the forming of tornadoes.
Built on real-world observational data, the computer simulations unveil the inner workings of these monstrous events in unprecedented detail.
According to a report in the leading international HPC magazine, Primeur ,Leigh Orf’s most recent simulation re-creates the El Reno tornado, the most powerful of several tornadoes that devastated parts of Oklahoma during May 2011.
On May 24, the El Reno tornado, registered as an EF-5, the strongest tornado category, remained on the ground for two hours and left a 63-mile long path of destruction, causing huge property damage and loss of life.
According to Primeur, Leigh Orf designed the high-res simulation using archived data taken from a short-term operational model forecast, in the form of an atmospheric sounding, a vertical profile of temperature, air pressure, wind speed and moisture.
Among other things, the simulation shows the forming of several small “mini-tornadoes” (misocyclones) leading up to the main tornado. Also, it shows the building up of the streamwise vorticity current (SVC).
The SVC is a helically-flowing “tube” of rain-cooled air that is sucked into the updraft that helps to drive the powerful storm, Leigh Orf told Primeur.
“It’s believed that this is a crucial part in maintaining the unusually strong storm, but interestingly, the SVC never makes contact with the tornado. Rather, it flows up and around it,” he said.
The SVC is one of many factors creating conditions that allow the forming of tornadoes, known as tornadogenesis. These parameters are highly unstable and interact in complex ways that are very difficult to re-create in a simulation.
Adding to the complexity is the fact that when designing simulations that can yield new knowledge about the forming of these extreme weather phenomena you need real-life data about weather conditions immediately prior to tornado formation. Obviously these data are difficult and dangerous to obtain.
Also, you need powerful computing resources to process the enormous amounts of data. For this purpose Leigh Orf uses the Blue Waters Supercomputer at the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign.
As well as supercomputing power, Leigh relies on high-speed networks to move data around.
“I do move a lot of data from Blue Waters to other machines, such as my office machine and a local Lustre share, using Globus (API allowing scientists to share large amounts of data simply, quickly and securely, ed.). I’d say about half of my analysis is done that way. I regularly use a piece of software called Vapor, which requires the files, often spanning several TB, to be local to the machine you are using.
“I regularly get some very nice transfer rates from Blue Waters in Urbana, Illinois to my office in Madison, Wisconsin – over 100 MB/s.
“On the other hand I do as much visualization and analysis on Blue Waters as I can since it has a huge disk array and then I don’t have to deal with moving TB of data around.”
Leigh Orf and his collaborators are now working to further refine the tornado simulation model to shed more light on these dangerous and unpredictable weather phenomena.
For the full story please go to Primeur Magazine
For more information please contact our contributor(s):