- UCAR Home
- About Us
- For Staff
The biggest swarm of tornadoes ever recorded—148 in all—rumbled across the U.S. Midwest and South on 3–4 April 1974. A network of radars operated by NOAA’s National Weather Service caught many of the tornadic circulations, and dozens of watches and warnings issued by the service gave many people ample time to seek cover. Still, more than 300 people died in what became known as the Jumbo Outbreak.
Severe weather research was still in its infancy at the time, and NCAR was one of its incubators. A small group of researchers, some of them involved with the National Hail Research Experiment, or NHRE (see page 18), turned their attention to supercells, the long-lived thunderstorms that spawned the most violent tornadoes. Would it be possible to zero in on the storm scale and generate a supercell inside a computer model?
Two young scientists—Joseph Klemp (NCAR) and Robert Wilhelmson (University of Illinois at Urbana-Champaign)—gave it a try. Until then, most weather modeling had simulated motions and processes spanning thousands of miles. On a smaller scale, modelers at NCAR and elsewhere had made some progress in simulating severe thunderstorms in two dimensions, an approach that worked well for the linear nature of nontornadic squall lines.
However, supercells were more complex and three-dimensional, a portrait that emerged from a wealth of satellite and radar observations as well as still and moving imagery. Photos and film came from the first research-based storm chase projects, launched in the 1970s in Texas and Oklahoma, while data from the hail project helped scientists develop conceptual models that underscored the 3-D nature of storms. Nobody had yet tried to turn these concepts into software, though. To model with a domain a supercell realistically, Klemp and Wilhelmson would need to build a 3-D model with a domain large enough to encompass the storm and its environs, but detailed enough to capture important features within the thunderstorm.
The computational challenge was immense. Temperature, moisture, and wind near thunderstorms can vary sharply across small distances and short time scales. Klemp and Wilhelmson pared the computing demands in carefully chosen ways, drawing on basic research in such areas as turbulence and cloud physics. Technology also gave Klemp and Wilhelmson a major boost with the 1977 arrival of NCAR’s first supercomputer, the Cray-1 (see page 21).
Among other things, the Klemp-Wilhelmson model captured the development of a splitting supercell, one that breaks into left- and right-moving pieces under the influence of powerful upper-level winds. Over the subsequent years, as universities gained computing prowess, dozens of scientists adopted the Klemp-Wilhelmson techniques, and the field of severe-storm modeling came into its own. Today, both scientists serve as senior leaders at the same institutions from which they forged a classic NCAR-university collaboration.
Wilhelmson looks forward to tornado simulation using models with resolutions as fine as 10 meters (33 feet), compared to the 2 kilometers (3.2 miles) used in his earliest simulations with Klemp. “We didn’t understand at the time the impact we would have on the modeling community,” Wilhelmson observes. “What we did know, however, is that we had a great time working together.”
"I think we were all surprised by how well the communications worked."
—Paul Markowski, Pennsylvania State University
The spring of 2009 saw what was perhaps the largest array of vehicles ever deployed for an atmospheric study. Although the Great Plains were oddly devoid of severe weather for weeks, the second Verification of the Origins of Rotation in Tornadoes Experiment (VORTEX2 or V2) captured one twister in unprecedented detail, as well as a number of potentially tornadic thunderstorms that never made the grade.
More than 50 vehicles were on the road from early May to mid-June, spanning a study area that stretched from Texas to southern Minnesota. On each “go” morning, participants headed out to a potentially stormy area, eventually zeroing in on a county-sized target where a tornadic supercell was deemed possible over a several-hour window. As they stayed in touch through multiple cellphone networks and a Web-based chat program, the teams formed a dense set of observing platforms as the storm passed. Each evening the groups reconvened, typically hundreds of miles from the day’s starting point. As NCAR participant Timothy Lim observed, “It’s hard to get your laundry done when you don’t know what state you’ll be in by nightfall.”
Major advances in technology pushed V2 far beyond the bounds of the first VORTEX study, conducted in 1994–95. Several sets of mobile Doppler radars were deployed, each with their own specifications and strengths, along with an array of other observing systems. A tornado on 5 June in Goshen County, Wyoming, was sampled at close range as frequently as every few seconds, yielding an unprecedented data set.
Nontornadic cases also drew keen interest. Models have trouble depicting the small pools of cool, moist air that descend from severe storms and appear to help trigger tornadoes. V2 gathered wind, temperature, and moisture data from a number of these cold pools, including several from storms that developed strong rotation but no tornadoes.
Even after a second season of sampling in 2010—and a total of more than 20 tornadoes documented—much more work lies ahead. “We’re not expecting ‘Eureka!’ data,” says NCAR visiting scientist Joshua Wurman (Center for Severe Weather Research), whose mobile radars had already sampled more than 140 tornadoes before V2. “If there were low-hanging fruit, it would have been plucked already. The features we’re looking for are subtle, and they’ll only come out through an arduous and complicated synthesis of data.”