The quest to understand turbulence

New branches of NCAR research sometimes emerge through the advent of new technology, a change in national priorities, or a disastrous weather event. Other research topics have threaded their way through the center’s entire half-century history. Such is the case with turbulence in the atmosphere, magnetosphere, and oceans, a subject in which each advance in understanding has been hard won.

Several fruitful developments in turbulence research at NCAR in the 1990s were built on a foundation dating back to the 1960s. Associate director Philip Thompson founded an informal Turbulence Club, and NCAR’s Cecil Leith and Jackson Herring were prime contributors to turbulence research. Douglas Lilly and James Deardorff developed a numerical technique that came to be known as large-eddy simulation (LES), which Deardorff premiered at a 1969 symposium. “Jim’s numerical results agreed amazingly well with the detailed measurements of turbulent channel flow made over many years,” says John Wyngaard (Pennsylvania State University), who studied turbulence at NCAR in the 1980s. “The fluid engineers must have wondered, ‘Who are these guys?’ ”

NCAR’s Chin-Hoh Moeng collaborated with Wyngaard to expand the range of LES techniques. “LES is now our most powerful numerical tool for turbulence research,” Moeng says. In less than a week of run time, she can simulate turbulence with thousands more grid points than were available in the early 1990s. Scientists have used LES to shed light on the flow through vast sheets of marine stratocumulus clouds, as well as the dynamics that drive severe thunderstorms on Earth and magnetic storms in and near the Sun.

Color visualization in black, red, yellow and blue
Examining a 1992 incident involving clear-air turbulence, NCAR scientists used a high-resolution model to study horizontally aligned vortex tubes that swept eastward (left to right) from Colorado’s Front Range mountains. (Visualization courtesy NCAR/CISL/VETS.)

By the 1990s, these disparate lines of work—plus regular workshops on turbulence held since 1974—had an institutional home: the NCAR Geophysical Turbulence Program. GTP has served as a loose-knit framework by which scientists at NCAR, universities, and federal laboratories can share notes and meet regularly. One 1999 symposium attracted more than 100 mathematicians, atmospheric scientists, and oceanographers—a large chunk of the world’s leading turbulence researchers.

“In many ways, our workshops and summer schools are the core of the program, along with the development of community tools,” says Annick Pouquet, who joined NCAR in 2000 and has headed GTP ever since. “We make a special effort to choose topics that reflect the universality of turbulence and bring together scientists from many disciplines.”

Even with a shot of computational adrenaline, turbulence modelers in the 1990s couldn’t ignore real-world data. “History shows us that the path to truth in turbulence has never strayed far from observations,” says Wyngaard. One study, led by Robert Kerr, Terry Clark, and William Hall, used an NCAR supercomputer to successfully portray bursts of clear-air turbulence that tore an engine off a DC-8 cargo jet near Denver on 9 December 1992. When pushed to its highest resolution, the model revealed something previously unseen: narrow tubes of circulation, extending east from the Rocky Mountains (see graphic), in which wind speeds changed by as much as 100 miles per hour (40 meters per second) over a mere eighth of a mile (200 meters). “The pilot didn’t expect to hit what he hit,” said Kerr.

At that time, pilots received only subjective reports of choppy air from other pilots. New warning systems, produced at NCAR starting in the 1990s, arose from a solid grounding in research. With support from the Federal Aviation Administration, NCAR’s Larry Cornman and colleagues crafted software that uses aircraft themselves as instruments—converting the up-and-down bouncing detected by onboard sensors into a measure of the currents buffeting the plane. This system is now an international standard, and radar-based techniques for spotting turbulence are also gaining ground.


Today — Tracking turbulent flow: the visceral and the virtual

Photo of Harindra "Joe" Fernando

"NCAR is at the vanguard of research on atmospheric turbulence."

—Harindra "Joe" Fernando, University of Notre Dame

Just as the view in a smashed mirror can become unrecognizable, researchers have long struggled to see what happens when large atmospheric eddies break up into smaller circulations. Huge amounts of energy are dissipated along the way, across scales as small as a few millimeters. Conversely, small-scale motions may coalesce to influence large-scale behavior. This problem—often popularized as the “butterfly effect”—was identified in the 1960s by frequent NCAR visitor Edward Lorenz (Massachusetts Institute of Technology) as a hindrance to predicting weather more than a few days in advance.

One time-tested way to catch turbulence in action is with aircraft-mounted probes, a technique pioneered by NCAR’s Donald Lenschow and James Telford in the 1960s. Today, the NSF/NCAR C-130 and Gulfstream-V aircraft boast systems that can measure turbulence on scales as fine as 5 meters (17 feet).

Closer to terra firma, an NCAR-led, NSF-funded study in 2007 examined the turbulence within currents skimming within and across the tops of trees in a California walnut orchard. The project followed a similar NSF study in 2004—led by NCAR, the Woods Hole Oceanographic Institution, and Pennsylvania State University—that sampled winds just above the ocean surface and documented wind-wave interactions.

Taking their lead from observational data, modelers at NCAR and elsewhere are generating increasingly detailed portraits of turbulence in action. A burst of progress occurred in 2006 with a simulation of magnetic currents such as those found in Earth’s magnetosphere and the Sun’s corona. Produced by Annick Pouquet and Pablo Mininni (NCAR) and David Montgomery (Dartmouth College), the simulations depicted elegantly coiled cylinders similar to the rows of curls found in Kelvin-Helmholtz clouds (see below).

Visualization on left side in yellow, green and blue, and photo of clouds on right
High-resolution modeling of electromagnetic currents in the presence of turbulent flow, such as those observed in the solar corona and Earth’s magnetosphere, depicts “roll-ups” of current sheets (the cylindrical features at the center and top of the graphic at left). These roll-ups are produced by instabilities similar to those responsible for Kelvin-Helmholtz clouds (right). (Visualization on left courtesy NCAR/CISL/IMAGe; photo on right by Ben Foster, NCAR.)

“These are structures that you wouldn’t discern by just looking at the physical equations,” notes NCAR’s Douglas Nychka. He leads the applied mathematics institute at NCAR that includes GTP as well as the Turbulence Numerics Team, a group of mathematicians and software engineers hashing out strategies for portraying turbulence in models.

In the last decade, NCAR’s turbulence modeling has aided the U.S. departments of Homeland Security and Defense in simulating how winds might loft hazardous chemicals close to the Pentagon, New York, and other potential targets of terrorism. Fine-scale modeling is critical when it comes to depicting the flow of airborne toxins, whether released intentionally or inadvertently. Software now incorporates turbulence within a weather model that can map atmospheric conditions over areas ranging from an entire region (such as the mid-Atlantic) to a single building.