The GPS revolution

Atmospheric research made enormous gains in the 1990s through the growth of high-speed data exchange facilitated by the Internet. At the same time, another byproduct of government research—the Global Positioning System—was bringing its own benefits to the field. This constellation of satellites, originally created for military use, allowed researchers to pinpoint the behavior of Earth’s atmosphere, oceans, and surface with new precision.

One of the first technologies transformed by GPS was the venerable balloon-borne radiosonde. Weather balloons launched each day from fixed observing sites had been a standby of atmospheric monitoring since the 1930s. As each balloon rose, a transmitter sent back data on temperature and moisture, while a separate set of navigation signals allowed scientists to track the instruments’ location and infer the speed of the winds pushing the balloon onward.

Photo of a man holding a small yellow balloon which is attached to a dropsonde
Terry Hock led technology development for the NCAR GPS dropsonde, which descends from a specialized parachute.

Starting in the late 1980s, an NCAR mobile system made it possible to launch radiosondes wherever a van could be driven. The system’s value grew further when NCAR began using radiosondes that received GPS signals. Since the instruments’ rise could now be tracked with far more precision, wind speeds could be calculated at many more points along the vertical path. “The combination of GPS resolution and mobile launch platforms quickly became a powerful tool and one of our most requested,” says NCAR facility manager Stephen Cohn. “We could go to the weather, rather than waiting for it to come to us.”

Perhaps the biggest benefits from GPS-enhanced wind data are in monitoring tropical cyclones, where a hurricane’s fierce gusts can be gauged through dropwindsondes. Like radiosondes in reverse, dropsondes hang from parachutes, tracking conditions as they descend from hurricane-hunting aircraft to the sea. Prior to the GPS era, winds could be estimated about every 300 meters (1,000 feet) along the downward path of a dropsonde.

That vertical resolution sharpened to roughly 10 meters (33 feet) with the advent of a GPS-based dropsonde funded by NOAA and developed by NCAR in the early 1990s. Manufactured by Vaisala, thousands of the new GPS sondes were deployed from Air Force and NOAA flights starting in 1996, just as the Atlantic entered a series of active hurricane seasons. NCAR engineer Terry Hock, who had led the sonde development, said, “It’s almost like having an instrument that never existed before. We’re now measuring quantities down to the very surface of the ocean.”

“In terms of bang-for-the-buck, the NCAR GPS dropsonde program has been a stunning success,” says Kerry Emanuel (Massachusetts Institute of Technology). “A great deal of what we have learned about hurricanes in the last few decades has come from measurement using the GPS sondes, and my own research on surface fluxes in hurricanes deduced from field observations would not have been possible without the new sondes.”

There was one catch: the GPS-boosted sondes yielded data only along their pathways. What many scientists craved was a way to map the full global atmosphere in three dimensions, beyond the limits of individual sondes and the constraints of satellite-borne instruments often hobbled by clouds.

Diagram of part of Eargh and satellites
As radio signals from GPS satellites pass through Earth’s atmosphere they are bent and slowed before reaching an occulting low-Earth orbiting (LEO) satellite. By measuring these effects, scientists can infer much about the atmosphere along the signal path.

In 1995, a shoebox-sized GPS sensor—sent to space on a microsatellite not much larger than a suitcase—laid the foundation for atmospheric monitoring that was truly 3-D. The twist in UCAR’s multiagency GPS/Meteorology (GPS-MET) project was that GPS signals themselves could be used as weather instruments to observe the atmosphere. The space-borne sensor intercepted GPS signals that cut through the atmosphere at a shallow angle, using a method called radio occulation (see graphic, next page). As the atmosphere becomes more dense or more moist, the radio signal is slowed and bent slightly. Scientists can use the signal deviations to infer atmospheric density and, in turn, electron densities, temperature, and moisture.

Less than a month after its launch, the GPS/MET sensor yielded the first profile of Earth’s atmosphere based on its data. Roughly 9,000 more profiles were constructed during the instrument’s two-year active life. The sensor served as a prototype for a much more ambitious network of six microsatellites launched in 2006 (see page 44).

Inspiration for GPS-MET came from past work at NASA’s Jet Propulsion Laboratory and Stanford University that used radio occultation to analyze the atmospheres of Jupiter, Venus, and Mars. The project also marshaled expertise from the University NAVSTAR Consortium, a group of universities involved in GPS-related work. Now independent, UNAVCO was based at UCAR from 1992 to 2003.

“I’ve talked with many of the inventors and developers of the GPS system, and none of them anticipated that it could be used with such high accuracy,” said principal investigator Randolph “Stick” Ware (now with the firm Radiometrics) a few months after the 1996 success.

UCAR president Richard Anthes not only championed GPS-MET; he was one of the project’s principal investigators. “It is rare that a scientist or administrator gets the chance to participate in a first-ever breakthrough that makes a revolutionary advance in Earth sciences and operations,” says Anthes.


Today — A COSMIC success looks ahead

Photo of Lidia Cucurull

"We cannot afford to lose radio occultation data."

—Lidia Cucurull, NOAA

The GPS/MET experiment (page 42) showed that a single instrument in space could intercept GPS radio signals and provide useful data through radio occultation (RO). With support from NSF, NOAA, NASA, and DOD, UCAR collaborated with Taiwan’s National Space Program Office (now the National Space Organization) to produce a six-satellite system now yielding one of the world’s most complete 3-D databases on the global atmosphere.

COSMIC—the Constellation Observing System for Meteorology, Ionosphere, and Climate—is called FORMOSAT in Taiwan, which paid about 80% of the $100 million tab for the project. Launched on 14 April 2006, the system produces between 1,500 and 2,500 atmospheric profiles every day. Each is available within hours from the UCAR-based COSMIC Data Analysis and Archive Center. More than 1,000 users in more than 50 countries had registered to access COSMIC data by mid-2010.

Color visualization of satellites in space
This artist’s illustration shows two of the six microsatellites in COSMIC, which is providing a major boost to the quality and quantity of data needed to improve global weather forecasts, climate monitoring, and space weather monitoring. (Illustration courtesy Orbital Sciences Corporation.)

Among COSMIC’s most intensive users are several of the world’s largest operational forecasting centers, where the data are helping enhance the starting-point conditions for weather forecasting models. At NOAA’s National Centers for Environmental Prediction, temperature biases in models have been reduced and the skill of four-day model forecasts improved by eight hours. “The benefits from adding COSMIC into the operational data stream have been very significant,” says Lidia Cucurull, NOAA’s GPS RO program scientist.

Life in space does have its hazards. COSMIC planners are keeping a wary eye on thousands of bits of space debris so that the microsatellite orbits can be adjusted to avoid collisions. Age takes its toll as well: COSMIC’s projected lifespan extends only through 2011, though some satellites may gather useful data for several more years. Taiwan and NOAA are now planning a follow-on mission, COSMIC II, that could launch in 2014, with the number of satellites expected to increase from 6 to 12 and the number of daily profiles jumping to 8,000 or more.