Weather Research

Solar energy gets boost from new forecasting system

BOULDER, Colo. — A cutting edge forecasting system developed by a national team of scientists offers the potential to save the solar energy industry hundreds of millions of dollars through improved forecasts of the atmosphere.The new system, known as Sun4CastTM, has been in development for three years by the National Center for Atmospheric Research (NCAR) in collaboration with government labs, universities, utilities, and commercial firms across the country. Funded by the U.S. Department of Energy SunShot Initiative, the system greatly improves predictions of clouds and other atmospheric conditions that influence the amount of energy generated by solar arrays.After testing Sun4Cast at multiple sites, the research team has determined that it can be up to 50 percent more accurate than current solar power forecasts. This improved accuracy will enable utilities to deploy solar energy more reliably and inexpensively, reducing the need to purchase energy on the spot market. The amount of energy gathered by solar panels — such as these in Colorado's San Luis Valley — is influenced by factors including the position and types of clouds, the amount of snow on the ground, and relative humidity. The new Sun4Cast system greatly improves solar irradiance predictions, enabling utilities to deploy solar energy more reliably and inexpensively. (©UCAR. Photo by Sue Ellen Haupt, NCAR. This image is freely available for media & nonprofit use.)As a result, utilities across the United States may be able to save an estimated $455 million through 2040 as they use more solar energy, according to an analysis by NCAR economist Jeffrey Lazo.NCAR, which does not provide operational forecasts, makes the technology available so it can be adapted by utilities or private forecasting companies. The research is being highlighted in more than 20 peer-reviewed papers."These results can help enable the nation's expanding use of solar energy," said Sue Ellen Haupt, director of NCAR’s Weather Systems and Assessment Program, who led the research team. "More accurate predictions are vital for making solar energy more reliable and cost effective."The work builds on NCAR’s expertise in highly detailed atmospheric prediction, including the design of an advanced wind energy forecasting system."This type of research and development is important because it contributes to the reduction in costs for solar and wind energy and makes it easier for utilities to integrate renewables into the electrical grid," said William Mahoney, Deputy Director of NCAR's Research Applications Laboratory. "When it comes to balancing demand for power with supply, it's vital to be able to predict sources of energy as accurately as possible."Xcel Energy is already beginning to use the system to forecast conditions at several of its main solar facilities.“Our previous experience with the National Center for Atmospheric Research in developing a wind forecasting system has saved millions of dollars and has been highly beneficial for our customers," said Drake Bartlett, senior trading analyst for Xcel Energy – Colorado. "It is our sincere hope and belief that we will see positive atmospheric forecasting results for predicting solar generation as well, again to the benefit of our Xcel Energy customers."Energy forecasts out to 72 hoursUsing a combination of advanced computer models, atmospheric observations, and artificial intelligence techniques, Sun4Cast provides 0- to 6-hour nowcasts of solar irradiance and the resulting power production for specific solar facilities at 15-minute intervals. This enables utilities to continuously anticipate the amount of available solar energy.In addition, forecasts extend out to 72 hours, allowing utility officials to make decisions in advance for balancing solar with other sources of energy.Solar irradiance is notoriously difficult to predict. It is affected not just by the locations and types of clouds, but also a myriad of other atmospheric conditions, such as the amount of dust and other particles in the air, relative humidity, and air pollution. Further complicating the forecast, freshly fallen snow, nearby steep mountainsides, or even passing cumulus clouds can reflect sunlight in a way that can increase the amount of energy produced by solar panels.To design a system to forecast solar energy output, NCAR and its partners drew on an array of observing instruments, including satellites, radars, and sky imagers; specialized software; and mathematical and artificial intelligence techniques. Central to Sun4Cast is a new computer model of the atmosphere that simulates solar irradiance based on meteorological conditions. Called WRF-SolarTM, the model is derived from the NCAR-based Weather Research and Forecasting (WRF) model, which is widely used by meteorological agencies worldwide.The team tested the system in geographically diverse areas, including Long Island, New York; the Colorado mountains; and coastal California."We have to provide utilities with confidence that the system maintains a high degree of accuracy year-round in very different types of terrain," said Branko Kosovic, NCAR Program Manager for Renewable Energy.In addition to aiding the solar power industry, the work can also improve weather forecasting in general because of improved cloud prediction.NCAR's numerous partners on the project in the public and private sectors included:Government labs: National Renewable Energy Laboratory, Brookhaven National Laboratory, the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory, and other NOAA facilities; Universities: The Pennsylvania State University, Colorado State University, University of Hawaii, and University of Washington; Utilities: Long Island Power Authority, New York Power Authority, Public Service Company of Colorado, Sacramento Municipal Utility District (SMUD), Southern California Edison, and the Hawaiian Electric Company; Independent system operators: New York ISO, Xcel Energy, SMUD, California ISO, and Hawaiian Electric; and Commercial forecast providers: Schneider Electric, Atmospheric and Environmental Research, Global Weather Corporation, MDA Information Systems, and Solar Consulting Services.Computing time was provided by the New York State Department of Economic Development's Division of Science, Technology and Innovation on an IBM Blue Gene supercomputer at Brookhaven National Laboratory. Researchers also performed computing at the NCAR-Wyoming Supercomputing Center and the DOE National Energy Research Scientific Computing Center.About the SunShot InitiativeThe U.S. Department of Energy SunShot Initiative is a collaborative national effort that aggressively drives innovation to make solar energy fully cost-competitive with traditional energy sources before the end of the decade. Through SunShot, the Energy Department supports efforts by private companies, universities, and national laboratories to drive down the cost of solar electricity to $0.06 per kilowatt-hour.

US taps NCAR technology for new water resources forecasts

BOULDER, Colo. — As the National Oceanic and Atmospheric Administration (NOAA) this month launches a comprehensive system for forecasting water resources in the United States, it is turning to technology developed by the National Center for Atmospheric Research (NCAR) and its university and agency collaborators.WRF-Hydro, a powerful NCAR-based computer model, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model."WRF-Hydro gives us a continuous picture of all of the waterways in the contiguous United States," said NCAR scientist David Gochis, who helped lead its development. "By generating detailed forecast guidance that is hours to weeks ahead, it will help officials make more informed decisions about reservoir levels and river navigation, as well as alerting them to dangerous events like flash floods."WRF-Hydro (WRF stands for Weather Research and Forecasting) is part of a major Office of Water Prediction initiative to bolster U.S. capabilities in predicting and managing water resources. By teaming with NCAR and the research community, NOAA's National Water Center is developing a new national water intelligence capability, enabling better impacts-based forecasts for management and decision making.The new WRF-Hydro computer model simulates streams and other aspects of the hydrologic system in far more detail than previously possible. (Image by NOAA Office of Water Prediction.) Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, WRF-Hydro provides continuous forecasts of millions of points along rivers, streams, and their tributaries across the contiguous United States. To accomplish this, it simulates the entire hydrologic system — including snowpack, soil moisture, local ponded water, and evapotranspiration — and rapidly generates output on some of the nation's most powerful supercomputers.WRF-Hydro was developed in collaboration with NOAA and university and agency scientists through the Consortium of Universities for the Advancement of Hydrologic Science, the U.S. Geological Survey, Israel Hydrologic Service, and Baron Advanced Meteorological Services. Funding came from NOAA, NASA, and the National Science Foundation, which is NCAR's sponsor."WRF-Hydro is a perfect example of the transition from research to operations," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation (NSF). "It builds on the NSF investment in basic research in partnership with other agencies, helps to accelerate collaboration with the larger research community, and culminates in support of a mission agency such as NOAA. The use of WRF-Hydro in an operational setting will also allow for feedback from operations to research. In the end this is a win-win situation for all parties involved, chief among them the U.S. taxpayers.""Through our partnership with NCAR and the academic and federal water community, we are bringing the state of the science in water forecasting and prediction to bear operationally," said Thomas Graziano, director of NOAA’s new Office of Water Prediction at the National Weather Service.Filling in the water pictureThe continental United States has a vast network of rivers and streams, from major navigable waterways such as the Mississippi and Columbia to the remote mountain brooks flowing from the high Adirondacks into the Hudson River. The levels and flow rates of these watercourses have far-reaching implications for water availability, water quality, and public safety.Until now, however, it has not been possible to predict conditions at all points in the nation's waterways. Instead, computer models have produced a limited picture by incorporating observations from about 4,000 gauges, generally on the country's bigger rivers. Smaller streams and channels are largely left out of these forecast models, and stretches of major rivers for tens of miles are often not predicted — meaning that schools, bridges, and even entire towns can be vulnerable to unexpected changes in river levels.To fill in the picture, NCAR scientists have worked for the past several years with their colleagues within NOAA, other federal agencies, and universities to combine a range of atmospheric, hydrologic, and soil data into a single forecasting system.The resulting National Water Model, based on WRF-Hydro, simulates current and future conditions on rivers and streams along points two miles apart across the contiguous United States. Along with an hourly analysis of current hydrologic conditions, the National Water Model generates three predictions: an hourly 0- to 15-hour short-range forecast, a daily 0- to 10-day medium-range forecast, and a daily 0- to 30-day long-range water resource forecast.The National Water Model predictions using WRF-Hydro offer a wide array of benefits for society. They will help local, state, and federal officials better manage reservoirs, improve navigation along major rivers, plan for droughts, anticipate water quality problems caused by lower flows, and monitor ecosystems for issues such as whether conditions are favorable for fish spawning. By providing a national view, this will also help the Federal Emergency Management Agency deploy resources more effectively in cases of simultaneous emergencies, such as a hurricane in the Gulf Coast and flooding in California."We've never had such a comprehensive system before," Gochis said. "In some ways, the value of this is a blank page yet to be written."A broad spectrum of observationsWRF-Hydro is a powerful forecasting system that incorporates advanced meteorological and streamflow observations, including data from nearly 8,000 U.S. Geological Survey streamflow gauges across the country. Using advanced mathematical techniques, the model then simulates current and future conditions for millions of points on every significant river, steam, tributary, and catchment in the United States.In time, scientists will add additional observations to the model, including snowpack conditions, lake and reservoir levels, subsurface flows, soil moisture, and land-atmosphere interactions such as evapotranspiration, the process by which water in soil, plants, and other land surfaces evaporates into the atmosphere.Scientists over the last year have demonstrated the accuracy of WRF-Hydro by comparing its simulations to observations of streamflow, snowpack, and other variables. They will continue to assess and expand the system as the National Water Model begins operational forecasts.NCAR scientists maintain and update the open-source code of WRF-Hydro, which is available to the academic community and others. WRF-Hydro is widely used by researchers, both to better understand water resources and floods in the United States and other countries such as Norway, Germany, Romania, Turkey, and Israel, and to project the possible impacts of climate change."At any point in time, forecasts from the new National Water Model have the potential to impact 300 million people," Gochis said. "What NOAA and its collaborator community are doing is trying to usher in a new era of bringing in better physics and better data into forecast models for improving situational awareness and hydrologic decision making."CollaboratorsBaron Advanced Meteorological Services Consortium of Universities for the Advancement of Hydrologic ScienceIsrael Hydrologic ServiceNational Center for Atmospheric ResearchNational Oceanic and Atmospheric AdministrationU.S. Geological SurveyFundersNational Science FoundationNational Aeronautics and Space AdministrationNational Oceanic and Atmospheric Administration

NCAR weather ensemble offers glimpse at forecasting's future

July 1, 2016 | Last spring, scientists at the National Center for Atmospheric Research (NCAR) flipped the switch on a first-of-its-kind weather forecasting system. For more than a year, NCAR's high-resolution, real-time ensemble forecasting system has been ingesting 50,000 to 70,000 observations every six hours and creating a whopping 90,000 weather maps each day.The system has become a favorite among professional forecasters and casual weather wonks: Typically more than 200 people check out the site each day with more than a thousand coming during major weather events. During this experimental period, the NCAR ensemble has also become a popular source of guidance within the National Weather Service, where it has already been referenced several hundred times by forecasters at more than 50 different offices. But perhaps more important, the data accumulated from running the system daily — and there is lots of it — is being used by researchers at universities across the country to study a range of topics, from predicting hail size to anticipating power outages for utilities."We wanted to demonstrate that a real-time system of this scale was feasible," said NCAR scientist Craig Schwartz. "But it's also a research project that can help the community learn more about the predictability of different kinds of weather events."Schwartz is a member of the team that designed and operates the system, along with NCAR colleagues Glen Romine, Ryan Sobash, and Kate Fossell.This animation shows the forecast for accumulated snowfall made by each of the NCAR ensemble's 10 members for the 48-hour period beginning on Jan. 22, 2016. In the run-up to the blizzard, which ultimately dropped more than 30 inches of snow on parts of the Mid-Atlantic, more than 1,000 people visited the NCAR ensemble's website. (©UCAR. This animation is freely available for media & nonprofit use.)Testing a unique toolNCAR's high-resolution ensemble forecasting system is unique in the country for a couple of reasons, both of which are revealed in its name: It's an ensemble, and it's high resolution.Instead of producing a single forecast, the system produces an "ensemble" of 10 forecasts, each with slightly different (but equally likely) starting conditions. The degree to which the forecasts look the same or different tells scientists something about the probability that a weather event, like rain, hail, or wind, will actually occur.By comparing the actual outcomes to the forecasted probabilities, scientists can study the predictability of particular weather events under different circumstances. The forecasting system's high resolution (the grid points are just 3 kilometers apart) allows it to simulate small-scale weather phenomena, like the creation of individual storms from convection — the process of moist, warm air rising and then condensing into clouds. The combination of fine grid spacing and ensemble predictions in the NCAR system offers a sneak peek at what the future of weather forecasting might look like, and weather researchers across the country have noticed. Cliff Mass, a professor of atmospheric sciences at the University of Washington whose specialty is forecasting, said: "It's extremely important for the United States to have a convection-allowing ensemble system to push our forecasting capabilities forward. We were delighted that NCAR demonstrated that this could be done."'The cat's meow'The treasure trove of accruing weather data generated by running the NCAR ensemble is already being used by researchers both at NCAR and in the broader community. Jim Steenburgh, for instance, is a researcher at the University of Utah who is using the system to understand the predictability of mountain snowstorms."NCAR's ensemble not only permits the 'formation' of clouds, it can also capture the topography of the western United States," he said. "The mountains control the weather to some degree, so you need to be able to resolve the mountains' effects on precipitation."Steenburgh has also been using the ensemble with his students. "We’re teaching the next generation of weather forecasters," he said. "In the future, these high-resolution ensemble forecasts will be the tools they need to use, and this gives them early, hands-on experience." Like Steenburgh, Lance Bosart, an atmospheric researcher at the University of Albany, State University of New York, has used the ensemble both in his own research — studying the variability of convective events — and with his students. He said having 10 members in the ensemble forecast helps students easily see the great spread of possibilities, and the visual emphasis of the user interface makes it easy for students to absorb the information."What makes it an invaluable tool is the graphical display," he said. "It's visually compelling. You don't have to take a lot of time to explain what you're looking at; you can get right into explaining the science. I like to say it's the cat's meow." Setting an exampleThe NCAR ensemble is also enabling the researchers running it to further their own research. "We're collecting statistics on the misfit between the model predictions and observations and then we're trying to use that to improve our model physics," Romine said. The ensemble project is also teaching the team about the strengths and weaknesses of the way they've chosen to kick off, or "initialize," each of the ensemble members. "The NCAR ensemble happens to produce a pretty good forecast, but we realize there are some shortcomings," Schwartz said. "For example, if we were trying to make the best forecast in the world, we would probably not be initializing the model the way we are. But then we wouldn’t learn as much from a research perspective." The NCAR ensemble began as a yearlong trial, but the project is continuing to run for now. The team would like to keep the system online until next summer, but they don't yet have the computing resources they need to run it past September.If the system does continue to run, the researchers who are using it say there's still more that they and their students can learn from the project. And if not, there's loads of data already collected that are still waiting to be mined. In any case, Mass says NCAR's ensemble has been a valuable project. "It set a really good example for the nation," he said.Community members interested in collaborating or helping support the NCAR ensemble project are encouraged to contact the team at ensemble@ucar.edu.Writer/contact:Laura Snider, Senior Science Writer and Public Information Officer 

Sizing up cyclones

UPDATE: 2016 SEASONAL FORECASTThe NCAR-based Engineering for Climate Extremes Partnership (ECEP) has issued its first seasonal forecast using the Cyclone Damage Potential index. The forecast is for a hurricane season with higher-than-average potential to cause damage. This year's forecasted seasonal CDP is 5.7, compared to an average seasonal CDP of 3.7 for the years 1981 - 2010. For more details, visit the ECEP website.May 18, 2016 |In early July 2005, Hurricane Dennis, a Category 3 storm on the Saffir-Simpson Hurricane Wind Scale, was bearing down on the Gulf Coast. Anyone paying attention would have been forgiven for having a foreboding sense of déjà vu. Just 10 months earlier, another Category 3 storm, Hurricane Ivan, had followed a strikingly similar track, making landfall just west of Gulf Shores, Alabama. Ivan ravaged the region, ultimately causing an estimated $18.8 billion in damages. But Dennis, despite roaring ashore in practically the same neighborhood, caused only $2.5 billion in damages—less than one-seventh that caused by Ivan.The fact that two Category 3 hurricanes making similar landfall less than one year apart had such different impacts illustrates a weakness in the Saffir-Simpson scale, the system most commonly used by weather forecasters to categorize hurricane risk.Scientists at the National Center for Atmospheric Research (NCAR)—in collaboration with global insurance broker Willis—have developed a new index that they suspect can do a better job of quantifying a hurricane's ability to cause destruction. The Cyclone Damage Potential index (CDP) rates storms on a scale of 1 to 10, with 10 being the greatest potential for destruction.A prototype for an app that will allow risk managers to easily use the CDP to identify local damage potential is already available and will be among the first tools included in the beta version of the NCAR-based Global Risk, Resilience, and Impacts Toolbox when it is released later this year.Infrared satellite imagery of Hurricane Ivan (left) and Hurricane Dennis (right). Both storms were rated Category 3, both made landfall in almost the same area, and yet they caused vastly different amounts of damage. Click to enlarge. (Images courtesy NOAA.)Moving beyond wind speedOn the frequently used Saffir-Simpson Hurricane Wind Scale, hurricanes are placed in one of five categories, based on their sustained wind speeds. On the low end, Category 1 storms have sustained winds between 74–95 mph and are expected to cause "some damage." On the high end, Category 5 storms have sustained winds of 157 mph or higher and are expected to cause "catastrophic damage."Because the Saffir-Simpson scale relies solely on sustained wind speeds, it does not take into account all the characteristics of a storm linked to its destructive power."Hurricane wind damage is driven by more than simply wind speed," said James Done, one of three NCAR scientists who worked on the CDP. "The hurricane's size and forward speed also are important. A large, slowly moving hurricane that repeatedly batters an area with high winds can cause greater total damage than a smaller, faster hurricane that blows quickly through a region."Damage caused to a marina in New Orleans by Hurricane Katrina. Katrina would have received a CDP rating of 6.6, compared to a 5.0 for Hurricane Ivan and a 2.4 for Hurricane Dennis. (Image courtesy NOAA. Click here for high resolution.)For example, the critical difference between Ivan and Dennis turned out to be hurricane size, according to a study of the storms by Done and Jeffrey Czajkowski at the University of Pennsylvania's Wharton Risk Management and Decision Processes Center.To create the CDP, the scientists incorporated hurricane size and forward speed into their index, along with sustained winds. To determine the relative importance of each, the team used hurricane damage statistics from the hundreds of offshore oil and gas facilities that pepper the northern Gulf of Mexico. Because facilities are spread more-or-less evenly across the region, their exposure to hurricanes is approximately the same. Damage differences from storm to storm can therefore be attributed to differences in the storms themselves. The CDP does not predict actual damage – which could vary markedly, depending on where (or if) a hurricane makes landfall – but instead predicts the storm's potential.When applying the CDP to past hurricanes, the index was able to discern a difference between Ivan, which would have scored 5.0 on the CDP prior to landfall, and the much smaller Dennis, which would have earned a 2.4. The CDP rating for Hurricane Katrina, which ultimately caused more than $80 billion in damages in 2005, would have been 6.6.“The value of the index is in comparing current storms with storms from the past," Done said. "For example, if a hurricane is approaching New Orleans, you can compare its CDP with Hurricane Katrina's CDP and get a fuller picture of how much damage the storm is likely to cause."The CDP project was led by NCAR scientist Greg Holland, along with NCAR colleagues Done, Ming Ge, and Willis collaborator Rowan Douglas.Dive deeperFrom today's storm to tomorrow's climateIn its original form, the CDP can be easily applied in real time to existing hurricanes. But Done also wanted to find a way to examine how hurricane damage might change in the future, especially as the climate warms. The question of how climate change may influence hurricanes has been difficult to answer, in part because global climate models are typically not able to "see" the small-scale details of individual storms. Though some scientists have run climate models at a resolution that is fine enough to study hurricane formation, the effort requires so much computing power that it hasn't been practical to replicate on a large scale.To skirt this problem, hurricane researchers have looked for links between hurricane activity and phenomena that climate models can see—for example, the sea surface temperatures of ocean basins."People have used large-scale variables to infer tropical cyclone activity for decades," Done said. "I've done a similar thing, but instead of predicting how many hurricanes will form, I’m predicting hurricane damage potential."To make this "climate" version of the CDP, Done – together with NCAR colleagues Debasish PaiMazumder and Erin Towler, and Indian Space Research Organization collaborator Chandra Kishtawal – looked for variables in the large-scale environment that could be correlated to the three variables used for the original CDP: sustained winds, size, and storm speed.The team found that "steering flow," the winds that would blow along a hurricane, is correlated with forward speed. They also found that relative sea surface temperature – the difference between temperatures in the Atlantic and Pacific ocean basins – is linked to seasonal average hurricane intensity and size. This is because relative sea surface temperatures affect wind speeds higher up in the atmosphere, which in turn affect hurricane formation. The result is an index that can spit out a damage potential rating for a season, a year, or even longer, without needing to predict how many individual storms might form. Such forecasts are of interest to large reinsurance companies, like Willis Re and others."This technique enables us to translate our climate model simulations into information about extreme events that’s critical for businesses and policy makers,” Done said.Writer/ContactLaura Snider, Senior Science Writer and Public information OfficerFundersResearch Partnership to Secure Energy for AmericaWillis Re CollaboratorsEngineering for Climate Extremes PartnershipWillis Re

A 3D window into a tornado

This simulation was created by NCAR scientist George Bryan to visualize what goes on inside a tornado. The animation is the "high swirl" version in a series that goes from low, to medium, to high. Click to enlarge. (Courtesy Goerge Bryan, NCAR. This image is freely available for media & nonprofit use.) May 17, 2016 | What's really going on inside a tornado? How fast are the strongest winds, and what are the chances that any given location will experience them when a tornado passes by? Due to the difficulties of measuring wind speeds in tornadoes, scientists don't have answers to these questions. However, a collaborative project between researchers at the University of Miami and NCAR has been seeking clues with new, highly detailed computer simulations of tornado wind fields. The simulations can be viewed in a series of animations, created by NCAR scientist George Bryan, that provide a 3D window into the evolving wind fields of idealized tornadoes at different rates of rotation. The "high-swirl animation," shown here, which depicts a powerful tornado with 200-plus mph winds, the purple tubelike structures depict the movements of rapidly rotating vortices. Near-surface winds are represented by colors ranging from light blue (less than 20 meters per second, or 45 mph) to deep red (more than 100 meters per second, or 224 miles per hour). The vortices and winds are contained within a condensation cloud that rises more than 500 meters (1,640 feet) above the surface. Such visualizations can help atmospheric scientists better understand the structures of tornadoes, as well as the shifting location and strength of maximum wind speeds.  Bryan also uses them in presentations to meteorology students. “When you make these 3D visualizations and then animate them, they give you a sense of how the flow evolves and how the turbulence changes,” Bryan said. “These are details you don’t see by just looking at a photograph.” For example, he learned from the visualization that the rotating tubes tilt backward against the flow at higher altitudes. These are the kinds of details that can eventually help scientists better understand these complex storms. The information is also critical for public safety officials and engineers. “If you’re an engineer and designing a building, you want to know details like how much greater is peak wind over average wind in a tornado,” Bryan said. “We’ll get questions from engineers asking about the details of wind gusts in those purple tubes.” Bryan is collaborating on the simulations with Dave Nolan, chair of Miami’s Department of Atmospheric Sciences. To create the animation, Bryan used innovative NCAR software that enables researchers in the atmospheric and related sciences to analyze and interpret results from large computer models. VAPOR (Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers) is an interactive 3D visualization environment for both animations and still-frame images. The open-source software can be downloaded and used on personal computers. VAPOR was developed at NCAR in partnership with the University of California at Davis and Ohio State University. Funding comes from the National Science Foundation and the Korea Institute of Science and Technology Information. Writer/contactDavid Hosansky FunderNational Science Foundation CollaboratorUniversity of Miami

Ocean temps predict U.S. heat waves 50 days out, study finds

BOULDER — The formation of a distinct pattern of sea surface temperatures in the middle of the North Pacific Ocean can predict an increased chance of summertime heat waves in the eastern half of the United States up to 50 days in advance, according to a new study led by a scientist at the National Center for Atmospheric Research (NCAR).  The pattern is a contrast of warmer-than-average water butting up against cooler-than-average seas. When it appears, the odds that extreme heat will strike during a particular week—or even on a particular day—can more than triple, depending on how well-formed the pattern is. The research is being published in the journal Nature Geoscience. "Summertime heat waves are among the deadliest weather events, and they can have big impacts on farming, energy use, and other critical aspects of society," said Karen McKinnon, a postdoctoral researcher at NCAR and the lead author of the study. "If we can give city planners and farmers a heads up that extreme heat is on the way, we might be able to avoid some of the worst consequences." The research was largely funded by the National Science Foundation, NCAR's sponsor. In addition to McKinnon, the research team includes Andrew Rhines, of the University of Washington; Martin Tingley, of Pennsylvania State University; and Peter Huybers, of Harvard University. A fingerprint on the ocean For the study, the scientists divided the country into regions that tend to experience extreme heat at the same time. The scientists then focused on the largest of the resulting blocks: a swath that stretches across much of the Midwest and up the East Coast, encompassing both important agricultural areas and heavily populated cities.  Top: Sea surface temperature anomalies in the mid-latitude Pacific 50 days in advance of June 29, 2012. The pattern inside the green box resembled the Pacific Extreme Pattern, indicating that there would be an increase in the odds of a heat wave in the eastern half of the United States at the end of June. (Image courtesy of Karen McKinnon, NCAR. This image is freely available for media & nonprofit use.) Bottom: June 29, 2012, was the hottest day of the year in the eastern United States. The hot temperatures in late June and early July were part of an extraordinarily hot summer that saw three heat waves strike the country. (Map courtesy of the National Weather Service's Weather Prediction Center.) The research team looked to see if there was a relationship between global sea surface temperature anomalies—waters that are warmer or cooler than average—and extreme heat in the eastern half of the U.S. Right away, a pattern popped out in the middle of the Pacific, above about 20 degrees North latitude. The scientists found that the particular configuration of ocean water temperatures, which they named the Pacific Extreme Pattern, was not only found when it was already hot in the eastern U.S., but that it tended to form in advance of that heat. "Whatever mechanisms ultimately leads to the heat wave also leaves a fingerprint of sea surface temperature anomalies behind," McKinnon said. Improving seasonal forecasts To test how well that fingerprint could predict future heat, the scientists used data collected from 1,613 weather stations across the eastern U.S. between 1982 and 2015, as well as daily sea surface temperatures for the same time period. The scientists defined extreme heat in the eastern U.S. as a summertime day when the temperature readings from the warmest 5 percent of weather stations in the region were at least 6.5 degrees Celsius (11.7 degrees Fahrenheit) hotter than average. The scientists only looked at extreme heat during that region’s 60 hottest days of the year: June 24 through Aug. 22. The researchers "hindcasted" each year in the dataset to see if they could retrospectively predict extreme heat events—or lack of those events—during that year's summer, using only data collected during the other years as a guide. At 50 days out, the scientists were able to predict an increase in the odds—from about 1 in 6 to about 1 in 4—that a heat wave would strike somewhere in the eastern U.S. during a given week. At 30 days out or closer, the scientists were able to predict an increase in the odds—to better than 1 in 2 for a particularly well-formed pattern—that a heat wave would strike on a particular day. This new technique could improve existing seasonal forecasts, which do not focus on predicting daily extremes. Seasonal forecasts typically predict whether an entire summer is expected to be warmer than normal, normal, or cooler than normal. For example, the seasonal forecast issued for the summer of 2012 predicted normal heat for the Northeast and Midwest. But, the summer ended up being especially hot, thanks to three major heat waves that struck in late June, mid-July, and late July. When the science team used the Pacific Extreme Pattern to hindcast 2012, they were able to determine as early as mid-May that there were increased odds of extremely hot days occurring in late June. The hottest day of the summer of 2012, as measured by the technique used for this study, was June 29, when the warmest 5 percent of weather stations recorded temperatures that were 10.4 degrees Celsius (18.7 degrees Fahrenheit) above average. "We found that we could go back as far as seven weeks and still predict an increase in the odds of future heat waves," McKinnon said. “What’s exciting about this is the potential for long-range predictions of individual heat waves that gives society far more notice than current forecasts.” Looking ahead Scientists do not yet know why the fingerprint on sea surface temperatures in the Pacific predicts heat in the eastern U.S. It could be that the sea surface temperatures themselves kick off weather patterns that cause the heat. Or it could be that they are both different results of the same phenomenon, but one does not cause the other. To learn more about how the two are connected, McKinnon is working with colleagues at NCAR to use sophisticated computer models to try and tease apart what is really happening. The study's findings also point toward the possibility that the Pacific Extreme Pattern, or a different oceanic fingerprint, could be used to forecast other weather events far in advance, including cooler-than-average days and extreme rainfall events. “The results suggest that the state of the mid-latitude ocean may be a previously overlooked source of predictability for summer weather,” McKinnon said. About the article Title: Long-lead predictions of eastern United States hot days from Pacific sea surface temperaturesAuthors: Karen McKinnon, Andrew Rhines, Martin Tingley, and Peter HuybersJournal: Nature Geoscience Writer:Laura Snider, senior science writer and public information officer

Potential Zika virus risk estimated for 50 U.S. cities

BOULDER – Key factors that can combine to produce a Zika virus outbreak are expected to be present in a number of U.S. cities during peak summer months, new research shows. The Aedes aegypti mosquito, which is spreading the virus in much of Latin America and the Caribbean, will likely be increasingly abundant across much of the southern and eastern United States as the weather warms, according to a new study led by mosquito and disease experts at the National Center for Atmospheric Research (NCAR). Summertime weather conditions are favorable for populations of the mosquito along the East Coast as far north as New York City and across the southern tier of the country as far west as Phoenix and Los Angeles, according to computer simulations conceived and run by researchers at NCAR and the NASA Marshall Space Flight Center. Spring and fall conditions can support low to moderate populations of the Aedes aegypti mosquito in more southern regions of its U.S. range. Wintertime weather is too cold for the species outside southern Florida and southern Texas, the study found. By analyzing travel patterns from countries and territories with Zika outbreaks, the research team further concluded that cities in southern Florida and impoverished areas in southern Texas may be particularly vulnerable to local virus transmission. Many U.S. cities face potential risk in summer of low, moderate, or high populations of the mosquito species that transmits Zika virus (colored circles). The mosquito has been observed in parts of the United States (shaded portion of map) and can establish populations in additional cities because of favorable summertime meteorological conditions. In addition, Zika risk may be elevated in cities with more air travelers arriving from Latin America and the Caribbean (larger circles). For a high-resolution map, click here or on the image. (Image based on data mapped by Olga Wilhelmi, NCAR GIS program. This image is freely available for media & nonprofit use.) "This research can help us anticipate the timing and location of possible Zika virus outbreaks in certain U.S. cities,” said NCAR scientist Andrew Monaghan, the lead author of the study. “While there is much we still don’t know about the dynamics of Zika virus transmission, understanding where the Aedes aegypti mosquito can survive in the U.S. and how its abundance fluctuates seasonally may help guide mosquito control efforts and public health preparedness.” “Even if the virus is transmitted here in the continental U.S., a quick response can reduce its impact,” added NCAR scientist Mary Hayden, a medical anthropologist and co-author of the study. Although the study does not include a specific prediction for this year, the authors note that long-range forecasts for this summer point to a 40–45 percent chance of warmer-than-average temperatures over most of the continental United States. Monaghan said this could lead to increased suitability for Aedes aegypti in much of the South and East, although above-normal temperatures would be less favorable for the species in the hottest regions of Texas, Arizona, and California. Monaghan stressed that, even if Zika establishes a toehold in the mainland United States, it is unlikely to spread as widely as in Latin America and the Caribbean. This is partly because a higher percentage of Americans live and work in air-conditioned and largely sealed homes and offices. The study is being published today in the peer-reviewed journal PLOS Currents Outbreaks. It was funded by the National Institutes of Health, NASA, and the National Science Foundation, which is NCAR’s sponsor. It was co-authored by scientists at NASA, North Carolina State University, Maricopa County Environmental Services Vector Control Division, University of Arizona, and Durham University. Spreading rapidly First identified in Uganda in 1947, the Zika virus has moved through tropical regions of the world over the past decade. It was introduced into Brazil last year and spread explosively across Latin America and the Caribbean, with more than 20 countries now facing pandemics. About 80 percent of infected people do not have significant symptoms, and most of the rest suffer relatively mild flu- or cold-like symptoms that generally clear up in about a week. However, scientists are investigating correlations between contracting the disease during pregnancy and microcephaly, a rare birth defect characterized by an abormally small head and brain damage. To determine the potential risk in the mainland United States, the research team ran two computer models that simulated the effect of meteorological conditions on a mosquito’s entire lifecycle (egg, larval, pupal, and adult stages) in 50 cities in or near the known range of the species. Monaghan and several team members have studied Aedes aegypti for years because it also carries the viruses that cause dengue and chikungunya. Generally, the mosquitoes need warm and relatively stable temperatures, as well as water-filled containers such as buckets, barrels, or tires, for their eggs to hatch. Once a mosquito bites an infected person, it also needs to live long enough – probably a week or more, depending on ambient temperatures – for the virus to travel from the mosquito's mid-gut to its salivary glands. Once in the saliva, the virus can then be transmitted by the mosquito biting another person. The study results show that, as springtime weather warms, the potential abundance of the mosquito begins to increase in April in the Southeast and some Arizona cities. By June, nearly all of the 50 cities studied have the potential for at least low-to-moderate abundance, and most eastern cities are suitable for moderate-to-high abundance. Conditions become most suitable for mosquito populations in July, August, and September, although the peak times vary by city. Weather conditions in southern and western cities remain suitable as late as November. Even some cities where the Aedes aegypti mosquito has not been detected, such as St. Louis and Denver, have suitable midsummer weather conditions for the species if it were introduced via transport of used tires or by other human activities, according to the computer models. The researchers stressed that additional factors outside the scope of the study could affect populations of the species, such as mosquito control efforts, competition with other mosquito species, and the extent to which eggs can survive in borderline temperatures. The study noted that northern cities could become vulnerable if a related species of mosquito that is more tolerant of cold temperatures, Aedes albopictus, begins to carry the virus. This animation shows the varying extent to which meteorological conditions can favor populations of the Aedes aegypti mosquito, which transmits the Zika virus, in 50 U.S. cities throughout the year. Red dots represent high-abundance conditions, orange represents medium-to-high, yellow represents low-to-medium, and gray represents no significant mosquito population. (Animation based on data from Andrew Monaghan, NCAR. This image is freely available for media & nonprofit use.) Factoring in travel, poverty In addition to looking at meteorological conditions, the researchers studied two other key variables that could influence the potential for Zika outbreaks: travel from Zika-affected areas and socioeconomic conditions in states that may face abundant mosquito populations. To analyze air travel, the team estimated the number of passengers arriving into U.S. cities on direct flights from airports in 22 Latin American countries and territories listed on the Centers for Disease Control and Prevention’s Zika travel advisory as of January 29. Cities that had both high potential numbers of Aedes aegypti and a large volume of air travelers included Miami, Houston, and Orlando. Since the scientists were able to obtain passenger numbers for direct flights only, they could not estimate the number of passengers continuing on to smaller cities. They noted that the summertime peak in air travel coincides with the peak season in mosquito abundance. The study also estimated that nearly five times as many people cross the U.S.-Mexico border per month than arrive by air in all 50 cities. This could indicate a high potential for transmission in border areas from Texas to California, although the Zika virus has not been widely reported in northern Mexico. Those border areas, as well as other parts of the South where the mosquitoes are expected to be abundant, have a high percentage of households living below the poverty line, according to 2014 U.S. Census data analyzed by the research team. Lower-income residents can be more exposed to mosquito bites if they live in houses without air conditioning or have torn or missing screens that enable mosquitoes to enter their homes more easily. However, Aedes aegypti populations tend to thrive in densely populated urban areas, while some of the most impoverished areas are rural. “The results of this study are a step toward providing information to the broader scientific and public health communities on the highest risk areas for Zika emergence in the United States,” said Kacey Ernst, an epidemiologist at the University of Arizona and co-author of the study. “We hope that others will build on this work as more information becomes available. All areas with an environment suitable to the establishment of Aedes aegypti should be working to enhance surveillance strategies to monitor the Aedes aegypti populations and human populations for disease emergence.” “This research highlights the complex set of human and environmental factors that determine whether a mosquito-borne disease is carried from one area to another, and how severely it affects different human populations,” said Sarah Ruth, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences. “By integrating information on weather, travel patterns, mosquito biology, and human behavior, the project team has improved our ability to forecast, deal with, and possibly even prevent future outbreaks of Zika and other serious diseases.” About the article Title: On the seasonal occurrence and abundance of the Zika virus vector mosquito Aedes aegypti in the contiguous United States Authors: Andrew Monaghan, Cory Morin, Daniel Steinhoff, Olga Wilhelmi, Mary Hayden, Dale Quattrochi, Michael Reiskind, Alun Lloyd, Kirk Smith, Christopher Schmidt, Paige Scalf, and Kacey Ernst Journal: PLOS Currents Outbreaks

COSMIC turns 10: Microsatellites reveal atmospheric properties in 3D

A constellation of six small satellites — rocketed into space a decade ago — has made outsized contributions to our ability to forecast severe weather events, track climate change, and understand space weather. April 15 at 01:40 UTC marks the 10-year anniversary of the launch of the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC). The project is a partnership between Taiwan, where it is called FORMOSAT-3, and the United States. Over its lifetime, COSMIC has proven to be an extremely cost-effective way to gather an abundance of atmospheric data, including three-dimensional profiles of temperature, humidity, and pressure, as well as electron density in the ionosphere. The broadcast by Taiwan's National Space Organization showing the launch of COSMIC in April 2006.  "The results have been outstanding," said Rick Anthes, one of COSMIC's founding scientists and president emeritus of the University Corporation for Atmospheric Research (UCAR), which manages COSMIC in the U.S. "COSMIC has significantly improved weather prediction and expanded our knowledge of the atmosphere around the world. It has exceeded all expectations." In fact, COSMIC has been such a success that the two countries are collaborating on a follow-on mission, known as COSMIC-2 in the U.S. and FORMOSAT-7 in Taiwan. "I am pleased to say that FORMOSAT-3/COSMIC has much exceeded the original planned 2-year mission goal and continues to operate into the 10-year anniversary and beyond," said Guey-Shin Chang, director general of Taiwan's National Space Organization. "FORMOSAT-7/COSMIC-2 is moving along very well and is on target to launch the first six satellites within a year to replace the decaying COSMIC satellites." A second set of six satellites is scheduled to launch in 2019. A fruitful partnership The roots of COSMIC stretch back half a century and across tens of millions of miles to NASA's Mariner 4 mission to Mars. The mission, launched in 1965, was the first to use a technique known as radio occultation to probe a planet's atmosphere. Radio occultation measures the degree to which a radio signal bends and slows as it passes through the atmosphere — the greater the bend and delay, the denser the atmosphere. Receivers on the COSMIC satellites measure the bend and delay of radio signals emitted by GPS satellites to infer information about the atmosphere. Image courtesy of the COSMIC Program. Click to enlarge. NASA went on to use the technique to explore the atmospheres of several other planets, but radio occultation was not used to gather information about our own atmosphere until three decades later. A UCAR radio occultation experiment named GPS/MET changed that. Launched on a single satellite in 1995, GPS/MET demonstrated for the first time that the radio signals already being sent out by existing GPS satellites could be harnessed to record Earth's atmospheric density, a measurement that can be further broken down into information about temperature, humidity, and pressure.  The success of that proof-of-concept experiment fueled interest in launching a constellation of small, relatively inexpensive satellites that could provide similar measurements, but many more of them and over the entire globe. To turn the idea into a reality, UCAR partnered with Taiwan's National Space Organization.  "They really set an example for how a satellite project should be executed," said Ying-Hwa "Bill" Kuo, the longtime director of UCAR's COSMIC Program. "The budget was really tight, but they were very disciplined and were able to complete the project on time and without cost overruns." Each finished COSMIC satellite looks like an oversized pancake with two flipped up solar panels and weighs about 140 pounds. The budget for COSMIC, including the first two years of operation, was $100 million. Taiwan covered 80 percent of the cost. In the United States, the leading sponsor of the project is the National Science Foundation (NSF). Other U.S. partners include the National Oceanic and Atmospheric Administration, the Air Force, the Office of Naval Research, and NASA, which developed the radio occultation receivers used on the satellites. "Unlike most proposed projects where less is achieved than is promised, COSMIC produced much more than proposed and even expected by its proponents," said Jay Fein, who managed COSMIC for NSF when the project was coming together. "Many people and institutions helped to make COSMIC an extraordinary success." A COSMIC microsatellite during testing. Just over 6 inches deep, the satellite contains a GPS occultation transmitter, a Tiny Ionospheric Photometer, and a tri-band beacon to relay data to ground stations. Photo still from video. Photo courtesy of Taiwan's National Space Organization. A wealth of data After a successful launch, the data came flooding back from COSMIC. The constellation took more than 2,500 soundings — vertical profiles of the atmosphere — per day globally. Instantly, these high-quality data were provided free of charge in real time to atmospheric researchers and forecasters around the world, who wasted little time taking advantage of them. “We have been integrating COSMIC-1 data into our numerical weather prediction models since 2007,” said Louis W. Uccellini, director of NOAA’s National Weather Service. “Its unique characteristics of high accuracy and vertical resolution complement other satellite observations used in the models. With the recent upgrade of our supercomputers, we're now able to process more data than ever, including the increased observations expected from COSMIC-2 next year." At its peak, COSMIC was among the top five observing systems in the world when measured by ability to boost the accuracy of numerical weather predictions, according to studies by the European Centre for Medium-Range Weather Forecasts and other international weather prediction centers. "COSMIC has conclusively demonstrated the scientific and practical value of radio occultation and established the technique as an essential part of the future global observing system," Anthes said. Accurate and precise Not only has COSMIC provided a huge volume of atmospheric observations, but those observations have also proven to be highly accurate — so much so that COSMIC data have been used over the past decade to uncover and correct instrument bias in other types of satellite observations. These biases arise as satellite sensors that detect microwave and infrared radiation degrade over time due to relentless exposure to the Sun's rays.  Scientists also have found that data collected by COSMIC are especially useful for forecasting tropical cyclones, including typhoons and hurricanes. COSMIC is able to provide critical observations of water vapor, the fuel that drives tropical cyclones, in high vertical resolution, which means scientists can determine how much water is present at what height in the atmosphere. Other observing systems over the oceans — as well as weather prediction models — often underestimate reservoirs of moisture stored in the lower troposphere, the layer of the atmosphere closest to Earth's surface where tropical cyclones form. COSMIC, on the other hand, provides valuable information to weather prediction models on how much water is available to fuel a forming cyclone.  The upcoming launch of the first phase of COSMIC-2 will increase radio occultation measurements over the tropics, where typhoons and hurricanes are born. "COSMIC-2 has the potential to revolutionize tropical cyclone forecasts," said Kuo, who is now the director of UCAR Community Programs. "Not only will COSMIC-2 provide more data from the tropics, but the data will be higher quality as well, thanks to an improved antenna on the new satellites." More than 140 scientists, engineers, and other experts in radio occultation and its use from 20 countries are gathering in Taiwan for an international conference in Taipei from March 9-11 to celebrate the 10th anniversary of the launch. 

NCAR weather modeling system aids Antarctic rescue effort

Forecast products generated by AMPS for an Antarctic rescue mission. From top: flight-level winds, surface temperature, and cloud base heights. Click to enlarge. Images courtesy of Jordan Powers, NCAR. As the U.S. Antarctic Program began mobilizing a specially equipped ski plane this week to rescue a stranded group of researchers, they asked for some key help from the National Center for Atmospheric Research. They needed a good forecast. "Antarctica has some really bad weather. Every day is different, but often not in a good way," said NCAR scientist Jordan Powers. "We provide model weather guidance to the U.S. Antarctic Program on an ongoing basis. And when these emergencies come up, they often ask us for specialized forecast products, too." This time, the emergency is related to a group of more than 30 people who are stranded at an Australian research station. The group was originally scheduled to return home on the Aurora Australis icebreaker. But the ship broke free of its moorings during a blizzard on Tuesday night (U.S. time) and ran aground. The U.S. Antarctic Program plans to send a ski-equipped LC-130 aircraft to pick up the group and fly them to Australia's Casey research station, where they can catch another flight home.  To ensure the safety of the rescue flight, the U.S. Antarctic Program asked NCAR to provide information on expected winds at 18,000 and 24,000 feet—the altitude where the LC-130 is expected to fly.  The forecast model system created by NCAR and used in Antarctica is called the Antarctic Mesoscale Prediction System (AMPS). AMPS was zoomed in on the section of the continent where the plane would be flying and is providing the information meteorologists need to make a good regional forecast. AMPS has been used in Antarctica since September 2000. About every year and a half, NCAR has been called on to help with a rescue, Powers said. "It's relatively quick to provide the specialized forecast products now because we've been doing these for so long," Powers said.  AMPS is run on Erebus, a supercomputer funded by the National Science Foundation and maintained by NCAR's Computational Information and Systems Lab. Writer/contact:Laura Snider, Senior Science Writer and Public Information Officer

Southwest dries as wet weather systems become more rare

BOULDER—The weather patterns that typically bring moisture to the southwestern United States are becoming more rare, an indication that the region is sliding into the drier climate state predicted by global models, according to a new study. "A normal year in the Southwest is now drier than it once was," said Andreas Prein, a postdoctoral researcher at the National Center for Atmospheric Research (NCAR) who led the study. "If you have a drought nowadays, it will be more severe because our base state is drier." Climate models generally agree that human-caused climate change will push the southwestern United States to become drier. And in recent years, the region has been stricken by drought. But linking model predictions to changes on the ground is challenging. In the new study—published online today in the journal Geophysical Research Letters, a publication of the American Geophysical Union—NCAR researchers grapple with the root cause of current drying in the Southwest to better understand how it might be connected to a warming climate. Subtle shift yields dramatic impact For the study, the researchers analyzed 35 years of data to identify common weather patterns—arrangements of high and low pressure systems that determine where it's likely to be sunny and clear or cloudy and wet, among other things. They identified a dozen patterns that are typical for the weather activity in the contiguous United States and then looked to see whether those patterns were becoming more or less frequent. Weather systems that typically bring moisture to the southwestern United States are forming less often, resulting in a drier climate across the region. This map depicts the portion of overall changes in precipitation across the United States that can be attributed to these changes in weather system frequency. The gray dots represent areas where the results are statistically significant. (Map courtesy of Andreas Prein, NCAR. This image is freely available for media & nonprofit use. Click image to enlarge.) "The weather types that are becoming more rare are the ones that bring a lot of rain to the southwestern United States," Prein said. "Because only a few weather patterns bring precipitation to the Southwest, those changes have a dramatic impact." The Southwest is especially vulnerable to any additional drying. The region, already the most arid in the country, is home to a quickly growing population that is putting tremendous stress on its limited water resources. “Prolonged drought has many adverse effects,” said Anjuli Bamzai, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which funded the research, “so understanding regional precipitation trends is vital for the well-being of society. These researchers demonstrate that subtle shifts in large-scale weather patterns over the past three decades or so have been the dominant factor in precipitation trends in the southwestern United States.” The study also found an opposite, though smaller, effect in the Northeast, where some of the weather patterns that typically bring moisture to the region are increasing. “Understanding how changing weather pattern frequencies may impact total precipitation across the U.S. is particularly relevant to water resource managers as they contend with issues such as droughts and floods, and plan future infrastructure to store and disperse water,” said NCAR scientist Mari Tye, a co-author of the study. The climate connection The three patterns that tend to bring the most wet weather to the Southwest all involve low pressure centered in the North Pacific just off the coast of Washington, typically during the winter. Between 1979 and 2014, such low-pressure systems formed less and less often. The associated persistent high pressure in that area over recent years is a main driver of the devastating California drought. This shift toward higher pressure in the North Pacific is consistent with climate model runs, which predict that a belt of higher average pressure that now sits closer to the equator will move north. This high-pressure belt is created as air that rises over the equator moves poleward and then descends back toward the surface. The sinking air causes generally drier conditions over the region and inhibits the development of rain-producing systems. Many of the world's deserts, including the Sahara, are found in such regions of sinking air, which typically lie around 30 degrees latitude on either side of the equator. Climate models project that these zones will move further poleward. The result is a generally drier Southwest. While climate change is a plausible explanation for the change in frequency, the authors caution that the study does not prove a connection. To examine this potential connection further, they are studying climate model data for evidence of similar changes in future weather pattern frequencies. "As temperatures increase, the ground becomes drier and the transition into drought happens more rapidly,” said NCAR scientist Greg Holland, a co-author of the study. “In the Southwest the decreased frequency of rainfall events has further extended the period and intensity of these droughts.” The study was funded in part by the National Science Foundation, NCAR's sponsor, and the Research Partnership to Secure Energy for America. Other co-authors of the study include NCAR scientists Roy Rasmussen and Martyn Clark. About the article Title: Running dry: The U.S. Southwest’s drift into a drier climate stateAuthors: Andreas F. Prein, Gregory J. Holland, Roy M. Rasmussen, Martyn P. Clark, and Mari R. TyePublication: Geophysical Research Letters

Pages

Subscribe to Weather Research