Weather Research

Past tornado experiences shape perception of risk

The following is a news release from the Society of Risk Analysis about a study led by NCAR scientist Julie Demuth.With much of the central plains and Midwest now entering peak tornado season, the impact of these potentially devastating weather events will be shaped in large part by how individuals think about and prepare for them. A new study published in Risk Analysis: An International Journal shows that people's past experiences with tornadoes inform how they approach this type of extreme weather in the future, including their perception of the risk.Led by Julie Demuth, a scientist from the National Center for Atmospheric Research, the study, "Explicating experience: Development of a valid scale of past hazard experience for tornadoes," characterized and measured people's past tornado experiences to determine their impact on the perceived risks of future tornadoes. Better understanding of these factors can help mitigate future societal harm, for instance, by improving risk communication campaigns that encourage preparation for hazardous weather events.The results indicate that people's risk perceptions are highly influenced by a memorable past tornado experience that contributes to unwelcome thoughts, feelings and disruption, which ultimately increase one's fear, dread, worry and depression. Also, the more experiences people have with tornadoes, and the more personalized those experiences, the more likely they are to believe their homes (versus the larger geographic area of their city/town) will be damaged by a tornado within the next 10 years.A tornado in Oklahoma. People's past experiences with tornadoes shapes how they perceive risk when new storms threaten. (Image courtesy NOAA.)In the context of this study, Demuth defines 'past tornado experience' as "the perceptions one acquires about the conditions associated with or impacts of a prior tornado event. Such perceptions are gained by the occurrence of a tornado threat and/or event; directly by oneself or indirectly through others; and at different points throughout the duration of the threat and event."The study was conducted through two surveys distributed to a random sample of residents in tornado prone areas of the U.S. during the spring and fall of 2014. The first survey evaluated an initial set of items measuring experiences, and the second was used to re-evaluate the experience items and to measure tornado risk perceptions. The sample sizes for the two surveys were 144 and 184, respectively.Since tornado experiences can occur at any time throughout one's life, and in multiplicity, the survey items measured both one's most memorable tornado experience and his or her multiple experiences. A factor analysis of the survey items yielded four factors which make up the memorable experience dimensions. Risk awareness: information pertaining to the possibility of a specific tornado hazard occurring, as well as threat-related social cues from both people and the media.Risk personalization: one's protective behavioral and emotional responses as well as visual, auditory and tactile sensations experienced during the tornado.Personal intrusive impacts: ways that one is personally affected by an experience, including intangible, unpleasant thoughts and feelings from the experience.Vicarious troubling impacts: others' tangible impacts and verbal accounts of their experiences and intangible intrusive impacts. The "others" are people known personally by the responding individual. Although all the items in this factor reflect others' accounts of a tornado experience, the respondent experiences these aspects by hearing about or witnessing them.The factor analysis revealed two factors contributing to the multiple experience dimensions: common threat and impact communication, and negative emotional responses. The first factor captures one's personal experience with receiving common types of information (e.g., sirens) about tornado threats and tornado-related news. The second factor captures the amount of experience a respondent has with fearing for their own life, a loved one's life and worrying about their property due to a tornado.Individual's past tornado experiences are multi-faceted and nuanced with each of the above six dimensions exerting a different influence on tornado risk perceptions. These dimensions have not been previously analyzed, particularly the intangible aspects - feelings, thoughts and emotions."This research can help meteorologists who provide many essential, skillful risk messages in the form of forecasts, watches, and warnings when tornadoes (and other hazardous weather) threaten. This research can help meteorologists recognize the many ways that people's past tornado experiences shape what they think and do, in addition to the weather forecasts they receive," states Demuth.The Society for Risk Analysis is a multidisciplinary, interdisciplinary, scholarly, international society that provides an open forum for all those interested in risk analysis. SRA was established in 1980 and has published Risk Analysis: An International Journal, the leading scholarly journal in the field, continuously since 1981. For more information, visit

A record winter during the American Revolution almost put independence on ice

December 18, 2017 | The seasonal forecasts are in for this winter, and they generally indicate relatively average conditions across much of the country's midsection, with wetter-than-normal weather likely in the north and dryness in the South.Continuing to improve these longer-term forecasts can help communities and businesses prepare for particular weather patterns — and possibly even save lives.In fact, a good seasonal forecast could even have made a difference during a critical moment in the American Revolution.This National Park Service painting portrays conditions at the Continental Army's New Jersey encampment in the winter of 1779-80, with a hospital hut in the foreground. (Image from Morristown National Historic Park.)No East Coast season on record was colder than the winter of 1779-80. All of the saltwater inlets, harbors, and sounds of the Atlantic coastal plain froze over from Canada to North Carolina, remaining closed to navigation for a month or more for the only time in recorded history.The winter happened to occur during the height of the Revolutionary War. George Washington and his soldiers were greeted by a foot of snow that already lay on the ground in November 1779 when they began arriving at their winter quarters outside Morristown, New Jersey.The ensuing winter months almost cost the young nation its independence. The Continental Army was hammered by repeated snowstorms, including a blizzard in early January that dumped four feet of snow. Many of the soldiers lacked coats, shirts, shoes, and even food.As the winter wore on, the soldiers became more embittered and mutinous than during the storied but milder winter two years earlier in Valley Forge. If not for help from surrounding communities during the winter of 1779-80, they may have deserted or even starved to death, potentially changing the course of history.Could such a winter be predicted today?Longer-term forecasting in the two-week to three-month range is one of the most difficult challenges in meteorology. These subseasonal to seasonal forecasts, while providing general guidance, still lack much precision.This winter, for example, the National Oceanic and Atmospheric Administration (NOAA) predicts that the chances are roughly equal for conditions that are wetter or drier than normal across large swaths of the mid-Atlantic. Even a forecast of a wetter-than-average winter could play out in many ways, from a series of light rains to a couple of blockbuster snowstorms.The National Oceanic and Atmospheric Administration's forecast for wintertime precipitation, released in October, projects drier-than-normal conditions in the South and wetter-than-normal conditions in parts of the North. Click here for an analysis of the forecast by NOAA's Climate Prediction Center, as well as a forecast map that includes Alaska and Hawaii. To add more detail to such forecasts, scientists are working to better understand the links between U.S. weather patterns and large-scale atmospheric and oceanic conditions, such as El Niño and the North Atlantic Oscillation (more popularly known as the "polar vortex" when it ushers in cold weather).Recognizing the importance of such research, Congress in April passed the Weather Research and Forecasting Innovation Act, a major weather bill that calls for more work into subseasonal to seasonal prediction.If the modern understanding of the atmosphere and oceans had existed during the American Revolution, perhaps Washington and his soldiers could have taken more precautions. The next time the nation is threatened by an unusually severe winter, better forecasts may make it possible to prepare."Scientists are gaining new insights into the entire Earth system in ways that will lead to predictions of weather patterns weeks, months, or even more than a year in advance," said UCAR President Antonio Busalacchi. "History shows this type of intelligence can be critical to national security, as well as to businesses and vulnerable communities."Writer/contact: David HosanskyManager of Media Relations  

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

North American storm clusters could produce 80 percent more rain

BOULDER, Colo. — Major clusters of summertime thunderstorms in North America will grow larger, more intense, and more frequent later this century in a changing climate, unleashing far more rain and posing a greater threat of flooding across wide areas, new research concludes.The study, by scientists at the National Center for Atmospheric Research (NCAR), builds on previous work showing that storms are becoming more intense as the atmosphere is warming. In addition to higher rainfall rates, the new research finds that the volume of rainfall from damaging storms known as mesoscale convective systems (MCSs) will increase by as much as 80 percent across the continent by the end of this century, deluging entire metropolitan areas or sizable portions of states."The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted," said NCAR scientist Andreas Prein, the study's lead author. "If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.""This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative," he added.The research team drew on extensive computer modeling that realistically simulates MCSs and thunderstorms across North America to examine what will happen if emissions of greenhouse gases continue unabated.The study will be published Nov. 20 in the journal Nature Climate Change. It was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Army Corps of Engineers. Hourly rain rate averages for the 40 most extreme summertime mesoscale convective systems (MCSs) in the current (left) and future climate of the mid-Atlantic region. New research shows that MSCs will generate substantially higher maximum rain rates over larger areas by the end of the century if society continues a "business as usual" approach of emitting greenhouse gases . (©UCAR, Image by Andreas Prein, NCAR. This image is freely available for media & nonprofit use.)A warning signalThunderstorms and other heavy rainfall events are estimated to cause more than $20 billion of economic losses annually in the United States, the study notes. Particularly damaging, and often deadly, are MSCs: clusters of thunderstorms that can extend for many dozens of miles and last for hours, producing flash floods, debris flows, landslides, high winds, and/or hail. The persistent storms over Houston in the wake of Hurricane Harvey were an example of an unusually powerful and long-lived MCS.Storms have become more intense in recent decades, and a number of scientific studies have shown that this trend is likely to continue as temperatures continue to warm. The reason, in large part, is that the atmosphere can hold more water as it gets warmer, thereby generating heavier rain.A study by Prein and co-authors last year used high-resolution computer simulations of current and future weather, finding that the number of summertime storms that produce extreme downpours could increase by five times across parts of the United States by the end of the century. In the new study, Prein and his co-authors focused on MCSs, which are responsible for much of the major summertime flooding east of the Continental Divide. They investigated not only how their rainfall intensity will change in future climates, but also how their size, movement, and rainfall volume may evolve.Analyzing the same dataset of computer simulations and applying a special storm-tracking algorithm, they found that the number of severe MCSs in North America more than tripled by the end of the century. Moreover, maximum rainfall rates became 15 to 40 percent heavier, and intense rainfall reached farther from the storm's center. As a result, severe MCSs increased throughout North America, particularly in the northeastern and mid-Atlantic states, as well as parts of Canada, where they are currently uncommon.The research team also looked at the potential effect of particularly powerful MCSs on the densely populated Eastern Seaboard. They found, for example, that at the end of the century, intense MCSs over an area the size of New York City could drop 60 percent more rain than a severe present-day system. That amount is equivalent to adding six times the annual discharge of the Hudson River on top of a current extreme MCS in that area."This is a warning signal that says the floods of the future are likely to be much greater than what our current infrastructure is designed for," Prein said. "If you have a slow-moving storm system that aligns over a densely populated area, the result can be devastating, as could be seen in the impact of Hurricane Harvey on Houston."This satellite image loop shows an MCS developing over West Virginia on June 23, 2016. The resulting floods caused widespread flooding, killing more than 20 people.  MCSs are responsible for much of the major flooding east of the Continental Divide during warm weather months. (Image by NOAA National Weather Service, Aviation Weather Center.) Intensive modelingAdvances in computer modeling and more powerful supercomputing facilities are enabling climate scientists to begin examining the potential influence of a changing climate on convective storms such as thunderstorms, building on previous studies that looked more generally at regional precipitation trends.For the new study, Prein and his co-authors turned to a dataset created by running the NCAR-based Weather and Research Forecasting (WRF) model over North America at a resolution of 4 kilometers (about 2.5 miles). That is sufficiently fine-scale resolution to simulate MCSs. The intensive modeling, by NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda, required a year to run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.The team used an algorithm developed at NCAR to identify and track simulated MCSs. They compared simulations of the storms at the beginning of the century, from 2000 to 2013, with observations of actual MCSs during the same period and showed that the modeled storms are statistically identical to real MCSs.The scientists then used the dataset and algorithm to examine how MCSs may change by the end of the century in a climate that is approximately 5 degrees Celsius (9 degrees Fahrenheit) warmer than in the pre-industrial era — the temperature increase expected if greenhouse gas emissions continue unabated.About the paperTitle: Increased rainfall volume from future convective storms in the USAuthors: Andreas F Prein, Changhai Liu, Kyoko Ikeda, Stanley B Trier, Roy M Rasmussen, Greg J Holland, Martyn P ClarkJournal: Nature Climate Change  

UCAR Congressional Briefing: Moving research to industry

WASHINGTON — Federally funded scientific advances are enabling the multibillion-dollar weather industry to deliver increasingly targeted forecasts to consumers and businesses, strengthening the economy and providing the nation with greater resilience to natural disasters, experts said today at a congressional briefing.The panel of experts, representing universities, federally funded labs, and the private sector, said continued government investment in advanced computer modeling, observing tools, and other basic research provides the foundation for improved forecasts.The nonprofit University Corporation for Atmospheric Research (UCAR) sponsored the briefing."Thanks to a quiet revolution in modern weather prediction, we can all use forecasts to make decisions in ways that wouldn't have been possible just 10 years ago," said Rebecca Morss, a senior scientist with the National Center for Atmospheric Research (NCAR) and deputy director of the center's Mesoscale and Microscale Meteorology Lab. "Now we are looking to the next revolution, which includes giving people longer lead times and communicating risk as effectively as possible."Fuqing Zhang, a professor of meteorology and statistics at Pennsylvania State University, highlighted the ways that scientists are advancing their understanding of hurricanes and other storms with increasingly detailed observations and computer modeling. Researchers at Penn State, for example, fed data from the new National Oceanic and Atmospheric Administration GOES-R satellite into NOAA's powerful FV3 model to generate an experimental forecast of Hurricane Harvey that simulated its track and intensity."The future of weather forecasting is very promising," said Zhang, who is also the director of the Penn State Center for Advanced Data Assimilation and Predictability Techniques.  "With strategic investments in observations, modeling, data assimilation, and supercomputing, we will see some remarkable achievements."Mary Glackin, director of science and forecast operations for The Weather Company, an IBM business, said the goal of the weather industry is to help consumers and businesses make better decisions, both by providing its own forecasts and by forwarding alerts from the National Weather Service. The Weather Company currently is adapting a powerful research weather model based at NCAR, the Model for Prediction Across Scales (MPAS), for use in worldwide, real-time forecasts.The NCAR-based Model for Prediction Across Scales simulates the entire globe while enabling scientists to zoom in on areas of interest. It is one of the key tools for improving forecasts in the future. (©UCAR. This image is freely available for media & nonprofit use.) "We have a weather and climate enterprise that we can be extremely proud of as a nation, but it's not where it should be," Glackin said. "Weather affects every consumer and business, and the public-private partnership can play a pivotal role in providing better weather information that is critically needed."Antonio Busalacchi, president of UCAR, emphasized the benefits of partnerships across the academic, public, and private sectors. He said that research investments by the National Science Foundation, NOAA, and other federal agencies are critical for improving forecasts that will better protect vulnerable communities and strengthen the economy."These essential collaborations between government agencies, universities, and private companies are driving landmark advances in weather forecasting," Busalacchi said. "The investments that taxpayers are making in basic research are paying off many times over by keeping our nation safer and more prosperous."The briefing was the latest in a series of UCAR Congressional Briefings that draw on expertise from UCAR's university consortium and public-private partnerships to provide insights into critical topics in the Earth system sciences. Past briefings have focused on wildfires, predicting space weather, aviation weather safety, the state of the Arctic, hurricane prediction, potential impacts of El Niño, and new advances in water forecasting.

New climate forecasts for watersheds - and the water sector

Nov. 10, 2017 | Water managers and streamflow forecasters can now access bi-weekly, monthly, and seasonal precipitation and temperature forecasts that are broken down by individual watersheds, thanks to a research partnership between the National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU Boulder). The project is sponsored by the National Oceanic and Atmospheric Administration (NOAA) through the Modeling, Applications, Predictions, and Projections program.Operational climate forecasts for subseasonal to seasonal time scales are currently provided by the NOAA Climate Prediction Center and other sources. The forecasts usually take the form of national contour maps (example) and gridded datasets at a relatively coarse geographic resolution. Some forecast products are broken down further, based on state boundaries or on climate divisions, which average two per state; others are summarized for major cities. But river forecasters and water managers grapple with climate variability and trends in the particular watersheds within their service areas, which do not align directly with the boundaries of existing forecast areas. A forecast that directly describes predicted conditions inside an individual watershed would be extremely valuable to these users for making decisions in their management areas, such as how much water to release or store in critical reservoirs and when.To bridge this gap, the NCAR–CU Boulder research team has developed a new prototype prediction system that maps climate forecasts to watershed boundaries over the contiguous United States in real time. The system is currently running at NCAR, with real-time forecasts and analyses available on a demonstration website."We are trying to improve the accessibility and relevance of climate predictions for streamflow forecasting groups and water managers," said NCAR scientist Andy Wood, who co-leads the project. "We can’t solve all the scientific challenges of climate prediction, but we can make it easier for a person thinking about climate and water in a river basin — such as the Gunnison, or the Yakima, or the Potomac — to find and download operational climate information that has been tailored to that basin’s observed variability."The project is funded by NOAA, and the scientists plan to hand off successful components of the system for experimental operational evaluation within the NOAA National Weather Service.  Collaborators include scientists from the NOAA Climate Prediction Center and partners from the major federal water agencies: the U.S. Army Corps of Engineers and the Bureau of Reclamation.This screenshot of the S2S Climate Outlooks for Watersheds website shows forecasted temperature anomalies for watersheds across the contiguous United States. As users scroll across different watersheds, they get more precise information. In this screenshot from early November 2017, the forecast is showing that, over the next one to two weeks, the Colorado Headwaters watershed is expected to be 1.2 degrees warmer than normal. Visit the website to learn more. (©UCAR. This image is freely available for media & nonprofit use.)  Beyond the standard weather forecastPrecipitation and temperature forecasts that extend beyond the typical 7- to 10-day window can be useful to water managers making a number of important decisions about how to best regulate supplies. For instance, during a wet water year, when snowpack is high and reservoirs are more full than usual, the relative warmth or coolness of the coming spring can affect how quickly the snow melts. Good spring season forecasts allow water managers to plan in advance for how to best manage the resulting runoff.For water systems in drought, such as California's during 2012–2015, early outlooks on whether the winter rainy season will help alleviate the drought or exacerbate it can help water utilities strategize ways of meeting the year’s water demands. Historically, making these kinds of longer-term predictions accurately has been highly challenging. But in recent years, scientists have improved their skill at subseasonal and seasonal climate prediction. NOAA’s National Centers for Environmental Prediction plays a key role, both running an in-house modeling system — the Climate Forecast System, version 2 (CFSv2) — and leading an effort called the North American Multi-Model Ensemble (NMME). These model-based forecasts help inform the NOAA official climate forecasts, which also include other tools and expert judgment. NMME combines forecasts from seven different climate models based in the U.S. and Canada to form a super-ensemble of climate predictions that extend up to 10 months into the future. The combination of the different forecasts is often more accurate than the forecast from any single model. Temperature forecasts, in particular, from the combined system are notably more accurate than they were 10 years ago, Wood said, partly due to their representation of observed warming trends. Even with these new tools, however, predicting seasonal precipitation beyond the first month continues to be a major challenge. The NCAR–CU Boulder project makes use of both the CFSv2 and NMME forecasts. It generates predictions for bi-weekly periods (weeks 1-2, 2-3, and 3-4) from CFSv2 that are updated daily and longer-term forecasts derived from the NMME (months 1, 2, 3, and season 1) that are updated monthly. The scientists currently map these forecasts to 202 major watersheds in the contiguous U.S.Analyzing forecast skillThe resulting watershed-specific forecasts are available in real-time on the project's interactive website, which also provides information about their accuracy and reliability."It's important for users to be able to check on the quality of the forecasts," said Sarah Baker, a doctoral student in the Civil, Environmental, and Architectural Engineering Department at CU Boulder. "We're able to use hindcasts, which are long records of past forecasts, to analyze and describe the skill of the current forecasts. Baker, who also works for the Bureau of Reclamation, has been building the prototype system under the supervision of Wood and her academic adviser, CU Professor Balaji Rajagopalan. The researchers are also using analyses of forecast accuracy and reliability to begin correcting for systematic biases — such as consistently over-predicting springtime rains in one watershed or under-predicting summertime heat in another — in the forecasts.The project team has presented the project at a number of water-oriented meetings in the western U.S. Water managers, operators, and researchers from agencies such as the Bureau of Reclamation and utilities such as the Southern Nevada Water Authority, which manages water for Las Vegas, have expressed interest in the new forecast products."This project has great potential to provide climate outlook information that is more relevant for hydrologists and the water sector. It will be critical to connect with stakeholders or possible users of the forecasts so that their needs can continue to help shape this type of information product," said NOAA’s Andrea Ray. Ray leads an effort funded by NIDIS, the National Integrated Drought Information System, to identify tools and information such as this for a NOAA online Water Resources Monitor and Outlook that would also help connect stakeholders to climate and water information.In the coming year, the research team will implement statistical post-processing methods to improve the accuracy of the forecasts. They will also investigate the prediction of extreme climate events at the watershed scale. ContactAndy Wood, NCAR Research Applications LaboratoryWebsite BoulderNCARNOAAU.S. Army Corps of EngineersBureau of Reclamation FunderNOAA's Modeling, Applications, Predictions and Projections Climate Testbed program

NCAR|UCAR hurricane experts available to explain storm behavior, potential impacts

BOULDER, Colo. — As Hurricane Harvey takes aim at Texas, scientists at the National Center for Atmospheric Research (NCAR) and its managing organization, the University Corporation for Atmospheric Research (UCAR), are closely watching the storm and testing high-resolution computer models.Hurricane experts are available to explain issues such as:How we can better predict the possible impacts of hurricanes, including wind damage, flooding, and subsequent spread of disease-bearing mosquitoes;How people respond to hurricane forecast and warning messages and how risk communication can be improvedWhether climate change is affecting hurricanes and what we can expect in the future;The importance of improving weather models to safeguard life and property.Antonio Busalacchi, UCAR president (please contact David Hosansky for interview requests)An expert on ocean-atmosphere interactions, Busalacchi has testified before Congress on the importance of improving the nation's weather forecasting capabilities to better protect life and property, bolster the economy, and strengthen national security. He has firsthand experience with storms along the Gulf Coast as a part-time New Orleans resident, and he is a member of the Gulf Research Program Advisory Board of the National Academy of Sciences.Christopher Davis, director, NCAR Mesoscale and Microscale Meteorology Laboratory,, 303-497-8990Davis studies the weather systems that lead to hurricanes and other heavy rainfall events. His expertise includes hurricane prediction and how computer models can be improved to better forecast storms. His NCAR weather lab is running experimental computer simulations of Hurricane Harvey.James Done, NCAR scientist,, 303-497-8209Done led development of the innovative Cyclone Damage Potential (CDP) index, which quantifies a hurricane's ability to cause destruction, using a scale of 1 to 10. It can also be used to examine the damage potential for cyclones in the future as the climate warms.David Gochis, NCAR scientist,, 303-497-2809An expert in hydrometeorology, Gochis studies the causes of floods and how to better predict them. He helped develop pioneering software that is at the core of the National Water Model. The National Oceanic and Atmospheric Administration Office of Water Prediction uses this model to provide a continuous picture of all the waterways in the contiguous United States and alert officials to potentially dangerous floods.Matthew Kelsch, UCAR hydrometeorologist,, 303-497-8309Kelsch has studied some of the biggest U.S. flood events connected to hurricanes and tropical storms. He trains scientists and forecasters from around the world on emerging hydrology and weather topics.Rebecca Morse, NCAR scientist,, 303-497-8172Morss studies the predictability of hurricane-related hazards, including storm surge and inland flooding, and hurricane and flood risk communication and evauation decision making.Kevin Trenberth, NCAR senior scientist,, 303-497-1318Trenberth is an expert on the global climate system. He has been in the forefront of scientists examining the potential influence of climate change on the intensity of tropical storms and hurricanes and the increased widespread flooding that they cause.Jeff Weber, UCAR meteorologist,, 303-497-8676As an expert on hurricanes and severe weather in general, Weber closely monitors the behavior of individual storms and the larger atmospheric and oceanic conditions that influence them.

UCAR collaboration with The Weather Company to improve weather forecasts worldwide

BOULDER, Colo. — The University Corporation for Atmospheric Research (UCAR) today announced a new collaboration with The Weather Company, an IBM business, to improve global weather forecasting. The partnership brings together cutting-edge computer modeling developed at the National Center for Atmospheric Research (NCAR) with The Weather Company's meteorological science and IBM's advanced compute equipment."This is a major public-private partnership that will advance weather prediction and generate significant benefits for businesses making critical decisions based on weather forecasts," said UCAR President Antonio J. Busalacchi. "We are gratified that taxpayer investments in the development of weather models are now helping U.S. industries compete in the global marketplace."UCAR, a nonprofit consortium of 110 universities focused on research and training in the atmospheric and related Earth system sciences, manages NCAR on behalf of the National Science Foundation.With the new agreement, The Weather Company will develop a global forecast model based on the Model for Prediction Across Scales (MPAS), an innovative software platform developed by NCAR and the Los Alamos National Laboratory.The Model for Prediction Across Scales (MPAS) enables forecasters to combine a global view of the atmosphere with a higher-resolution view of a particular region, such as North America. (@UCAR. This image is freely available for media & nonprofit use.)MPAS offers a unique way of simulating the global atmosphere while providing users with more flexibility when focusing on specific regions of interest. Unlike traditional three-dimensional models that calculate atmospheric conditions at multiple points within a block-shaped grid, it uses a hexagonal mesh resembling a honeycomb that can be stretched wide in some regions and compressed for higher resolution in others. This enables forecasters to simultaneously capture far-flung atmospheric conditions that can influence local weather, as well as small-scale features such as vertical wind shear that can affect thunderstorms and other severe weather.Drawing on the computational power of GPUs — graphics processing units — such as those being used in a powerful new generation of IBM supercomputers, and on the expertise of NCAR and The Weather Company, the new collaboration is designed to push the capabilities of MPAS to yield more accurate forecasts with longer lead times. The results of NCAR's work will be freely available to the meteorological community. Businesses, from airlines to retailers, as well as the general public, stand to benefit.Mary Glackin, head of weather science and operations for The Weather Company, said, "As strong advocates for science, we embrace strong public-private collaborations that understand the value science brings to society, such as our continued efforts with UCAR to advance atmospheric and computational sciences.""Thanks to research funded by the National Science Foundation and other federal agencies, society is on the cusp of a new era in weather prediction, with more precise short-range forecasts as well as longer-term forecasts of seasonal weather patterns," Busalacchi said. "These forecasts are important for public health and safety, as well as enabling companies to leverage economic opportunities in ways that were never possible before."About The Weather CompanyThe Weather Company, an IBM Business, helps people make informed decisions and take action in the face of weather. The company offers weather data and insights to millions of consumers, as well as thousands of marketers and businesses via Weather’s API, its business solutions division, and its own digital products from The Weather Channel ( and Weather Underground ( webpage was last updated on July 5, 2017.

Offshore wind turbines vulnerable to Category 5 hurricane gusts

NCAR scientist George Bryan is a co-author of a new study appearing in the journal Geophysical Research Letters. The following is an excerpt from a news release by the University of Colorado Boulder. Offshore wind turbines built according to current standards may not be able to withstand the powerful gusts of a Category 5 hurricane, creating potential risk for any such turbines built in hurricane-prone areas, new University of Colorado Boulder-led research shows.The study, which was conducted in collaboration with the National Center for Atmospheric Research in Boulder, Colorado, and the U.S. Department of Energy’s National Renewable Energy Laboratory in Golden, Colorado, highlights the limitations of current turbine design and could provide guidance for manufacturers and engineers looking to build more hurricane-resilient turbines in the future.Offshore wind-energy development in the U.S. has ramped up in recent years, with projects either under consideration or already underway in most Atlantic coastal states from Maine to the Carolinas, as well as the West Coast and Great Lakes. The country’s first utility-scale offshore wind farm, consisting of five turbines, began commercial operation in December 2016 off the coast of Rhode Island.Turbine design standards are governed by the International Electrotechnical Commission (IEC). For offshore turbines, no specific guidelines for hurricane-force winds exist. Offshore turbines can be built larger than land-based turbines, however, owing to a manufacturer’s ability to transport larger molded components such as blades via freighter rather than over land by rail or truck.Read the full news release.

Warmer temperatures cause decline in key runoff measure

BOULDER, Colo. — Since the mid-1980s, the percentage of precipitation that becomes streamflow in the Upper Rio Grande watershed has fallen more steeply than at any point in at least 445 years, according to a new study led by the National Center for Atmospheric Research (NCAR).While this decline was driven in part by the transition from an unusually wet period to an unusually dry period, rising temperatures deepened the trend, the researchers said.The study paints a detailed picture of how temperature has affected the runoff ratio — the amount of snow and rain that actually makes it into the river — over time, and the findings could help improve water supply forecasts for the Rio Grande, which is a source of water for an estimated 5 million people.The study results also suggest that runoff ratios in the Upper Rio Grande and other neighboring snow-fed watersheds, such as the Colorado River Basin, could decline further as the climate continues to warm.Sandhill cranes in the San Luis Valley of Colorado. The mountains ringing the valley form the headwaters of the Rio Grande River, which flows south into New Mexico and along the border between Texas and Mexico. (Photo courtesy of the National Park Service.)"The most important variable for predicting streamflow is how much it has rained or snowed," said NCAR scientist Flavio Lehner, lead author of the study. "But when we looked back hundreds of years, we found that temperature has also had an important influence  — which is not currently factored into water supply forecasts. We believe that incorporating temperature in future forecasts will increase their accuracy, not only in general but also in the face of climate change."The study, published in the journal Geophysical Research Letters, was funded by the Bureau of Reclamation, Army Corps of Engineers, National Oceanic and Atmospheric Administration (NOAA), and National Science Foundation, which is NCAR's sponsor.Co-authors of the paper are Eugene Wahl, of NOAA; Andrew Wood, of NCAR; and Douglas Blatchford and Dagmar Llewellyn, both of the Bureau of Reclamation.Over-predicting water supplyBorn in the Rocky Mountains of southern Colorado, the Rio Grande cuts south across New Mexico before hooking east and forming the border between Texas and Mexico. Snow piles up on the peaks surrounding the headwaters throughout the winter, and in spring the snowpack begins to melt and feed the river.The resulting streamflow is used both by farmers and cities, including Albuquerque, New Mexico, and El Paso, Texas, and water users depend on the annual water supply forecasts to determine who gets how much of the river. The forecast is also used to determine whether additional water needs to be imported from the San Juan River, on the other side of the Continental Divide, or pumped from groundwater.Current operational streamflow forecasts depend on estimates of the amount of snow and rain that have fallen in the basin, and they assume that a particular amount of precipitation and snowpack will always yield a particular amount of streamflow.In recent years, those forecasts have tended to over-predict how much water will be available, leading to over-allocation of the river. In an effort to understand this changing dynamic, Lehner and his colleagues investigated how the relationship between precipitation and streamflow, known as the runoff ratio, has evolved over time.Precipitation vs. streamflow: Tree rings tell a new storyThe scientists used tree ring-derived streamflow data from outside of the Upper Rio Grande basin to reconstruct estimates of precipitation within the watershed stretching back to 1571. Then they combined this information with a separate streamflow reconstruction within the basin for the same period. Because these two reconstructions were independent, it allowed the research team to also estimate runoff ratio for each year: the higher the ratio, the greater the share of precipitation that was actually converted into streamflow."For the first time, we were able to take these two quantities and use them to reconstruct runoff ratios over the past 445 years," Wahl said.They found that the runoff ratio varies significantly from year to year and even decade to decade. The biggest factor associated with this variation was precipitation. When it snows less over the mountains in the headwaters of the Rio Grande, not only is less water available to become streamflow, but the runoff ratio also decreases. In other words, a smaller percentage of the snowpack becomes streamflow during drier years.But the scientists also found that another factor affected the runoff ratio: temperature. Over the last few centuries, the runoff ratio was reduced when temperatures were warmer. And the influence of temperature strengthened during drier years: When the snowpack was shallow, warm temperatures reduced the runoff ratio more than when the snowpack was deep, further exacerbating drought conditions. The low runoff ratios seen in dry years were two and a half to three times more likely when temperatures were also warmer."The effect of temperature on runoff ratio is relatively small compared to precipitation," Lehner said. "But because its greatest impact is when conditions are dry, a warmer year can make an already bad situation much worse."A number of factors may explain the influence of temperature on runoff ratio. When it's warmer, plants take up more water from the soil and more water can evaporate directly into the air. Additionally, warmer temperatures can lead snow to melt earlier in the season, when the days are shorter and the angle of the sun is lower. This causes the snow to melt more slowly, allowing the meltwater to linger in the soil and giving plants added opportunity to use it.The extensive reconstruction of historical runoff ratio in the Upper Rio Grande also revealed that the decline in runoff ratio over the last three decades is unprecedented in the historical record. The 1980s were an unusually wet period for the Upper Rio Grande, while the 2000s and 2010s have been unusually dry. Pair that with an increase in temperatures over the same period, and the decline in runoff ratio between 1986 and 2015 was unlike any other stretch of that length in the last 445 years.The graph shows changes to runoff ratio in the Upper Rio Grande over time. (Image courtesy Flavio Lehner, NCAR.) Upgrading the old approachesThis new understanding of how temperature influences runoff ratio could help improve water supply forecasts, which do not currently consider whether the upcoming months are expected to be hotter or cooler than average. The authors are now assessing the value of incorporating seasonal temperature forecasts into water supply forecasts to account for these temperature influences. The study complements a multi-year NCAR project funded by the Bureau of Reclamation and the Army Corps of Engineers that is evaluating prospects for enhancing seasonal streamflow forecasts for reservoir management.“Forecast users and stakeholders are increasingly raising questions about the reliability of forecasting techniques if climate is changing our hydrology," said Wood, who led the effort. "This study helps us think about ways to upgrade one of our oldest approaches — statistical water supply forecasting — to respond to recent trends in temperature. Our current challenge is to find ways to make sure the lessons of this work can benefit operational streamflow forecasts.” Because the existing forecasting models were calibrated on conditions in the late 1980s and 1990s, it's not surprising that they over-predicted streamflow in the drier period since 2000, Lehner said."These statistical models often assume that the climate is stable," Lehner said. "It's an assumption that sometimes works, but statistical forecasting techniques will struggle with any strong changes in hydroclimatology from decade to decade, such as the one we have just experienced."Lehner is a Postdoc Applying Climate Expertise (PACE) fellow, which is part of the Cooperative Programs for the Advancement of Earth System Science (CPAESS). CPAESS is a community program of the University Corporation for Atmospheric Research (UCAR).About the articleTitle: Assessing recent declines in Upper Rio Grande River runoff efficiency from a paleoclimate perspectiveAuthors: Flavio Lehner, Eugene R. Wahl, Andrew W. Wood, Douglas B. Blatchford, and Dagmar LlewellynJournal: Geophysical Research Letters, DOI: 10.1002/2017GL073253Writer:Laura Snider, Senior Science Writer and Public Information Officer


Subscribe to Weather Research