Weather Research

NCAR-based climate model gets a significant upgrade

BOULDER, Colo. — The National Center for Atmospheric Research (NCAR) has released an updated version of its flagship climate model to include a host of new capabilities — from a much more realistic representation of Greenland's evolving ice sheet to the ability to model in detail how crops interact with the larger Earth system to the addition of wind-driven waves on the model's ocean surface.The Community Earth System Model version 2 (CESM2) is an open-source community computer model largely funded by the National Science Foundation, which is NCAR's sponsor, and the U.S. Department of Energy's Office of Science.Released publicly last week, CESM2 builds on a succession of climate models, each cutting edge for their day, stretching back decades to a time when their software only simulated atmospheric circulation. By comparison, CESM2 includes interactions among the land, ocean, atmosphere, land ice, and sea ice, representing the many important ways the different parts of the Earth system interact."The breadth of the science questions we can tackle just significantly expanded; that's very exciting to me," said Jean-François Lamarque, who led the effort to develop CESM2 until recently. "Every time we release a new model we're providing a better tool to do the science. It's a more complicated tool, but the world is very complicated."The new capabilities of CESM2 include:An atmospheric model component that incorporates significant improvements to its turbulence and convection representations, which open the way for an analysis of how these small-scale processes can impact the climate. Improved ability to simulate modes of tropical variability that can span seasons and affect global weather patterns, including extreme precipitation over the western United States. These more realistic representations will allow researchers to better understand those connections and could lead to improved seasonal predictions. A land ice sheet model component for Greenland that can simulate the complex way the ice sheet moves — sluggish in the middle and much more quickly near the coast — and does a better job of simulating calving of the ice into the ocean.  A global crop model component that can simulate both how cropland affects regional climate, including the impacts of increased irrigation, and how the changing climate will affect crop productivity. The component also allows scientists to explore the impacts of increased use of fertilizers and greater concentrations of atmospheric carbon dioxide, which can spur plant growth.  A wave model component that simulates how wind creates waves on the ocean, an important mechanism for mixing of the upper ocean, which in turn affects how well the model represents sea surface temperatures. An updated river model component that simulates surface flows across hillsides and into tributaries before entering the main river channel. It also simulates the speed of water as it moves through the channel, along with water depth. A new set of infrastructure utilities that provide many new capabilities for easier portability, case generation and user customization, testing functionality, and greatly increased robustness and flexibility.A full list of updates with more technical descriptions can be found at This image from a global CESM2 historical simulation shows key aspects of the Arctic climate system. The speed at which simulated glacier ice flows over Greenland is represented, with warmer colors indicating faster speeds. The September 2005 sea ice concentration is depicted in grayscale, with white indicating higher ice concentrations. The time series of September mean sea ice extent simulated by CESM2 is in good agreement with the satellite observations provided by the National Snow and Ice Data Center for the late 20th century and early 21st century, with both showing the recent sea ice decline. (©UCAR. Image courtesy of Alice DuVivier, Gunter Leguy, and Ryan Johnson/NCAR. This image is freely available for media & nonprofit use.)Community-driven, continuously improvedWork on CESM2 began in earnest about five years ago, but scientists began tinkering with how to improve the model as soon as CESM1 was released in 2010. It's no different with CESM2."We've already started to think about what we can improve for CESM3," Lamarque said. "We know, for example, that we want to make the ocean model better to expand the kind of scientific questions it can be used to answer."Collaboration and input from the broader Earth system science community has always been at the heart of the complex model development facilitated by NCAR. For example, the land model component of the new CESM2 tapped the expertise of more than 50 researchers at 16 different institutions.CESM, which is freely available, is an important tool for Earth system science researchers across the United States and the globe who are studying everything from the predictability of seasonal droughts to accelerating sea level rise. The NCAR-based model is one of about a dozen leading climate models around the globe that scientists use to research the changing climate and contribute what they find to the Intergovernmental Panel on Climate Change.Because the Earth system is so complicated, and computing resources are so limited, the computer models used to simulate how Earth's climate behaves use a mix of equations that actually represent the physics, biology, and chemistry behind the processes that unfold in the Earth system — from evaporation to ozone formation to deforestation to sea ice melt — and "parameterizations," which simplify small-scale processes and estimate their impacts."CESM2 is representing much more of the physics than past models, and we are doing a much better job of it," said CESM Chief Scientist Gokhan Danabasoglu, who is now leading the model development effort. "There are numerous new capabilities in all component models as well as significant infrastructure improvements for flexibility and easier portability.”These improved equations allow the model to do an even better job replicating the real world."The model is our lab — the only laboratory we get when studying the climate," Lamarque said. "So it has to be close enough to the real world to be relevant."

Measuring snowfall in Antarctica

Antarctica is one of the snowiest and windiest places on Earth, making it difficult for researchers to measure the amount of snow that is falling, and then becoming part of, the Antarctic ice sheets. A team of researchers have set up the first advanced suite of snowfall measuring tools in this harsh environment. (Image: Scott Landolt.)June 5, 2018 | Less than a year ago, an iceberg the size of the state of Delaware calved off an Antarctic ice shelf into the Weddell Sea, taking the final step in the slow but inevitable march of the ice sheet toward the ocean.Observers have long tracked the ice sheets to the ocean's edge, but little is known about the origins of these frozen formations. Scientists are interested in determining how much snowfall is feeding the ice sheets, and whether that amount offsets the frozen water lost to the ocean each year. For researchers studying the impacts of global climate change on the polar ice sheets, this data could help predict the future of Antarctica’s ice — and the world's coastlines. According to the National Snow and Ice Data Center, if the Antarctic Ice Sheet ever fully melted without being replenished, global sea level would rise about 200 feet (60 meters)."We have a good idea of how much ice is being lost," said NCAR scientist Scott Landolt. "What we don’t have a good handle on is how much snow is accumulating to offset the ice loss."To find the answers, Landolt and his team have just set up the first suite of advanced snowfall measuring tools in Antarctica. The goal is to get a picture of the snowfall budget feeding the ice sheets that hold approximately 90 percent of the world’s ice. The project is funded by the National Science Foundation, which is NCAR’s sponsor.Going to the snowiest landscape on Earth was the research trip of a lifetime for Landolt, who has been conducting snowfall measurements in the United States for 20 years. "Antarctica is the holy grail of snowfall measurement," he said.Measuring falling snowWhen snow falls in Antarctica, a variety of factors can compress it into ice that then slowly crawls toward the ocean. To understand how much snow is contributing to the ice sheet, researchers must measure the amount of water actually frozen in the snowflakes, referred to as the liquid water equivalent. The liquid water equivalent is very challenging to measure, and until now there have been few attempts by Antarctic researchers to do so.The challenge arises as snow falls. It is lighter than rain and therefore more likely to fall along a path dictated by blowing wind. To make things more difficult, once a snowflake reaches the ground it could then be picked up by the wind and blown for miles before finally settling down.With collaborators Mark Seefeldt and Andrew Monaghan, both at the University of Colorado Boulder, Landolt had to design a system that can make the distinction between falling snow and blowing snow. The system also had to withstand hurricane-force winds and temperatures well below -40 degrees (Fahrenheit/Celsius). "We have refined our techniques over the years doing local research on snowfall measurement, and we know the cutting edge of technology for doing this," said Landolt. "Now we are going to take on the real challenge of doing this in ridiculously windy and ridiculously cold conditions and try to get meaningful data out of that."A snowfall data collecting station includes a bucket on a mass balance inside a precipitation gauge, surrounded by two concentric wind shields to help guide the snow into the mouth of the gauge. (Image: Scott Landolt.)Handling extreme conditionsIn November 2017, Landolt and his team, which included technical support from UNAVCO, set up a suite of specially tailored snow measurement tools at four precipitation monitoring stations in Antarctica. The tools measure snow height, snow mass (to determine the liquid water equivalent), snowflake size, wind speed, and the number of snowflakes that fall in a given amount of time. The researchers also set up a webcam to keep an eye on their weather stations. The entire suite runs on just three watts of power per year, the equivalent of powering a small light bulb for the same amount of time.Snow mass is the key measurement that helps researchers estimate the liquid water equivalent going into the ice sheet. These data are collected using a bucket inside a precipitation gauge that sits on a very sensitive mass balance. The precipitation gauge is encircled by two concentric wind shields designed to slow wind down, minimizing gusting winds that can blow the snow around and over the bucket. The shields help direct the snow to fall into the gauge. Once snow has accumulated in the bucket, the researchers can measure the mass and convert it to the liquid water equivalent.Preliminary measurements from the sites are already revealing more of the Antarctic’s story. Landolt’s team noticed that after a snowfall event, the measurement devices show signs of snow sublimating — the process of going directly from solid snowflakes to water vapor. Landolt pointed to an example of a snowfall that accumulated 0.2 inches of liquid water equivalent, but then slowly sublimated out over the course of two weeks. This could mean that less Antarctic snow may be feeding ice sheets than previously thought."This is one cool thing about the project that we did not set out to measure but we will report on," said Landolt. "There have been estimates of how much snow is sublimating, but to my knowledge, nobody has actually really measured this before."The team has two more trips to the Antarctic sites scheduled: a maintenance trip later this year, and a tentative sensor removal trip in 2019. If the data from the weather stations is sound, the next step would be to integrate the technology into existing Antarctic weather stations. Once the data is analyzed, the team expects this work to help climate scientists adjust models of climate change and sea level rise."If this works, the goal would be to get more stations across Antarctica, which would contribute to more accurate snowfall modeling," said Landolt. "There are worldwide impacts to the research we are doing."The team of researchers set up four precipitation measuring stations in Antarctica, complete with a suite of tools to measure snow height, snow mass, snowflake size, wind speed, and number of snowflakes falling in a time period. (Image: Scott Landolt.)Collaborators:Andrew Monaghan, University of Colorado BoulderMark Seefeldt, University of Colorado BoulderNikko Bayou, UNAVCO
Spenser Niebuhr, UNAVCO
Thomas Nylen, UNAVCOFunder:
National Science FoundationWriter/Contact:
Alexandra Branscombe

Hurricanes: A bit stronger, a bit slower, and a lot wetter in a warmer climate

BOULDER, Colo. — Scientists have published a detailed analysis of how 22 recent hurricanes would change if they instead formed near the end of this century. While each storm's transformation would be unique, on balance, the hurricanes would become a little stronger, a little slower moving, and a lot wetter.In one example, Hurricane Ike — which killed more than 100 people and devastated parts of the U.S. Gulf Coast in 2008 — could have 13 percent stronger winds, move 17 percent slower, and be 34 percent wetter if it formed in a future, warmer climate.Other storms could become slightly weaker (like Hurricane Ernesto) or move slightly faster (like Hurricane Gustav). None would become drier. The rainfall rate of simulated future storms in the study increased by an average of 24 percent.The study, led by the National Center for Atmospheric Research (NCAR) and published in the Journal of Climate, compares high-resolution computer simulations of more than 20 historical, named Atlantic storms with a second set of simulations that are identical except for a warmer, wetter climate that is consistent with the average outcome of scientific projections for the end of this century."Our research suggests that future hurricanes could drop significantly more rain," said NCAR scientist Ethan Gutmann, who led the study. "Hurricane Harvey demonstrated last year just how dangerous that can be."Harvey produced more than four feet of rain in some locations, breaking records and causing devastating flooding across the Houston area.The research was funded by the National Science Foundation, which is NCAR's sponsor, and by DNV GL (Det Norske Veritas Germanischer Lloyd), a global quality assurance and risk management company.This infographic shows how 22 named storms would change if they formed at the end of this century instead of toward the beginning. Each individual storm changed in a unique way, with some getting weaker or faster instead of stronger or slower. All became wetter. To see a larger version of this graphic, click here. (©UCAR. Graphic by Simmi Sinha. This image is freely available for media & nonprofit use.)  Tapping a vast data set to see stormsWith more people and businesses relocating near the coasts, the potential influence of climate change on hurricanes has significant implications for public safety and the economy. Last year's hurricane season, which caused an estimated $215 billion in losses according to Munich RE, was the costliest on record."This study shows that the number of strong hurricanes, as a percent of total hurricanes each year, may increase," said Ed Bensman, a program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which supported the study. "With increased development along coastlines, that has important implications for future storm damage."It's been challenging for scientists to study how hurricanes may change in the future as the climate continues to warm. Most climate models, which are typically run on a global scale over decades or centuries, are not run at a high enough resolution to "see" hurricanes.Most weather models, on the other hand, are run at a high enough resolution to accurately represent hurricanes, but they generally are not used to simulate long-term changes in climate due to the high cost of computational resources.For the current study, the researchers took advantage of a massive new data set created at NCAR by running the Weather Research and Forecasting (WRF) model at a high resolution (4 kilometers, or about 2.5 miles) over the contiguous United States over two 13-year periods. The simulations took about a year to run at the NCAR-Wyoming Supercomputing Center in Cheyenne.The first set of model runs simulates the weather as it unfolded between 2000–2013 and the second simulates the same weather patterns, but in a climate that is about 5 degrees Celsius (9 degrees Fahrenheit) hotter — the amount of warming expected by end of century if greenhouse gas emissions continue unabated.The scientists created an algorithm to detect and track hurricanes within the vast amount of data. They identified 22 named storms that appear with very similar tracks in both the historic and future simulations, allowing them to be more easily compared.As a group, the storms in the future simulation had 6 percent stronger average hourly maximum wind speeds than those in the past. They also moved at a 9 percent slower speed and had a 24 percent higher average hourly maximum rainfall rate.  Average storm radius did not change.But each storm was unique."Some past studies have also run WRF at a high resolution to study the impact of climate change on hurricanes, but those studies have tended to look at a single storm, like Sandy or Katrina," Gutmann said. "What we find looking at more than 20 storms is that some change one way, while others change in a different way. There is so much variability that you can't just study one storm and then extrapolate to all storms."Still, there was one consistent feature across storms: They all produced more rain.While the study sheds light on how a particular storm might look in a warmer climate, it doesn't provide insight into how global warming might affect storm genesis. That's because the hurricanes analyzed in this study formed outside of the region simulated by WRF and passed into the WRF simulation as fully formed storms.Other research has suggested that fewer storms may form in the future due to increasing atmospheric stability or greater high-level wind shear, though the storms that do form are apt to be stronger."It's possible that in a future climate, large-scale atmospheric changes would make it so that some of these storms might never be able to form," Gutmann said. "But from this study we get an idea of what we can expect from the storms that do form."The study co-authors include NCAR scientists Roy Rasmussen, Changhai Liu, Kyoko Ikeda, Cindy Bruyere, and James Done, as well as Luca Garrè, Peter Friis-Hansen, and Vidyynmala Veldore, all of DNV GL.About the articleTitle: Changes in Hurricanes from a 13-Yr Convection-Permitting Pseudo-Global Warming SimulationAuthors: Ethan D. Gutmann, Roy M. Rasmussen, Changhai Liu, Kyoko Ikeda, Cindy L. Bruyere, James M. Done, Luca Garrè, Peter Friis-Hansen, and Vidyunmala VeldoreJournal: Journal of Climate, DOI: 10.1175/JCLI-D-17-0391.1Writer:Laura Snider, Senior Science Writer

UCAR, Skymet team up to improve forecasts in India

BOULDER, Colo. — A major new partnership between the University Corporation for Atmospheric Research (UCAR) and Skymet Weather Services will provide people across India with more detailed and accurate forecasts.The $1 million agreement will enable Skymet to use a customized version of the DICast® system, cutting-edge automated weather prediction technology developed at the National Center for Atmospheric Research (NCAR). DICast uses advanced statistical techniques to blend output from different weather models with observations and statistical datasets, generating dynamically tuned predictions for specific sites that are more accurate than those based on a single model and traditional statistical approaches.This will result in improved forecasts for residents throughout India, including tens of millions of farmers, business executives, and emergency officials. The information will help strengthen business competitiveness while providing vulnerable communities with early warning of floods and other disasters."This investment by Skymet is a commitment to the science enabled by UCAR and the role it plays in protecting lives and property in the United States and around the world," said UCAR President Antonio Busalacchi. "We look forward to a long-term and fruitful relationship with Skymet, producing broad benefits to the people of India."Skymet, based outside New Delhi, provides weather forecasts and graphics to Indian media organizations and customized predictions to the nation's agricultural, insurance, energy, and other vital sectors of its fast-growing economy.Skymet founder and CEO Jatin Singh (left) and UCAR President Antonio Busalacchi shake hands after signing an agreement that will lead to improved weather forecasts across India. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.)"We're honored to work with UCAR on this very important endeavor, which will enable us to bring the most advanced products and weather services to the people of India," said Jatin Singh, Skymet's founder and CEO. "This agreement provides us access to the best technology in the world. The forecasts that result will save lives and increase the productivity of farmers and businesses."Singh added that he looks forward to future collaborations with UCAR.William Mahoney, director of the NCAR lab that developed DICast, said the five-year agreement will enable his team of scientists and engineers to focus on improving forecasts of tropical weather patterns, including monsoons. DICast has been applied globally, but has primarily been used for forecasts in midlatitude regions."This opens up research opportunities for better understanding tropical weather patterns," Mahoney said. "It will enable us to expand our support of the U.S. and global weather industry and provide additional benefits to society."James Cowie, an NCAR software engineer who helped develop DICast, added that Skymet's extensive observations of atmospheric conditions throughout India will provide important information for future forecasts."Skymet has a huge weather observing network in India, and we plan to incorporate their data into the DICast system," Cowie said. "This will improve both short and long-range forecasts of weather conditions and will enable the forecasts to zoom in on scales of less than 1 kilometer."Founded in 2003, Skymet is India's largest weather monitoring and agriculture risk solutions company.DICast is a registered trademark of the University Corporation for Atmospheric Research.

Record-breaking ocean heat fueled Hurricane Harvey

BOULDER, Colo. — In the weeks before Hurricane Harvey tore across the Gulf of Mexico and plowed into the Texas coast in August 2017, the Gulf's waters were warmer than any time on record, according to a new analysis led by the National Center for Atmospheric Research (NCAR).These hotter-than-normal conditions supercharged the storm, fueling it with vast stores of moisture, the authors found. When it stalled near the Houston area, the resulting rains broke precipitation records and caused devastating flooding."We show, for the first time, that the volume of rain over land corresponds to the amount of water evaporated from the unusually warm ocean," said lead author Kevin Trenberth, an NCAR senior scientist. "As climate change continues to heat the oceans, we can expect more supercharged storms like Harvey."Despite a busy 2017 hurricane season, Hurricane Harvey was more or less isolated in location and time, traveling solo over relatively undisturbed waters in the Gulf of Mexico. This gave Trenberth and his colleagues an opportunity to study in detail how the storm fed off the heat stored in that 930-mile wide ocean basin.The team compared temperatures in the upper 160 meters (525 feet) of the Gulf before and after the storm using data collected by Argo, a network of autonomous floats that measure temperature as they move up and down in the water. To measure rainfall over land, the scientists took advantage of a new NASA-based international satellite mission, dubbed Global Precipitation Measurement.The study appears in the journal Earth's Future, a publication of the American Geophysical Union. It was funded by the U.S. Department of Energy and by the National Science Foundation, which is NCAR's sponsor. Other co-authors of the paper are Yongxin Zhang and John Fasullo, also of NCAR; Lijing Cheng, of the Chinese Academy of Sciences; and Peter Jacobs, of George Mason University.An image of Hurricane Harvey taken by the GOES-16 satellite as the storm collided with the Texas coast. (Image courtesy NASA.) Matching evaporation and rainAs hurricanes move over the ocean, their strong winds strafe the sea surface, making it easier for water to evaporate. The process of evaporation also requires energy from heat, and the warmer the temperatures are in the upper ocean and at the ocean surface, the more energy is available.As the storm progresses over the ocean, evaporating water as it goes, it leaves a cold wake in its path. In the case of Hurricane Harvey, the scientists found the cold wake was not very cold. So much heat was available in the upper layer of the ocean that, as the surface temperature was cooled from the storm, heat from below welled up, rewarming the surface waters and continuing to feed the storm.The near-surface ocean temperature before the storm's passage was upward of 30 degrees Celsius (86 degrees Fahrenheit), and after passage the temperature was still around 28.5 C (83 F). Sea surface temperatures above 26 C (79 F) are typically needed for a hurricane to continue to grow.Even after Harvey made landfall, its arms reached out over the ocean, continuing to draw strength (and water) from the still-warm Gulf."The implication is that the warmer oceans increased the risk of greater hurricane intensity and duration," Trenberth said. "While we often think of hurricanes as atmospheric phenomena, it's clear that the oceans play a critical role and will shape future storms as the climate changes."The scientists were able to measure the total loss in ocean heat, mostly due to evaporation, as the storm moved over the Gulf. They also measured the latent heat released over land as the water vapor turned back into liquid water and fell as rain. They then compared those two measurements and found that they corresponded.The study highlights the increased threat of future supercharged hurricanes due to climate change, Trenberth said."We know this threat exists, and yet in many cases, society is not adequately planning for these storms," Trenberth said. "I believe there is a need to increase resilience with better building codes, flood protection, and water management, and we need to prepare for contingencies, including planning evacuation routes and how to deal with power cuts."About the studyTitle: Hurricane Harvey Links to Ocean Heat Content and Climate Change AdaptationAuthors: Kevin E. Trenberth, Lijing Cheng, Peter Jacobs, Yongxin Zhang, and John FasulloJournal: Earth's Future. DOI: 10.1029/2018EF000825Writer:Laura Snider, Senior Science Writer

UCAR Congressional Briefing: Subseasonal to seasonal forecasts

WASHINGTON — Federal investments in atmospheric and oceanic research are ushering in major advances in longer-term weather prediction, enabling private companies to provide their clients with valuable forecasts of weather patterns weeks to months in advance, experts said today at a congressional briefing.A panel of scientists representing universities and the private sector agreed that continued government investment in advanced computer modeling, observing tools, and supercomputers is critical for progress in forecasting on longer time scales.The nonprofit University Corporation for Atmospheric Research (UCAR) sponsored the briefing.Subseasonal to seasonal forecasts are predictions of regional weather patterns from two weeks to two years in advance, such as the likelihood of unusually dry or stormy conditions. Improving such forecasts is a national priority, emphasized in the Weather Research and Forecasting Innovation Act that Congress passed last year The panelists said the key to long-term forecasts is increased understanding of the role of the oceans and ocean-atmosphere patterns such as El Niño."Ocean conditions change more slowly than the atmosphere, and that longer memory allows us to predict weather patterns on longer time scales," said Ben Kirtman, a professor of atmospheric sciences at the University of Miami Rosenstiel School of Marine and Atmospheric Science. "How temperatures evolve below the ocean surface, and how the atmosphere and the ocean exchange heat and moisture and momentum — these processes are particularly important when you want to make subseasonal to seasonal forecasts."Gokhan Danabasoglu, chief scientist of the Community Earth System Model at the National Center for Atmospheric Research, said advanced computer models that incorporate observations of ocean conditions are being increasingly used for longer-term prediction. For example, forecasters are able to produce increasingly accurate month-ahead outlooks of temperatures over North America, and researchers have even been able to generate a 10-year forecast of Arctic sea ice conditions, which is important to shipping companies."Better models and more detailed information about ocean conditions lead to better predictions," Danabasoglu said. "We are seeing promising results in longer-term predictions that can be highly beneficial to society."Such forecasts are providing critical intelligence to the $100 billion livestock industry, said Chad McNutt, principal and co-founder of Livestock Wx, which provides livestock producers with advanced weather and climate information. He explained that cattle producers need advance information about temperature and precipitation patterns that affect winter wheat and other crops that cattle graze on. His clients also want to know the timing of insect infestations that affect cattle health."Agriculture sectors like the cattle industry need sustained support for research into improved subseasonal to seasonal forecasting," McNutt said. "The forecasts have real economic implications for producers."Alicia Karspeck, climate scientist and associate director of research partnerships with Jupiter Technology Systems, Inc., said private companies need high-quality, accessible, and continuous data from federal agencies to create long-term prediction products. Jupiter relies on federally funded observations and computer modeling to provide its clients with customized climate and weather risk analytics on timescales of weeks to decades."Federal funding for climate research, observations, and computing creates real value for the private sector, helping us deliver high-quality forecast products to our customers," Karspeck said. "Our company understands that the pipeline from scientific discovery to useful and marketable products relies on a vibrant, well-resourced research sector."Antonio Busalacchi, president of UCAR, emphasized the close partnerships among government agencies, universities, and private companies as they work to improve long-range forecasts."These collaborations are enabling us to better understand the entire Earth system in ways that will allow society to prepare for weather patterns weeks to months in advance," Busalacchi said. "Accurate subseasonal to seasonal forecasts will help to safeguard lives and property as well as benefit every economic sector."The briefing was the latest in a series of UCAR Congressional Briefings that draw on expertise from UCAR's university consortium and public-private partnerships to provide insights into critical topics in the Earth system sciences. Past briefings have focused on moving advances in Earth science research to industry, predicting wildfires, forecasting space weather, tools that improve aviation weather safety, the state of the Arctic, hurricane prediction, potential impacts of El Niño, and new advances in water resources forecasting.

Past tornado experience shapes perception of risk

The following is a news release from the Society of Risk Analysis about a study led by NCAR scientist Julie Demuth.With much of the central plains and Midwest now entering peak tornado season, the impact of these potentially devastating weather events will be shaped in large part by how individuals think about and prepare for them. A new study published in Risk Analysis: An International Journal shows that people's past experiences with tornadoes inform how they approach this type of extreme weather in the future, including their perception of the risk.Led by Julie Demuth, a scientist from the National Center for Atmospheric Research, the study, "Explicating experience: Development of a valid scale of past hazard experience for tornadoes," characterized and measured people's past tornado experiences to determine their impact on the perceived risks of future tornadoes. Better understanding of these factors can help mitigate future societal harm, for instance, by improving risk communication campaigns that encourage preparation for hazardous weather events.The results indicate that people's risk perceptions are highly influenced by a memorable past tornado experience that contributes to unwelcome thoughts, feelings and disruption, which ultimately increase one's fear, dread, worry and depression. Also, the more experiences people have with tornadoes, and the more personalized those experiences, the more likely they are to believe their homes (versus the larger geographic area of their city/town) will be damaged by a tornado within the next 10 years.A tornado in Oklahoma. People's past experiences with tornadoes shapes how they perceive risk when new storms threaten. (Image courtesy NOAA.)In the context of this study, Demuth defines 'past tornado experience' as "the perceptions one acquires about the conditions associated with or impacts of a prior tornado event. Such perceptions are gained by the occurrence of a tornado threat and/or event; directly by oneself or indirectly through others; and at different points throughout the duration of the threat and event."The study was conducted through two surveys distributed to a random sample of residents in tornado prone areas of the U.S. during the spring and fall of 2014. The first survey evaluated an initial set of items measuring experiences, and the second was used to re-evaluate the experience items and to measure tornado risk perceptions. The sample sizes for the two surveys were 144 and 184, respectively.Since tornado experiences can occur at any time throughout one's life, and in multiplicity, the survey items measured both one's most memorable tornado experience and his or her multiple experiences. A factor analysis of the survey items yielded four factors which make up the memorable experience dimensions. Risk awareness: information pertaining to the possibility of a specific tornado hazard occurring, as well as threat-related social cues from both people and the media.Risk personalization: one's protective behavioral and emotional responses as well as visual, auditory and tactile sensations experienced during the tornado.Personal intrusive impacts: ways that one is personally affected by an experience, including intangible, unpleasant thoughts and feelings from the experience.Vicarious troubling impacts: others' tangible impacts and verbal accounts of their experiences and intangible intrusive impacts. The "others" are people known personally by the responding individual. Although all the items in this factor reflect others' accounts of a tornado experience, the respondent experiences these aspects by hearing about or witnessing them.The factor analysis revealed two factors contributing to the multiple experience dimensions: common threat and impact communication, and negative emotional responses. The first factor captures one's personal experience with receiving common types of information (e.g., sirens) about tornado threats and tornado-related news. The second factor captures the amount of experience a respondent has with fearing for their own life, a loved one's life and worrying about their property due to a tornado.Individual's past tornado experiences are multi-faceted and nuanced with each of the above six dimensions exerting a different influence on tornado risk perceptions. These dimensions have not been previously analyzed, particularly the intangible aspects - feelings, thoughts and emotions."This research can help meteorologists who provide many essential, skillful risk messages in the form of forecasts, watches, and warnings when tornadoes (and other hazardous weather) threaten. This research can help meteorologists recognize the many ways that people's past tornado experiences shape what they think and do, in addition to the weather forecasts they receive," states Demuth.The Society for Risk Analysis is a multidisciplinary, interdisciplinary, scholarly, international society that provides an open forum for all those interested in risk analysis. SRA was established in 1980 and has published Risk Analysis: An International Journal, the leading scholarly journal in the field, continuously since 1981. For more information, visit

A record winter during the American Revolution almost put independence on ice

December 18, 2017 | The seasonal forecasts are in for this winter, and they generally indicate relatively average conditions across much of the country's midsection, with wetter-than-normal weather likely in the north and dryness in the South.Continuing to improve these longer-term forecasts can help communities and businesses prepare for particular weather patterns — and possibly even save lives.In fact, a good seasonal forecast could even have made a difference during a critical moment in the American Revolution.This National Park Service painting portrays conditions at the Continental Army's New Jersey encampment in the winter of 1779-80, with a hospital hut in the foreground. (Image from Morristown National Historic Park.)No East Coast season on record was colder than the winter of 1779-80. All of the saltwater inlets, harbors, and sounds of the Atlantic coastal plain froze over from Canada to North Carolina, remaining closed to navigation for a month or more for the only time in recorded history.The winter happened to occur during the height of the Revolutionary War. George Washington and his soldiers were greeted by a foot of snow that already lay on the ground in November 1779 when they began arriving at their winter quarters outside Morristown, New Jersey.The ensuing winter months almost cost the young nation its independence. The Continental Army was hammered by repeated snowstorms, including a blizzard in early January that dumped four feet of snow. Many of the soldiers lacked coats, shirts, shoes, and even food.As the winter wore on, the soldiers became more embittered and mutinous than during the storied but milder winter two years earlier in Valley Forge. If not for help from surrounding communities during the winter of 1779-80, they may have deserted or even starved to death, potentially changing the course of history.Could such a winter be predicted today?Longer-term forecasting in the two-week to three-month range is one of the most difficult challenges in meteorology. These subseasonal to seasonal forecasts, while providing general guidance, still lack much precision.This winter, for example, the National Oceanic and Atmospheric Administration (NOAA) predicts that the chances are roughly equal for conditions that are wetter or drier than normal across large swaths of the mid-Atlantic. Even a forecast of a wetter-than-average winter could play out in many ways, from a series of light rains to a couple of blockbuster snowstorms.The National Oceanic and Atmospheric Administration's forecast for wintertime precipitation, released in October, projects drier-than-normal conditions in the South and wetter-than-normal conditions in parts of the North. Click here for an analysis of the forecast by NOAA's Climate Prediction Center, as well as a forecast map that includes Alaska and Hawaii. To add more detail to such forecasts, scientists are working to better understand the links between U.S. weather patterns and large-scale atmospheric and oceanic conditions, such as El Niño and the North Atlantic Oscillation (more popularly known as the "polar vortex" when it ushers in cold weather).Recognizing the importance of such research, Congress in April passed the Weather Research and Forecasting Innovation Act, a major weather bill that calls for more work into subseasonal to seasonal prediction.If the modern understanding of the atmosphere and oceans had existed during the American Revolution, perhaps Washington and his soldiers could have taken more precautions. The next time the nation is threatened by an unusually severe winter, better forecasts may make it possible to prepare."Scientists are gaining new insights into the entire Earth system in ways that will lead to predictions of weather patterns weeks, months, or even more than a year in advance," said UCAR President Antonio Busalacchi. "History shows this type of intelligence can be critical to national security, as well as to businesses and vulnerable communities."Writer/contact: David HosanskyManager of Media Relations  

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

North American storm clusters could produce 80 percent more rain

BOULDER, Colo. — Major clusters of summertime thunderstorms in North America will grow larger, more intense, and more frequent later this century in a changing climate, unleashing far more rain and posing a greater threat of flooding across wide areas, new research concludes.The study, by scientists at the National Center for Atmospheric Research (NCAR), builds on previous work showing that storms are becoming more intense as the atmosphere is warming. In addition to higher rainfall rates, the new research finds that the volume of rainfall from damaging storms known as mesoscale convective systems (MCSs) will increase by as much as 80 percent across the continent by the end of this century, deluging entire metropolitan areas or sizable portions of states."The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted," said NCAR scientist Andreas Prein, the study's lead author. "If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.""This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative," he added.The research team drew on extensive computer modeling that realistically simulates MCSs and thunderstorms across North America to examine what will happen if emissions of greenhouse gases continue unabated.The study will be published Nov. 20 in the journal Nature Climate Change. It was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Army Corps of Engineers. Hourly rain rate averages for the 40 most extreme summertime mesoscale convective systems (MCSs) in the current (left) and future climate of the mid-Atlantic region. New research shows that MSCs will generate substantially higher maximum rain rates over larger areas by the end of the century if society continues a "business as usual" approach of emitting greenhouse gases . (©UCAR, Image by Andreas Prein, NCAR. This image is freely available for media & nonprofit use.)A warning signalThunderstorms and other heavy rainfall events are estimated to cause more than $20 billion of economic losses annually in the United States, the study notes. Particularly damaging, and often deadly, are MSCs: clusters of thunderstorms that can extend for many dozens of miles and last for hours, producing flash floods, debris flows, landslides, high winds, and/or hail. The persistent storms over Houston in the wake of Hurricane Harvey were an example of an unusually powerful and long-lived MCS.Storms have become more intense in recent decades, and a number of scientific studies have shown that this trend is likely to continue as temperatures continue to warm. The reason, in large part, is that the atmosphere can hold more water as it gets warmer, thereby generating heavier rain.A study by Prein and co-authors last year used high-resolution computer simulations of current and future weather, finding that the number of summertime storms that produce extreme downpours could increase by five times across parts of the United States by the end of the century. In the new study, Prein and his co-authors focused on MCSs, which are responsible for much of the major summertime flooding east of the Continental Divide. They investigated not only how their rainfall intensity will change in future climates, but also how their size, movement, and rainfall volume may evolve.Analyzing the same dataset of computer simulations and applying a special storm-tracking algorithm, they found that the number of severe MCSs in North America more than tripled by the end of the century. Moreover, maximum rainfall rates became 15 to 40 percent heavier, and intense rainfall reached farther from the storm's center. As a result, severe MCSs increased throughout North America, particularly in the northeastern and mid-Atlantic states, as well as parts of Canada, where they are currently uncommon.The research team also looked at the potential effect of particularly powerful MCSs on the densely populated Eastern Seaboard. They found, for example, that at the end of the century, intense MCSs over an area the size of New York City could drop 60 percent more rain than a severe present-day system. That amount is equivalent to adding six times the annual discharge of the Hudson River on top of a current extreme MCS in that area."This is a warning signal that says the floods of the future are likely to be much greater than what our current infrastructure is designed for," Prein said. "If you have a slow-moving storm system that aligns over a densely populated area, the result can be devastating, as could be seen in the impact of Hurricane Harvey on Houston."This satellite image loop shows an MCS developing over West Virginia on June 23, 2016. The resulting floods caused widespread flooding, killing more than 20 people.  MCSs are responsible for much of the major flooding east of the Continental Divide during warm weather months. (Image by NOAA National Weather Service, Aviation Weather Center.) Intensive modelingAdvances in computer modeling and more powerful supercomputing facilities are enabling climate scientists to begin examining the potential influence of a changing climate on convective storms such as thunderstorms, building on previous studies that looked more generally at regional precipitation trends.For the new study, Prein and his co-authors turned to a dataset created by running the NCAR-based Weather and Research Forecasting (WRF) model over North America at a resolution of 4 kilometers (about 2.5 miles). That is sufficiently fine-scale resolution to simulate MCSs. The intensive modeling, by NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda, required a year to run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.The team used an algorithm developed at NCAR to identify and track simulated MCSs. They compared simulations of the storms at the beginning of the century, from 2000 to 2013, with observations of actual MCSs during the same period and showed that the modeled storms are statistically identical to real MCSs.The scientists then used the dataset and algorithm to examine how MCSs may change by the end of the century in a climate that is approximately 5 degrees Celsius (9 degrees Fahrenheit) warmer than in the pre-industrial era — the temperature increase expected if greenhouse gas emissions continue unabated.About the paperTitle: Increased rainfall volume from future convective storms in the USAuthors: Andreas F Prein, Changhai Liu, Kyoko Ikeda, Stanley B Trier, Roy M Rasmussen, Greg J Holland, Martyn P ClarkJournal: Nature Climate Change  


Subscribe to Weather Research