Computer Modeling

US taps NCAR technology for new water resources forecasts

BOULDER, Colo. — As the National Oceanic and Atmospheric Administration (NOAA) this month launches a comprehensive system for forecasting water resources in the United States, it is turning to technology developed by the National Center for Atmospheric Research (NCAR) and its university and agency collaborators.WRF-Hydro, a powerful NCAR-based computer model, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model."WRF-Hydro gives us a continuous picture of all of the waterways in the contiguous United States," said NCAR scientist David Gochis, who helped lead its development. "By generating detailed forecast guidance that is hours to weeks ahead, it will help officials make more informed decisions about reservoir levels and river navigation, as well as alerting them to dangerous events like flash floods."WRF-Hydro (WRF stands for Weather Research and Forecasting) is part of a major Office of Water Prediction initiative to bolster U.S. capabilities in predicting and managing water resources. By teaming with NCAR and the research community, NOAA's National Water Center is developing a new national water intelligence capability, enabling better impacts-based forecasts for management and decision making.The new WRF-Hydro computer model simulates streams and other aspects of the hydrologic system in far more detail than previously possible. (Image by NOAA Office of Water Prediction.) Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, WRF-Hydro provides continuous forecasts of millions of points along rivers, streams, and their tributaries across the contiguous United States. To accomplish this, it simulates the entire hydrologic system — including snowpack, soil moisture, local ponded water, and evapotranspiration — and rapidly generates output on some of the nation's most powerful supercomputers.WRF-Hydro was developed in collaboration with NOAA and university and agency scientists through the Consortium of Universities for the Advancement of Hydrologic Science, the U.S. Geological Survey, Israel Hydrologic Service, and Baron Advanced Meteorological Services. Funding came from NOAA, NASA, and the National Science Foundation, which is NCAR's sponsor."WRF-Hydro is a perfect example of the transition from research to operations," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation (NSF). "It builds on the NSF investment in basic research in partnership with other agencies, helps to accelerate collaboration with the larger research community, and culminates in support of a mission agency such as NOAA. The use of WRF-Hydro in an operational setting will also allow for feedback from operations to research. In the end this is a win-win situation for all parties involved, chief among them the U.S. taxpayers.""Through our partnership with NCAR and the academic and federal water community, we are bringing the state of the science in water forecasting and prediction to bear operationally," said Thomas Graziano, director of NOAA’s new Office of Water Prediction at the National Weather Service.Filling in the water pictureThe continental United States has a vast network of rivers and streams, from major navigable waterways such as the Mississippi and Columbia to the remote mountain brooks flowing from the high Adirondacks into the Hudson River. The levels and flow rates of these watercourses have far-reaching implications for water availability, water quality, and public safety.Until now, however, it has not been possible to predict conditions at all points in the nation's waterways. Instead, computer models have produced a limited picture by incorporating observations from about 4,000 gauges, generally on the country's bigger rivers. Smaller streams and channels are largely left out of these forecast models, and stretches of major rivers for tens of miles are often not predicted — meaning that schools, bridges, and even entire towns can be vulnerable to unexpected changes in river levels.To fill in the picture, NCAR scientists have worked for the past several years with their colleagues within NOAA, other federal agencies, and universities to combine a range of atmospheric, hydrologic, and soil data into a single forecasting system.The resulting National Water Model, based on WRF-Hydro, simulates current and future conditions on rivers and streams along points two miles apart across the contiguous United States. Along with an hourly analysis of current hydrologic conditions, the National Water Model generates three predictions: an hourly 0- to 15-hour short-range forecast, a daily 0- to 10-day medium-range forecast, and a daily 0- to 30-day long-range water resource forecast.The National Water Model predictions using WRF-Hydro offer a wide array of benefits for society. They will help local, state, and federal officials better manage reservoirs, improve navigation along major rivers, plan for droughts, anticipate water quality problems caused by lower flows, and monitor ecosystems for issues such as whether conditions are favorable for fish spawning. By providing a national view, this will also help the Federal Emergency Management Agency deploy resources more effectively in cases of simultaneous emergencies, such as a hurricane in the Gulf Coast and flooding in California."We've never had such a comprehensive system before," Gochis said. "In some ways, the value of this is a blank page yet to be written."A broad spectrum of observationsWRF-Hydro is a powerful forecasting system that incorporates advanced meteorological and streamflow observations, including data from nearly 8,000 U.S. Geological Survey streamflow gauges across the country. Using advanced mathematical techniques, the model then simulates current and future conditions for millions of points on every significant river, steam, tributary, and catchment in the United States.In time, scientists will add additional observations to the model, including snowpack conditions, lake and reservoir levels, subsurface flows, soil moisture, and land-atmosphere interactions such as evapotranspiration, the process by which water in soil, plants, and other land surfaces evaporates into the atmosphere.Scientists over the last year have demonstrated the accuracy of WRF-Hydro by comparing its simulations to observations of streamflow, snowpack, and other variables. They will continue to assess and expand the system as the National Water Model begins operational forecasts.NCAR scientists maintain and update the open-source code of WRF-Hydro, which is available to the academic community and others. WRF-Hydro is widely used by researchers, both to better understand water resources and floods in the United States and other countries such as Norway, Germany, Romania, Turkey, and Israel, and to project the possible impacts of climate change."At any point in time, forecasts from the new National Water Model have the potential to impact 300 million people," Gochis said. "What NOAA and its collaborator community are doing is trying to usher in a new era of bringing in better physics and better data into forecast models for improving situational awareness and hydrologic decision making."CollaboratorsBaron Advanced Meteorological Services Consortium of Universities for the Advancement of Hydrologic ScienceIsrael Hydrologic ServiceNational Center for Atmospheric ResearchNational Oceanic and Atmospheric AdministrationU.S. Geological SurveyFundersNational Science FoundationNational Aeronautics and Space AdministrationNational Oceanic and Atmospheric Administration

NCAR weather ensemble offers glimpse at forecasting's future

July 1, 2016 | Last spring, scientists at the National Center for Atmospheric Research (NCAR) flipped the switch on a first-of-its-kind weather forecasting system. For more than a year, NCAR's high-resolution, real-time ensemble forecasting system has been ingesting 50,000 to 70,000 observations every six hours and creating a whopping 90,000 weather maps each day.The system has become a favorite among professional forecasters and casual weather wonks: Typically more than 200 people check out the site each day with more than a thousand coming during major weather events. During this experimental period, the NCAR ensemble has also become a popular source of guidance within the National Weather Service, where it has already been referenced several hundred times by forecasters at more than 50 different offices. But perhaps more important, the data accumulated from running the system daily — and there is lots of it — is being used by researchers at universities across the country to study a range of topics, from predicting hail size to anticipating power outages for utilities."We wanted to demonstrate that a real-time system of this scale was feasible," said NCAR scientist Craig Schwartz. "But it's also a research project that can help the community learn more about the predictability of different kinds of weather events."Schwartz is a member of the team that designed and operates the system, along with NCAR colleagues Glen Romine, Ryan Sobash, and Kate Fossell.This animation shows the forecast for accumulated snowfall made by each of the NCAR ensemble's 10 members for the 48-hour period beginning on Jan. 22, 2016. In the run-up to the blizzard, which ultimately dropped more than 30 inches of snow on parts of the Mid-Atlantic, more than 1,000 people visited the NCAR ensemble's website. (©UCAR. This animation is freely available for media & nonprofit use.)Testing a unique toolNCAR's high-resolution ensemble forecasting system is unique in the country for a couple of reasons, both of which are revealed in its name: It's an ensemble, and it's high resolution.Instead of producing a single forecast, the system produces an "ensemble" of 10 forecasts, each with slightly different (but equally likely) starting conditions. The degree to which the forecasts look the same or different tells scientists something about the probability that a weather event, like rain, hail, or wind, will actually occur.By comparing the actual outcomes to the forecasted probabilities, scientists can study the predictability of particular weather events under different circumstances. The forecasting system's high resolution (the grid points are just 3 kilometers apart) allows it to simulate small-scale weather phenomena, like the creation of individual storms from convection — the process of moist, warm air rising and then condensing into clouds. The combination of fine grid spacing and ensemble predictions in the NCAR system offers a sneak peek at what the future of weather forecasting might look like, and weather researchers across the country have noticed. Cliff Mass, a professor of atmospheric sciences at the University of Washington whose specialty is forecasting, said: "It's extremely important for the United States to have a convection-allowing ensemble system to push our forecasting capabilities forward. We were delighted that NCAR demonstrated that this could be done."'The cat's meow'The treasure trove of accruing weather data generated by running the NCAR ensemble is already being used by researchers both at NCAR and in the broader community. Jim Steenburgh, for instance, is a researcher at the University of Utah who is using the system to understand the predictability of mountain snowstorms."NCAR's ensemble not only permits the 'formation' of clouds, it can also capture the topography of the western United States," he said. "The mountains control the weather to some degree, so you need to be able to resolve the mountains' effects on precipitation."Steenburgh has also been using the ensemble with his students. "We’re teaching the next generation of weather forecasters," he said. "In the future, these high-resolution ensemble forecasts will be the tools they need to use, and this gives them early, hands-on experience." Like Steenburgh, Lance Bosart, an atmospheric researcher at the University of Albany, State University of New York, has used the ensemble both in his own research — studying the variability of convective events — and with his students. He said having 10 members in the ensemble forecast helps students easily see the great spread of possibilities, and the visual emphasis of the user interface makes it easy for students to absorb the information."What makes it an invaluable tool is the graphical display," he said. "It's visually compelling. You don't have to take a lot of time to explain what you're looking at; you can get right into explaining the science. I like to say it's the cat's meow." Setting an exampleThe NCAR ensemble is also enabling the researchers running it to further their own research. "We're collecting statistics on the misfit between the model predictions and observations and then we're trying to use that to improve our model physics," Romine said. The ensemble project is also teaching the team about the strengths and weaknesses of the way they've chosen to kick off, or "initialize," each of the ensemble members. "The NCAR ensemble happens to produce a pretty good forecast, but we realize there are some shortcomings," Schwartz said. "For example, if we were trying to make the best forecast in the world, we would probably not be initializing the model the way we are. But then we wouldn’t learn as much from a research perspective." The NCAR ensemble began as a yearlong trial, but the project is continuing to run for now. The team would like to keep the system online until next summer, but they don't yet have the computing resources they need to run it past September.If the system does continue to run, the researchers who are using it say there's still more that they and their students can learn from the project. And if not, there's loads of data already collected that are still waiting to be mined. In any case, Mass says NCAR's ensemble has been a valuable project. "It set a really good example for the nation," he said.Community members interested in collaborating or helping support the NCAR ensemble project are encouraged to contact the team at ensemble@ucar.edu.Writer/contact:Laura Snider, Senior Science Writer and Public Information Officer 

Climate modeling 101: Explanations without equations

A new book breaks down climate models into easy-to-understand concepts. (Photo courtesy Springer.) June 21, 2016 | Climate scientists tell us it's going to get hotter. How much it rains and where it rains is likely to shift. Sea level rise is apt to accelerate. Oceans are on their way to becoming more acidic and less oxygenated. Floods, droughts, storms, and other extreme weather events are projected to change in frequency or intensity.  But how do they know what they know? For climate scientists, numerical models are the tools of the trade. But for the layperson — and even for scientists in other fields — climate models can seem mysterious. What does "numerical" even mean? Do climate models take other things besides the atmosphere into account?How do scientists know if a model is any good? * Two experts in climate modeling, Andrew Gettelman of the National Center for Atmospheric Research and Richard Rood of the University of Michigan, have your answers and more, free of charge. In a new open-access book, "Demystifying Climate Models," the pair lay out the fundamentals. In 282 pages, the scientists explain the basics of climate science, how that science is translated into a climate model, and what those models can tell us (as well as what they can't) — all without using a single equation. *Find the answers on pages 8, 13, and 161, respectively, of the book. AtmosNews sat down with Gettelman to learn more about the book, which anyone can download at http://www.demystifyingclimate.org.   NCAR scientist Andrew Gettelman has written a new book on climate modeling with Richard Rood of the University of Michigan. (Courtesy photo. This image is freely available for media & nonprofit use.) What was the motivation to write this book? There isn't really another book that sets out the philosophy and structure of models. There are textbooks, but inside you'll find a lot of physics and chemistry: information about momentum equations, turbulent fluxes — which is useful if you want to build your own model. And then there are books on climate change for the layperson, and they devote maybe a paragraph to climate modeling. There's not much in the middle. This book provides an introduction for the beginning grad student, or someone in another field who is interested in using model output, or anyone who is just curious how climate works and how we simulate it. What are some of the biggest misperceptions about climate models that you hear? One is that people say climate models are based on uncertain science. But that's not true at all. If we didn't know the science, my cellphone wouldn't work. Radios wouldn't work. GPS wouldn't work. That's because the energy that warms the Earth, which radiates from the Sun, and is absorbed and re-emitted by Earth's surface — and also by greenhouse gases in the atmosphere — is part of the same spectrum of radiation that makes up radio waves. If we didn't understand electromagnetic waves, we couldn't have created the technology we rely on today. The same is true for the science that underlies other aspects of climate models. (Learn more on page 38 of the book.) But we don't understand everything, right? We have understood the basic physics for hundreds of years. The last piece of it, the discovery that carbon dioxide warms the atmosphere, was put in place in the late 19th, early 20th century. Everything else — the laws of motion, the laws of thermodynamics — was all worked out between the 17th and 19th centuries. (Learn more on page 39 of the book.) We do still have uncertainty in our modeling systems. A big part of this book is about how scientists understand that uncertainty and actually embrace it as part of their work. If you know what you don't know and why, you can use that to better understand the whole climate system. Can we ever eliminate the uncertainty? Not entirely. In our book, we break down uncertainty into three categories: model uncertainty (How good are the models at reflecting how the Earth really works?), initial condition uncertainty (How well do we understand what the Earth system looks like right now?), and scenario uncertainty (What will future emissions look like?) To better understand, it might help to think about the uncertainty that would be involved if you had a computer model that could simulate making a pizza. Instead of trying to figure out what Earth's climate would look like in 50 or 100 years, this model would predict what your pizza would look like when it was done.  The first thing you want to know is how well the model reflects the reality of how a pizza is made. For example, does the model take into account all the ingredients you need to make the pizza, and how they will each evolve? The cheese melts, the dough rises, and the pepperoni shrinks. How well can the model approximate each of those processes? This is model uncertainty. The second thing you'd want to know is if you can input all the pizza's "initial conditions" into the model. Some initial conditions — like how many pepperoni slices are on the pizza and where — are easy to observe, but others are not. For example, kneading the pizza dough creates small pockets of air, but you don’t know exactly where they are. When the dough is heated, the air expands and forms big bubbles in the crust. If you can't tell the model where the air pockets are, it can't accurately predict where the crust bubbles will form when the pizza is baked. The same is true for a climate model. Some parts of the Earth, like the deep oceans and the polar regions, are not easy to observe with enough detail, leaving scientists to estimate what the conditions there are like and leading to the second type of uncertainty in the model results.  Finally, the pizza-baking model also has to deal with "scenario uncertainty," because it doesn't know how long the person baking the pizza will keep it in the oven, or at what temperature. Without understanding the choices the human will make, the model can't say for sure if the dough will be soft, crispy, or burnt. With climate models, over long periods of time, like a century, we've found that this scenario uncertainty is actually the dominant one. In other words, we don't know how much carbon dioxide humans around the world going to emit in the years and decades to come, and it turns out that that's what matters most.  (Learn more about uncertainty on page 10 of the book.) Any other misperceptions you frequently hear? People always say, "If we can't predict the weather next week, how can we know what the climate will be like in 50 years?" Generally speaking, we can't perfectly predict the weather because we don't have a full understanding of all the current conditions. We don't have observations for every grid point on a weather model or for large parts of the ocean, for example. But climate is not concerned about the exact weather on a particular day 50 or 100 years from now. Climate is the statistical distribution of weather, not a particular point on that distribution. Climate prediction is focused on the statistics of this distribution, and that is governed by conservation of energy and mass on long time scales, something we do understand. (Learn more on page 6 of the book. Read more common misperceptions at http://www.demystifyingclimate.org/misperceptions.) Did you learn anything about climate modeling while working on the book? My background is the atmosphere. I sat down and wrote the whole section on the atmosphere in practically one sitting. But I had to learn about the other aspects of models, the ocean and the land, which work really differently. The atmosphere has only one boundary, a bottom boundary. We just have to worry about how it interacts with mountains and other bumps on the surface. But the ocean has three hard boundaries: the bottom and the sides, like a giant rough bathtub. It also has a boundary with the atmosphere on the top. Those boundaries really change how the ocean moves. And the land is completely different because it doesn't move at all. Writing this book really gave me a new appreciation for some of the subtleties of other parts of the Earth System and the ways my colleagues model them. (Learn more on page 13 of the book.) What was the most fun part of writing the book for you? I think having to force myself to think in terms of analogies that are understandable to a variety of people. I can describe a model using a whole bunch of words most people don't use every day, like "flux." It was a fun challenge to come up with words that would accurately describe the models and the science but that were accessible to everyone.

UCAR to support EarthCube: Cyberinfrastructure will advance science

BOULDER – EarthCube, a landmark initiative to develop new technological and computational capabilities for geosciences research, will be supported by the University Corporation for Atmospheric Research (UCAR) under a new agreement with the National Science Foundation (NSF). Created by NSF in 2011, EarthCube aims to help researchers across the geosciences from meteorology to seismology better understand our planet in ways that can strengthen societal resilience to natural events. More than 2,500 EarthCube contributors – including scientists, educators, and information professionals – work together on the creation of a common cyberinfrastructure for researchers to collect, access, analyze, share, and visualize all forms of data and related resources. "EarthCube offers the promise to advance geoscience research by creating and delivering critical new capabilities,” said UCAR scientist Mohan Ramamurthy, principal investigator and project director of the new EarthCube office at UCAR. "This is a great opportunity for UCAR to leverage its successful track record in managing large scientific projects that advance our understanding of the planet," said Michael Thompson, interim UCAR president. "The EarthCube project offers the potential to significantly benefit society by helping scientists use the power of diverse big datasets to better understand and predict the natural events, from severe storms to solar disturbances, that affect all of us." EarthCube is designed to foster collaborations across the geosciences. The technology helps scientists in different disciplines better understand the far-reaching influences of natural events, such as how major storms like Sandy (above) affect coastal and inland flooding. This unique view of Sandy was generated with NCAR's VAPOR visualization software, based on detailed computer modeling. (©UCAR. Visualization by Alan Norton, NCAR, based on research by NCAR scientists Mel Shapiro and Thomas Galarneau. This image is freely available for media & nonprofit use. Click here for higher resolution.) UCAR will administer the day-to-day operations of EarthCube under the three-year, $2.8 million agreement with NSF. The EarthCube science support office, currently funded through an NSF grant to the Arizona Geological Survey in Tucson, Arizona, will move to UCAR's Boulder offices starting this month. EarthCube is designed to help researchers across the geosciences address the challenges of understanding and predicting the complexity of the Earth system, from the geology and topography to the water cycle, atmosphere, and space environment of the planet. This approach is critical for improved understanding of the environment and better safeguarding society. In order to better predict the potential effects of a landfalling hurricane on inland mudslides, for example, scientists from multiple disciplines, including meteorology, hydrology, geography, and geology, need a common platform to work together to collect observations, ingest them into advanced computer models of the Earth system, and analyze and interpret the resulting data. "The EarthCube Science Support Office will help us find and share the data geoscientists collect and use to answer critical science questions about the Earth," said Eva Zanzerkia, program director in NSF’s Division of Earth Sciences. Ramamurthy said UCAR is well positioned to help EarthCube meet its goals, since UCAR provides technological support to the geosciences community, including its 109 member universities. UCAR has been involved with EarthCube since NSF launched the initiative. "Currently researchers are spending an enormous amount of time on routine tasks because there is no data system, database, or data infrastructure where they can get all the information they need in some kind of a uniform way from a single interface," Ramamurthy said. "If EarthCube can facilitate the integration of data from multiple domains in a way that is easier and faster, and if there is interoperability in terms of standards for data to be input into a common environment, then integration becomes more easily possible." UCAR is a nonprofit consortium of more than 100 member colleges and universities focused on research and training in the atmospheric and related Earth system sciences. UCAR’s primary activity is managing the National Center for Atmospheric Research (NCAR) on behalf of NSF, NCAR’s sponsor. UCAR also oversees a variety of education and scientific support activities under the umbrella of the UCAR Community Programs, which will administer EarthCube.

A 3D window into a tornado

This simulation was created by NCAR scientist George Bryan to visualize what goes on inside a tornado. The animation is the "high swirl" version in a series that goes from low, to medium, to high. Click to enlarge. (Courtesy Goerge Bryan, NCAR. This image is freely available for media & nonprofit use.) May 17, 2016 | What's really going on inside a tornado? How fast are the strongest winds, and what are the chances that any given location will experience them when a tornado passes by? Due to the difficulties of measuring wind speeds in tornadoes, scientists don't have answers to these questions. However, a collaborative project between researchers at the University of Miami and NCAR has been seeking clues with new, highly detailed computer simulations of tornado wind fields. The simulations can be viewed in a series of animations, created by NCAR scientist George Bryan, that provide a 3D window into the evolving wind fields of idealized tornadoes at different rates of rotation. The "high-swirl animation," shown here, which depicts a powerful tornado with 200-plus mph winds, the purple tubelike structures depict the movements of rapidly rotating vortices. Near-surface winds are represented by colors ranging from light blue (less than 20 meters per second, or 45 mph) to deep red (more than 100 meters per second, or 224 miles per hour). The vortices and winds are contained within a condensation cloud that rises more than 500 meters (1,640 feet) above the surface. Such visualizations can help atmospheric scientists better understand the structures of tornadoes, as well as the shifting location and strength of maximum wind speeds.  Bryan also uses them in presentations to meteorology students. “When you make these 3D visualizations and then animate them, they give you a sense of how the flow evolves and how the turbulence changes,” Bryan said. “These are details you don’t see by just looking at a photograph.” For example, he learned from the visualization that the rotating tubes tilt backward against the flow at higher altitudes. These are the kinds of details that can eventually help scientists better understand these complex storms. The information is also critical for public safety officials and engineers. “If you’re an engineer and designing a building, you want to know details like how much greater is peak wind over average wind in a tornado,” Bryan said. “We’ll get questions from engineers asking about the details of wind gusts in those purple tubes.” Bryan is collaborating on the simulations with Dave Nolan, chair of Miami’s Department of Atmospheric Sciences. To create the animation, Bryan used innovative NCAR software that enables researchers in the atmospheric and related sciences to analyze and interpret results from large computer models. VAPOR (Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers) is an interactive 3D visualization environment for both animations and still-frame images. The open-source software can be downloaded and used on personal computers. VAPOR was developed at NCAR in partnership with the University of California at Davis and Ohio State University. Funding comes from the National Science Foundation and the Korea Institute of Science and Technology Information. Writer/contactDavid Hosansky FunderNational Science Foundation CollaboratorUniversity of Miami

Wrangling observations into models

April 4, 2016 | If scientists could directly measure the properties of all the water throughout the world’s oceans, they wouldn’t need help from NCAR scientist Alicia Karspeck. But since large expanses of the oceans are beyond the reach of observing instruments, Karspeck’s work is critical for those who want estimates of temperature, salinity, and other properties of water around the globe. Scientists need these estimates to better understand the world’s climate system and how it is changing. “It’s painstaking work, but my hope is it will lead to major advances in climate modeling and long-term prediction,” Karspeck said. She is one of a dozen or so researchers at NCAR who spend their days on data assimilation, a field that is becoming increasingly important for the geosciences and other areas of research. Broadly speaking, data assimilation is any method of enabling computer models to utilize relevant observations. Part science and part art, it involves figuring out how to get available measurements--which may be sparse, tightly clustered, or irregularly scattered--into models that tend to simplify the world by breaking it into gridded boxes. Commonly used in weather forecasting, the technique can improve simulations and help scientists predict future events with more confidence. It can also identify deficiencies in both models and observations. As models have become more powerful and observations more numerous, the technique has become so critical that NCAR last year launched a Data Assimilation Program to better leverage expertise across its seven labs. “Activities in data assimilation have grown well beyond traditional applications in numerical weather prediction for the atmosphere and now span across NCAR’s laboratories,” said NCAR Director Jim Hurrell. “The Data Assimilation program is designed to enhance data assimilation research at NCAR, while at the same time serving the broader U.S. research community.” Scientists are using data assimilation techniques to input a range of North American observations into experimental, high-resolution U.S. forecasts. These real-time ensemble forecasts are publicly available while they're being tested. (@UCAR. This image is freely available for media & nonprofit use.) Improving prediction Created by the NCAR Directorate, the Data Assimilation Program is designed to advance prediction of events ranging from severe weather and floods to air pollution outbreaks and peaks in the solar cycle. One of its goals is to encourage collaborations among data assimilation experts at NCAR and the larger research community. For example, scientists in several labs are joining forces to apply data assimilation methods to satellite measurements to create a database of global winds and other atmospheric properties. This database will then be used for a broad range of climate and weather studies. The program also provides funding to hire postdocs at NCAR to focus on data assimilation projects as well as for a software engineer to support such activities. NCAR Senior Scientist Chris Snyder coordinates the Data Assimilation Program. "By bringing money to the table, we’re building up data assimilation capability across NCAR,” said NCAR Senior Scientist Chris Snyder, who coordinates the Data Assimilation Program. “This is critical because data assimilation provides a framework to scientists throughout the atmospheric and related sciences who need to assess where the uncertainties are and how a given observation can help.” NCAR Senior Scientist Jeff Anderson, who oversees the Data Assimilation Research Testbed (DART), says that data assimilation has become central for the geosciences. DART is a software environment that helps researchers develop data assimilation methods and observations with various computer models. “I think the Data Assimilation Program is a huge win for NCAR and the entire atmospheric sciences community,” Anderson said. “The scientific method is about taking observations of the world and making sense of them, and data assimilation is fundamental for applying the scientific method to the geosciences as well as to other research areas.” From oceans to Sun Here are examples of how data assimilation is advancing our understanding of atmospheric and related processes from ocean depths to the Sun’s interior: Oceans. Karspeck is using data assimilation to estimate water properties and currents throughout the world's oceans. This is a computationally demanding task that requires feeding observations into the NCAR-based Community Earth System Model, simulating several days of ocean conditions on the Yellowstone supercomputer, and using those results to update the conditions in the model and run another simulation. The good news: the resulting simulations match well with historical records, indicating that the data assimilation approach is working. “My goal is to turn this into a viable system for researchers,” Karspeck said. Air quality. Atmospheric chemists at NCAR are using data assimilation of satellite observations to improve air quality models that currently draw on limited surface observations of pollutants. For example, assimilating satellite observations would show the effect of emissions from a wildfire in Montana on downwind air quality, such as in Chicago.  “We've done a lot of work to speed up the processing time and the results are promising," said NCAR scientist Helen Worden. “The model simulations after assimilating satellite carbon monoxide data are much closer to actual air quality conditions.” Weather forecasting. Data assimilation is helping scientists diagnose problems with weather models. For example, why do models consistently overpredict or underpredict temperatures near the surface? Using data assimilation, NCAR scientist Josh Hacker discovered that models incorrectly simulate the transfer of heat from the ground into the atmosphere. “With data assimilation, you’re repeatedly confronting the model with observations so you can very quickly see how things go wrong,” he said. Solar cycle. Scientists believe the 11-year solar cycle is driven by mysterious processes deep below the Sun’s surface, such as the movements of cells of plasma between the Sun’s lower latitudes and poles. To understand the causes of the cycle and ultimately predict it, they are turning to data assimilation to augment observations of magnetic fields and plasma flow at the Sun’s surface and feed the resulting information into a computer model of subsurface processes. “We are matching surface conditions to the model, such as the pattern and speed of the plasma flows and evolving magnetic fields,” said NCAR scientist Mausumi Dikpati. Capturing data. In addition to helping scientists improve models, the new Data Assimilation Program is also fostering discussions about observations. NCAR senior scientist Wen-Chau Lee and colleagues who are experts in gathering observations are conferring with computer modelers over how to process the data for the models to readily ingest. One challenge, for example, is that radars may take observations every 150 meters whereas the models often have a resolution of 1-3 kilometers. Inputting the radar observations into the models requires advanced quality control techniques, including coordinate transformation (modifying coordinates from observations to the models) and data thinning (reducing the density of observations while retaining the basic information). “We are modifying our quality control procedures to make sure that the flow of data is smooth.” Lee said. “With data assimilation, the first word is ‘data’,” he added. “Without data, without observations, there is no assimilation.” Writer/contactDavid Hosansky, Manager of Media Relations FundersNCAR Directorate National Science FoundationAdditional funding agencies for specific projects  

The quest to predict severe weather sooner

January 26, 2016 | Weather forecasts have become increasingly more reliable thanks to improvements over the past several decades in computer modeling and observational equipment. However, when it comes to severe weather, that reliability typically begins to deteriorate beyond a two-day forecast. To provide an accurate severe weather outlook three or more days in advance, forecasters need to capture the fine-scale behavior of clouds, vertical wind shear and other local processes, as well as the global atmospheric conditions surrounding the local region of interest. Regional models examine fine-scale conditions at high resolution, but they have a difficult time with accuracy between the area of interest and the surrounding region. Errors in these so-called boundary regions can distort the results for the target area. Simulating the entire globe in high resolution would help, but that takes an exorbitant amount of computing time. MPAS's variable mesh enables smooth transitions from higher resolution (over North America in this example) to coarser resolution over the rest of the globe. (@UCAR. This image is freely available for media & nonprofit use.) A global software platform called Model for Prediction Across Scales, or MPAS, aims at resolving those issues. It offers a new way of simulating the atmosphere while providing scientists with more flexibility when focusing on regional conditions. Its development comes at a time when the U.S. National Weather Service wants to increase the lead time and accuracy of forecasts of severe storms, including hurricanes, tornadoes and flash floods, so communities can be better prepared. Unlike traditional three-dimensional models that calculate atmospheric conditions at multiple points within a block-shaped grid, MPAS uses a hexagonal mesh resembling a soccer ball or honeycomb that can be stretched wide or compressed for higher resolution as needed. "The mesh allows for a smooth transition between areas of coarse and fine resolution, with the goal of eliminating boundary distortions," said NCAR Senior Scientist William Skamarock, one of the developers of MPAS. Look globally as well as locally Vertical wind shear, or the change of winds at height, is a critical factor in determining thunderstorm severity. MPAS is able to simulate vertical wind shear at higher resolutions over local areas of interest, as well as cloud behavior and other processes vital to severe weather prediction. Ocean currents and many other global factors also can alter weather quickly. Global forecasts produced by the National Oceanic and Atmospheric Administration (NOAA) go out to 16 days; for tropical cyclones and hurricanes it's five days, but accuracy declines for the extended forecasts. "For some weather events, such as tropical cyclones, what's going on at the other side of the globe can influence the forecast for your region," Skamarock said. So forecasters need to portray the global environment surrounding a region that's under threat. Jointly developed at NCAR and the Los Alamos National Laboratory in New Mexico, MPAS is being groomed especially to improve regional and global weather forecasts, climate modeling, and atmospheric chemistry research, such as regional air-quality forecasts. Last July, MPAS was selected by NOAA as one of the finalists to become the National Weather Service’s next-generation global weather model. The decision is expected later this year. "The fact that MPAS is a finalist is an expression of confidence in the model’s capabilities," Skamarock said. In tests, MPAS has performed well in predicting springtime thunderstorms and other severe weather over the Great Plains. It also has produced realistic simulations of certain tropical cyclones, including Hurricane Sandy of 2012. However, along with other U.S. models, it missed on 2015's Hurricane Joaquin.  Longer lead times ahead NOAA has reported that MPAS provided realistic, fine-scale detail for Hurricane Sandy in 2012 and for 2013 springtime weather over the continental U.S., including the tornado that struck Moore, Okla.  "MPAS also did reasonably well in providing five-day forecasts during a NOAA hazardous weather experiment last May," Skamarock said. MPAS's 48-hour forecast for July 8, 2015, accurately predicted heavy rain for northern Texas and much of Oklahoma. Abilene wound up getting 8.25 inches, its wettest day since record keeping started in 1885. (@UCAR. This image is freely available for media & nonprofit use.) In spring 2015, MPAS also won high marks for the accuracy of its three-day forecasts that helped guide research aircraft missions during a major field campaign to study nighttime thunderstorms on the Great Plains, called PECAN (Plains Elevated Convection at Night). NCAR Project Scientist Stan Trier, who worked as a forecaster on the PECAN campaign, said the MPAS forecasts were usually the first he would look at for planning purposes because MPAS was the only model that had the resolution to indicate possible storm structures beyond 48 hours. Then, as the time to make decisions on overnight field operations approached, he would update these earlier forecasts with new information produced by shorter-range, high-resolution models. "There were multiple situations where MPAS did quite well at these longer time ranges," Trier said. "Forecasts with two to three days of lead time are less accurate than one-day forecasts. This is expected. But overall, I would definitely say that MPAS was a useful part of the PECAN forecasting process." Most recently, MPAS has been tested in Antarctica and during the 2015 tropical cyclone season in the Atlantic and Pacific oceans. It also is being used as a component within the NCAR-based Community Earth System Model for long-term climate prediction, and has been tested at the Taiwan Typhoon and Flood Research Institute to predict severe weather events in that country. Even if MPAS emerges as the National Weather Service’s next-generation weather model, there will still be a role for the Weather Research and Forecasting platform hosted by NCAR. WRF, an open source model used widely worldwide, is especially adept for local and regional weather predictions in the mid-latitudes. And, while MPAS's variable-mesh design conserves computing requirements, as a global model, it still uses more computing resources than WRF. "With MPAS, we want to predict severe thunderstorms with a mesh spacing of a few kilometers," Skamarock said. "That takes a lot of computer power." Writer/contactJeff Smith, Science Writer and Public Information Officer

NCAR announces powerful new supercomputer for scientific discovery

BOULDER—The National Center for Atmospheric Research (NCAR) announced today that it has selected its next supercomputer for advancing atmospheric and Earth science, following a competitive open procurement process. The new machine will help scientists lay the groundwork for improved predictions of a range of phenomena, from hour-by-hour risks associated with thunderstorm outbreaks to the timing of the 11-year solar cycle and its potential impacts on GPS and other sensitive technologies.The new system, named Cheyenne, will be installed this year at the NCAR-Wyoming Supercomputing Center (NWSC) and become operational at the beginning of 2017.Cheyenne will be built by Silicon Graphics International Corp. (SGI) in conjunction with centralized file system and data storage components provided by DataDirect Networks (DDN). The SGI high-performance computer will be a 5.34-petaflop system, meaning it can carry out 5.34 quadrillion calculations per second. It will be capable of more than 2.5 times the amount of scientific computing performed by Yellowstone, the current NCAR supercomputer.Funded by the National Science Foundation and the state of Wyoming through an appropriation to the University of Wyoming, Cheyenne will be a critical tool for researchers across the country studying climate change, severe weather, geomagnetic storms, seismic activity, air quality, wildfires, and other important geoscience topics. Since the supercomputing facility in Wyoming opened its doors in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources.Six clips of scientific visualizations created with the help of the Yellowstone supercomputer. For more details on the individual clips, and to see the full-length visualizations, click here. “We’re excited to bring more supercomputing power to the scientific community,” said Anke Kamrath, director of operations and services at NCAR’s Computational and Information Systems Laboratory. “Whether it’s the threat of solar storms or a heightened risk in certain severe weather events, this new system will help lead to improved predictions and strengthen society’s resilience to potential disasters.”“Researchers at the University of Wyoming will make great use of the new system as they continue their work into better understanding such areas as the surface and subsurface flows of water and other liquids, cloud processes, and the design of wind energy plants,” said William Gern, vice president of research and economic development at the University of Wyoming. “UW’s relationship with NCAR through the NWSC has greatly strengthened our scientific computing and data-centric research. It’s helping us introduce the next generation of scientists and engineers to these endeavors.”The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support that it has received from the people of that city. It also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne nation.Increased power, greater efficiencyThe new data storage system for Cheyenne will be integrated with NCAR’s existing GLADE file system. The DDN storage will provide an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  This, combined with the current 16 petabytes of GLADE, will total 36 petabytes of high-speed storage. The new DDN system also will transfer data at the rate of 200 gigabytes per second, which is more than twice as fast as the current file system’s rate of 90 gigabytes per second.The system will include powerful Intel Xeon processors, whose performance will be augmented through optimization work that has been done by NCAR and the University of Colorado Boulder. NCAR and the university performed this work through their participation in the Intel Parallel Computing Centers program.Even with its increased power, Cheyenne will be three times more energy efficient (in floating point operations per second, or flops, per watt) than Yellowstone, its predecessor, which is itself highly efficient.“The new system will have a peak computation rate of over 3 billion calculations per second for every watt of power consumed," said NCAR’s Irfan Elahi, project manager of Cheyenne and section manager for high-end supercomputing services.Scientists used the Yellowstone supercomputer to develop this 3-D rendering of a major thunderstorm in July 2011 that caused flooding in Fourmile Canyon west of Boulder. The colors show conditions in the clouds, including ice particles (light blue), graupel (orange), snow (pink), rain (blue), and water (grey). (Image by David Gochis, NCAR. This image is freely available for media & nonprofit use.)More detailed predictionsHigh-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex processes and how they might unfold in the future. These predictions give resource managers and policy experts valuable information for planning ahead and mitigating risk.Some of the areas in which Cheyenne is expected to accelerate research include the following:Streamflow. Year-ahead predictions of streamflows and associated reservoir levels at a greater level of detail will provide water managers, farmers, and other decision makers with vital information about likely water availability and the potential for drought or flood impacts.Severe weather. By conducting multiple simultaneous runs (or ensembles) of high-resolution forecast models, scientists will lay the groundwork for more specific predictions of severe weather events, such as the probability that a cluster of intense thunderstorms with the risk of hail or flooding will strike a county at a particular hour.Solar energy. Specialized models of solar irradiance and cloud cover will be run more frequently and at higher resolution, producing research that will help utilities predict how much energy will be generated by major solar arrays hours to days in advance.Regional climate change. Scientists will conduct multiple simulations with detailed climate models, predicting how particular regions around the world will experience changing patterns of precipitation and temperature, along with potential impacts from sea level rise, streamflow, and runoff.Decadal prediction. Ensembles of detailed climate models will also help scientists predict the likelihood of certain climate patterns over a 10-year period, such as the risk of drought for a certain region or changes in Arctic sea ice extent.Air quality. Scientists will be able to simulate the movement and evolution of air pollutants in far more detail, thereby better understanding the potential health effects of particular types of emissions and working toward improved forecasts of air quality.Subsurface flows. More accurate and detailed models will enable researchers to better simulate the subsurface flows of water, oil, and gas, leading to a greater understanding of these resources.Solar storms. Innovative, three-dimensional models of the Sun will lay the groundwork for predictions of the timing and strength of the Sun’s 11-year cycle as well as for days-ahead forecasts of solar disturbances that can generate geomagnetic storms in Earth’s upper atmosphere."Supercomputing is vital to NCAR’s scientific research and applications, giving us a virtual laboratory in which we run experiments that would otherwise be impractical or impossible to do,” said NCAR Director James Hurrell. “Cheyenne will be a key component of the research infrastructure of the United States through its provision of supercomputing specifically tailored for the atmospheric, geospace, and related sciences. The capabilities of this new system will be central to the continued improvement of our ability to understand and predict changes in weather, climate, air quality, and space weather, as well as their impacts on people, ecosystems, and society.”This series of images, based on a research project run on the Yellowstone supercomputer, shows order and chaos in the Sun's interior dynamo. Turbulent plasma motions (image a) generate a tangled web of magnetic field lines, with opposing "wreaths" of magnetism pointing east (red) or west (blue). Images b and c provide a better look at the magnetic wreaths. (Images by Kyle Augustson, NCAR. This image is freely available for media & nonprofit use.)Cheyenne Quick FactsKey features of the new Cheyenne supercomputer system:5.34-petaflop SGI ICE XA Cluster with Intel “Broadwell” processorsMore than 4K compute nodes20% of the compute nodes have 128GB memory and the remaining ~80% have 64GB memory313 terabytes  (TB) of total memoryMellanox EDR InfiniBand high-speed interconnectPartial 9D Enhanced Hypercube interconnect topologySUSE Linux Enterprise Server operating systemAltair PBS Professional Workload ManagerIntel Parallel Studio XE compiler suiteSGI Management Center & SGI Development SuiteMellanox Unified Fabric ManagerThe new Cheyenne supercomputer and the existing file system are complemented by a new centralized parallel file system and data storage components.Key features of the new data storage system:Four DDN SFA14KX systems20 petabytes of usable file system space (can be expanded to 40 petabytes by adding drives)200 GB per second aggregate I/O bandwidth3,360 × 8-TB NL SAS drives48 × 800-GB mixed-use SSD drives for metadata24 × NSD (Network Shared Disk) serversRed Hat Enterprise Linux operating systemIBM GPFS (General Parallel File System)

NCAR to develop wildland fire prediction system for Colorado

BOULDER – The state of Colorado is turning to the National Center for Atmospheric Research (NCAR) to establish the country’s most advanced system for predicting wildland fire behavior, including where and how quickly the blazes will spread. Developed in response to legislation that Gov. John Hickenlooper signed in May, the new agreement finalized this month creates an innovative research and development partnership to generate real-time, 18-hour forecasts of active wildfires in the state. NCAR will work with the Colorado Division of Fire Prevention and Control’s new Center of Excellence for Advanced Technology Aerial Firefighting in Rifle to design and develop the system and begin testing it as early as next year. “This technology represents the next generation of wildland fire prediction,” said NCAR science manager William Mahoney, who worked with state officials on developing the new agreement. “It will capture some of the critical feedbacks between large fires and the local weather, which often result in extreme fire behaviors that threaten lives and property. Colorado is using homegrown technology to lead the nation in wildland fire prediction.” The experimental forecast products will draw on powerful NCAR computer simulations and newly available satellite measurements of fires, obtained with a technique developed at the University of Maryland. They will also incorporate observations from Colorado’s Multi-Mission Aircraft. The Division of Fire Prevention and Control’s Center of Excellence is “excited to be working with NCAR to develop this stakeholder-driven technology,” said Center of Excellence Director Melissa Lineberger. She added that the technology will be particularly valuable to Colorado because it is being developed with stakeholder input and firefighters’ needs in mind. NCAR scientist Janice Coen used the CAWFE modeling system to create this simulation of the 2013 Yarnell Hill fire in Arizona, which killed 19 firefighters.  Click here for other wildfire animations. (©UCAR. This image is freely available for media & nonprofit use.) The system will provide unprecedented detail about interactions between weather and fire, which can create dangers for firefighters on the ground as well as for firefighting aircraft. It will build on a specialized computer model that was developed at NCAR with support by the National Science Foundation, NASA, and the Federal Emergency Management Agency. Once the system is fully developed and operational, it will be run by the Colorado Division of Fire Prevention and Control. Tackling a major threat Wildland fires are highly damaging in Colorado, taking the lives of firefighters and local residents, devastating large areas, and causing hundreds of millions of dollars in damage. Insurance claims from a single blaze, the 2012 Waldo Canyon Fire, totaled more than $450 million. To better protect Colorado, state Rep. Tracy Kraft-Tharp (D-Arvada) and state Sen. Ellen Roberts (R-Durango) sponsored legislation earlier this year to fund development of the forecasting system. “This is a revolutionary early-warning system that will better safeguard all of us for years to come,” Kraft-Tharp said. The lessons learned from the Colorado system are expected to yield benefits for fighting wildfires across the western United States. Capturing fire weather Despite the lives and economic costs at risk, the techniques currently available for anticipating fire behavior remain similar to those of past decades. Typically, firefighters infer how fast the edge of a fire will expand based on terrain, fuels, and a measurement or estimate of the winds. But this approach cannot capture changes associated with the interaction of fire and weather. To accurately forecast a wildland fire in detail, a computer model has to simulate highly localized winds that drive the flames. Adding to the complexity, a major blaze alters local weather, creating winds within the fire that may be more than 10 times stronger than those outside. These internal winds can contribute to potentially deadly accelerations, increases in intensity, unexpected shifts in direction, or splits in which the flames go in multiple directions. This interplay between fire and weather is particularly pronounced in Colorado and other western states, where clouds produce strong outflows and winds can rush down mountainsides and vary from one valley to the next. Wildfire is a major concern across Colorado and many other states across the western U.S. (Fire over Camp Pendleton, California, October 23, 2007. U.S. Marine Corps photo by Lance Cpl. Albert F. Hunt, via Wikimedia Commons.) To tackle this problem, the Colorado forecasting system will use a breakthrough computer model developed by NCAR scientist Janice Coen, who has studied wildland fires for more than 20 years. NCAR’s CAWFE® modeling system (derived from Coupled Atmosphere-Wildland Fire Environment) combines weather prediction with fire behavior simulations to capture the highly complex interplay of fire and weather. By restarting the model every few hours with the latest satellite and aircraft observations of an active fire—a process known as cycling—Coen and her research partner, University of Maryland professor Wilfrid Schroeder, have shown that it is possible to accurately predict the course of a blaze over the next one to two days. They can keep refreshing the model, making it possible to simulate the entire lifetime of even a very long-lived fire, from ignition to extinction. “Even though fires are complex and rapidly changing and often described as unpredictable, much of a fire event can be foreseen by this more sophisticated model,” Coen said. WriterDavid Hosansky, Manager of Media Relations

Looking at swells in 3D

November 10, 2015 | Scientists have long been interested in studying how winds influence ocean waves.  NCAR Senior Scientist Peter Sullivan wanted to examine the relationship in reverse: How do waves affect the atmosphere? "Most people focus on winds influencing waves because it’s the easiest to study in a laboratory," Sullivan said. "But nature works the other way, too." The result is this striking 3D animation showing the influence of ocean waves on the air above. While strong winds from storms create waves on the ocean surface, those waves don’t just stop. They travel away from the storm, sometimes thousands of miles, to areas with lighter winds. There, in those new areas, the waves, which have become swells, influence the atmosphere. Understanding how this happens provides insight into global weather and climate patterns, as well as into major storms such as hurricanes, which draw energy from the ocean. "With a hurricane, the first thing you see is the fast moving swell created by it," Sullivan said.  Simulating turbulence Sullivan simulated what would happen if a spectrum of small and big waves moved into an area with light winds. He used a technique called large eddy simulation, first developed by NCAR scientists in the 1960s to account for turbulence in computer models. A billion data points were crunched by the NCAR-Wyoming Supercomputing Center’s Yellowstone system. NCAR software engineer Scott Pearse later rendered this two-minute visualization from telling chunks of the simulation. "It’s the data that was actually beautiful," Pearse said. "The ocean wave fields and atmosphere couple, leaving an imprint on each other, and the visualization illustrates some of the complexity of what goes on—in 3D," Sullivan said. Small waves impose a drag on air, while fast and big waves provide thrust, pushing the air forward. "You can see wave signatures in the atmosphere over a large vertical extent," Sullivan said. (Watch for some of these wave-driven winds aloft between the 1:30 and 2:00 mark in the video). Sullivan has made available a simplified version of the new software code, and a number of students are using the code for their graduate work. Pearse created the visualization using a software tool from NCAR called VAPOR, a visualization and analysis platform for ocean, atmosphere, and solar researchers. When he vacationed in Florida after doing initial work on the project, Pearse found he had gained a new perspective on the relationship between the air and ocean. "I saw the waves differently," he said. "Now that I know that atmosphere-ocean interaction is a two-way street, I find myself wondering what the air is doing whenever I see moving water." About the study A journal publication describing the numerical algorithm and further results from the large-eddy simulations can be found at: Sullivan, P.P., J.C. McWilliams and E.G Patton, 2014: Large-eddy simulation of marine boundary layers above a spectrum of moving waves. Journal of the Atmospheric Sciences, doi: 10.1175/JAS-D-14-0095.1 Writer/ContactJeff Smith, Science Writer and Public Information Officer FundersPhysical Oceanography Program at the Office of Naval ResearchNational Science Foundation      

Pages

Subscribe to Computer Modeling