Data & Data Analysis

Double the data: Putting weather observations in the cloud increases access

May 1, 2018 | Meteorologists and other users accessed more than twice as much U.S. weather radar data after the National Oceanic and Atmospheric Administration (NOAA) partnered with Amazon and the University Corporation for Atmospheric Research (UCAR) to make the data available in the cloud. The collaboration is part of the NOAA Big Data Project. Launched in 2015, the project aims to make NOAA's vast storehouse of environmental data easier to access with the hope that both the public and industry will find ways to capitalize on it and spur new economic growth. Partners in the project include Amazon Web Services, Google Cloud Platform, IBM, Microsoft, and the Open Commons Consortium. The first dataset chosen for the project was from the Next Generation Weather Radar (NEXRAD) Weather Surveillance Radar system, and the first collaborator to put that dataset in the cloud was Amazon. The entire archive of NEXRAD data, which stretches back to 1991, is now available via NEXRAD on AWS, and Amazon has partnered with UCAR's Unidata program to update the database in real time and to provide the tools users need to make sense of the data. "As a leading provider of geoscience data to universities across the country, it made perfect sense to partner with Amazon to explore how cloud computing can expand our reach and provide new capabilities to our users," said Unidata Director Mohan Ramamurthy. "The success of this project has given us a chance to be a part of what the future of data delivery will look like." In a new paper published recently in the Bulletin of the American Meteorological Society, collaborators describe the project's early accomplishments. Chief among them: users are now accessing 2.3 times more NEXRAD Level II data. This data, collected from 160 sites and updated approximately every 5 minutes, characterizes precipitation and winds from across the U.S. and is an important input for weather forecasts. Prior to the NOAA Big Data project, archived Level II NEXRAD data was stored only at NOAA's National Centers for Environmental Information (NCEI). While this archive represents a critical dataset for researchers interested in large-scale analysis of how weather patterns may have changed over time, in practice it was very difficult to use. In part that's because of its size — about 270 terabytes — and in part because of the time and cost to obtain the files. The authors of the new study estimate that it would have cost $203,310 and taken 540 days for NCEI to fulfill one researcher's request to obtain the entire NEXRAD Level II data archive. Now, scientists can access the data on the Amazon cloud at no charge. As a result, about 80 percent of the NEXRAD data accessed by users now comes from the cloud, and only about 20 percent comes from NCEI. Jeff Weber, who leads Unidata's part of the project, said that this kind of easy, open access to geoscience data "removes the friction" of doing the science. Moving forward, Weber envisions having satellite data and weather model output available alongside the radar data."Once we're able to bring all these components together in the cloud, I think we're going to see a huge leap in the science," he said. "When you make it easier for scientists to just focus on the science — and not worry about accessing and storing huge amounts of data — breakthroughs are bound to happen." The collaboration with Amazon has allowed Unidata to test how scientists will respond to data stored in the cloud. Part of Unidata's long-term vision is to take data that it now pushes out to users across the country and move it instead to a cloud platform. Aside from the benefit of scientists being able to access the data from anywhere, moving data to the cloud also addresses the reality that increasingly complex and high-resolution models and observational instruments are straining the physical capacity to deliver data to individual research institutions. "The data volumes are growing exponentially — they are growing so fast that we can't just keep pushing all of the data out to our users," Ramamurthy said. "But just putting data out there in the cloud isn't enough either. We need to train our community on how to access that data and provide them the tools they need in the cloud to make it as easy as possible to use the data that's stored there."About the articleTitle: Unlocking the Potential of NEXRAD Data through NOAA’s Big Data PartnershipAuthors: Steve Ansari, Stephen Del Greco, Edward Kearns, Otis Brown, Scott Wilkins, Mohan Ramamurthy, Jeff Weber, Ryan May, Jed Sundwall, Jeff Layton, Ariel Gold, Adam Pasch, and Valliappa LakshmananJournal: Bulletin of the American Meteorological Society, DOI: 10.1175/BAMS-D-16-0021.1Writer/contact:Laura Snider, Senior Science Writer

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

From GOES-16 to the world

March 6, 2017 | As atmospheric scientists around the world look forward to seeing extraordinarily detailed images from the new GOES-16 satellite, the University Corporation for Atmospheric Research (UCAR) and National Center for Atmospheric Research (NCAR) are preparing for central roles in disseminating the satellite's data.The first of a series of next-generation National Oceanic and Atmospheric Administration (NOAA) satellites, GOES-16 was launched in November and is expected to become fully operational late this year. It will immediately improve weather forecasts with its rapid, high-resolution views of hurricanes, thunderstorms, and other severe events, as well as provide a breakthrough lightning mapping system and more detailed monitoring of geomagnetic disturbances caused by the Sun."Scientists are rightfully excited because this is a revolutionary system," said Mohan Ramamurthy, director of UCAR's Unidata Program. "It's going to truly transform weather forecasting and research."GOES-16 captured this view of the mid-Atlantic and New England states on Jan. 15. (Image by National Oceanic and Atmospheric Administration.) Data from GOES-16 will be transmitted to a new downlink facility at the NCAR Mesa Lab. Unidata, which provides data, software tools, and support to enhance Earth system science education and research, will then make that data widely available.  As the only open-access and free source of GOES data in real time, Unidata's services have become indispensable to scientists as well as to operational forecasters in regions that lack their own downlink facilities, such as parts of Latin America.In addition, NCAR's Earth Observing Laboratory (EOL) will produce customized data products from GOES-16 to support field campaigns. EOL currently uses observations from GOES satellites and other sources to help scientists make critical decisions as they're taking measurements in the field.More data than everFor years, NCAR and UCAR have provided real-time data from a series of NOAA satellites known as GOES (Geostationary Operational Environmental Satellite). These satellites, which provide views of the Americas and adjoining ocean regions, are part of a global network of satellites whose observations are shared by forecasters and researchers worldwide.But the advantages of GOES-16 also create new challenges. The satellite has three times as many spectral channels as its predecessors, each with four times more resolution. It can scan the entire Western Hemisphere every 15 minutes and simultaneously generate images of severe weather every 30-60 seconds. All this data will amount to about 1 terabyte per day, more than 100 times the amount of data produced by an existing GOES satellite. And even more data can be expected when NOAA launches additional advanced GOES satellites in coming years.Thanks to a NOAA grant, UCAR and NCAR have installed a direct broadcast receiving station to receive the data, as well as the computers and electronics needed to process and transmit it. In addition to Unidata and EOL, NCAR's Research Applications Laboratory helps operate the downlink facilities for existing GOES satellites and relies on satellite data for the development of specialized forecasting products.The volume of information means that Unidata will continue to move toward making data available in the cloud. It will store GOES-16 data for about 10 days and is in discussions with Amazon over long-term storage options.EOL will customize GOES-16 observations for worldwide field projects, which advance understanding of Earth system science, including weather, climate, and air quality. Such projects deploy teams of scientists with aircraft, ships, ground-based instruments, and other tools. They rely on detailed forecasts and real-time updates about evolving atmospheric conditions."The data from GOES 16 will provide invaluable information for flight planning and decision making during field projects," said EOL director Vanda Grubišić. "This will enable scientists to gather additional observations, further advancing our understanding of the atmosphere and related aspects of the Earth system."EOL will also include the GOES data in their field catalog, along with measurements from field campaigns and other observations. This catalog is widely used by scientists when analyzing results from past campaigns or planning new ones.Other scientists say they are looking forward to the new capabilities that GOES-16 offers."The observations collected by the Geostationary Lightning Mapper on GOES-16 have the potential to help advance our understanding of hurricanes and their intensity changes," said Kristen Corboseiero, a professor in the Department of Atmospheric and Environmental Sciences at the University of Albany-SUNY. "Being able to access this data through Unidata will streamline and expedite our research."In Costa Rica, agencies are planning to use the GOES-16 data from Unidata for weather forecasting and research. In addition, the data will help with monitoring water levels for hydropower to avoid possible power cuts during the dry season, as well as for observing volcanic ash that can affect aviation and farming near San Jose."Several institutions will be using the new GOES-16 data in ways that will help safeguard society from potential natural disasters as well as avoiding energy shortages," said Marcial Garbanzo Salas, an atmospheric sciences professor at the Universidad de Costa Rica (University of Costa Rica). "This is extremely important to us, and we're very pleased that Unidata will be making it available."Writer/contact:David Hosansky, Media Relations ManagerFunder:National Oceanic and Atmospheric Administration

New apps set atmospheric data spinning in 3D

Feburary 6, 2017 | Students of microbiology can grow bacteria in petri dishes to better understand their subject. Paleontology students have fossils, and chemistry students have beakers bubbling with reactions. But students of the atmospheric and related sciences are often left with something much less tangible: data, and lots of it.The Meteo AR app uses augmented-reality techniquest to make atmospheric science data more accessible to the public. (©UCAR. This animation is freely available for media & nonprofit use.)Data sets in the atmospheric sciences cover everything from observations made by weather balloons to satellite measurements of cloud cover to output from climate model runs.Now the National Center for Atmospheric Research (NCAR) is helping make those data less abstract and more concrete  — a little closer to a rock sample and a little further from a computer file. The result is two apps: one using virtual-reality and one using augmented-reality techniques to create 3D visualizations of data sets on a globe that students can move around and view from different perspectives. Meteo VR (Virtual Reality) and Meteo AR (Augmented Reality) are available for use on iPhone, iPad, and Android devices. They were developed by NCAR's Computational and Information Systems Lab (CISL)."The goal is to make our data more accessible to the public, especially to students," said Tim Scheitlin, a senior software engineer at CISL's Visualization Lab. "We think it's a fun way to start a dialogue about atmospheric science. If people can get excited about using the app, then maybe they'll start asking questions that will lead to a deeper understanding."The 'wow' factor and beyondThe Meteo AR app takes advantage of the camera on a personal device. When the camera's pointed at an image from a visualization — of sea surface temperature anomalies during an El Niño, or of the inner workings of a hurricane, for example — the visualization pops up onto a 3D globe that can be spun around with a finger.The Meteo VR app requires a virtual reality headset, such as Google Cardboard, and allows the user to "fly around" the globe to look at the projected data set from any angle.Development of the two apps was led by Nihanth Cherukuru, a doctoral student at Arizona State University. He came to NCAR last summer as part of CISL's Summer Internships in Parallel Computational Science (SIParCS) program, which strives "to make a long-term, positive impact on the quality and diversity of the workforce needed to use and operate 21st century supercomputers."Cherukuru said one of the challenges of the project was to wrestle the vast amounts of data into a format that wouldn’t crash a handheld device. "Mobile phones are tiny devices and the atmospheric data can be really huge," Cherukuru said. "We needed to take that data and trim it down. We created a single image for each timestamp and then we made animations to reduce the computational burden on the phones."While Cherukuru has returned to Arizona State after his SIParCS internship, he is still working with the Visualization Lab. The goal is to expand the apps' capabilities, perhaps, for example, by having users click on parts of the data to get more information."There's kind of a 'wow' factor you get when you first use the app," Scheitlin said. "Our goal is to get past that and make it as educational as we can." Download the appsMeteo AR:For iPhone or iPadFor AndroidMeteo VR:For iPhone or iPadFor Android Writer/contact:Laura Snider, senior science writer

Forecast for big data: Mostly cloudy

May 31, 2016 | The rise of big data has big implications for the advancement of science. It also has big implications for the clogging of bandwidth. The growing deluge of geoscience data is in danger of maxing out the existing capacity to deliver that information to researchers. In response, scientific institutions are experimenting with storing data in the cloud, where researchers can readily get the relatively small portion of the data they actually need. Helping blaze the way is Unidata, which partnered with Amazon Web Services last year to make Next Generation Weather Radar (NEXRAD) data from the National Oceanic and Atmospheric Administration (NOAA) available in the cloud in near real time. The project is one of the ways Unidata, a community program of the University Corporation for Atmospheric Research (UCAR), is exploring what the future of data access may look like. "One of the roles we play at Unidata is to see where the information technology world is going and monitor the new technologies that can advance science," said Unidata Director Mohan Ramamurthy. "In the last 10 years, we've watched the cloud computing environment mature. It's become robust and reliable enough that it now makes sense for the scientific community to begin to adopt it." Inside an Amazon Web Services data center. (Photo courtesy Amazon.) The data deluge Since 1984, Unidata has been delivering geoscience data in near real time to researchers who want it. Today, Unidata also offers those scientists tools they can use to analyze and visualize the data. In 2008, Unidata's servers delivered 2.7 terabytes of data a day to 170 institutions. Just five years later, the program was providing 13 terabytes—or the equivalent of about 4.5 million digital photos—a day to 263 institutions. Today, Unidata is delivering about 33 terabytes of data a day. And the volume is only expected to grow.  For example, NOAA's new weather satellite, GOES-R (Geostationary Operational Environmental Satellite R-Series), is scheduled to launch in October. When GOES-R is up and running, it alone will produce a whopping 3.5 terabytes of data a day. "We've been pushing out data for 30-plus years here at Unidata," said Jeff Weber, who is heading up Unidata's collaboration with Amazon. "What we're finding now is that the volume of available data is just getting to be too large," We can't keep putting more and more data into the pipe and pushing it out—there are physical constraints." The physical constraints are not just on Unidata's side. Many universities and other institutions that rely on Unidata do not have the local bandwidth to handle a huge increase in the incoming stream of data. To address the problem, Unidata decided a few years ago to begin transitioning its services to the cloud—a network of servers hosted on the Internet that allow you to access and process data from anywhere. The vision is to create a future where scientists could go to the cloud, access the data they need, and then use cloud-based tools to process and analyze that data. At the end of their projects, scientists would download only their finished products: a map or graph, perhaps, or the results from a statistical analysis. "With cloud computing, you can bring all your science and the analytic tools you use to the data, rather than the old paradigm of bringing the data to your tools," Ramamurthy said.  'Navigating the waters' These advantages were part of the motivation behind the U.S. Department of Commerce's announcement last spring that NOAA would collaborate with Amazon, Google, IBM, Microsoft, and the Open Commons Consortium with the goal of "unleashing its vast resources of environmental data" using cloud computing. A NEXRAD data product available to researchers through Unidata. (Image courtesy Unidata.) Amazon Web Services was one of the first out of the gate on the NOAA Big Data Project, uploading the full archive of NEXRAD data to the cloud last summer. But to figure out how to continue to feed the archive with near real time observations and to help make sense of the data — how people might want to use it and what kinds of tools they would need — Amazon turned to Unidata. "It made a lot of sense for Unidata to partner with Amazon and vice versa," Ramamurthy said. "They wanted expertise in atmospheric science data. We wanted an opportunity to introduce cloud-based data services to our community and raise awareness about what it can do." The scientific community is perhaps more hesitant to rely on the cloud than other user groups. Datasets are the lifeblood of many research projects, and knowing that the data are stored locally offers a sense of security for many scientists, Ramamurthy said. Losing access to some data could nullify years of work. But the truth is that the data are likely more secure in the cloud than on a local hard drive, Ramamurthy said. "Mirroring" by multiple cloud servers means that data are always backed up. If the Amazon project, and the NOAA Big Data Project in general, are successful in winning scientists over, it could go a long way toward helping Unidata make its own transition to the cloud. Unidata will be studying and learning from the project – including how to make a business model that will work -- with an eye toward its own future. "We're navigating the waters to find out what works and what doesn't so we can report back to the National Science Foundation," Weber said. "We want to see how this paradigm shift might play out — if it makes sense, if it doesn't, or if it makes sense in a few ways but not others." Writer/contactLaura Snider, Senior Science Writer and Public Information Officer

UCAR to support EarthCube: Cyberinfrastructure will advance science

BOULDER – EarthCube, a landmark initiative to develop new technological and computational capabilities for geosciences research, will be supported by the University Corporation for Atmospheric Research (UCAR) under a new agreement with the National Science Foundation (NSF). Created by NSF in 2011, EarthCube aims to help researchers across the geosciences from meteorology to seismology better understand our planet in ways that can strengthen societal resilience to natural events. More than 2,500 EarthCube contributors – including scientists, educators, and information professionals – work together on the creation of a common cyberinfrastructure for researchers to collect, access, analyze, share, and visualize all forms of data and related resources. "EarthCube offers the promise to advance geoscience research by creating and delivering critical new capabilities,” said UCAR scientist Mohan Ramamurthy, principal investigator and project director of the new EarthCube office at UCAR. "This is a great opportunity for UCAR to leverage its successful track record in managing large scientific projects that advance our understanding of the planet," said Michael Thompson, interim UCAR president. "The EarthCube project offers the potential to significantly benefit society by helping scientists use the power of diverse big datasets to better understand and predict the natural events, from severe storms to solar disturbances, that affect all of us." EarthCube is designed to foster collaborations across the geosciences. The technology helps scientists in different disciplines better understand the far-reaching influences of natural events, such as how major storms like Sandy (above) affect coastal and inland flooding. This unique view of Sandy was generated with NCAR's VAPOR visualization software, based on detailed computer modeling. (©UCAR. Visualization by Alan Norton, NCAR, based on research by NCAR scientists Mel Shapiro and Thomas Galarneau. This image is freely available for media & nonprofit use. Click here for higher resolution.) UCAR will administer the day-to-day operations of EarthCube under the three-year, $2.8 million agreement with NSF. The EarthCube science support office, currently funded through an NSF grant to the Arizona Geological Survey in Tucson, Arizona, will move to UCAR's Boulder offices starting this month. EarthCube is designed to help researchers across the geosciences address the challenges of understanding and predicting the complexity of the Earth system, from the geology and topography to the water cycle, atmosphere, and space environment of the planet. This approach is critical for improved understanding of the environment and better safeguarding society. In order to better predict the potential effects of a landfalling hurricane on inland mudslides, for example, scientists from multiple disciplines, including meteorology, hydrology, geography, and geology, need a common platform to work together to collect observations, ingest them into advanced computer models of the Earth system, and analyze and interpret the resulting data. "The EarthCube Science Support Office will help us find and share the data geoscientists collect and use to answer critical science questions about the Earth," said Eva Zanzerkia, program director in NSF’s Division of Earth Sciences. Ramamurthy said UCAR is well positioned to help EarthCube meet its goals, since UCAR provides technological support to the geosciences community, including its 109 member universities. UCAR has been involved with EarthCube since NSF launched the initiative. "Currently researchers are spending an enormous amount of time on routine tasks because there is no data system, database, or data infrastructure where they can get all the information they need in some kind of a uniform way from a single interface," Ramamurthy said. "If EarthCube can facilitate the integration of data from multiple domains in a way that is easier and faster, and if there is interoperability in terms of standards for data to be input into a common environment, then integration becomes more easily possible." UCAR is a nonprofit consortium of more than 100 member colleges and universities focused on research and training in the atmospheric and related Earth system sciences. UCAR’s primary activity is managing the National Center for Atmospheric Research (NCAR) on behalf of NSF, NCAR’s sponsor. UCAR also oversees a variety of education and scientific support activities under the umbrella of the UCAR Community Programs, which will administer EarthCube.

Wrangling observations into models

April 4, 2016 | If scientists could directly measure the properties of all the water throughout the world’s oceans, they wouldn’t need help from NCAR scientist Alicia Karspeck. But since large expanses of the oceans are beyond the reach of observing instruments, Karspeck’s work is critical for those who want estimates of temperature, salinity, and other properties of water around the globe. Scientists need these estimates to better understand the world’s climate system and how it is changing. “It’s painstaking work, but my hope is it will lead to major advances in climate modeling and long-term prediction,” Karspeck said. She is one of a dozen or so researchers at NCAR who spend their days on data assimilation, a field that is becoming increasingly important for the geosciences and other areas of research. Broadly speaking, data assimilation is any method of enabling computer models to utilize relevant observations. Part science and part art, it involves figuring out how to get available measurements--which may be sparse, tightly clustered, or irregularly scattered--into models that tend to simplify the world by breaking it into gridded boxes. Commonly used in weather forecasting, the technique can improve simulations and help scientists predict future events with more confidence. It can also identify deficiencies in both models and observations. As models have become more powerful and observations more numerous, the technique has become so critical that NCAR last year launched a Data Assimilation Program to better leverage expertise across its seven labs. “Activities in data assimilation have grown well beyond traditional applications in numerical weather prediction for the atmosphere and now span across NCAR’s laboratories,” said NCAR Director Jim Hurrell. “The Data Assimilation program is designed to enhance data assimilation research at NCAR, while at the same time serving the broader U.S. research community.” Scientists are using data assimilation techniques to input a range of North American observations into experimental, high-resolution U.S. forecasts. These real-time ensemble forecasts are publicly available while they're being tested. (@UCAR. This image is freely available for media & nonprofit use.) Improving prediction Created by the NCAR Directorate, the Data Assimilation Program is designed to advance prediction of events ranging from severe weather and floods to air pollution outbreaks and peaks in the solar cycle. One of its goals is to encourage collaborations among data assimilation experts at NCAR and the larger research community. For example, scientists in several labs are joining forces to apply data assimilation methods to satellite measurements to create a database of global winds and other atmospheric properties. This database will then be used for a broad range of climate and weather studies. The program also provides funding to hire postdocs at NCAR to focus on data assimilation projects as well as for a software engineer to support such activities. NCAR Senior Scientist Chris Snyder coordinates the Data Assimilation Program. "By bringing money to the table, we’re building up data assimilation capability across NCAR,” said NCAR Senior Scientist Chris Snyder, who coordinates the Data Assimilation Program. “This is critical because data assimilation provides a framework to scientists throughout the atmospheric and related sciences who need to assess where the uncertainties are and how a given observation can help.” NCAR Senior Scientist Jeff Anderson, who oversees the Data Assimilation Research Testbed (DART), says that data assimilation has become central for the geosciences. DART is a software environment that helps researchers develop data assimilation methods and observations with various computer models. “I think the Data Assimilation Program is a huge win for NCAR and the entire atmospheric sciences community,” Anderson said. “The scientific method is about taking observations of the world and making sense of them, and data assimilation is fundamental for applying the scientific method to the geosciences as well as to other research areas.” From oceans to Sun Here are examples of how data assimilation is advancing our understanding of atmospheric and related processes from ocean depths to the Sun’s interior: Oceans. Karspeck is using data assimilation to estimate water properties and currents throughout the world's oceans. This is a computationally demanding task that requires feeding observations into the NCAR-based Community Earth System Model, simulating several days of ocean conditions on the Yellowstone supercomputer, and using those results to update the conditions in the model and run another simulation. The good news: the resulting simulations match well with historical records, indicating that the data assimilation approach is working. “My goal is to turn this into a viable system for researchers,” Karspeck said. Air quality. Atmospheric chemists at NCAR are using data assimilation of satellite observations to improve air quality models that currently draw on limited surface observations of pollutants. For example, assimilating satellite observations would show the effect of emissions from a wildfire in Montana on downwind air quality, such as in Chicago.  “We've done a lot of work to speed up the processing time and the results are promising," said NCAR scientist Helen Worden. “The model simulations after assimilating satellite carbon monoxide data are much closer to actual air quality conditions.” Weather forecasting. Data assimilation is helping scientists diagnose problems with weather models. For example, why do models consistently overpredict or underpredict temperatures near the surface? Using data assimilation, NCAR scientist Josh Hacker discovered that models incorrectly simulate the transfer of heat from the ground into the atmosphere. “With data assimilation, you’re repeatedly confronting the model with observations so you can very quickly see how things go wrong,” he said. Solar cycle. Scientists believe the 11-year solar cycle is driven by mysterious processes deep below the Sun’s surface, such as the movements of cells of plasma between the Sun’s lower latitudes and poles. To understand the causes of the cycle and ultimately predict it, they are turning to data assimilation to augment observations of magnetic fields and plasma flow at the Sun’s surface and feed the resulting information into a computer model of subsurface processes. “We are matching surface conditions to the model, such as the pattern and speed of the plasma flows and evolving magnetic fields,” said NCAR scientist Mausumi Dikpati. Capturing data. In addition to helping scientists improve models, the new Data Assimilation Program is also fostering discussions about observations. NCAR senior scientist Wen-Chau Lee and colleagues who are experts in gathering observations are conferring with computer modelers over how to process the data for the models to readily ingest. One challenge, for example, is that radars may take observations every 150 meters whereas the models often have a resolution of 1-3 kilometers. Inputting the radar observations into the models requires advanced quality control techniques, including coordinate transformation (modifying coordinates from observations to the models) and data thinning (reducing the density of observations while retaining the basic information). “We are modifying our quality control procedures to make sure that the flow of data is smooth.” Lee said. “With data assimilation, the first word is ‘data’,” he added. “Without data, without observations, there is no assimilation.” Writer/contactDavid Hosansky, Manager of Media Relations FundersNCAR Directorate National Science FoundationAdditional funding agencies for specific projects  

Earth Science Week 2015: NCAR visualizes Earth, air, fire & water

October 12, 2015 | We're excited it's Earth Science Week, and even more excited about this year's theme—visualizing Earth systems—because it happens to be one of the things NCAR does best. NCAR visualizations cover the spectrum, from Earth to air to fire to water.   Clockwise from top left: EARTH (ground movement for an earthquake in California), AIR (wind trajectories during a marine cyclone), FIRE (behavior of a Colorado wildfire), and  WATER (sea surface temperature anomalies during El Niño and La Niña). Click on the images to watch the full video versions of the simulations. Scientists across NCAR and at collaborating universities create visualizations to help make sense of their research, often with the help of the Computational and Information Systems Lab. CISL houses the VisLab (the Scientific Visualization Services Group), VAPOR (the Visualization and Analysis Platform for Ocean, Atmosphere and Solar Researchers group); and NCL (the NCAR Command Language group). These teams of software engineers and other professionals are resources for scientists who want to make their research come alive. Learn more about how the visualizations are made here. Earth Science Week was launched by the American Geosciences Institute in 1998. #EarthSciWeek 2015 runs from Oct. 11 through Oct 18.   Writer/contactLaura Snider, Senior Science Writer and Public Information Officer

Watch 2015 and 1997 El Niños build, side by side

September 3, 2015 | The El Niño brewing in the tropical Pacific is on track to become one of the strongest such events in recorded history and may even warm its way past the historic 1997-98 El Niño. While it's too early to say if the current El Niño will live up to the hype, this new NCAR visualization comparing sea surface temperatures in the tropical Pacific in 1997 to those in 2015 gives a revealing glimpse into the similarities, and differences, between the two events. Sea surface temperatures are key to gauging the strength of an El Niño, which is marked by warmer-than-average waters. Even if this year's El Niño goes on to take the title for strongest recorded event, there's no guarantee that the impacts on weather around the world will be the same as they were in 1997-98. Like snowflakes, each El Niño is unique. Still, experts are pondering whether a strong El Niño might ease California's unrelenting drought, cause heatwaves in Australia, cut coffee production in Uganda, and impact the food supply for Peruvian vicuñas. This video animation was created by Matt Rehme at NCAR's Visualization Lab, part of the Computational & Information Systems Lab. It uses the latest data from the National Oceanic and Atmospheric Administration. Rehme had previously created a similar visualization of the 1997-98 El Niño. When comparisons between this year's El Niño and that event began flying around, he decided to make a second animation and compare the two. "I was a little shocked just how closely 2015 resembles 1997 visually," Rehme said. More on El Niño El Niño, La Niña & ENSO FAQ Here comes El Niño—but what exactly is it? El Niño or La Nada? The great forecast challenge of 2014 ¡Hola, La Nada! What happens when El Niño and La Niña take a break? Writer/contactLaura Snider

NCAR "STEPs" up rain, flood research

August 26, 2015 | While many people take advantage of the sunshine this time of year, NCAR scientist Rita Roberts seeks out storms.Roberts is leading an experiment this summer along the Front Range to improve short-term forecasts of heavy rainfall and flash floods, particularly over complex terrain.The tests, which also took place last summer, are part of NCAR’s Short Term Explicit Prediction (STEP) program. The project is pioneering in that it runs several meteorological and hydrological models at the same time, combined with advanced data analysis. Funding comes from the National Science Foundation.Radar images of precipitation (top), with computer model outputs (below) of rainfall accumulation and short-term rainfall predictions for specific areas. STEP uses meteorological and hydrological models, combined with advanced data analysis, to improve short-term forecasts of rain and flash floods. (©UCAR. This image is freely available for media & nonprofit use.)"The system captures in real time where storms are forming and where they are dissipating," said Roberts, who has analyzed high-impact weather events for NCAR since 1982. "It's about improving predictions of heavy rainfall and flash flooding."STEP’s rainfall forecast will be tested during the monsoon season next spring in Taiwan, which has a similar mix of plains and rugged mountains as the Front Range. Individual components of STEP are being tested elsewhere, while the complete system is drawing interest from weather forecast offices in other countries.The forecast models currently used by weather forecasters don’t always provide accurate rainfall rates and the models have difficulty pinpointing the exact location where the heavy rainfall will occur. Studies show atmospheric conditions can change rapidly, resulting in large shifts of weather.STEP’s goal is to provide accurate rainfall and streamflow forecasts up to a day out with particular emphasis on nowcasting exactly where the heavy rainfall will be in the next few hours using information that is updated continuously. Such short-term forecasts are critical to providing warnings to communities so they can reduce fatalities, injuries, and economic damage from rainstorms, floods, and other extreme weather events. Roberts said she believes STEP also could be used to provide motorists with real-time alerts about areas to avoid because of rain and potential flooding.Said Jenny Sun, chair of the STEP program: "We’ve talked with weather forecasters who tell us their biggest challenge is to forecast heavy rainfall—and the biggest impact to the community is flooding."The STEP test along the Front Range combines:Data from 17 radar stations and other observational equipmentHigh-resolution rainfall forecasts from the NCAR-based Weather Research and Forecasting (WRF) systemAuto-nowcaster, a software program that projects the evolution of storms and rainfall over the next 10 minutes to 1 hourWRF-Hydro, which generates stream flow predictions in a 0- to 12-hour timeframeCitizen participation a keySTEP produces digital maps with a resolution of one square kilometer that show how much rain has fallen in the past two hours and how much is expected in the next hour. More complex maps show how atmospheric conditions and stream levels are changing.Rainfall forecasts are evaluated in part through an extensive network of rain gauges run by the Community Collaborative Rain Hail and Snow Network (CoCoRaHS), a nonprofit network of citizen volunteers.Leaders of various aspects of STEP include NCAR scientists Barbara Brown, Dave Gochis, and Jim Wilson.NCAR scientists Jenny Sun and Rita Roberts examine radar images of heavy rainfall along the Front Range. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.)The auto-nowcaster component has been tested in Texas; Florida; and Washington, D.C. WRF-Hydro, a hydrological modeling extension package that can operate independently or coupled with the WRF atmospheric model, is being integrated into the National Weather Service’s new National Water Center and is expected to start running in real time next May. Sun said NCAR scientists also are collaborating with a Japanese electric power research institute, the Beijing Meteorological Bureau, and Panasonic’s weather solutions unit, which has offices in Colorado and North Carolina.The project wouldn’t be possible without advances in the ability to observe how three-dimensional phenomena in the atmosphere evolve over time. NCAR’s powerful Yellowstone supercomputer in Wyoming crunches the data.Since scientists can’t measure all atmospheric conditions at any given moment, the STEP program takes uncertainties into account by using NCAR’s ensemble modeling approach led by Morris Weisman and Glen Romine. This way a range of equally likely conditions can be simulated.Roberts said the STEP team is planning to conduct another real-time test along the Front Range next summer. "Our goal," she said, "is to keep improving the capability and accuracy of the system."Writer/Contact Jeff SmithFunder National Science Foundation

Pages

Subscribe to Data & Data Analysis