Food security report wins USDA award

BOULDER, Colo. — A comprehensive report warning of the impacts of climate change on the world's food security has won a top U.S. Department of Agriculture (USDA) award."Climate Change, Global Food Security, and the U.S. Food System," with co-authors from the National Center for Atmospheric Research (NCAR), provides an overview of recent research in climate change and agriculture. It warns that warmer temperatures and altered precipitation patterns can threaten food production, disrupt transportation systems, and degrade food safety, among other impacts, and that the world's poor and those living in tropical regions are particularly vulnerable.Michael Scuse, USDA acting deputy secretary (center), with members of the team of experts who produced the award-winning report, "Climate Change, Global Food Security, and the U.S. Food System." Those pictured are (back row from left): William Easterling (The Pennsylvania State University), Edward Carr (Clark University), and Peter Backlund (Colorado State University); front row from left: Rachel Melnick (USDA), Margaret Walsh (USDA), Scuse, Moffat Ngugi (U.S. Agency for International Development/USDA), and Karen Griggs (NCAR). (Photo by USDA.) The USDA this month named it as the winner of the 2016 Abraham Lincoln Honor Award for Increasing Global Food Security. The Abraham Lincoln Honor Award is the most prestigious USDA award presented by the Secretary of Agriculture, recognizing noteworthy accomplishments that significantly contribute to the advancement of the USDA's strategic goals, mission objectives, and overall management excellence.The report was produced as part of a collaboration between NCAR, the USDA, and the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation. It was written by 32 experts from 19 federal, academic, nongovernmental, intergovernmental, and private organizations in the United States, Argentina, Britain, and Thailand. The authors included three NCAR scientists, as well as eight experts affiliated with UCAR member universities."This award highlights the importance of addressing climate change in order to maintain the progress the world has made on food security in recent decades," said NCAR program director Lawrence Buja, who helped oversee production of the report. "Scientists will continue to study this critical issue and work with decision makers to co-develop the information they need about potential climate impacts on future production, distribution, and other aspects of our U.S. and global food systems."Published under the auspices of the U.S. Global Change Research Program, the reportfocuses on identifying climate change impacts on global food security through 2100. The authors emphasize that food security — the ability of people to obtain and use sufficient amounts of safe and nutritious food — will be affected by several factors in addition to climate change, such as technological advances, increases in population, the distribution of wealth, and changes in eating habits."Climate change has a myriad of potential impacts, especially on food, water, and energy systems," said UCAR President Antonio J. Busalacchi. "I commend the authors of this report for clearly analyzing this very complex issue in the agriculture sector, which has implications for all of society, from the least developed nations to the most advanced economies."Report authorsMolly Brown, University of Maryland*John Antle, Oregon State University*Peter Backlund, Colorado State University *Edward Carr, Clark UniversityBill Easterling, Pennsylvania State University*Margaret Walsh, USDA Office of the Chief Economist/Climate Change Program OfficeCaspar Ammann, NCARWitsanu Attavanich, Kasetsart UniversityChris Barrett, Cornell University*Marc Bellemare, University of Minnesota*Violet Dancheck, U.S. Agency for International DevelopmentChris Funk, U.S. Geological SurveyKathryn Grace, University of Utah*John Ingram, University of OxfordHui Jiang, USDA Foreign Agricultural ServiceHector Maletta, Universidad de Buenos AiresTawny Mata, USDA/American Association for the Advancement of ScienceAnthony Murray, USDA-Economic Research ServiceMoffatt Ngugi, U.S. Agency for International Development/USDA Foreign Agricultural ServiceDennis Ojima, Colorado State University*Brian O'Neill, NCARClaudia Tebaldi, NCAR*UCAR member universityReport project teamLawrence Buja, NCARKaren Griggs, NCAR 

UCAR congressional briefing highlights flood, drought prediction

WASHINGTON — The nation is poised to make major advances in "water intelligence" with more detailed forecasts of floods, streamflow, and potential drought conditions, a panel of experts said at a congressional briefing today.The briefing, sponsored by the University Corporation for Atmospheric Research (UCAR), highlighted the new National Water Model, a comprehensive system for forecasting water resources from coast to coast. The technology underpinning the model, launched last month by the National Oceanic and Atmospheric Administration (NOAA), was developed by the National Center for Atmospheric Research (NCAR) and its collaborators at universities, the National Science Foundation and other federal agencies, and the private sector."The new forecast model is really a quantum leap forward and will help safeguard Americans from major floods and other precipitation events," said UCAR President Antonio J. Busalacchi, who introduced the panel. "It bridges the gap between research and operations, generating real-time forecasts to help vulnerable communities and protect lives and property."UCAR manages NCAR on behalf of the National Science Foundation."Through a series of partnerships, it's possible to provide consistent, high-resolution, integrated water analyses, predictions, and data to address critical unmet information and service gaps," said Edward Clark, director of the Geo-Intelligence Office of Water Prediction at the NOAA National Water Center.Scientists generated this inundation forecast during Houston-area flooding earlier this year in a demonstration of  advanced computer modeling technology. (©UCAR. Image by David Gochis, NCAR. This image is freely available for media & nonprofit use.)Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, the new system continuously simulates conditions at 2.7 million locations along rivers, streams, and their tributaries across the contiguous United States. It paves the way for the biggest improvement in flood forecasting in the nation's history."The National Water Model provides a different way of thinking about continental hydrology by providing a view of a connected plumbing network from the mountains to the ocean," said panelist Richard Hooper, executive director of the Consortium of Universities for the Advancement of Hydrologic Science (CUAHSI). "Previously, hydrologists had considered river basins as discrete units rather than this river-continuum approach. This change in view opens up new areas of research that will improve our ability to predict not just floods but other aspects of water resources, including water quality and the impacts of droughts."Thanks to ongoing research, the National Water Model is expected to provide increasingly detailed street-level forecasts, inundation maps, and additional features such as water quality forecasts. Scientists are working on incorporating more processes, such as soil saturation and the amount of water drawn up by vegetation."By dramatically increasing the geographic coverage as well as the lead times for forecasts, the National Water Model is ushering in a new era in flood and flash flood forecasting," said John McHenry, chief scientist of advanced meteorological systems for Baron Services. "Business, industry, and the general public will benefit through reduction in lost lives and property."The panelists emphasized the importance of water resources to the major sectors of the U.S. economy. They warned that the nation is facing myriad water-related challenges ranging from growing demand to increasingly costly floods and droughts. Meeting those challenges will require continued coordination among research organizations, universities, the private sector, and federal, state, and local agencies."Beyond developing a new computer model, we're building a community by sharing resources, tools, and ideas," said NCAR scientist David Gochis. "The scientists are engaging with practitioners and decision makers to make the system as usable as possible."The development team at NCAR worked with scientists at NOAA, the U.S. Geological Survey, and universities to adapt WRF-Hydro to serve as the first version of the National Water Model.The panelists also discussed the need for better water intelligence among diverse communities across the country. For example, Ryan Emanuel, associate professor at North Carolina State University's Department of Forestry and Environmental Resources, noted that indigenous tribes across the nation are particularly vulnerable to drought and flooding for a range of cultural, historical, and economic reasons."Indigenous peoples across the United States are diverse, but one common theme is that water is sacred," said Emanuel, a member of the Lumbee Tribe of North Carolina. "It's not only critical for life, but it is life itself. Beyond the tools, the models, and the management lies the knowledge of the original inhabitants of this nation that water binds us all to a common fate."The event is the latest in a series of UCAR congressional briefings about critical topics in the Earth system sciences. Past briefings have focused on predicting space weather, aviation weather safety, the state of the Arctic, hurricane prediction, and potential impacts of El Niño.

NSF/NCAR research plane assisting with U.S. hurricane forecasts

BOULDER, Colo. — As the peak of hurricane season approaches, U.S. forecasters are deploying a high-altitude research aircraft operated by the National Center for Atmospheric Research (NCAR) to fly over and around storms to take critical observations.The NSF/NCAR Gulfstream-V readies for takeoff on a mission to study tropical storms. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) The deployment this week of the Gulfstream-V (G-V) aircraft is the result of a partnership between the National Science Foundation (NSF), which owns the plane, and the National Oceanic and Atmospheric Administration (NOAA), which issues forecasts. The NSF/NCAR G-V will take to the skies to support hurricane forecasts through October 12, while NOAA’s Gulfstream-IV (G-IV) undergoes unscheduled maintenance."It's critical to have detailed measurements of the atmosphere around a hurricane in order to ensure that forecasts are as accurate as possible," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of NSF. "NCAR and its research partners have a proven track record of improving predictions of dangerous storms. Consistent with our role of managing NCAR, we take very seriously our ability and responsibility to share our advanced resources in support of NOAA's mission to protect life and property.""NSF is pleased that NCAR, using the G-V, is able to assist in this potentially lifesaving activity," said Roger Wakimoto, assistant director of the NSF Directorate for Geosciences. "The data gathered will help refine future hurricane forecasts.”Outfitted for critical observationsThe NSF/NCAR G-V can fly at high altitudes and deploy the same specialized sensors as the NOAA G-IV. These sensors take critical observations of atmospheric conditions for the NOAA National Hurricane Center.Studies show that such observations improve hurricane track forecasts in the U.S. global weather model (called the GFS) by about 15 percent during the 24 to 48 hours before landfall. Research also demonstrates that these data increase the accuracy of hurricane intensity forecasts.To take the observations, the NSF/NCAR G-V has been outfitted with the Airborne Vertical Atmospheric Profiling System (AVAPS). The system releases parachute-borne sensors, known as GPS dropsondes, that measure ambient temperature, pressure, humidity, wind speed, and wind direction at different altitudes as they fall through the atmosphere. Dropsondes were first developed at NCAR in the 1970s with NSF funding and have since been regularly updated. NOAA was an early adopter of the dropsondes for hurricane surveillance missions and research, and the development of the AVAPS system design in the 1990s was motivated in part by the capabilities of the NOAA G-IV.The NSF/NCAR G-V, which is available for flights over both the Atlantic and Pacific, will fly above a hurricane or other major storm at altitudes of up to 45,000 feet, as well as around the storm's edges. Its dropsonde launch system and software is similar to that of the NOAA G-IV.NCAR pilots will guide the aircraft on pre-planned flight tracks, dropping sondes approximately every 15 minutes. Data from the sondes will be processed by a NOAA technician onboard the plane, then sent to the Global Telecommunications System for immediate inclusion in hurricane forecast models."It is a special privilege for us to be able to help out our colleagues at NOAA by deploying the NSF/NCAR G-V in the hurricane surveillance missions this season," said Vanda Grubišić, director of NCAR's Earth Observing Laboratory, which operates the G-V. "Our Research Aviation Facility crews look forward to working with their NOAA colleagues and collecting important data in support of their mission."

Solar energy gets boost from new forecasting system

BOULDER, Colo. — A cutting edge forecasting system developed by a national team of scientists offers the potential to save the solar energy industry hundreds of millions of dollars through improved forecasts of the atmosphere.The new system, known as Sun4CastTM, has been in development for three years by the National Center for Atmospheric Research (NCAR) in collaboration with government labs, universities, utilities, and commercial firms across the country. Funded by the U.S. Department of Energy SunShot Initiative, the system greatly improves predictions of clouds and other atmospheric conditions that influence the amount of energy generated by solar arrays.After testing Sun4Cast at multiple sites, the research team has determined that it can be up to 50 percent more accurate than current solar power forecasts. This improved accuracy will enable utilities to deploy solar energy more reliably and inexpensively, reducing the need to purchase energy on the spot market. The amount of energy gathered by solar panels — such as these in Colorado's San Luis Valley — is influenced by factors including the position and types of clouds, the amount of snow on the ground, and relative humidity. The new Sun4Cast system greatly improves solar irradiance predictions, enabling utilities to deploy solar energy more reliably and inexpensively. (©UCAR. Photo by Sue Ellen Haupt, NCAR. This image is freely available for media & nonprofit use.)As a result, utilities across the United States may be able to save an estimated $455 million through 2040 as they use more solar energy, according to an analysis by NCAR economist Jeffrey Lazo.NCAR, which does not provide operational forecasts, makes the technology available so it can be adapted by utilities or private forecasting companies. The research is being highlighted in more than 20 peer-reviewed papers."These results can help enable the nation's expanding use of solar energy," said Sue Ellen Haupt, director of NCAR’s Weather Systems and Assessment Program, who led the research team. "More accurate predictions are vital for making solar energy more reliable and cost effective."The work builds on NCAR’s expertise in highly detailed atmospheric prediction, including the design of an advanced wind energy forecasting system."This type of research and development is important because it contributes to the reduction in costs for solar and wind energy and makes it easier for utilities to integrate renewables into the electrical grid," said William Mahoney, Deputy Director of NCAR's Research Applications Laboratory. "When it comes to balancing demand for power with supply, it's vital to be able to predict sources of energy as accurately as possible."Xcel Energy is already beginning to use the system to forecast conditions at several of its main solar facilities.“Our previous experience with the National Center for Atmospheric Research in developing a wind forecasting system has saved millions of dollars and has been highly beneficial for our customers," said Drake Bartlett, senior trading analyst for Xcel Energy – Colorado. "It is our sincere hope and belief that we will see positive atmospheric forecasting results for predicting solar generation as well, again to the benefit of our Xcel Energy customers."Energy forecasts out to 72 hoursUsing a combination of advanced computer models, atmospheric observations, and artificial intelligence techniques, Sun4Cast provides 0- to 6-hour nowcasts of solar irradiance and the resulting power production for specific solar facilities at 15-minute intervals. This enables utilities to continuously anticipate the amount of available solar energy.In addition, forecasts extend out to 72 hours, allowing utility officials to make decisions in advance for balancing solar with other sources of energy.Solar irradiance is notoriously difficult to predict. It is affected not just by the locations and types of clouds, but also a myriad of other atmospheric conditions, such as the amount of dust and other particles in the air, relative humidity, and air pollution. Further complicating the forecast, freshly fallen snow, nearby steep mountainsides, or even passing cumulus clouds can reflect sunlight in a way that can increase the amount of energy produced by solar panels.To design a system to forecast solar energy output, NCAR and its partners drew on an array of observing instruments, including satellites, radars, and sky imagers; specialized software; and mathematical and artificial intelligence techniques. Central to Sun4Cast is a new computer model of the atmosphere that simulates solar irradiance based on meteorological conditions. Called WRF-SolarTM, the model is derived from the NCAR-based Weather Research and Forecasting (WRF) model, which is widely used by meteorological agencies worldwide.The team tested the system in geographically diverse areas, including Long Island, New York; the Colorado mountains; and coastal California."We have to provide utilities with confidence that the system maintains a high degree of accuracy year-round in very different types of terrain," said Branko Kosovic, NCAR Program Manager for Renewable Energy.In addition to aiding the solar power industry, the work can also improve weather forecasting in general because of improved cloud prediction.NCAR's numerous partners on the project in the public and private sectors included:Government labs: National Renewable Energy Laboratory, Brookhaven National Laboratory, the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory, and other NOAA facilities; Universities: The Pennsylvania State University, Colorado State University, University of Hawaii, and University of Washington; Utilities: Long Island Power Authority, New York Power Authority, Public Service Company of Colorado, Sacramento Municipal Utility District (SMUD), Southern California Edison, and the Hawaiian Electric Company; Independent system operators: New York ISO, Xcel Energy, SMUD, California ISO, and Hawaiian Electric; and Commercial forecast providers: Schneider Electric, Atmospheric and Environmental Research, Global Weather Corporation, MDA Information Systems, and Solar Consulting Services.Computing time was provided by the New York State Department of Economic Development's Division of Science, Technology and Innovation on an IBM Blue Gene supercomputer at Brookhaven National Laboratory. Researchers also performed computing at the NCAR-Wyoming Supercomputing Center and the DOE National Energy Research Scientific Computing Center.About the SunShot InitiativeThe U.S. Department of Energy SunShot Initiative is a collaborative national effort that aggressively drives innovation to make solar energy fully cost-competitive with traditional energy sources before the end of the decade. Through SunShot, the Energy Department supports efforts by private companies, universities, and national laboratories to drive down the cost of solar electricity to $0.06 per kilowatt-hour.

US taps NCAR technology for new water resources forecasts

BOULDER, Colo. — As the National Oceanic and Atmospheric Administration (NOAA) this month launches a comprehensive system for forecasting water resources in the United States, it is turning to technology developed by the National Center for Atmospheric Research (NCAR) and its university and agency collaborators.WRF-Hydro, a powerful NCAR-based computer model, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model."WRF-Hydro gives us a continuous picture of all of the waterways in the contiguous United States," said NCAR scientist David Gochis, who helped lead its development. "By generating detailed forecast guidance that is hours to weeks ahead, it will help officials make more informed decisions about reservoir levels and river navigation, as well as alerting them to dangerous events like flash floods."WRF-Hydro (WRF stands for Weather Research and Forecasting) is part of a major Office of Water Prediction initiative to bolster U.S. capabilities in predicting and managing water resources. By teaming with NCAR and the research community, NOAA's National Water Center is developing a new national water intelligence capability, enabling better impacts-based forecasts for management and decision making.The new WRF-Hydro computer model simulates streams and other aspects of the hydrologic system in far more detail than previously possible. (Image by NOAA Office of Water Prediction.) Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, WRF-Hydro provides continuous forecasts of millions of points along rivers, streams, and their tributaries across the contiguous United States. To accomplish this, it simulates the entire hydrologic system — including snowpack, soil moisture, local ponded water, and evapotranspiration — and rapidly generates output on some of the nation's most powerful supercomputers.WRF-Hydro was developed in collaboration with NOAA and university and agency scientists through the Consortium of Universities for the Advancement of Hydrologic Science, the U.S. Geological Survey, Israel Hydrologic Service, and Baron Advanced Meteorological Services. Funding came from NOAA, NASA, and the National Science Foundation, which is NCAR's sponsor."WRF-Hydro is a perfect example of the transition from research to operations," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation (NSF). "It builds on the NSF investment in basic research in partnership with other agencies, helps to accelerate collaboration with the larger research community, and culminates in support of a mission agency such as NOAA. The use of WRF-Hydro in an operational setting will also allow for feedback from operations to research. In the end this is a win-win situation for all parties involved, chief among them the U.S. taxpayers.""Through our partnership with NCAR and the academic and federal water community, we are bringing the state of the science in water forecasting and prediction to bear operationally," said Thomas Graziano, director of NOAA’s new Office of Water Prediction at the National Weather Service.Filling in the water pictureThe continental United States has a vast network of rivers and streams, from major navigable waterways such as the Mississippi and Columbia to the remote mountain brooks flowing from the high Adirondacks into the Hudson River. The levels and flow rates of these watercourses have far-reaching implications for water availability, water quality, and public safety.Until now, however, it has not been possible to predict conditions at all points in the nation's waterways. Instead, computer models have produced a limited picture by incorporating observations from about 4,000 gauges, generally on the country's bigger rivers. Smaller streams and channels are largely left out of these forecast models, and stretches of major rivers for tens of miles are often not predicted — meaning that schools, bridges, and even entire towns can be vulnerable to unexpected changes in river levels.To fill in the picture, NCAR scientists have worked for the past several years with their colleagues within NOAA, other federal agencies, and universities to combine a range of atmospheric, hydrologic, and soil data into a single forecasting system.The resulting National Water Model, based on WRF-Hydro, simulates current and future conditions on rivers and streams along points two miles apart across the contiguous United States. Along with an hourly analysis of current hydrologic conditions, the National Water Model generates three predictions: an hourly 0- to 15-hour short-range forecast, a daily 0- to 10-day medium-range forecast, and a daily 0- to 30-day long-range water resource forecast.The National Water Model predictions using WRF-Hydro offer a wide array of benefits for society. They will help local, state, and federal officials better manage reservoirs, improve navigation along major rivers, plan for droughts, anticipate water quality problems caused by lower flows, and monitor ecosystems for issues such as whether conditions are favorable for fish spawning. By providing a national view, this will also help the Federal Emergency Management Agency deploy resources more effectively in cases of simultaneous emergencies, such as a hurricane in the Gulf Coast and flooding in California."We've never had such a comprehensive system before," Gochis said. "In some ways, the value of this is a blank page yet to be written."A broad spectrum of observationsWRF-Hydro is a powerful forecasting system that incorporates advanced meteorological and streamflow observations, including data from nearly 8,000 U.S. Geological Survey streamflow gauges across the country. Using advanced mathematical techniques, the model then simulates current and future conditions for millions of points on every significant river, steam, tributary, and catchment in the United States.In time, scientists will add additional observations to the model, including snowpack conditions, lake and reservoir levels, subsurface flows, soil moisture, and land-atmosphere interactions such as evapotranspiration, the process by which water in soil, plants, and other land surfaces evaporates into the atmosphere.Scientists over the last year have demonstrated the accuracy of WRF-Hydro by comparing its simulations to observations of streamflow, snowpack, and other variables. They will continue to assess and expand the system as the National Water Model begins operational forecasts.NCAR scientists maintain and update the open-source code of WRF-Hydro, which is available to the academic community and others. WRF-Hydro is widely used by researchers, both to better understand water resources and floods in the United States and other countries such as Norway, Germany, Romania, Turkey, and Israel, and to project the possible impacts of climate change."At any point in time, forecasts from the new National Water Model have the potential to impact 300 million people," Gochis said. "What NOAA and its collaborator community are doing is trying to usher in a new era of bringing in better physics and better data into forecast models for improving situational awareness and hydrologic decision making."CollaboratorsBaron Advanced Meteorological Services Consortium of Universities for the Advancement of Hydrologic ScienceIsrael Hydrologic ServiceNational Center for Atmospheric ResearchNational Oceanic and Atmospheric AdministrationU.S. Geological SurveyFundersNational Science FoundationNational Aeronautics and Space AdministrationNational Oceanic and Atmospheric Administration

Climate change already accelerating sea level rise, study finds

BOULDER, Colo. — Greenhouse gases are already having an accelerating effect on sea level rise, but the impact has so far been masked by the cataclysmic 1991 eruption of Mount Pinatubo in the Philippines, according to a new study led by the National Center for Atmospheric Research (NCAR).Satellite observations, which began in 1993, indicate that the rate of sea level rise has held fairly steady at about 3 millimeters per year. But the expected acceleration due to climate change is likely hidden in the satellite record because of a happenstance of timing: The record began soon after the Pinatubo eruption, which temporarily cooled the planet, causing sea levels to drop.The new study finds that the lower starting point effectively distorts the calculation of sea level rise acceleration for the last couple of decades.The study lends support to climate model projections, which show the rate of sea level rise escalating over time as the climate warms. The findings were published today in the open-access Nature journal Scientific Reports.Mount Pinatubo's caldera on June 22, 1991. (Image courtesy USGS.)"When we used climate model runs designed to remove the effect of the Pinatubo eruption, we saw the rate of sea level rise accelerating in our simulations," said NCAR scientist John Fasullo, who led the study. "Now that the impacts of Pinatubo have faded, this acceleration should become evident in the satellite measurements in the coming decade, barring another major volcanic eruption."Study co-author Steve Nerem, from the University of Colorado Boulder, added: “This study shows that large volcanic eruptions can significantly impact the satellite record of global average sea level change. So we must be careful to consider these effects when we look for the effects of climate change in the satellite-based sea level record."The findings have implications for the extent of sea level rise this century and may be useful to coastal communities planning for the future. In recent years, decision makers have debated whether these communities should make plans based on the steady rate of sea level rise measured in recent decades or based on the accelerated rate expected in the future by climate scientists.The study was funded by NASA, the U.S. Department of Energy, and the National Science Foundation, which is NCAR's sponsor.Reconstructing a pre-Pinatubo worldClimate change triggers sea level rise in a couple of ways: by warming the ocean, which causes the water to expand, and by melting glaciers and ice sheets, which drain into the ocean and increase its volume. In recent decades, the pace of warming and melting has accelerated, and scientists have expected to see a corresponding increase in the rate of sea level rise. But analysis of the relatively short satellite record has not borne that out.To investigate, Fasullo, Nerem, and Benjamin Hamlington of Old Dominion University worked to pin down how quickly sea levels were rising in the decades before the satellite record began.Prior to the launch of the international TOPEX/Poseidon satellite mission in late 1992, sea level was mainly measured using tide gauges. While records from some gauges stretch back to the 18th century, variations in measurement technique and location mean that the pre-satellite record is best used to get a ballpark estimate of global mean sea level.Mount Pinatubo erupting in 1991. (Image courtesy USGS.)To complement the historic record, the research team used a dataset produced by running the NCAR-based Community Earth System Model 40 times with slightly different—but historically plausible—starting conditions. The resulting simulations characterize the range of natural variability in the factors that affect sea levels. The model was run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.A separate set of model runs that omitted volcanic aerosols — particles spewed into the atmosphere by an eruption — was also assessed. By comparing the two sets of runs, the scientists were able to pick out a signal (in this case, the impact of Mount Pinatubo's eruption) from the noise (natural variations in ocean temperature and other factors that affect sea level)."You can't do it with one or two model runs—or even three or four," Fasullo said. "There's just too much accompanying climate noise to understand precisely what the effect of Pinatubo was. We could not have done it without large numbers of runs."Using models to understand observationsAnalyzing the simulations, the research team found that Pinatubo's eruption caused the oceans to cool and sea levels to drop by about 6 millimeters immediately before TOPEX/Poseidon began recording observations.As the sunlight-blocking aerosols from Mount Pinatubo dissipated in the simulations, sea levels began to slowly rebound to pre-eruption levels. This rebound swamped the acceleration caused by the warming climate and made the rate of sea level rise higher in the mid- to late 1990s than it would otherwise have been.This higher-than-normal rate of sea level rise in the early part of the satellite record makes it appear that the rate of sea level rise has not accelerated over time and may actually have decreased somewhat. In fact, according to the study, if the Pinatubo eruption had not occurred—leaving sea level at a higher starting point in the early 1990s—the satellite record would have shown a clear acceleration."The satellite record is unable to account for everything that happened before the first satellite was launched, " Fasullo said. "This study is a great example of how computer models can give us the historical context that's needed to understand some of what we're seeing in the satellite record."Understanding whether the rate of sea level rise is accelerating or remaining constant is important because it drastically changes what sea levels might look like in 20, 50, or 100 years.“These scientists have disentangled the major role played by the 1991 volcanic eruption of Mt. Pinatubo on trends in global mean sea level,” said Anjuli Bamzai, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which funded the research.  “This research is vital as society prepares for the potential effects of climate change."Because the study's findings suggest that acceleration due to climate change is already under way, the acceleration should become evident in the satellite record in the coming decade, Fasullo said.Since the original TOPEX/Poseidon mission, other satellites have been launched—Jason-1 in 2001 and Jason-2 in 2008—to continue tracking sea levels. The most recent satellite, Jason-3, launched on Jan. 17 of this year."Sea level rise is potentially one of the most damaging impacts of climate change, so it's critical that we understand how quickly it will rise in the future," Fasullo said. "Measurements from Jason-3 will help us evaluate what we've learned in this study and help us better plan for the future."The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.The graph shows how sea level rises and falls as ocean heat content fluctuates. After volcanic eruptions, the Earth cools and, in turn, the heat content in the ocean drops, ultimately lowering sea level.The solid blue line is the average sea level rise of climate model simulations that include volcanic eruptions. The green line is the average from model simulations with the effect of volcanic eruptions removed, and it shows a smooth acceleration in the rate of sea level rise due to climate change.The blue line between the start of the satellite record and present day makes a relatively straight line — just as we see from actual satellite observations during that time —  indicating that the rate of sea level rise has not accelerated. But in the future, barring another major volcanic eruption, scientists expect sea level to follow the gray dotted line, which is on the same accelerating path as the green line below it. Click to enlarge. (©UCAR. This graph is freely available for media & nonprofit use.) About the articleTitle: Is the detection of sea level rise imminent?Authors: J.T. Fasullo, R. S. Nerem, and B. HamlingtonJournal: Scientific Reports, DOI: 10.1038/srep31245 Funders:  NASANational Science FoundationU.S. Department of Energy Collaborators: Univesity of Colorado Boulder (UCAR member)Old Dominion University (UCAR member)Writer:Laura Snider, Senior Science Writer and Public Information Officer

UCAR maintains A+ long-term credit rating

BOULDER — The A+ long-term bond rating for the University Corporation for Atmospheric Research (UCAR) has been affirmed by the credit rating agency Standard & Poor's (S&P).The A+ rating reflects UCAR's role as a leading organization supporting atmospheric and earth-system science, and its ability to increase its financial strength, S&P stated in the report last month.UCAR, a consortium of more than 100 colleges and universities, manages the National Center for Atmospheric Research (NCAR) under sponsorship by the National Science Foundation (NSF). The organization has an annual budget of more than $200 million.The Anthes Building in Boulder houses UCAR's administrative staff. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.)In its report, S&P cited a number of UCAR's strengths: financial flexibility, stable membership, longstanding relationship with NSF, manageable debt, and solid operating performance.Melissa Miller, UCAR vice president of finance and administration, said the organization works hard to maintain a high credit rating, which translates into lower costs for its funders.Bonds have been issued over the years to procure and equip facilities."UCAR is vigilant in taking the necessary steps to ensure continued sound fiscal management amid a frequently changing financial landscape," Miller said.

Expanding Antarctic sea ice linked to natural variability

BOULDER — The recent trend of increasing Antarctic sea ice extent — seemingly at odds with climate model projections — can largely be explained by a natural climate fluctuation, according to a new study led by the National Center for Atmospheric Research (NCAR). The study offers evidence that the negative phase of the Interdecadal Pacific Oscillation (IPO), which is characterized by cooler-than-average sea surface temperatures in the tropical eastern Pacific, has created favorable conditions for additional Antarctic sea ice growth since 2000. The findings, published in the journal Nature Geoscience, may resolve a longstanding mystery: Why is Antarctic sea ice expanding when climate change is causing the world to warm? The study's authors also suggest that sea ice may begin to shrink as the IPO switches to a positive phase. "The climate we experience during any given decade is some combination of naturally occurring variability and the planet's response to increasing greenhouse gases," said NCAR scientist Gerald Meehl, lead author of the study. "It's never all one or the other, but the combination, that is important to understand." Study co-authors include Julie Arblaster of NCAR and Monash University in Australia, Cecilia Bitz of the University of Washington, Christine Chung of the Australian Bureau of Meteorology, and NCAR scientist Haiyan Teng. The study was funded by the U.S. Department of Energy and by the National Science Foundation, which sponsors NCAR. On Sept. 19, 2014, the five-day average of Antarctic sea ice extent exceeded 20 million square kilometers (about 7.7 million square miles) for the first time since 1979, according to the National Snow and Ice Data Center. The red line shows the average maximum extent from 1979-2014. (Image courtesy NASA's Scientific Visualization Studio/Cindy Starr) Expanding ice The sea ice surrounding Antarctica has been slowly increasing in area since the satellite record began in 1979. But the rate of increase rose nearly five fold between 2000 and 2014, following the IPO transition to a negative phase in 1999. The new study finds that when the IPO changes phase, from positive to negative or vice versa, it touches off a chain reaction of climate impacts that may ultimately affect sea ice formation at the bottom of the world. When the IPO transitions to a negative phase, the sea surface temperatures in the tropical eastern Pacific become somewhat cooler than average when measured over a decade or two. These sea surface temperatures, in turn, change tropical precipitation, which drives large-scale changes to the winds that extend all the way down to Antarctica. The ultimate impact is a deepening of a low-pressure system off the coast of Antarctica known as the Amundsen Sea Low. Winds generated on the western flank of this system blow sea ice northward, away from Antarctica, helping to enlarge the extent of sea ice coverage. “Compared to the Arctic, global warming causes only weak Antarctic sea ice loss, which is why the IPO can have such a striking effect in the Antarctic," said Bitz. "There is no comparable natural variability in the Arctic that competes with global warming.” Sifting through simulations To test if these IPO-related impacts were sufficient to cause the growth in sea ice extent observed between 2000 and 2014, the scientists first examined 262 climate simulations created by different modeling groups from around the world. When all of those simulations are averaged, the natural variability cancels itself out. For example, simulations with a positive IPO offset those with a negative IPO. What remains is the expected impact of human-caused climate change: a decline in Antarctic sea ice extent. But for this study, the scientists were not interested in the average. Instead, they wanted to find individual members that correctly characterized the natural variability between 2000-2014, including the negative phase of the IPO. The team discovered 10 simulations that met the criteria, and all of them showed an increase in Antarctic sea ice extent across all seasons. "When all the models are taken together, the natural variability is averaged out, leaving only the shrinking sea ice caused by global warming," Arblaster said. "But the model simulations that happen to sync up with the observed natural variability capture the expansion of the sea ice area. And we were able to trace these changes to the equatorial eastern Pacific in our model experiments." Scientists suspect that in 2014, the IPO began to change from negative to positive. That would indicate an upcoming period of warmer eastern Pacific Ocean surface temperatures on average, though year-to-year temperatures may go up or down, depending on El Niño/La Niña conditions. Accordingly, the trend of increasing Antarctic sea ice extent may also change in response. "As the IPO transitions to positive, the increase of Antarctic sea ice extent should slow and perhaps start to show signs of retreat when averaged over the next 10 years or so," Meehl said. About the article Title: Antarctic sea-ice expansion between 2000 and 2014 driven by tropical Pacific decadal climate variability Authors: Gerald A. Meehl, Julie M. Arblaster, Cecilia M. Bitz, Christine T. Y. Chung, and Haiyan Teng Publication: Nature Geoscience, DOI: 10.1038/NGEO2751 WriterLaura Snider, Senior Science Writer and Public Information Officer

Future summers could regularly be hotter than the hottest on record

BOULDER — In 50 years, summers across most of the globe could regularly be hotter than any summer experienced so far by people alive today, according to a study by scientists at the National Center for Atmospheric Research (NCAR).  If climate change continues on its current trajectory, the probability that any summer between 2061 and 2080 will be warmer than the hottest on record is 80 percent across the world's land areas, excluding Antarctica, which was not studied. If greenhouse gas emissions are reduced, however, that probability drops to 41 percent, according to the study. "Extremely hot summers always pose a challenge to society," said NCAR scientist Flavio Lehner, lead author of the study. "They can increase the risk for health issues, but can also damage crops and deepen droughts. Such summers are a true test of our adaptability to rising temperatures." The study, which is available online, is part of an upcoming special issue of the journal Climatic Changethat will focus on quantifying the benefits of reducing greenhouse gas emissions. The research was funded by the U.S. National Science Foundation (NSF) and the Swiss National Science Foundation. If greenhouse gas emissions remain unabated. virtually every summer between 2061-2080 could be hotter than any in the historical record. (Image is in the public domain.) Simulating a range of summers The research team, which includes NCAR scientists Clara Deser and Benjamin Sanderson, used two existing sets of model simulations to investigate what future summers might look like. Both had been created by running the NCAR-based Community Earth System Model 15 times, with one assuming that greenhouse gas emissions remain unabated and the other assuming that society reduces emissions. The Community Earth System Model is funded by NSF and the U.S. Department of Energy. The simulations were run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center. By using simulations that were created by running the same model multiple times, with only tiny differences in the initial starting conditions, the scientists could examine the range of summertime temperatures we might expect in the future for the "business-as-usual" and reduced-emissions scenarios. "This is the first time that the risk of record summer heat and its dependence on the rate of greenhouse gas emissions has been so comprehensively evaluated from a large set of simulations with a single state-of-the-art climate model," Deser said. The scientists compared the results to summertime temperatures recorded between 1920 and 2014 as well as to 15 sets of simulated summertime temperatures for the same historic period. By simulating past summers — instead of relying solely on observations — the scientists established a large range of temperatures that could have occurred naturally under the same conditions, including greenhouse gas concentrations and volcanic eruptions. "Instead of just comparing the future to 95 summers from the past, the models give us the opportunity to create more than 1,400 possible past summers," Lehner said. "The result is a more comprehensive and robust look at what should be considered natural variability and what can be attributed to climate change." Emissions cuts could yield big benefits The scientists found that between 2061 and 2080, summers in large parts of North and South America, central Europe, Asia, and Africa have a greater than 90 percent chance of being warmer than any summer in the historic record if emissions continue unabated. This means that virtually every summer would be as warm as the hottest to date. In some regions, the likelihood of summers being warmer than any in the historical record remained less than 50 percent, but in those places — including Alaska, the central U.S., Scandinavia, Siberia, and continental Australia — summer temperatures naturally vary a great deal, making it more difficult to detect the impact of climate change. Reducing emissions would lower the global probability that future summers will be hotter than any in the past, but the benefits would not be spread uniformly. In some regions, including the U.S. East Coast and large parts of the tropics, the probability would remain above 90 percent, even if emissions were reduced. But it would be a sizable boon for other regions of the world. Parts of Brazil, central Europe, and eastern China would see a reduction of more than 50 percentage points in the chance that future summers would be hotter than the historic range. Since these areas are densely inhabited, a large part of the global population would benefit significantly from climate change mitigation. “We've thought of climate change as 'global warming'; among what matters is how this overall warming affects conditions that hit people where they live,” said Eric DeWeaver, program director in NSF’s Division of Atmospheric and Geospace Sciences, which funds NCAR.  “Extreme temperatures pose risks to people around the globe. These scientists show the power of ensembles of simulations for understanding how these risks depend on the level of greenhouse gas emissions.” Lehner recently published another study looking at the overlay of population on warming projections. “It's often overlooked that the majority of the world's population lives in regions that will see a comparably fast rise in temperatures," he said.  About the article Title: Future risk of record-breaking summer temperatures and its mitigation Authors: Flavio Lehner, Clara Deser, and Benjamin M. Sanderson Publication: Climatic Change, DOI: 10.1007/s10584-016-1616-2 Writer:Laura Snider, Senior Science Writer

3D-printed weather stations fill gaps in developing world

BOULDER — Scientists have successfully installed the first wave of low-cost weather stations that are designed to provide critically needed information to farmers and other residents in developing countries. The stations are built largely with 3D-printed parts that can be easily replaced if they wear out in the field. They were created by weather experts at the National Center for Atmospheric Research (NCAR) and its managing entity, the University Corporation for Atmospheric Research (UCAR). The first five stations, newly installed in Zambia, are beginning to transmit information about temperature, rainfall, winds, and other weather parameters. These measurements and the resulting forecasts can provide weather information for local subsistence farmers deciding when to plant and fertilize crops. They can also alert communities about floods and other potential disasters. A newly installed weather station at the Salvation Army's College of Biomedical Sciences in Chikankata, Zambia. The sensor on the left (with the funnel) is a specially designed tipping bucket rain gauge; the vertical, vented cylinder on the vertical arm of the station is a radiation shield containing temperature, humidity, and pressure sensors; and the horizontal cylinder protruding out the back contains a single-board computer. A wind vane (left), solar light sensor (middle), and three-cup wind anemometer (right) are mounted on the upper arm.  The station is powered by a single solar panel and a backup battery. (©UCAR. Photo by Martin Steinson. This image is freely available for media & nonprofit use.) "It’s a major opportunity to provide weather information that farmers have never had before," said NCAR scientist Paul Kucera, one of the project leaders. "This can literally make the difference when it comes to being able to feed their families." The scientists will next explore the need for low-cost weather stations in other developing countries. The project is funded by the U.S. Agency for International Development's Office of Foreign Disaster Assistance and the U.S. National Weather Service. “The bottom line is that 3D-printing will help to save lives,” said Sezin Tokar, a hydrometeorologist with U.S. AID. “Not only can they provide countries with the ability to more accurately monitor for weather-related disasters, the data they produce can also help reduce the economic impact of disasters.” Lack of observations Like many developing countries, Zambia does not have detailed forecasts, partly because weather stations are scarce. The density of stations in Africa is eight times lower than recommended by the World Meteorological Organization. Building out a network can be prohibitively expensive, with a single commercial weather station often costing $10,000 to $20,000, plus ongoing funding for maintenance and replacing worn-out parts. To fill this need, UCAR and NCAR scientists have worked for years to come up with a weather station that is cheap and easy to fix, and can be adapted to the needs of the host country. The resulting stations are constructed out of plastic parts that are custom designed and can be run off a 3D printer, along with off-the-shelf sensors and a basic, credit card-sized computer developed for schoolchildren. Total cost: about $300 per station. Best of all, the host country can easily print replacement parts. "If you want a different kind of wind direction gauge or anemometer, or you just need to replace a broken part, you can just print it out yourself," said project co-lead Martin Steinson of UCAR. "Our role is to make this as accessible as possible. This is entirely conceived as an open-source project." Building out a network Working with the Zambian Meteorological Department and other agencies, Kucera and Steinson installed the first stations earlier this year—three next to radio stations that will broadcast the information to local communities, one by a rural hospital, and one by the headquarters of the meteorological department. The meteorological office will take over the project later this year, with a goal of building out a network of 100 weather stations across Zambia. They will also have the 3D printers, materials, and training to maintain or upgrade the network. The weather station measurements are accessible to local meteorologists and also transmitted over wireless networks in real time to NCAR. After all the weather stations have been installed, scientists will develop a system of one- to three-day regional forecasts for Zambia using the NCAR-based Weather Research and Forecast (WRF) computer model. The forecasts, in addition to helping farmers and other residents, can also alert the country to the threat of impending floods or other weather-related disasters. The system will ultimately be transferred to the Zambian Meteorological Department to run the forecasts. "The objective of the project is to transfer the technology so this will be run by Zambia," Kucera said. Once the technology has been established in Zambia, Kucera and Steinson will turn to other nations that need additional weather stations, such as in Africa or the Caribbean. In addition to improving local forecasts, the additional observations can eventually make a difference for forecasts globally because computer models everywhere will have additional information about the atmosphere. "We’re hearing a lot of interest in using this technology in other countries," Kucera said. "It’s really quite a return on investment." Writer:David Hosansky, Manager of Media Relations


Subscribe to News