NCAR

Solar energy gets boost from new forecasting system

BOULDER, Colo. — A cutting edge forecasting system developed by a national team of scientists offers the potential to save the solar energy industry hundreds of millions of dollars through improved forecasts of the atmosphere.The new system, known as Sun4CastTM, has been in development for three years by the National Center for Atmospheric Research (NCAR) in collaboration with government labs, universities, utilities, and commercial firms across the country. Funded by the U.S. Department of Energy SunShot Initiative, the system greatly improves predictions of clouds and other atmospheric conditions that influence the amount of energy generated by solar arrays.After testing Sun4Cast at multiple sites, the research team has determined that it can be up to 50 percent more accurate than current solar power forecasts. This improved accuracy will enable utilities to deploy solar energy more reliably and inexpensively, reducing the need to purchase energy on the spot market. The amount of energy gathered by solar panels — such as these in Colorado's San Luis Valley — is influenced by factors including the position and types of clouds, the amount of snow on the ground, and relative humidity. The new Sun4Cast system greatly improves solar irradiance predictions, enabling utilities to deploy solar energy more reliably and inexpensively. (©UCAR. Photo by Sue Ellen Haupt, NCAR. This image is freely available for media & nonprofit use.)As a result, utilities across the United States may be able to save an estimated $455 million through 2040 as they use more solar energy, according to an analysis by NCAR economist Jeffrey Lazo.NCAR, which does not provide operational forecasts, makes the technology available so it can be adapted by utilities or private forecasting companies. The research is being highlighted in more than 20 peer-reviewed papers."These results can help enable the nation's expanding use of solar energy," said Sue Ellen Haupt, director of NCAR’s Weather Systems and Assessment Program, who led the research team. "More accurate predictions are vital for making solar energy more reliable and cost effective."The work builds on NCAR’s expertise in highly detailed atmospheric prediction, including the design of an advanced wind energy forecasting system."This type of research and development is important because it contributes to the reduction in costs for solar and wind energy and makes it easier for utilities to integrate renewables into the electrical grid," said William Mahoney, Deputy Director of NCAR's Research Applications Laboratory. "When it comes to balancing demand for power with supply, it's vital to be able to predict sources of energy as accurately as possible."Xcel Energy is already beginning to use the system to forecast conditions at several of its main solar facilities.“Our previous experience with the National Center for Atmospheric Research in developing a wind forecasting system has saved millions of dollars and has been highly beneficial for our customers," said Drake Bartlett, senior trading analyst for Xcel Energy – Colorado. "It is our sincere hope and belief that we will see positive atmospheric forecasting results for predicting solar generation as well, again to the benefit of our Xcel Energy customers."Energy forecasts out to 72 hoursUsing a combination of advanced computer models, atmospheric observations, and artificial intelligence techniques, Sun4Cast provides 0- to 6-hour nowcasts of solar irradiance and the resulting power production for specific solar facilities at 15-minute intervals. This enables utilities to continuously anticipate the amount of available solar energy.In addition, forecasts extend out to 72 hours, allowing utility officials to make decisions in advance for balancing solar with other sources of energy.Solar irradiance is notoriously difficult to predict. It is affected not just by the locations and types of clouds, but also a myriad of other atmospheric conditions, such as the amount of dust and other particles in the air, relative humidity, and air pollution. Further complicating the forecast, freshly fallen snow, nearby steep mountainsides, or even passing cumulus clouds can reflect sunlight in a way that can increase the amount of energy produced by solar panels.To design a system to forecast solar energy output, NCAR and its partners drew on an array of observing instruments, including satellites, radars, and sky imagers; specialized software; and mathematical and artificial intelligence techniques. Central to Sun4Cast is a new computer model of the atmosphere that simulates solar irradiance based on meteorological conditions. Called WRF-SolarTM, the model is derived from the NCAR-based Weather Research and Forecasting (WRF) model, which is widely used by meteorological agencies worldwide.The team tested the system in geographically diverse areas, including Long Island, New York; the Colorado mountains; and coastal California."We have to provide utilities with confidence that the system maintains a high degree of accuracy year-round in very different types of terrain," said Branko Kosovic, NCAR Program Manager for Renewable Energy.In addition to aiding the solar power industry, the work can also improve weather forecasting in general because of improved cloud prediction.NCAR's numerous partners on the project in the public and private sectors included:Government labs: National Renewable Energy Laboratory, Brookhaven National Laboratory, the National Oceanic and Atmospheric Administration’s Earth System Research Laboratory, and other NOAA facilities; Universities: The Pennsylvania State University, Colorado State University, University of Hawaii, and University of Washington; Utilities: Long Island Power Authority, New York Power Authority, Public Service Company of Colorado, Sacramento Municipal Utility District (SMUD), Southern California Edison, and the Hawaiian Electric Company; Independent system operators: New York ISO, Xcel Energy, SMUD, California ISO, and Hawaiian Electric; and Commercial forecast providers: Schneider Electric, Atmospheric and Environmental Research, Global Weather Corporation, MDA Information Systems, and Solar Consulting Services.Computing time was provided by the New York State Department of Economic Development's Division of Science, Technology and Innovation on an IBM Blue Gene supercomputer at Brookhaven National Laboratory. Researchers also performed computing at the NCAR-Wyoming Supercomputing Center and the DOE National Energy Research Scientific Computing Center.About the SunShot InitiativeThe U.S. Department of Energy SunShot Initiative is a collaborative national effort that aggressively drives innovation to make solar energy fully cost-competitive with traditional energy sources before the end of the decade. Through SunShot, the Energy Department supports efforts by private companies, universities, and national laboratories to drive down the cost of solar electricity to $0.06 per kilowatt-hour.

US taps NCAR technology for new water resources forecasts

BOULDER, Colo. — As the National Oceanic and Atmospheric Administration (NOAA) this month launches a comprehensive system for forecasting water resources in the United States, it is turning to technology developed by the National Center for Atmospheric Research (NCAR) and its university and agency collaborators.WRF-Hydro, a powerful NCAR-based computer model, is the first nationwide operational system to provide continuous predictions of water levels and potential flooding in rivers and streams from coast to coast. NOAA's new Office of Water Prediction selected it last year as the core of the agency's new National Water Model."WRF-Hydro gives us a continuous picture of all of the waterways in the contiguous United States," said NCAR scientist David Gochis, who helped lead its development. "By generating detailed forecast guidance that is hours to weeks ahead, it will help officials make more informed decisions about reservoir levels and river navigation, as well as alerting them to dangerous events like flash floods."WRF-Hydro (WRF stands for Weather Research and Forecasting) is part of a major Office of Water Prediction initiative to bolster U.S. capabilities in predicting and managing water resources. By teaming with NCAR and the research community, NOAA's National Water Center is developing a new national water intelligence capability, enabling better impacts-based forecasts for management and decision making.The new WRF-Hydro computer model simulates streams and other aspects of the hydrologic system in far more detail than previously possible. (Image by NOAA Office of Water Prediction.) Unlike past streamflow models, which provided forecasts every few hours and only for specific points along major river systems, WRF-Hydro provides continuous forecasts of millions of points along rivers, streams, and their tributaries across the contiguous United States. To accomplish this, it simulates the entire hydrologic system — including snowpack, soil moisture, local ponded water, and evapotranspiration — and rapidly generates output on some of the nation's most powerful supercomputers.WRF-Hydro was developed in collaboration with NOAA and university and agency scientists through the Consortium of Universities for the Advancement of Hydrologic Science, the U.S. Geological Survey, Israel Hydrologic Service, and Baron Advanced Meteorological Services. Funding came from NOAA, NASA, and the National Science Foundation, which is NCAR's sponsor."WRF-Hydro is a perfect example of the transition from research to operations," said Antonio (Tony) J. Busalacchi, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation (NSF). "It builds on the NSF investment in basic research in partnership with other agencies, helps to accelerate collaboration with the larger research community, and culminates in support of a mission agency such as NOAA. The use of WRF-Hydro in an operational setting will also allow for feedback from operations to research. In the end this is a win-win situation for all parties involved, chief among them the U.S. taxpayers.""Through our partnership with NCAR and the academic and federal water community, we are bringing the state of the science in water forecasting and prediction to bear operationally," said Thomas Graziano, director of NOAA’s new Office of Water Prediction at the National Weather Service.Filling in the water pictureThe continental United States has a vast network of rivers and streams, from major navigable waterways such as the Mississippi and Columbia to the remote mountain brooks flowing from the high Adirondacks into the Hudson River. The levels and flow rates of these watercourses have far-reaching implications for water availability, water quality, and public safety.Until now, however, it has not been possible to predict conditions at all points in the nation's waterways. Instead, computer models have produced a limited picture by incorporating observations from about 4,000 gauges, generally on the country's bigger rivers. Smaller streams and channels are largely left out of these forecast models, and stretches of major rivers for tens of miles are often not predicted — meaning that schools, bridges, and even entire towns can be vulnerable to unexpected changes in river levels.To fill in the picture, NCAR scientists have worked for the past several years with their colleagues within NOAA, other federal agencies, and universities to combine a range of atmospheric, hydrologic, and soil data into a single forecasting system.The resulting National Water Model, based on WRF-Hydro, simulates current and future conditions on rivers and streams along points two miles apart across the contiguous United States. Along with an hourly analysis of current hydrologic conditions, the National Water Model generates three predictions: an hourly 0- to 15-hour short-range forecast, a daily 0- to 10-day medium-range forecast, and a daily 0- to 30-day long-range water resource forecast.The National Water Model predictions using WRF-Hydro offer a wide array of benefits for society. They will help local, state, and federal officials better manage reservoirs, improve navigation along major rivers, plan for droughts, anticipate water quality problems caused by lower flows, and monitor ecosystems for issues such as whether conditions are favorable for fish spawning. By providing a national view, this will also help the Federal Emergency Management Agency deploy resources more effectively in cases of simultaneous emergencies, such as a hurricane in the Gulf Coast and flooding in California."We've never had such a comprehensive system before," Gochis said. "In some ways, the value of this is a blank page yet to be written."A broad spectrum of observationsWRF-Hydro is a powerful forecasting system that incorporates advanced meteorological and streamflow observations, including data from nearly 8,000 U.S. Geological Survey streamflow gauges across the country. Using advanced mathematical techniques, the model then simulates current and future conditions for millions of points on every significant river, steam, tributary, and catchment in the United States.In time, scientists will add additional observations to the model, including snowpack conditions, lake and reservoir levels, subsurface flows, soil moisture, and land-atmosphere interactions such as evapotranspiration, the process by which water in soil, plants, and other land surfaces evaporates into the atmosphere.Scientists over the last year have demonstrated the accuracy of WRF-Hydro by comparing its simulations to observations of streamflow, snowpack, and other variables. They will continue to assess and expand the system as the National Water Model begins operational forecasts.NCAR scientists maintain and update the open-source code of WRF-Hydro, which is available to the academic community and others. WRF-Hydro is widely used by researchers, both to better understand water resources and floods in the United States and other countries such as Norway, Germany, Romania, Turkey, and Israel, and to project the possible impacts of climate change."At any point in time, forecasts from the new National Water Model have the potential to impact 300 million people," Gochis said. "What NOAA and its collaborator community are doing is trying to usher in a new era of bringing in better physics and better data into forecast models for improving situational awareness and hydrologic decision making."CollaboratorsBaron Advanced Meteorological Services Consortium of Universities for the Advancement of Hydrologic ScienceIsrael Hydrologic ServiceNational Center for Atmospheric ResearchNational Oceanic and Atmospheric AdministrationU.S. Geological SurveyFundersNational Science FoundationNational Aeronautics and Space AdministrationNational Oceanic and Atmospheric Administration

Climate change already accelerating sea level rise, study finds

BOULDER, Colo. — Greenhouse gases are already having an accelerating effect on sea level rise, but the impact has so far been masked by the cataclysmic 1991 eruption of Mount Pinatubo in the Philippines, according to a new study led by the National Center for Atmospheric Research (NCAR).Satellite observations, which began in 1993, indicate that the rate of sea level rise has held fairly steady at about 3 millimeters per year. But the expected acceleration due to climate change is likely hidden in the satellite record because of a happenstance of timing: The record began soon after the Pinatubo eruption, which temporarily cooled the planet, causing sea levels to drop.The new study finds that the lower starting point effectively distorts the calculation of sea level rise acceleration for the last couple of decades.The study lends support to climate model projections, which show the rate of sea level rise escalating over time as the climate warms. The findings were published today in the open-access Nature journal Scientific Reports.Mount Pinatubo's caldera on June 22, 1991. (Image courtesy USGS.)"When we used climate model runs designed to remove the effect of the Pinatubo eruption, we saw the rate of sea level rise accelerating in our simulations," said NCAR scientist John Fasullo, who led the study. "Now that the impacts of Pinatubo have faded, this acceleration should become evident in the satellite measurements in the coming decade, barring another major volcanic eruption."Study co-author Steve Nerem, from the University of Colorado Boulder, added: “This study shows that large volcanic eruptions can significantly impact the satellite record of global average sea level change. So we must be careful to consider these effects when we look for the effects of climate change in the satellite-based sea level record."The findings have implications for the extent of sea level rise this century and may be useful to coastal communities planning for the future. In recent years, decision makers have debated whether these communities should make plans based on the steady rate of sea level rise measured in recent decades or based on the accelerated rate expected in the future by climate scientists.The study was funded by NASA, the U.S. Department of Energy, and the National Science Foundation, which is NCAR's sponsor.Reconstructing a pre-Pinatubo worldClimate change triggers sea level rise in a couple of ways: by warming the ocean, which causes the water to expand, and by melting glaciers and ice sheets, which drain into the ocean and increase its volume. In recent decades, the pace of warming and melting has accelerated, and scientists have expected to see a corresponding increase in the rate of sea level rise. But analysis of the relatively short satellite record has not borne that out.To investigate, Fasullo, Nerem, and Benjamin Hamlington of Old Dominion University worked to pin down how quickly sea levels were rising in the decades before the satellite record began.Prior to the launch of the international TOPEX/Poseidon satellite mission in late 1992, sea level was mainly measured using tide gauges. While records from some gauges stretch back to the 18th century, variations in measurement technique and location mean that the pre-satellite record is best used to get a ballpark estimate of global mean sea level.Mount Pinatubo erupting in 1991. (Image courtesy USGS.)To complement the historic record, the research team used a dataset produced by running the NCAR-based Community Earth System Model 40 times with slightly different—but historically plausible—starting conditions. The resulting simulations characterize the range of natural variability in the factors that affect sea levels. The model was run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.A separate set of model runs that omitted volcanic aerosols — particles spewed into the atmosphere by an eruption — was also assessed. By comparing the two sets of runs, the scientists were able to pick out a signal (in this case, the impact of Mount Pinatubo's eruption) from the noise (natural variations in ocean temperature and other factors that affect sea level)."You can't do it with one or two model runs—or even three or four," Fasullo said. "There's just too much accompanying climate noise to understand precisely what the effect of Pinatubo was. We could not have done it without large numbers of runs."Using models to understand observationsAnalyzing the simulations, the research team found that Pinatubo's eruption caused the oceans to cool and sea levels to drop by about 6 millimeters immediately before TOPEX/Poseidon began recording observations.As the sunlight-blocking aerosols from Mount Pinatubo dissipated in the simulations, sea levels began to slowly rebound to pre-eruption levels. This rebound swamped the acceleration caused by the warming climate and made the rate of sea level rise higher in the mid- to late 1990s than it would otherwise have been.This higher-than-normal rate of sea level rise in the early part of the satellite record makes it appear that the rate of sea level rise has not accelerated over time and may actually have decreased somewhat. In fact, according to the study, if the Pinatubo eruption had not occurred—leaving sea level at a higher starting point in the early 1990s—the satellite record would have shown a clear acceleration."The satellite record is unable to account for everything that happened before the first satellite was launched, " Fasullo said. "This study is a great example of how computer models can give us the historical context that's needed to understand some of what we're seeing in the satellite record."Understanding whether the rate of sea level rise is accelerating or remaining constant is important because it drastically changes what sea levels might look like in 20, 50, or 100 years.“These scientists have disentangled the major role played by the 1991 volcanic eruption of Mt. Pinatubo on trends in global mean sea level,” said Anjuli Bamzai, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which funded the research.  “This research is vital as society prepares for the potential effects of climate change."Because the study's findings suggest that acceleration due to climate change is already under way, the acceleration should become evident in the satellite record in the coming decade, Fasullo said.Since the original TOPEX/Poseidon mission, other satellites have been launched—Jason-1 in 2001 and Jason-2 in 2008—to continue tracking sea levels. The most recent satellite, Jason-3, launched on Jan. 17 of this year."Sea level rise is potentially one of the most damaging impacts of climate change, so it's critical that we understand how quickly it will rise in the future," Fasullo said. "Measurements from Jason-3 will help us evaluate what we've learned in this study and help us better plan for the future."The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.The graph shows how sea level rises and falls as ocean heat content fluctuates. After volcanic eruptions, the Earth cools and, in turn, the heat content in the ocean drops, ultimately lowering sea level.The solid blue line is the average sea level rise of climate model simulations that include volcanic eruptions. The green line is the average from model simulations with the effect of volcanic eruptions removed, and it shows a smooth acceleration in the rate of sea level rise due to climate change.The blue line between the start of the satellite record and present day makes a relatively straight line — just as we see from actual satellite observations during that time —  indicating that the rate of sea level rise has not accelerated. But in the future, barring another major volcanic eruption, scientists expect sea level to follow the gray dotted line, which is on the same accelerating path as the green line below it. Click to enlarge. (©UCAR. This graph is freely available for media & nonprofit use.) About the articleTitle: Is the detection of sea level rise imminent?Authors: J.T. Fasullo, R. S. Nerem, and B. HamlingtonJournal: Scientific Reports, DOI: 10.1038/srep31245 Funders:  NASANational Science FoundationU.S. Department of Energy Collaborators: Univesity of Colorado Boulder (UCAR member)Old Dominion University (UCAR member)Writer:Laura Snider, Senior Science Writer and Public Information Officer

1967 solar storm nearly took U.S. to brink of war

Delores Knipp, a senior research associate at NCAR's High Altitude Observatory, is the lead author of a new paper published in the journal Space Weather. Knipp is also a research professor at the University of Colorado Boulder, a UCAR member institution. This is an excerpt from a news release by the American Geophysical Union. Aug. 9, 2016 | A solar storm that jammed radar and radio communications at the height of the Cold War could have led to a disastrous military conflict if not for the U.S. Air Force’s budding efforts to monitor the sun’s activity, a new study finds.A view of the Sun on May 23, 1967, in a narrow visible wavelength of light called Hydrogen-alpha. The bright region in the top center region shows the area where the large flare occurred. Click to enlarge.(Image courtesy of the National Solar Observatory historical archive.)On May 23, 1967, the Air Force prepared aircraft for war, thinking the nation’s surveillance radars in polar regions were being jammed by the Soviet Union. Just in time, military space weather forecasters conveyed information about the solar storm’s potential to disrupt radar and radio communications. The planes remained on the ground and the U.S. avoided a potential nuclear weapon exchange with the Soviet Union, according to the new research.Retired U.S. Air Force officers involved in forecasting and analyzing the storm collectively describe the event publicly for the first time in a new paper accepted for publication in Space Weather, a journal of the American Geophysical Union.The storm’s potential impact on society was largely unknown until these individuals came together to share their stories, said Delores Knipp, a space physicist at the University of Colorado in Boulder, a senior research associate at the National Center for Atmospheric Research, and lead author of the new study. Knipp will give a presentation about the event on August 10, 2016 at NCAR's High Altitude Observatory in Boulder, Colorado.Hear Delores Knipp discuss the new paper during a live webcast on Wednesday, Aug. 10, at 3 p.m. Mountain Time. The storm is a classic example of how geoscience and space research are essential to U.S. national security, she said.“Had it not been for the fact that we had invested very early on in solar and geomagnetic storm observations and forecasting, the impact [of the storm] likely would have been much greater,” Knipp said. “This was a lesson learned in how important it is to be prepared.”Read the full release at AGU.About the articleTitle: The May 1967 Great Storm and Radio Disruption Event: Extreme Space Weather and Extraordinary ResponsesAuthors: D. J. Knipp, A. C. Ramsay, E. D. Beard, A. L. Boright, W. B. Cade, I. M. Hewins, R. McFadden, W. F. Denig, L. M. Kilcommons , M. A. Shea, and D. F. SmartJournal: Space Weather, DOI: 10.1002/2016SW001423

The comet that disappeared: What happened to ISON?

July 25, 2016 | On Thanksgiving Day in 2013, solar scientists, astronomers, and amateur skywatchers alike pointed their instruments at the Sun and waited. Comet ISON, a bright ball of frozen matter from the earliest days of the universe, was inbound from the Oort Cloud at the edge of the solar system and expected to pierce the Sun's corona on Nov. 28. Scientists were expecting quite a show.But instead of a brilliant cosmic display, there was … nothing.An enhanced image of Comet ISON taken by the Hubble Space Telescope in May 2013. (Image courtesy NASA.)"The first thing we did was make sure that we had definitely seen nothing," said Paul Bryans, a solar scientist at the National Center for Atmospheric Research (NCAR), who was looking for the comet using NASA's Solar Dynamics Observatory. "We did image processing just to make sure nothing was there, and it wasn't. But that's not necessarily a boring result. That can tell us something."And it has. Bryans and colleague Dean Pesnell, of the NASA Goddard Space Flight Center, recently published a study that sheds light on the mystery of Comet ISON."We think that the most likely thing that happened is that Comet ISON broke up before it got really close to the Sun," said Bryans, a researcher at NCAR's High Altitude Observatory.Solar scientists, like Bryans, are interested in comets like ISON because they can act as probes into the mysterious solar corona. How they behave on their journey past the Sun can offer insight into the corona's composition and the behavior of the Sun's magnetic field.Watching and waitingSun-grazing comets are not that unusual, but they're usually too small to live through the encounter. Larger comets, like Comet Lovejoy, which sailed through the Sun's corona in December 2011, can survive brushes with the Sun. But they burn off a large part of their masses in the process, sometimes leaving a dazzling trail of extreme ultraviolet emissions in their wake. Comet ISON, first spotted more than a year before it reached the Sun, was thought to be large enough to survive the trip. The comet was very bright, a sign that it might also be quite large.This animation shows Comet ISON, Mercury, and Earth from Nov. 20 to Nov. 25, 2013. The Sun sits right of the field of view. The images were taken by the Heliospheric Imager on NASA's STEREO mission. See the full video. (Animation courtesy of NASA/STEREO.) When NASA observatories failed to see a showy trail from Comet ISON — or any trail at all — scientists were left wondering what happened. In a study published in 2014, researchers hypothesized that Comet ISON did not emit the extreme ultraviolet radiation like Comet Lovejoy because ISON passed further away from the Sun.In the new study, published in The Astrophysical Journal, Bryans and Pesnell challenge those conclusions. Using data collected by the Solar Dynamics Observatory, the researchers compared ISON to Lovejoy, systematically evaluating how the conditions might have differed for the two comets — including the density of the solar atmosphere, the Sun's magnetic field, and the size of the comets — as well as how those differences might have affected the comets' emissions of extreme ultraviolet radiation."Using Lovejoy as a benchmark, we took each factor in turn," Bryans said. "The fact that ISON was further away from the Sun than Lovejoy would have made a difference, but it was not a large enough difference to explain why we saw nothing from ISON."Instead, the study finds that ISON's fizzle is best explained by the comet's size. They estimate that IOSN's radius was at least a factor of four smaller than Lovejoy's.'Dust and rubble' If Bryans and Pesnell are correct, it means that estimates of Comet ISON's size before it reached the Sun were too large. Comet size is correlated to brightness, but other factors can affect brightness as well. In ISON's case, scientists believe the comet was making its first trip around the Sun, which means that it was still packed with highly volatile matter that had not yet burned off. This matter could make the comet appear brighter for its size than a comet that had already traveled once past the Sun. "On a comet's first passage past the sun, it has all this cold, icy stuff on the outside of it that burns off easily and looks really bright," Bryans said.But even if the comet was bright because of its size, the scientists believe it’s likely that the comet broke into pieces before entering the Sun's corona."It's possible by the time it made its closest approach to the Sun, it was just a pile of dust and rubble," Bryans said. About the articleTitle: On the absence of EUV emission from Comet C/2012 S1 (ISON)Authors: Paul Bryans and W. Dean PesnellPublication: The Astrophysical Journal, DOI: 10.3847/0004-637X/822/2/77Funders:  NASACollaborators: NASA Goddard Space Flight CenterWriter/contact:Laura Snider, Senior Science Writer and Public Information Officer 

Expanding Antarctic sea ice linked to natural variability

BOULDER — The recent trend of increasing Antarctic sea ice extent — seemingly at odds with climate model projections — can largely be explained by a natural climate fluctuation, according to a new study led by the National Center for Atmospheric Research (NCAR). The study offers evidence that the negative phase of the Interdecadal Pacific Oscillation (IPO), which is characterized by cooler-than-average sea surface temperatures in the tropical eastern Pacific, has created favorable conditions for additional Antarctic sea ice growth since 2000. The findings, published in the journal Nature Geoscience, may resolve a longstanding mystery: Why is Antarctic sea ice expanding when climate change is causing the world to warm? The study's authors also suggest that sea ice may begin to shrink as the IPO switches to a positive phase. "The climate we experience during any given decade is some combination of naturally occurring variability and the planet's response to increasing greenhouse gases," said NCAR scientist Gerald Meehl, lead author of the study. "It's never all one or the other, but the combination, that is important to understand." Study co-authors include Julie Arblaster of NCAR and Monash University in Australia, Cecilia Bitz of the University of Washington, Christine Chung of the Australian Bureau of Meteorology, and NCAR scientist Haiyan Teng. The study was funded by the U.S. Department of Energy and by the National Science Foundation, which sponsors NCAR. On Sept. 19, 2014, the five-day average of Antarctic sea ice extent exceeded 20 million square kilometers (about 7.7 million square miles) for the first time since 1979, according to the National Snow and Ice Data Center. The red line shows the average maximum extent from 1979-2014. (Image courtesy NASA's Scientific Visualization Studio/Cindy Starr) Expanding ice The sea ice surrounding Antarctica has been slowly increasing in area since the satellite record began in 1979. But the rate of increase rose nearly five fold between 2000 and 2014, following the IPO transition to a negative phase in 1999. The new study finds that when the IPO changes phase, from positive to negative or vice versa, it touches off a chain reaction of climate impacts that may ultimately affect sea ice formation at the bottom of the world. When the IPO transitions to a negative phase, the sea surface temperatures in the tropical eastern Pacific become somewhat cooler than average when measured over a decade or two. These sea surface temperatures, in turn, change tropical precipitation, which drives large-scale changes to the winds that extend all the way down to Antarctica. The ultimate impact is a deepening of a low-pressure system off the coast of Antarctica known as the Amundsen Sea Low. Winds generated on the western flank of this system blow sea ice northward, away from Antarctica, helping to enlarge the extent of sea ice coverage. “Compared to the Arctic, global warming causes only weak Antarctic sea ice loss, which is why the IPO can have such a striking effect in the Antarctic," said Bitz. "There is no comparable natural variability in the Arctic that competes with global warming.” Sifting through simulations To test if these IPO-related impacts were sufficient to cause the growth in sea ice extent observed between 2000 and 2014, the scientists first examined 262 climate simulations created by different modeling groups from around the world. When all of those simulations are averaged, the natural variability cancels itself out. For example, simulations with a positive IPO offset those with a negative IPO. What remains is the expected impact of human-caused climate change: a decline in Antarctic sea ice extent. But for this study, the scientists were not interested in the average. Instead, they wanted to find individual members that correctly characterized the natural variability between 2000-2014, including the negative phase of the IPO. The team discovered 10 simulations that met the criteria, and all of them showed an increase in Antarctic sea ice extent across all seasons. "When all the models are taken together, the natural variability is averaged out, leaving only the shrinking sea ice caused by global warming," Arblaster said. "But the model simulations that happen to sync up with the observed natural variability capture the expansion of the sea ice area. And we were able to trace these changes to the equatorial eastern Pacific in our model experiments." Scientists suspect that in 2014, the IPO began to change from negative to positive. That would indicate an upcoming period of warmer eastern Pacific Ocean surface temperatures on average, though year-to-year temperatures may go up or down, depending on El Niño/La Niña conditions. Accordingly, the trend of increasing Antarctic sea ice extent may also change in response. "As the IPO transitions to positive, the increase of Antarctic sea ice extent should slow and perhaps start to show signs of retreat when averaged over the next 10 years or so," Meehl said. About the article Title: Antarctic sea-ice expansion between 2000 and 2014 driven by tropical Pacific decadal climate variability Authors: Gerald A. Meehl, Julie M. Arblaster, Cecilia M. Bitz, Christine T. Y. Chung, and Haiyan Teng Publication: Nature Geoscience, DOI: 10.1038/NGEO2751 WriterLaura Snider, Senior Science Writer and Public Information Officer

Pages

Subscribe to NCAR