NCAR

The rate of sea level rise is accelerating, a new study finds

NCAR scientist John Fasullo is a co-author of a new study appearing in the Proceedings of the National Academies of Science. The study finds that the rate of sea level rise is accelerating. The following is an excerpt from a news release by the Cooperative Institute for Research in Environmental Sciences. February 13, 2018 | Global sea level rise is not cruising along at a steady 3 mm per year, it’s accelerating a little every year, according to a new study that harnessed 25 years of satellite data to calculate that the rate is increasing by about 0.08 mm/year every year — which could mean an annual rate of sea level rise of 10 mm/year, or even more, by 2100.“This acceleration, driven mainly by accelerated melting in Greenland and Antarctica, has the potential to double the total sea level rise by 2100 as compared to projections that assume a constant rate—to more than 60 cm instead of about 30.” said lead author Steve Nerem, a scientists at the Cooperative Institute for Research in Environmental Sciences. "And this is almost certainly a conservative estimate," he added. "Our extrapolation assumes that sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."If the oceans continue to change at this pace, sea level will rise 65cm (26 inches) by 2100—enough to cause significant problems for coastal cities, according to the new assessment by Nerem and several colleagues from CU Boulder, the University of South Florida, NASA Goddard Space Flight Center, Old Dominion University, and the National Center for Atmospheric Research. The team, driven to understand and better predict Earth’s response to a warming world, published their work today in the journal Proceedings of the National Academy of Sciences.Rising concentrations of greenhouse gases in Earth’s atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the oceans has contributed about half of the 7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.These increases were measured using satellite altimeter measurements since 1992, including the U.S./European TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3 satellite missions. But detecting acceleration is challenging, even in such a long record. Episodes like volcanic eruptions can create variability: the eruption of Mount Pinatubo in 1991 decreased global mean sea level just before the Topex/Poseidon satellite launch, for example. In addition, global sea level can fluctuate due to climate patterns such as El Niños and La Niñas (the opposing phases of the El Niño Southern Oscillation, or ENSO) which influence ocean temperature and global precipitation patterns.Read the full news release here. 

Drier and wetter: The future of precipitation variability

January 17, 2018 | Precipitation variability — the swing from dry to wet and back again — will continue to increase across the majority of the world's land area as the climate warms, according to a new study led by scientists at the National Center for Atmospheric Research.The researchers expect precipitation variability to become greater from day to day, year to year, and even decade to decade. The new research, published in the Nature journal Scientific Reports, provides results from sophisticated computer simulations that predict that there will be both more droughts and more floods within the same areas as the climate warms. The findings are relevant for water managers who need to make long-range plans."When it's dry, it will be drier. When it's wet, it will be wetter — in the same place," said NCAR scientist Angeline Pendergrass, lead author of the study. "There will be a broader range of conditions that will become 'normal.'"The research was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Department of Energy.As the climate continues to warm, the range of precipitation that is "normal" in a particular place is likely to grow, meaning a single location can become both wetter and drier. The image on the left shows a flood in Colorado. The image on the right shows a droughtin Texas. (Images courtesy the U.S. Department of Defense and U.S. Department of Agriculture.)New tools to study changes in precipitationHistorically, changes in precipitation variability have been difficult to pin down because the amount of rain or snow a particular region gets can vary a great deal naturally.But in recent years, the availability of large ensembles of climate model runs has allowed scientists to begin separating some of the more subtle impacts of climate change from the natural chaos in the climate system. These ensembles may include 30 or 40 runs of a single climate model over the same time period with slightly different, but equally plausible, initial conditions.Pendergrass and her colleagues, NCAR scientists Flavio Lehner, Clara Deser, and Benjamin Sanderson, along with ETH-Zürich professor Reto Knutti, took a closer look at precipitation variability using large ensembles of runs from the NCAR-based Community Earth System Model (CESM) and from the Geophysical Fluid Dynamics Laboratory (GFDL) climate model. They also looked at a collection of individual runs taken from many different climate models and known as the Climate Model Intercomparison Project Phase 5, or CMIP5.The team found that precipitation variability will likely increase substantially over two-thirds of the world's land areas by the end of the century if greenhouse gas emissions continue unabated. They also found that, on average, variability increases 4 to 5 percent over land per degree Celsius of warming and that variability increases across all time scales, from days to decades."This increase in variability is arising due to more moisture in the atmosphere and a weakening of global atmospheric circulation," Pendergrass said. "That's important because it means that changes in precipitation variability are not just linked to changes in El Niño and La Niña events, as some previous work implied."Helping water managers plan for the futurePendergrass hopes the study's findings will be used by water managers in their future planning. Models used today by water managers often assume that the change in precipitation variability in the future will track with the expected increase in average precipitation.But the new study finds that the increase in precipitation variability will outstrip the increase in average precipitation, which means that water managers may be miscalculating the magnitude of future swings from wet to dry or vice versa."Water managers may be underestimating how much heavy events — floods or droughts — will change," Pendergrass said.About the articleTitle: Precipitation variability increases in a warmer climateAuthors: Pendergrass, A. G., R. Knutti, F. Lehner, C. Deser, and B. M. SandersonJournal: Scientific Reports, DOI: 10.1038/s41598-017-17966-yWriter/contact:Laura Snider, Senior Science Writer

The climate secrets of southern clouds

BOULDER, Colo. — This month, an international team of scientists will head to the remote Southern Ocean for six weeks to tackle one of the region's many persistent mysteries: its clouds.What they discover will be used to improve climate models, which routinely underestimate the amount of solar radiation reflected back into space by clouds in the region. Accurately simulating the amount of radiation that is absorbed or reflected on Earth is key to calculating how much the globe is warming.The field campaign, called the Southern Ocean Clouds, Radiation, Aerosol Transport Experimental Study, or SOCRATES, could also help scientists understand the very nature of how clouds interact with aerosols — particles suspended in the atmosphere that can be from either natural or human-made sources. Aerosols can spur cloud formation, change cloud structure, and affect precipitation, all of which affect the amount of solar radiation that is reflected.During the mission, which will run from mid-January through February, the scientists will collect data from a bevy of advanced instruments packed onboard an aircraft and a ship, both of which are specially designed for scientific missions."SOCRATES will allow for some of the best observations of clouds, aerosols, radiation, and precipitation that have ever been collected over the Southern Ocean," said Greg McFarquhar, a principal investigator and the director of the University of Oklahoma Cooperative Institute for Mesoscale Meteorological Studies (CIMMS). "These data will provide us with critical insight into the physics of cloud formation in the region, information we can use to improve global climate models."The U.S. portion of SOCRATES is largely funded by the National Science Foundation (NSF).“The Southern Ocean is famously remote and stormy and it's hard to imagine a worse place to do a field campaign. But a vast, stormy ocean is a great laboratory for studying clouds, and it's clear from our models that we have a lot to learn about them,” said Eric DeWeaver, program director for Climate and Large-Scale Dynamics in NSF’s Geoscience directorate."I'm excited about this campaign because I think it will answer some fundamental questions about clouds and their dependence on atmospheric conditions," DeWeaver said. "We'll be able to use this information to understand cloud behavior closer to home and how clouds are likely to adjust to changing climatic conditions."Critical observing and logistical support for SOCRATES is being provided by the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR). Other U.S. principal investigators are based at the University of Washington.The Australian portion of SOCRATES is largely funded by the country's government through the Australian Marine National Facility, which is owned and operated by CSIRO.A supercooled mysteryMcFarquhar and his colleagues think the reason that climate models are not accurately capturing the amount of radiation reflected by clouds above the Southern Ocean is because they may not be correctly predicting the composition of the clouds. In particular, the models may not be producing enough supercooled water — droplets that stay liquid even when the temperature is below freezing.One possible explanation for the problem is the way models represent how clouds interact with aerosols, a process that affects the amount of supercooled water in a cloud. These representations were developed from atmospheric observations, largely in the Northern Hemisphere, where most of the world's population lives.But the atmosphere over the Northern Hemisphere — even over the Arctic — contains many more pollutants, including aerosols, than the atmosphere over the Southern Ocean, which is relatively pristine."We don't know how appropriate the representations of these processes are for the Southern Hemisphere," McFarquhar said. "SOCRATES will give us an opportunity to observe these cloud-aerosol interactions and see how much they differ, if at all, from those in the Northern Hemisphere."Flying through hazardous cloudsThe NSF/NCAR HIAPER Gulfstream V has been modified to serve as a flying laboratory. (©UCAR. This figure is freely available for media & nonprofit use.)For the SOCRATES field campaign, observations will be taken from the NSF/NCAR High-performance Instrumented Airborne Platform for Environmental Research, or HIAPER, a highly modified Gulfstream V aircraft, and the R/V Investigator, an Australian deep-ocean research vessel."Much of what we currently know about Southern Ocean cloud, aerosol, and precipitation properties comes from satellite-based estimates, which are uncertain and have undergone few comparisons against independent data," said co-investigator Roger Marchand, a scientist at the University of Washington. "The data collected during SOCRATES will also enable us to evaluate current satellite data over the Southern Ocean, as well as potentially help in the design of better satellite-based techniques."The research aircraft will be based out of Hobart, Tasmania, and will make about 16 flights over the Southern Ocean during the course of the campaign. The many high-tech instruments on board will measure the size and distribution of cloud droplets, ice crystals, and aerosols, as well as record the temperature, winds, air pressure, and other standard atmospheric variables.The instruments include NCAR's HIAPER Cloud Radar (HCR) and High Spectral Resolution Lidar (HSRL). The wing-mounted HCR is able to "see" inside clouds and characterize the droplets within, while the HSRL can measure air molecules and aerosols. Together, the two highly advanced instruments will give scientists a more complete picture of the wide range of particles in the atmosphere above the Southern Ocean.The nature of the research — flying a plane in search of supercooled water —presents some challenges with aircraft icing."Oftentimes, the cleaner the air, the more probable large drops and severe icing conditions become," said Cory Wolff, the NCAR project manager who is overseeing aircraft operations for SOCRATES. "We have a number of precautions we're taking to mitigate that risk."First, a mission coordinator whose sole job is to monitor icing conditions will join each flight. Second, the design of the flights themselves will help the crew anticipate icing conditions before they have to fly through them. On the flight south from Tasmania, the HIAPER GV will fly high above the clouds — and the icing danger. During that leg of the flight, the scientists will collect information about the clouds below, both with onboard radar and lidar as well as with dropsondes — small instrument packages released from the aircraft.With that information, the scientists can determine whether it's safe to pilot the aircraft through the clouds on the return trip, collecting detailed information about the cloud composition.Sailing the stormiest seasThe Australian R/V Investigator will take measurements of the atmosphere and ocean during its six-week voyage. (Image courtesy CSIRO.)The measurements taken from the sky will be complemented by data collected from instruments on board the Australian R/V Investigator, including the NCAR Integrated Sounding System. The ISS gathers extensive data by using a radar wind profiler, surface meteorology sensors, and a balloon-borne radiosonde sounding system. The team will launch soundings every six hours, and sometimes more often, throughout the campaign."Observations from the ship will help us understand the background state of the atmosphere — how it's behaving," said NCAR scientist Bill Brown, who traveled to Australia in late November to prepare the ISS for the voyage.The ship will be deployed for the entire six weeks and will face its own challenges, notably the notorious roughness of the Southern Ocean, sometimes called the stormiest place on Earth."There are no land masses to break up the winds down there," Brown said. "So the ocean can be quite rough."SOCRATES investigators will also draw on measurements from another Australian ship as it travels between Tasmania and Antarctica on resupply missions, the R/V Aurora Australis, as well as observations from buoys and some land-based instruments on Macquarie Island."I am excited that we will have such a comprehensive suite of observations," McFarquhar said. "If we just had the cloud observations we wouldn’t have the appropriate context. If we just had the aerosols and measurements below the clouds, we wouldn't be able to understand the complete picture."For more about the SOCRATES campaign, visit the project website.Collaborating institutions:Australian Antarctic DivisionAustralian Bureau of MeteorologyAustralian Department of Environment and EnergyColorado State UniversityCooperative Institute for Mesoscale Meteorological StudiesCSIROKarlsruhe Institute of TechnologyMonash UniversityNational Center for Atmospheric ResearchNational Science FoundationNorthWest Research AssociatesQueensland University of TechnologyUniversity of California San DiegoUniversity of Colorado BoulderUniversity of Illinois at Urbana-ChampaignUniversity of MelbourneUniversity of OklahomaUniversity of Washington

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

New model reveals origins of the Sun's seasons

December 4, 2017 | The solar seasons — which change every six to 18 months from "bursty" to quiet, and vice versa — have been realistically simulated for the first time in a computer model of the Sun's shear layer beneath the turbulent outer shell, an advance that promises the ability to predict these seasonal fluctuations nearly a year in advance.Solar seasons, discovered just a few years ago, are periods of greater or lesser solar activity. In what scientists have dubbed the bursty season, sunspots and the flares that can accompany them are more common. In the quiet season they are fewer and farther between.These seasons are superimposed on the approximately 11-year solar cycle, when the Sun transitions from solar minimum (fewer sunspots) to solar maximum (more sunspots) and back again. The seasons serve to amplify — or dampen — the Sun's background state.Now, scientists at the National Center for Atmospheric Research have simulated these seasons in a sophisticated solar model, discovering for the first time the physical mechanisms at their root. The new research, led by NCAR scientist Mausumi Dikpati, was published earlier this month in the Nature journal Scientific Reports. Already, Dikpati and colleagues are working on using the model — fed with observations taken of magnetic fields on the front and back sides of the Sun — to make predictions of seasonal changes up to a year in advance. Such predictions are valuable because the major solar flares and coronal mass ejections that are more likely to occur during the bursty season can cause havoc on Earth, scrambling radio communications, damaging satellites, disabling power grids, and imperiling astronauts."Right now, space weather forecasters issue at most a one-day warning — sometimes just a few hours — that a coronal mass ejection might cause a damaging geomagnetic storm here on Earth," Dikpati said. "Having a model that captures the physical mechanisms behind the Sun's seasons can better equip scientists to forecast these storms."NASA's Solar Dynamics Observatory captured a solar flare exploding from the face of the Sun on April 17, 2016. (Image courtesy NASA.)A back-and-forth energy exchangeIn the new study, the scientists find that the solar seasons owe their origin to the interaction between two phenomena tied to the Sun's magnetic fields: Rossby waves and differential rotation.Rossby waves, only recently discovered in observations of the Sun, are large-scale planetary waves that can also be found in Earth's atmosphere and oceans.Differential rotation refers to the fact that the Sun's equator rotates more quickly than its poles. This difference allows the solar magnetic field to twist and tangle, sometimes combining into ropes of magnetic field lines that can burst from the Sun's surface.Dikpati and her colleagues found that when the Rossby waves are tilted in a particular direction, they can feed on energy from the Sun's differential rotation. Once the Rossby waves have extracted all the available energy, the waves begin to straighten and feed energy back to the differential rotation, eventually tilting to the opposite direction. Then the cycle repeats.This back-and-forth exchange of energy marks the changing of the solar seasons. The Sun's bursty season coincides with the period when Rossby waves have their maximum energy. During these times, the Rossby waves deform the surface of the Sun’s shear layer into bulges and depressions. When the bulges coincide with a rope of magnetic field lines, they provide an opportunity for those magnetic field lines to more easily break through the Sun's surface, often creating flares and coronal mass ejections, including very strong ones that affect Earth.Bursty seasons — no matter whether they occur during a solar cycle that is stronger or weaker than normal — contain the most dangerous space weather events. For example, one of the strongest solar storms ever observed was generated in July 2012 during the current solar cycle, which is considered weak. The solar storm narrowly missed hitting Earth. If it had, solar scientists say that the impact on our modern, technology-driven society could have been devastating."The Sun is remarkably complex, and this modeling effort has given us some insight into the structures of the seemingly chaotic magnetic field," Dikpati said. "More complex modeling, with assimilation of more observations, will allow us to continue to work on improving prediction of dangerous solar storms."The research was funded by the National Science Foundation, NCAR's sponsor. The model simulations for the study were run on both the Yellowstone and Cheyenne supercomputers at the NCAR-Wyoming Supercomputing Center. Other co-authors of the study are Paul Cally (Monash University Clayton in Australia), Scott McIntosh (NCAR), and Eyal Heifetz (Tel Aviv University in Israel).A new team has formed to work with Dikpati on using the model for prediction. The team includes Yuhong Fan, Scott McIntosh, Lisa Upton, Jeff Anderson, and Nancy Collins (all of NCAR);  Aimee Norton (Stanford University); Marty Snow (University of Colorado Boulder); and Doug Biesecker (National Oceanic and Atmospheric Administration.)About the articleTitle: The Origin of the "Seasons" in Space WeatherAuthors: Mausumi Dikpati, Paul S. Cally, Scott W. McIntosh, and Eyal HeifetzJournal: Scientific Reports, DOI: 10.1038/s41598-017-14957-xWriter/contact:Laura Snider, Senior Science Writer

North American storm clusters could produce 80 percent more rain

BOULDER, Colo. — Major clusters of summertime thunderstorms in North America will grow larger, more intense, and more frequent later this century in a changing climate, unleashing far more rain and posing a greater threat of flooding across wide areas, new research concludes.The study, by scientists at the National Center for Atmospheric Research (NCAR), builds on previous work showing that storms are becoming more intense as the atmosphere is warming. In addition to higher rainfall rates, the new research finds that the volume of rainfall from damaging storms known as mesoscale convective systems (MCSs) will increase by as much as 80 percent across the continent by the end of this century, deluging entire metropolitan areas or sizable portions of states."The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted," said NCAR scientist Andreas Prein, the study's lead author. "If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.""This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative," he added.The research team drew on extensive computer modeling that realistically simulates MCSs and thunderstorms across North America to examine what will happen if emissions of greenhouse gases continue unabated.The study will be published Nov. 20 in the journal Nature Climate Change. It was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Army Corps of Engineers. Hourly rain rate averages for the 40 most extreme summertime mesoscale convective systems (MCSs) in the current (left) and future climate of the mid-Atlantic region. New research shows that MSCs will generate substantially higher maximum rain rates over larger areas by the end of the century if society continues a "business as usual" approach of emitting greenhouse gases . (©UCAR, Image by Andreas Prein, NCAR. This image is freely available for media & nonprofit use.)A warning signalThunderstorms and other heavy rainfall events are estimated to cause more than $20 billion of economic losses annually in the United States, the study notes. Particularly damaging, and often deadly, are MSCs: clusters of thunderstorms that can extend for many dozens of miles and last for hours, producing flash floods, debris flows, landslides, high winds, and/or hail. The persistent storms over Houston in the wake of Hurricane Harvey were an example of an unusually powerful and long-lived MCS.Storms have become more intense in recent decades, and a number of scientific studies have shown that this trend is likely to continue as temperatures continue to warm. The reason, in large part, is that the atmosphere can hold more water as it gets warmer, thereby generating heavier rain.A study by Prein and co-authors last year used high-resolution computer simulations of current and future weather, finding that the number of summertime storms that produce extreme downpours could increase by five times across parts of the United States by the end of the century. In the new study, Prein and his co-authors focused on MCSs, which are responsible for much of the major summertime flooding east of the Continental Divide. They investigated not only how their rainfall intensity will change in future climates, but also how their size, movement, and rainfall volume may evolve.Analyzing the same dataset of computer simulations and applying a special storm-tracking algorithm, they found that the number of severe MCSs in North America more than tripled by the end of the century. Moreover, maximum rainfall rates became 15 to 40 percent heavier, and intense rainfall reached farther from the storm's center. As a result, severe MCSs increased throughout North America, particularly in the northeastern and mid-Atlantic states, as well as parts of Canada, where they are currently uncommon.The research team also looked at the potential effect of particularly powerful MCSs on the densely populated Eastern Seaboard. They found, for example, that at the end of the century, intense MCSs over an area the size of New York City could drop 60 percent more rain than a severe present-day system. That amount is equivalent to adding six times the annual discharge of the Hudson River on top of a current extreme MCS in that area."This is a warning signal that says the floods of the future are likely to be much greater than what our current infrastructure is designed for," Prein said. "If you have a slow-moving storm system that aligns over a densely populated area, the result can be devastating, as could be seen in the impact of Hurricane Harvey on Houston."This satellite image loop shows an MCS developing over West Virginia on June 23, 2016. The resulting floods caused widespread flooding, killing more than 20 people.  MCSs are responsible for much of the major flooding east of the Continental Divide during warm weather months. (Image by NOAA National Weather Service, Aviation Weather Center.) Intensive modelingAdvances in computer modeling and more powerful supercomputing facilities are enabling climate scientists to begin examining the potential influence of a changing climate on convective storms such as thunderstorms, building on previous studies that looked more generally at regional precipitation trends.For the new study, Prein and his co-authors turned to a dataset created by running the NCAR-based Weather and Research Forecasting (WRF) model over North America at a resolution of 4 kilometers (about 2.5 miles). That is sufficiently fine-scale resolution to simulate MCSs. The intensive modeling, by NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda, required a year to run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.The team used an algorithm developed at NCAR to identify and track simulated MCSs. They compared simulations of the storms at the beginning of the century, from 2000 to 2013, with observations of actual MCSs during the same period and showed that the modeled storms are statistically identical to real MCSs.The scientists then used the dataset and algorithm to examine how MCSs may change by the end of the century in a climate that is approximately 5 degrees Celsius (9 degrees Fahrenheit) warmer than in the pre-industrial era — the temperature increase expected if greenhouse gas emissions continue unabated.About the paperTitle: Increased rainfall volume from future convective storms in the USAuthors: Andreas F Prein, Changhai Liu, Kyoko Ikeda, Stanley B Trier, Roy M Rasmussen, Greg J Holland, Martyn P ClarkJournal: Nature Climate Change  

New research could predict La Niña drought years in advance

NCAR Senior Scientist Clara Deser is a co-author of two new studies, published this month in the journal Geophysical Research Letters, that examine the impacts and predictability of La Niña. The following excerpt is from a news release by the University of Texas at Austin, a UCAR Member.Nov. 16, 2017 | Two new studies from The University of Texas at Austin have significantly improved scientists’ ability to predict the strength and duration of droughts caused by La Niña – a recurrent cooling pattern in the tropical Pacific Ocean. Their findings, which predict that the current La Niña is likely to stretch into a second year, could help scientists know years in advance how a particular La Niña event is expected to evolve.“Some La Niña events last two years, and predicting them is extremely challenging,” said Pedro DiNezio, a research associate at the University of Texas Institute for Geophysics (UTIG).The studies were published in November in the journal Geophysical Research Letters. DiNezio and UTIG Research Associate Yuko Okumura were authors on both studies and collaborated with scientists from the National Center for Atmospheric Research (NCAR). UTIG is a research unit of the UT Jackson School of Geosciences.The southern United States, including parts of eastern Texas, regularly experiences warm and dry winters caused by La Niña. Therefore, predicting La Niña’s evolution, particularly its duration, is key. Read the full news release.

UCAR Congressional Briefing: Moving research to industry

WASHINGTON — Federally funded scientific advances are enabling the multibillion-dollar weather industry to deliver increasingly targeted forecasts to consumers and businesses, strengthening the economy and providing the nation with greater resilience to natural disasters, experts said today at a congressional briefing.The panel of experts, representing universities, federally funded labs, and the private sector, said continued government investment in advanced computer modeling, observing tools, and other basic research provides the foundation for improved forecasts.The nonprofit University Corporation for Atmospheric Research (UCAR) sponsored the briefing."Thanks to a quiet revolution in modern weather prediction, we can all use forecasts to make decisions in ways that wouldn't have been possible just 10 years ago," said Rebecca Morss, a senior scientist with the National Center for Atmospheric Research (NCAR) and deputy director of the center's Mesoscale and Microscale Meteorology Lab. "Now we are looking to the next revolution, which includes giving people longer lead times and communicating risk as effectively as possible."Fuqing Zhang, a professor of meteorology and statistics at Pennsylvania State University, highlighted the ways that scientists are advancing their understanding of hurricanes and other storms with increasingly detailed observations and computer modeling. Researchers at Penn State, for example, fed data from the new National Oceanic and Atmospheric Administration GOES-R satellite into NOAA's powerful FV3 model to generate an experimental forecast of Hurricane Harvey that simulated its track and intensity."The future of weather forecasting is very promising," said Zhang, who is also the director of the Penn State Center for Advanced Data Assimilation and Predictability Techniques.  "With strategic investments in observations, modeling, data assimilation, and supercomputing, we will see some remarkable achievements."Mary Glackin, director of science and forecast operations for The Weather Company, an IBM business, said the goal of the weather industry is to help consumers and businesses make better decisions, both by providing its own forecasts and by forwarding alerts from the National Weather Service. The Weather Company currently is adapting a powerful research weather model based at NCAR, the Model for Prediction Across Scales (MPAS), for use in worldwide, real-time forecasts.The NCAR-based Model for Prediction Across Scales simulates the entire globe while enabling scientists to zoom in on areas of interest. It is one of the key tools for improving forecasts in the future. (©UCAR. This image is freely available for media & nonprofit use.) "We have a weather and climate enterprise that we can be extremely proud of as a nation, but it's not where it should be," Glackin said. "Weather affects every consumer and business, and the public-private partnership can play a pivotal role in providing better weather information that is critically needed."Antonio Busalacchi, president of UCAR, emphasized the benefits of partnerships across the academic, public, and private sectors. He said that research investments by the National Science Foundation, NOAA, and other federal agencies are critical for improving forecasts that will better protect vulnerable communities and strengthen the economy."These essential collaborations between government agencies, universities, and private companies are driving landmark advances in weather forecasting," Busalacchi said. "The investments that taxpayers are making in basic research are paying off many times over by keeping our nation safer and more prosperous."The briefing was the latest in a series of UCAR Congressional Briefings that draw on expertise from UCAR's university consortium and public-private partnerships to provide insights into critical topics in the Earth system sciences. Past briefings have focused on wildfires, predicting space weather, aviation weather safety, the state of the Arctic, hurricane prediction, potential impacts of El Niño, and new advances in water forecasting.

New climate forecasts for watersheds - and the water sector

Nov. 10, 2017 | Water managers and streamflow forecasters can now access bi-weekly, monthly, and seasonal precipitation and temperature forecasts that are broken down by individual watersheds, thanks to a research partnership between the National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU Boulder). The project is sponsored by the National Oceanic and Atmospheric Administration (NOAA) through the Modeling, Applications, Predictions, and Projections program.Operational climate forecasts for subseasonal to seasonal time scales are currently provided by the NOAA Climate Prediction Center and other sources. The forecasts usually take the form of national contour maps (example) and gridded datasets at a relatively coarse geographic resolution. Some forecast products are broken down further, based on state boundaries or on climate divisions, which average two per state; others are summarized for major cities. But river forecasters and water managers grapple with climate variability and trends in the particular watersheds within their service areas, which do not align directly with the boundaries of existing forecast areas. A forecast that directly describes predicted conditions inside an individual watershed would be extremely valuable to these users for making decisions in their management areas, such as how much water to release or store in critical reservoirs and when.To bridge this gap, the NCAR–CU Boulder research team has developed a new prototype prediction system that maps climate forecasts to watershed boundaries over the contiguous United States in real time. The system is currently running at NCAR, with real-time forecasts and analyses available on a demonstration website."We are trying to improve the accessibility and relevance of climate predictions for streamflow forecasting groups and water managers," said NCAR scientist Andy Wood, who co-leads the project. "We can’t solve all the scientific challenges of climate prediction, but we can make it easier for a person thinking about climate and water in a river basin — such as the Gunnison, or the Yakima, or the Potomac — to find and download operational climate information that has been tailored to that basin’s observed variability."The project is funded by NOAA, and the scientists plan to hand off successful components of the system for experimental operational evaluation within the NOAA National Weather Service.  Collaborators include scientists from the NOAA Climate Prediction Center and partners from the major federal water agencies: the U.S. Army Corps of Engineers and the Bureau of Reclamation.This screenshot of the S2S Climate Outlooks for Watersheds website shows forecasted temperature anomalies for watersheds across the contiguous United States. As users scroll across different watersheds, they get more precise information. In this screenshot from early November 2017, the forecast is showing that, over the next one to two weeks, the Colorado Headwaters watershed is expected to be 1.2 degrees warmer than normal. Visit the website to learn more. (©UCAR. This image is freely available for media & nonprofit use.)  Beyond the standard weather forecastPrecipitation and temperature forecasts that extend beyond the typical 7- to 10-day window can be useful to water managers making a number of important decisions about how to best regulate supplies. For instance, during a wet water year, when snowpack is high and reservoirs are more full than usual, the relative warmth or coolness of the coming spring can affect how quickly the snow melts. Good spring season forecasts allow water managers to plan in advance for how to best manage the resulting runoff.For water systems in drought, such as California's during 2012–2015, early outlooks on whether the winter rainy season will help alleviate the drought or exacerbate it can help water utilities strategize ways of meeting the year’s water demands. Historically, making these kinds of longer-term predictions accurately has been highly challenging. But in recent years, scientists have improved their skill at subseasonal and seasonal climate prediction. NOAA’s National Centers for Environmental Prediction plays a key role, both running an in-house modeling system — the Climate Forecast System, version 2 (CFSv2) — and leading an effort called the North American Multi-Model Ensemble (NMME). These model-based forecasts help inform the NOAA official climate forecasts, which also include other tools and expert judgment. NMME combines forecasts from seven different climate models based in the U.S. and Canada to form a super-ensemble of climate predictions that extend up to 10 months into the future. The combination of the different forecasts is often more accurate than the forecast from any single model. Temperature forecasts, in particular, from the combined system are notably more accurate than they were 10 years ago, Wood said, partly due to their representation of observed warming trends. Even with these new tools, however, predicting seasonal precipitation beyond the first month continues to be a major challenge. The NCAR–CU Boulder project makes use of both the CFSv2 and NMME forecasts. It generates predictions for bi-weekly periods (weeks 1-2, 2-3, and 3-4) from CFSv2 that are updated daily and longer-term forecasts derived from the NMME (months 1, 2, 3, and season 1) that are updated monthly. The scientists currently map these forecasts to 202 major watersheds in the contiguous U.S.Analyzing forecast skillThe resulting watershed-specific forecasts are available in real-time on the project's interactive website, which also provides information about their accuracy and reliability."It's important for users to be able to check on the quality of the forecasts," said Sarah Baker, a doctoral student in the Civil, Environmental, and Architectural Engineering Department at CU Boulder. "We're able to use hindcasts, which are long records of past forecasts, to analyze and describe the skill of the current forecasts. Baker, who also works for the Bureau of Reclamation, has been building the prototype system under the supervision of Wood and her academic adviser, CU Professor Balaji Rajagopalan. The researchers are also using analyses of forecast accuracy and reliability to begin correcting for systematic biases — such as consistently over-predicting springtime rains in one watershed or under-predicting summertime heat in another — in the forecasts.The project team has presented the project at a number of water-oriented meetings in the western U.S. Water managers, operators, and researchers from agencies such as the Bureau of Reclamation and utilities such as the Southern Nevada Water Authority, which manages water for Las Vegas, have expressed interest in the new forecast products."This project has great potential to provide climate outlook information that is more relevant for hydrologists and the water sector. It will be critical to connect with stakeholders or possible users of the forecasts so that their needs can continue to help shape this type of information product," said NOAA’s Andrea Ray. Ray leads an effort funded by NIDIS, the National Integrated Drought Information System, to identify tools and information such as this for a NOAA online Water Resources Monitor and Outlook that would also help connect stakeholders to climate and water information.In the coming year, the research team will implement statistical post-processing methods to improve the accuracy of the forecasts. They will also investigate the prediction of extreme climate events at the watershed scale. ContactAndy Wood, NCAR Research Applications LaboratoryWebsitehttp://hydro.rap.ucar.edu/s2sCollaboratorsCU BoulderNCARNOAAU.S. Army Corps of EngineersBureau of Reclamation FunderNOAA's Modeling, Applications, Predictions and Projections Climate Testbed program

Pages

Subscribe to NCAR