Climate & Climate Change

The rate of sea level rise is accelerating, a new study finds

NCAR scientist John Fasullo is a co-author of a new study appearing in the Proceedings of the National Academies of Science. The study finds that the rate of sea level rise is accelerating. The following is an excerpt from a news release by the Cooperative Institute for Research in Environmental Sciences. February 13, 2018 | Global sea level rise is not cruising along at a steady 3 mm per year, it’s accelerating a little every year, according to a new study that harnessed 25 years of satellite data to calculate that the rate is increasing by about 0.08 mm/year every year — which could mean an annual rate of sea level rise of 10 mm/year, or even more, by 2100.“This acceleration, driven mainly by accelerated melting in Greenland and Antarctica, has the potential to double the total sea level rise by 2100 as compared to projections that assume a constant rate—to more than 60 cm instead of about 30.” said lead author Steve Nerem, a scientists at the Cooperative Institute for Research in Environmental Sciences. "And this is almost certainly a conservative estimate," he added. "Our extrapolation assumes that sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."If the oceans continue to change at this pace, sea level will rise 65cm (26 inches) by 2100—enough to cause significant problems for coastal cities, according to the new assessment by Nerem and several colleagues from CU Boulder, the University of South Florida, NASA Goddard Space Flight Center, Old Dominion University, and the National Center for Atmospheric Research. The team, driven to understand and better predict Earth’s response to a warming world, published their work today in the journal Proceedings of the National Academy of Sciences.Rising concentrations of greenhouse gases in Earth’s atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the oceans has contributed about half of the 7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.These increases were measured using satellite altimeter measurements since 1992, including the U.S./European TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3 satellite missions. But detecting acceleration is challenging, even in such a long record. Episodes like volcanic eruptions can create variability: the eruption of Mount Pinatubo in 1991 decreased global mean sea level just before the Topex/Poseidon satellite launch, for example. In addition, global sea level can fluctuate due to climate patterns such as El Niños and La Niñas (the opposing phases of the El Niño Southern Oscillation, or ENSO) which influence ocean temperature and global precipitation patterns.Read the full news release here. 

Drier and wetter: The future of precipitation variability

January 17, 2018 | Precipitation variability — the swing from dry to wet and back again — will continue to increase across the majority of the world's land area as the climate warms, according to a new study led by scientists at the National Center for Atmospheric Research.The researchers expect precipitation variability to become greater from day to day, year to year, and even decade to decade. The new research, published in the Nature journal Scientific Reports, provides results from sophisticated computer simulations that predict that there will be both more droughts and more floods within the same areas as the climate warms. The findings are relevant for water managers who need to make long-range plans."When it's dry, it will be drier. When it's wet, it will be wetter — in the same place," said NCAR scientist Angeline Pendergrass, lead author of the study. "There will be a broader range of conditions that will become 'normal.'"The research was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Department of Energy.As the climate continues to warm, the range of precipitation that is "normal" in a particular place is likely to grow, meaning a single location can become both wetter and drier. The image on the left shows a flood in Colorado. The image on the right shows a droughtin Texas. (Images courtesy the U.S. Department of Defense and U.S. Department of Agriculture.)New tools to study changes in precipitationHistorically, changes in precipitation variability have been difficult to pin down because the amount of rain or snow a particular region gets can vary a great deal naturally.But in recent years, the availability of large ensembles of climate model runs has allowed scientists to begin separating some of the more subtle impacts of climate change from the natural chaos in the climate system. These ensembles may include 30 or 40 runs of a single climate model over the same time period with slightly different, but equally plausible, initial conditions.Pendergrass and her colleagues, NCAR scientists Flavio Lehner, Clara Deser, and Benjamin Sanderson, along with ETH-Zürich professor Reto Knutti, took a closer look at precipitation variability using large ensembles of runs from the NCAR-based Community Earth System Model (CESM) and from the Geophysical Fluid Dynamics Laboratory (GFDL) climate model. They also looked at a collection of individual runs taken from many different climate models and known as the Climate Model Intercomparison Project Phase 5, or CMIP5.The team found that precipitation variability will likely increase substantially over two-thirds of the world's land areas by the end of the century if greenhouse gas emissions continue unabated. They also found that, on average, variability increases 4 to 5 percent over land per degree Celsius of warming and that variability increases across all time scales, from days to decades."This increase in variability is arising due to more moisture in the atmosphere and a weakening of global atmospheric circulation," Pendergrass said. "That's important because it means that changes in precipitation variability are not just linked to changes in El Niño and La Niña events, as some previous work implied."Helping water managers plan for the futurePendergrass hopes the study's findings will be used by water managers in their future planning. Models used today by water managers often assume that the change in precipitation variability in the future will track with the expected increase in average precipitation.But the new study finds that the increase in precipitation variability will outstrip the increase in average precipitation, which means that water managers may be miscalculating the magnitude of future swings from wet to dry or vice versa."Water managers may be underestimating how much heavy events — floods or droughts — will change," Pendergrass said.About the articleTitle: Precipitation variability increases in a warmer climateAuthors: Pendergrass, A. G., R. Knutti, F. Lehner, C. Deser, and B. M. SandersonJournal: Scientific Reports, DOI: 10.1038/s41598-017-17966-yWriter/contact:Laura Snider, Senior Science Writer

The climate secrets of southern clouds

BOULDER, Colo. — This month, an international team of scientists will head to the remote Southern Ocean for six weeks to tackle one of the region's many persistent mysteries: its clouds.What they discover will be used to improve climate models, which routinely underestimate the amount of solar radiation reflected back into space by clouds in the region. Accurately simulating the amount of radiation that is absorbed or reflected on Earth is key to calculating how much the globe is warming.The field campaign, called the Southern Ocean Clouds, Radiation, Aerosol Transport Experimental Study, or SOCRATES, could also help scientists understand the very nature of how clouds interact with aerosols — particles suspended in the atmosphere that can be from either natural or human-made sources. Aerosols can spur cloud formation, change cloud structure, and affect precipitation, all of which affect the amount of solar radiation that is reflected.During the mission, which will run from mid-January through February, the scientists will collect data from a bevy of advanced instruments packed onboard an aircraft and a ship, both of which are specially designed for scientific missions."SOCRATES will allow for some of the best observations of clouds, aerosols, radiation, and precipitation that have ever been collected over the Southern Ocean," said Greg McFarquhar, a principal investigator and the director of the University of Oklahoma Cooperative Institute for Mesoscale Meteorological Studies (CIMMS). "These data will provide us with critical insight into the physics of cloud formation in the region, information we can use to improve global climate models."The U.S. portion of SOCRATES is largely funded by the National Science Foundation (NSF).“The Southern Ocean is famously remote and stormy and it's hard to imagine a worse place to do a field campaign. But a vast, stormy ocean is a great laboratory for studying clouds, and it's clear from our models that we have a lot to learn about them,” said Eric DeWeaver, program director for Climate and Large-Scale Dynamics in NSF’s Geoscience directorate."I'm excited about this campaign because I think it will answer some fundamental questions about clouds and their dependence on atmospheric conditions," DeWeaver said. "We'll be able to use this information to understand cloud behavior closer to home and how clouds are likely to adjust to changing climatic conditions."Critical observing and logistical support for SOCRATES is being provided by the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR). Other U.S. principal investigators are based at the University of Washington.The Australian portion of SOCRATES is largely funded by the country's government through the Australian Marine National Facility, which is owned and operated by CSIRO.A supercooled mysteryMcFarquhar and his colleagues think the reason that climate models are not accurately capturing the amount of radiation reflected by clouds above the Southern Ocean is because they may not be correctly predicting the composition of the clouds. In particular, the models may not be producing enough supercooled water — droplets that stay liquid even when the temperature is below freezing.One possible explanation for the problem is the way models represent how clouds interact with aerosols, a process that affects the amount of supercooled water in a cloud. These representations were developed from atmospheric observations, largely in the Northern Hemisphere, where most of the world's population lives.But the atmosphere over the Northern Hemisphere — even over the Arctic — contains many more pollutants, including aerosols, than the atmosphere over the Southern Ocean, which is relatively pristine."We don't know how appropriate the representations of these processes are for the Southern Hemisphere," McFarquhar said. "SOCRATES will give us an opportunity to observe these cloud-aerosol interactions and see how much they differ, if at all, from those in the Northern Hemisphere."Flying through hazardous cloudsThe NSF/NCAR HIAPER Gulfstream V has been modified to serve as a flying laboratory. (©UCAR. This figure is freely available for media & nonprofit use.)For the SOCRATES field campaign, observations will be taken from the NSF/NCAR High-performance Instrumented Airborne Platform for Environmental Research, or HIAPER, a highly modified Gulfstream V aircraft, and the R/V Investigator, an Australian deep-ocean research vessel."Much of what we currently know about Southern Ocean cloud, aerosol, and precipitation properties comes from satellite-based estimates, which are uncertain and have undergone few comparisons against independent data," said co-investigator Roger Marchand, a scientist at the University of Washington. "The data collected during SOCRATES will also enable us to evaluate current satellite data over the Southern Ocean, as well as potentially help in the design of better satellite-based techniques."The research aircraft will be based out of Hobart, Tasmania, and will make about 16 flights over the Southern Ocean during the course of the campaign. The many high-tech instruments on board will measure the size and distribution of cloud droplets, ice crystals, and aerosols, as well as record the temperature, winds, air pressure, and other standard atmospheric variables.The instruments include NCAR's HIAPER Cloud Radar (HCR) and High Spectral Resolution Lidar (HSRL). The wing-mounted HCR is able to "see" inside clouds and characterize the droplets within, while the HSRL can measure air molecules and aerosols. Together, the two highly advanced instruments will give scientists a more complete picture of the wide range of particles in the atmosphere above the Southern Ocean.The nature of the research — flying a plane in search of supercooled water —presents some challenges with aircraft icing."Oftentimes, the cleaner the air, the more probable large drops and severe icing conditions become," said Cory Wolff, the NCAR project manager who is overseeing aircraft operations for SOCRATES. "We have a number of precautions we're taking to mitigate that risk."First, a mission coordinator whose sole job is to monitor icing conditions will join each flight. Second, the design of the flights themselves will help the crew anticipate icing conditions before they have to fly through them. On the flight south from Tasmania, the HIAPER GV will fly high above the clouds — and the icing danger. During that leg of the flight, the scientists will collect information about the clouds below, both with onboard radar and lidar as well as with dropsondes — small instrument packages released from the aircraft.With that information, the scientists can determine whether it's safe to pilot the aircraft through the clouds on the return trip, collecting detailed information about the cloud composition.Sailing the stormiest seasThe Australian R/V Investigator will take measurements of the atmosphere and ocean during its six-week voyage. (Image courtesy CSIRO.)The measurements taken from the sky will be complemented by data collected from instruments on board the Australian R/V Investigator, including the NCAR Integrated Sounding System. The ISS gathers extensive data by using a radar wind profiler, surface meteorology sensors, and a balloon-borne radiosonde sounding system. The team will launch soundings every six hours, and sometimes more often, throughout the campaign."Observations from the ship will help us understand the background state of the atmosphere — how it's behaving," said NCAR scientist Bill Brown, who traveled to Australia in late November to prepare the ISS for the voyage.The ship will be deployed for the entire six weeks and will face its own challenges, notably the notorious roughness of the Southern Ocean, sometimes called the stormiest place on Earth."There are no land masses to break up the winds down there," Brown said. "So the ocean can be quite rough."SOCRATES investigators will also draw on measurements from another Australian ship as it travels between Tasmania and Antarctica on resupply missions, the R/V Aurora Australis, as well as observations from buoys and some land-based instruments on Macquarie Island."I am excited that we will have such a comprehensive suite of observations," McFarquhar said. "If we just had the cloud observations we wouldn’t have the appropriate context. If we just had the aerosols and measurements below the clouds, we wouldn't be able to understand the complete picture."For more about the SOCRATES campaign, visit the project website.Collaborating institutions:Australian Antarctic DivisionAustralian Bureau of MeteorologyAustralian Department of Environment and EnergyColorado State UniversityCooperative Institute for Mesoscale Meteorological StudiesCSIROKarlsruhe Institute of TechnologyMonash UniversityNational Center for Atmospheric ResearchNational Science FoundationNorthWest Research AssociatesQueensland University of TechnologyUniversity of California San DiegoUniversity of Colorado BoulderUniversity of Illinois at Urbana-ChampaignUniversity of MelbourneUniversity of OklahomaUniversity of Washington

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

North American storm clusters could produce 80 percent more rain

BOULDER, Colo. — Major clusters of summertime thunderstorms in North America will grow larger, more intense, and more frequent later this century in a changing climate, unleashing far more rain and posing a greater threat of flooding across wide areas, new research concludes.The study, by scientists at the National Center for Atmospheric Research (NCAR), builds on previous work showing that storms are becoming more intense as the atmosphere is warming. In addition to higher rainfall rates, the new research finds that the volume of rainfall from damaging storms known as mesoscale convective systems (MCSs) will increase by as much as 80 percent across the continent by the end of this century, deluging entire metropolitan areas or sizable portions of states."The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted," said NCAR scientist Andreas Prein, the study's lead author. "If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.""This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative," he added.The research team drew on extensive computer modeling that realistically simulates MCSs and thunderstorms across North America to examine what will happen if emissions of greenhouse gases continue unabated.The study will be published Nov. 20 in the journal Nature Climate Change. It was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Army Corps of Engineers. Hourly rain rate averages for the 40 most extreme summertime mesoscale convective systems (MCSs) in the current (left) and future climate of the mid-Atlantic region. New research shows that MSCs will generate substantially higher maximum rain rates over larger areas by the end of the century if society continues a "business as usual" approach of emitting greenhouse gases . (©UCAR, Image by Andreas Prein, NCAR. This image is freely available for media & nonprofit use.)A warning signalThunderstorms and other heavy rainfall events are estimated to cause more than $20 billion of economic losses annually in the United States, the study notes. Particularly damaging, and often deadly, are MSCs: clusters of thunderstorms that can extend for many dozens of miles and last for hours, producing flash floods, debris flows, landslides, high winds, and/or hail. The persistent storms over Houston in the wake of Hurricane Harvey were an example of an unusually powerful and long-lived MCS.Storms have become more intense in recent decades, and a number of scientific studies have shown that this trend is likely to continue as temperatures continue to warm. The reason, in large part, is that the atmosphere can hold more water as it gets warmer, thereby generating heavier rain.A study by Prein and co-authors last year used high-resolution computer simulations of current and future weather, finding that the number of summertime storms that produce extreme downpours could increase by five times across parts of the United States by the end of the century. In the new study, Prein and his co-authors focused on MCSs, which are responsible for much of the major summertime flooding east of the Continental Divide. They investigated not only how their rainfall intensity will change in future climates, but also how their size, movement, and rainfall volume may evolve.Analyzing the same dataset of computer simulations and applying a special storm-tracking algorithm, they found that the number of severe MCSs in North America more than tripled by the end of the century. Moreover, maximum rainfall rates became 15 to 40 percent heavier, and intense rainfall reached farther from the storm's center. As a result, severe MCSs increased throughout North America, particularly in the northeastern and mid-Atlantic states, as well as parts of Canada, where they are currently uncommon.The research team also looked at the potential effect of particularly powerful MCSs on the densely populated Eastern Seaboard. They found, for example, that at the end of the century, intense MCSs over an area the size of New York City could drop 60 percent more rain than a severe present-day system. That amount is equivalent to adding six times the annual discharge of the Hudson River on top of a current extreme MCS in that area."This is a warning signal that says the floods of the future are likely to be much greater than what our current infrastructure is designed for," Prein said. "If you have a slow-moving storm system that aligns over a densely populated area, the result can be devastating, as could be seen in the impact of Hurricane Harvey on Houston."This satellite image loop shows an MCS developing over West Virginia on June 23, 2016. The resulting floods caused widespread flooding, killing more than 20 people.  MCSs are responsible for much of the major flooding east of the Continental Divide during warm weather months. (Image by NOAA National Weather Service, Aviation Weather Center.) Intensive modelingAdvances in computer modeling and more powerful supercomputing facilities are enabling climate scientists to begin examining the potential influence of a changing climate on convective storms such as thunderstorms, building on previous studies that looked more generally at regional precipitation trends.For the new study, Prein and his co-authors turned to a dataset created by running the NCAR-based Weather and Research Forecasting (WRF) model over North America at a resolution of 4 kilometers (about 2.5 miles). That is sufficiently fine-scale resolution to simulate MCSs. The intensive modeling, by NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda, required a year to run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.The team used an algorithm developed at NCAR to identify and track simulated MCSs. They compared simulations of the storms at the beginning of the century, from 2000 to 2013, with observations of actual MCSs during the same period and showed that the modeled storms are statistically identical to real MCSs.The scientists then used the dataset and algorithm to examine how MCSs may change by the end of the century in a climate that is approximately 5 degrees Celsius (9 degrees Fahrenheit) warmer than in the pre-industrial era — the temperature increase expected if greenhouse gas emissions continue unabated.About the paperTitle: Increased rainfall volume from future convective storms in the USAuthors: Andreas F Prein, Changhai Liu, Kyoko Ikeda, Stanley B Trier, Roy M Rasmussen, Greg J Holland, Martyn P ClarkJournal: Nature Climate Change  

Investing in climate observations would generate major returns

November 14, 2017 | A major new paper by more than two dozen climate experts concludes that a well-designed climate observing system could deliver trillions of dollars in benefits while providing decision makers with the information they need in coming decades to protect public health and the economy."We are on the threshold of a new era in prediction, drawing on our knowledge of the entire Earth system to strengthen societal resilience to potential climate and weather disasters," said Antonio Busalacchi, president of the University Corporation for Atmospheric Research and one of the co-authors. "Strategic investments in observing technologies will pay for themselves many times over by protecting life and property, promoting economic growth, and providing needed intelligence to decision makers."Elizabeth Weatherhead, a scientist with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, is the lead author of the new paper, published last week in Earth's Future. The co-authors include two scientists associated with the National Center for Atmospheric Research: Jeffrey Lazo and Kevin Trenberth.The scientists urge that investments focus on tackling seven grand challenges. These include predicting extreme weather and climate shifts, the role of clouds and circulation in regulating climate, regional sea level change and coastal impacts, understanding the consequences of melting ice, and feedback loops involving carbon cycling.For more about the paper, see the CIRES news release.

New climate forecasts for watersheds - and the water sector

Nov. 10, 2017 | Water managers and streamflow forecasters can now access bi-weekly, monthly, and seasonal precipitation and temperature forecasts that are broken down by individual watersheds, thanks to a research partnership between the National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU Boulder). The project is sponsored by the National Oceanic and Atmospheric Administration (NOAA) through the Modeling, Applications, Predictions, and Projections program.Operational climate forecasts for subseasonal to seasonal time scales are currently provided by the NOAA Climate Prediction Center and other sources. The forecasts usually take the form of national contour maps (example) and gridded datasets at a relatively coarse geographic resolution. Some forecast products are broken down further, based on state boundaries or on climate divisions, which average two per state; others are summarized for major cities. But river forecasters and water managers grapple with climate variability and trends in the particular watersheds within their service areas, which do not align directly with the boundaries of existing forecast areas. A forecast that directly describes predicted conditions inside an individual watershed would be extremely valuable to these users for making decisions in their management areas, such as how much water to release or store in critical reservoirs and when.To bridge this gap, the NCAR–CU Boulder research team has developed a new prototype prediction system that maps climate forecasts to watershed boundaries over the contiguous United States in real time. The system is currently running at NCAR, with real-time forecasts and analyses available on a demonstration website."We are trying to improve the accessibility and relevance of climate predictions for streamflow forecasting groups and water managers," said NCAR scientist Andy Wood, who co-leads the project. "We can’t solve all the scientific challenges of climate prediction, but we can make it easier for a person thinking about climate and water in a river basin — such as the Gunnison, or the Yakima, or the Potomac — to find and download operational climate information that has been tailored to that basin’s observed variability."The project is funded by NOAA, and the scientists plan to hand off successful components of the system for experimental operational evaluation within the NOAA National Weather Service.  Collaborators include scientists from the NOAA Climate Prediction Center and partners from the major federal water agencies: the U.S. Army Corps of Engineers and the Bureau of Reclamation.This screenshot of the S2S Climate Outlooks for Watersheds website shows forecasted temperature anomalies for watersheds across the contiguous United States. As users scroll across different watersheds, they get more precise information. In this screenshot from early November 2017, the forecast is showing that, over the next one to two weeks, the Colorado Headwaters watershed is expected to be 1.2 degrees warmer than normal. Visit the website to learn more. (©UCAR. This image is freely available for media & nonprofit use.)  Beyond the standard weather forecastPrecipitation and temperature forecasts that extend beyond the typical 7- to 10-day window can be useful to water managers making a number of important decisions about how to best regulate supplies. For instance, during a wet water year, when snowpack is high and reservoirs are more full than usual, the relative warmth or coolness of the coming spring can affect how quickly the snow melts. Good spring season forecasts allow water managers to plan in advance for how to best manage the resulting runoff.For water systems in drought, such as California's during 2012–2015, early outlooks on whether the winter rainy season will help alleviate the drought or exacerbate it can help water utilities strategize ways of meeting the year’s water demands. Historically, making these kinds of longer-term predictions accurately has been highly challenging. But in recent years, scientists have improved their skill at subseasonal and seasonal climate prediction. NOAA’s National Centers for Environmental Prediction plays a key role, both running an in-house modeling system — the Climate Forecast System, version 2 (CFSv2) — and leading an effort called the North American Multi-Model Ensemble (NMME). These model-based forecasts help inform the NOAA official climate forecasts, which also include other tools and expert judgment. NMME combines forecasts from seven different climate models based in the U.S. and Canada to form a super-ensemble of climate predictions that extend up to 10 months into the future. The combination of the different forecasts is often more accurate than the forecast from any single model. Temperature forecasts, in particular, from the combined system are notably more accurate than they were 10 years ago, Wood said, partly due to their representation of observed warming trends. Even with these new tools, however, predicting seasonal precipitation beyond the first month continues to be a major challenge. The NCAR–CU Boulder project makes use of both the CFSv2 and NMME forecasts. It generates predictions for bi-weekly periods (weeks 1-2, 2-3, and 3-4) from CFSv2 that are updated daily and longer-term forecasts derived from the NMME (months 1, 2, 3, and season 1) that are updated monthly. The scientists currently map these forecasts to 202 major watersheds in the contiguous U.S.Analyzing forecast skillThe resulting watershed-specific forecasts are available in real-time on the project's interactive website, which also provides information about their accuracy and reliability."It's important for users to be able to check on the quality of the forecasts," said Sarah Baker, a doctoral student in the Civil, Environmental, and Architectural Engineering Department at CU Boulder. "We're able to use hindcasts, which are long records of past forecasts, to analyze and describe the skill of the current forecasts. Baker, who also works for the Bureau of Reclamation, has been building the prototype system under the supervision of Wood and her academic adviser, CU Professor Balaji Rajagopalan. The researchers are also using analyses of forecast accuracy and reliability to begin correcting for systematic biases — such as consistently over-predicting springtime rains in one watershed or under-predicting summertime heat in another — in the forecasts.The project team has presented the project at a number of water-oriented meetings in the western U.S. Water managers, operators, and researchers from agencies such as the Bureau of Reclamation and utilities such as the Southern Nevada Water Authority, which manages water for Las Vegas, have expressed interest in the new forecast products."This project has great potential to provide climate outlook information that is more relevant for hydrologists and the water sector. It will be critical to connect with stakeholders or possible users of the forecasts so that their needs can continue to help shape this type of information product," said NOAA’s Andrea Ray. Ray leads an effort funded by NIDIS, the National Integrated Drought Information System, to identify tools and information such as this for a NOAA online Water Resources Monitor and Outlook that would also help connect stakeholders to climate and water information.In the coming year, the research team will implement statistical post-processing methods to improve the accuracy of the forecasts. They will also investigate the prediction of extreme climate events at the watershed scale. ContactAndy Wood, NCAR Research Applications LaboratoryWebsite BoulderNCARNOAAU.S. Army Corps of EngineersBureau of Reclamation FunderNOAA's Modeling, Applications, Predictions and Projections Climate Testbed program

New approach to geoengineering simulations is significant step forward

BOULDER, Colo. — Using a sophisticated computer model, scientists have demonstrated for the first time that a new research approach to geoengineering could potentially be used to limit Earth’s warming to a specific target while reducing some of the risks and concerns identified in past studies, including uneven cooling of the globe.The scientists developed a specialized algorithm for an Earth system model that varies the amount and location of geoengineering — in this case, injections of sulfur dioxide high into the atmosphere — that would in theory be needed, year to year, to effectively cap warming. They caution, however, that more research is needed to determine if this approach would be practical, or even possible, in the real world.The findings from the new research, led by scientists from the National Center for Atmospheric Research (NCAR), Pacific Northwest National Laboratory (PNNL), and Cornell University, represent a significant step forward in the field of geoengineering. Still, there are many questions that need to be answered about sulfur dioxide injections, including how this type of engineering might alter regional precipitation patterns and the extent to which such injections would damage the ozone layer. The possibility of a global geoengineering effort to combat warming also raises serious governance and ethical concerns."This is a major milestone and offers promise of what might be possible in the future,” said NCAR scientist Yaga Richter, one of the lead authors. “But it is just the beginning; there is a lot more research that needs to be done."Past modeling studies have typically sought to answer the question "What happens if we do geoengineering?" The results of those studies have described the outcomes — both positive and negative — of injecting a predetermined amount of sulfates into the atmosphere, often right at Earth's equator. But they did not attempt to specify the outcome they hoped to achieve at the outset.In a series of new studies, the researchers turned the question around, instead asking, "How might geoengineering be used to meet specific climate objectives?""We have really shifted the question, and by doing so, found that we can better understand what geoengineering may be able to achieve," Richter said.The research findings are detailed in a series of papers published in a special issue of the Journal of Geophysical Research – Atmospheres.Mimicking a volcanoIn theory, geoengineering — large-scale interventions designed to modify the climate — could take many forms, from launching orbiting solar mirrors to fertilizing carbon-hungry ocean algae. For this research, the team studied one much-discussed approach: injecting sulfur dioxide into the upper atmosphere, above the cloud layer.The idea of combating global warming with these injections is inspired by history's most massive volcanic eruptions. When volcanoes erupt, they loft sulfur dioxide high into the atmosphere, where it's chemically converted into light-scattering sulfate particles called aerosols. These sulfates, which can linger in the atmosphere for a few years, are spread around the Earth by stratospheric winds, forming a reflective layer that cools the planet.To mimic these effects, sulfur dioxide could be injected directly into the stratosphere, perhaps with the help of high-flying aircraft. But while the injections would counter global warming, they would not address all the problems associated with climate change, and they would likely have their own negative side effects.For example, the injections would not offset ocean acidification, which is linked directly to carbon dioxide emissions. Geoengineering also could result in significant disruptions in rainfall patterns as well as delays in healing the ozone hole. Moreover, once geoengineering began, if society wanted to avoid a rapid and drastic increase in temperature, the injections would need to continue until mitigation efforts were sufficient to cap warming on their own.There would also likely be significant international governance challenges that would have to be overcome before a geoengineering program could be implemented."For decision makers to accurately weigh the pros and cons of geoengineering against those of human-caused climate change, they need more information," said PNNL scientist Ben Kravitz, also a lead author of the studies. "Our goal is to better understand what geoengineering can do — and what it cannot."Modeling the complex chemistryFor the new studies, the scientists used the NCAR-based Community Earth System Model with its extended atmospheric component, the Whole Atmosphere Community Climate Model. WACCM includes detailed chemistry and physics of the upper atmosphere and was recently updated to simulate stratospheric aerosol evolution from source gases, including geoengineering."It was critical for this study that our model be able to accurately capture the chemistry in the atmosphere so we could understand how quickly sulfur dioxide would be converted into aerosols and how long those aerosols would stick around," said NCAR scientist Michael Mills, also a lead author. "Most global climate models do not include this interactive atmospheric chemistry.”The scientists also significantly improved how the model simulates tropical stratospheric winds, which change direction every few years. Accurately representing these winds is critical to understanding how aerosols are blown around the planet.The scientists successfully tested their model by seeing how well it could simulate the massive 1991 eruption of Mount Pinatubo, including the amount and rate of aerosol formation, as well as how those aerosols were transported around the globe and how long they stayed in the atmosphere.Then the scientists began to explore the impacts of injecting sulfur dioxide at different latitudes and altitudes. From past studies, the scientists knew that sulfates injected only at the equator affect Earth unevenly: over-cooling the tropics and under-cooling the poles. This is especially problematic since climate change is warming the Arctic at a faster rate. Climate change is also causing the Northern Hemisphere to warm more quickly than the Southern Hemisphere.The researchers used the model to study 14 possible injection sites at seven different latitudes and two different altitudes — something never before tried in geoengineering research. They found that they could spread the cooling more evenly across the globe by choosing injection sites on either side of the equator.The simulations on the left represent how global temperatures are expected to change if greenhouse gas emissions continue on a "business as usual" trajectory. The simulations on the right show how temperature could be stabilized in a model by injecting sulfur dioxide high into the atmosphere at four separate locations. Because greenhouse gases are being emitted at the same rate in the simulations on the left and the right, stopping geoengineering would result in a drastic spike in global temperatures. (©UCAR. This image is freely available for media & nonprofit use.)  Meeting multiple objectivesThe researchers then pieced together all their work into a single model simulation with specific objectives: to limit average global warming to 2020 levels through the end of the century and to minimize the difference in cooling between the equator and the poles as well as between the northern and southern hemispheres.They gave the model four choices of injection sites — at 15 degrees and 30 degrees North and South in latitude — and then implemented an algorithm that determines, for each year, the best injection sites and the quantity of sulfur dioxide needed at those sites. The model's ability to reformulate the amount of geoengineering needed each year, based on that year's conditions, also allowed the simulation to respond to natural fluctuations in the climate.The model successfully kept the surface temperatures near 2020 levels against a background of increasing greenhouse gas emissions that would be consistent with a business-as-usual scenario. The algorithm’s ability to choose injection sites cooled the Earth more evenly than in previous studies, because it could inject more sulfur dioxide in regions that were warming too quickly and less in areas that had over-cooled.However, by the end of the century, the amount of sulfur dioxide that would need to be injected each year to offset human-caused global warming would be enormous: almost five times the amount spewed into the air by Mount Pinatubo on June 15, 1991.Flipping the research question"The results demonstrate that it is possible to flip the research question that's been guiding geoengineering studies and not just explore what geoengineering does but see it as a design problem,” said Doug MacMartin, a scientist at Cornell and the California Institute of Technology. “When we see it in that light, we can then start to develop a strategy for how to meet society’s objectives."In the current series of studies, adjusting the geoengineering plan just once a year allowed the researchers to keep the average global temperature to 2020 levels in a given year, but regional temperatures — as well as seasonal temperature changes — were sometimes cooler or hotter than desired. So next steps could include exploring the possibility of making more frequent adjustments at a different choice of injection locations.The scientists are already working on a new study to help them understand the possible impacts geoengineering might have on regional phenomena, such as the Asian monsoons."We are still a long way from understanding all the interactions in the climate system that could be triggered by geoengineering, which means we don’t yet understand the full range of possible side effects," said NCAR scientist Simone Tilmes, a lead author. "But climate change also poses risks. Continuing research into geoengineering is critical to assess benefits and side effects and to inform decision makers and society."The research was funded by the Defense Advanced Research Projects Agency and the National Science Foundation, NCAR's sponsor.Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or the Defense Advanced Research Projects Agency.About the papers:Titles:Radiative and chemical response to interactive stratospheric sulfate aerosols in fully coupled CESM1(WACCM), DOI: 10.1002/2017JD027006Sensitivity of aerosol distribution and climate response to stratospheric SO2 injection locations, DOI: 10.1002/2017JD026888Stratospheric Dynamical Response and Ozone Feedbacks in the Presence of SO2 Injections, DOI: 10.1002/2017JD026912The climate response to stratospheric aerosol geoengineering can be tailored using multiple injection locations, DOI: 10.1002/2017JD026868First simulations of designing stratospheric sulfate aerosol geoengineering to meet multiple simultaneous climate objectives, DOI: 10.1002/2017JD026874Authors: B. Kravitz, D. MacMartin, M. J. Mills, J. H. Richter, and S. TilmesCo-authors: F. Vitt, J. J. Tribbia, J.-F. LamarqueJournal: Journal of Geophysical Research – AtmospheresData access: All the data from the experiments are available on the Earth System Gridat and or Snider, Senior Science Writer

Future volcanic eruptions could cause more climate disruption

BOULDER, Colo. — Major volcanic eruptions in the future have the potential to affect global temperatures and precipitation more dramatically than in the past because of climate change, according to a new study led by the National Center for Atmospheric Research (NCAR). The study authors focused on the cataclysmic eruption of Indonesia's Mount Tambora in April 1815, which is thought to have triggered the so-called "year without a summer" in 1816. They found that if a similar eruption occurred in the year 2085, temperatures would plunge more deeply, although not enough to offset the future warming associated with climate change. The increased cooling after a future eruption would also disrupt the water cycle more severely, decreasing the amount of precipitation that falls globally. The reason for the difference in climate response between 1815 and 2085 is tied to the oceans, which are expected to become more stratified as the planet warms, and therefore less able to moderate the climate impacts caused by volcanic eruptions. "We discovered that the oceans play a very large role in moderating, while also lengthening, the surface cooling induced by the 1815 eruption," said NCAR scientist John Fasullo, lead author of the new study. "The volcanic kick is just that — it's a cooling kick that lasts for a year or so. But the oceans change the timescale. They act to not only dampen the initial cooling but also to spread it out over several years." The research was published today in the journal Nature Communications. The work was funded in part by the National Science Foundation, NCAR's sponsor. Other funders include NASA and the U.S. Department of Energy. The study co-authors are Robert Tomas, Samantha Stevenson, Bette Otto-Bliesner, and Esther Brady, all of NCAR, as well as Eugene Wahl, of the National Oceanic and Atmospheric Administration.An aerial view of Mount Tambora's caldera, formed during the 1815 eruption. (Image credit: Wikipedia.) A detailed look at a deadly pastMount Tambora's eruption, the largest in the past several centuries, spewed a huge amount of sulfur dioxide into the upper atmosphere, where it turned into sulfate particles called aerosols. The layer of light-reflecting aerosols cooled Earth, setting in motion a chain of reactions that led to an extremely cold summer in 1816, especially across Europe and the northeast of North America. The "year without a summer" is blamed for widespread crop failure and disease, causing more than 100,000 deaths globally. To better understand and quantify the climate effects of Mount Tambora's eruption and to explore how those effects might differ for a future eruption if climate change continues on its current trajectory, the research team turned to a sophisticated computer model developed by scientists from NCAR and the broader community. The scientists looked at two sets of simulations from the Community Earth System Model. The first was taken from the CESM Last Millennium Ensemble Project, which simulates Earth's climate from the year 850 through 2005, including volcanic eruptions in the historic record. The second set, which assumes that greenhouse gas emission continue unabated, was created by running CESM forward and repeating a hypothetical Mount Tambora eruption in 2085. The historical model simulations revealed that two countervailing processes helped regulate Earth's temperature after Tambora's eruption. As aerosols in the stratosphere began blocking some of the Sun's heat, this cooling was intensified by an increase in the amount of land covered by snow and ice, which reflected heat back to space. At the same time, the oceans served as an important counterbalance. As the surface of the oceans cooled, the colder water sank, allowing warmer water to rise and release more heat into the atmosphere. By the time the oceans themselves had cooled substantially, the aerosol layer had begun to dissipate, allowing more of the Sun's heat to again reach Earth's surface. At that point, the ocean took on the opposite role, keeping the atmosphere cooler, since the oceans take much longer to warm back up than land. "In our model runs, we found that Earth actually reached its minimum temperature the following year, when the aerosols were almost gone," Fasullo said. "It turns out the aerosols did not need to stick around for an entire year to still have a year without a summer in 1816, since by then the oceans had cooled substantially."The oceans in a changed climateWhen the scientists studied how the climate in 2085 would respond to a hypothetical eruption that mimicked Mount Tambora's, they found that Earth would experience a similar increase in land area covered by snow and ice. However, the ocean's ability to moderate the cooling would be diminished substantially in 2085. As a result, the magnitude of Earth's surface cooling could be as much as 40 percent greater in the future. The scientists caution, however, that the exact magnitude is difficult to quantify since they had only a relatively small number of simulations of the future eruption. The reason for the change has to do with a more stratified ocean. As the climate warms, sea surface temperatures increase. The warmer water at the ocean's surface is then less able to mix with the colder, denser water below. In the model runs, this increase in ocean stratification meant that the water that was cooled after the volcanic eruption became trapped at the surface instead of mixing deeper into the ocean, reducing the heat released into the atmosphere. The scientists also found that the future eruption would have a larger effect on rainfall than the historical eruption of Mount Tambora. Cooler sea surface temperatures decrease the amount of water that evaporates into the atmosphere and, therefore, also decrease global average precipitation. Though the study found that Earth's response to a Tambora-like eruption would be more acute in the future than in the past, the scientists note that the average surface cooling caused by the 2085 eruption (about 1.1 degrees Celsius) would not be nearly enough to offset the warming caused by human-induced climate change (about 4.2 degrees Celsius by 2085). Study co-author Otto-Bliesner said, "The response of the climate system to the 1815 eruption of Indonesia's Mount Tambora gives us a perspective on potential surprises for the future, but with the twist that our climate system may respond much differently."About the article:Title: The amplifying influence of increased ocean stratification on a future year without a summer Authors: J.T. Fasullo, R. Tomas, S. Stevenson, B. Otto-Bliesner, E. Brady, and E. Wahl Journal: Nature Communications, DOI: 10.1038/s41467-017-01302-z

Climate change could decrease Sun's ability to disinfect lakes, coastal waters

October 20, 2017 | One of the largely unanticipated impacts of a changing climate may be a decline in sunlight's ability to disinfect lakes, rivers, and coastal waters, possibly leading to an increase in waterborne pathogens and the diseases they can cause in humans and wildlife.A new study published in the journal Scientific Reports outlines how a rise in the amount of organic matter washed into bodies of water can stunt the ability of pathogen-killing ultraviolet rays from the Sun to penetrate the water's surface.Scientists have already measured an increase in "browning" of the world's waters, a phenomenon caused by more organic matter washing in from the surrounding land. This trend is expected to continue as a warming climate leads to more extreme rainfall and thawing permafrost, both of which contribute to the problem.In the new study, led by Miami University in Ohio, researchers analyzed water samples and used a model based at the National Center for Atmospheric Research (NCAR) to quantify, for the first time, the impact of dissolved organic matter on the potential for UV radiation from the Sun to kill pathogens in the water."Much of the research emphasis up to this point has been on the browning itself, not the ecological consequences," said lead author Craig Williamson, an ecologist at Miami University. "We were able to determine that in some cases, browning is decreasing the ability of sunlight to disinfect water by a factor of 10. This could have serious implications for drinking water supplies and coastal fisheries across the globe."The study was an outgrowth of collaboration among multiple scientists from different disciplines who serve on the United Nations Environment Programme Environmental Effects Assessment Panel (UNEP EEAP). The data collection and modeling used in this study were funded by multiple grants from the National Science Foundation, NCAR's sponsor.Quantifying the impactsFor the study, Williamson and his colleagues relied on water samples collected from lakes around the world, from Pennsylvania and Wisconsin to Chile and New Zealand. The water samples were tested to determine how much dissolved organic matter each contained, as well as the wavelengths of light — including ultraviolet wavelengths — absorbed by that organic matter.The pristine waters of Lake Tahoe's Sand Harbor contrast with the brown water in the lake's Star Harbor, where people and boats are active. Dissolved organic matter from human activity and from heavy rains can cloud the water and reduce solar disinfection. (Photo courtesy Andrew Tucker.)Then NCAR scientist Sasha Madronich used this information as well as modeling results from the Tropospheric Ultraviolet-Visible model to calculate the solar inactivation potential (SIP) for each lake. SIP is an index of the expected disinfecting power of UV light in a particular body of water, based on its dissolved organic matter and other characteristics.The NCAR Tropospheric Ultraviolet-Visible model — which simulates how UV light is scattered and absorbed as it passes through Earth's atmosphere — was used to determine how much UV light hits the surface of the lakes throughout the year.Madronich also analyzed reflection and refraction off each lake's surface to calculate how much light penetrates the lakes and then, finally, how deeply it reaches.Because scientists already have some understanding of which wavelengths of UV light do the most damage to which waterborne pathogens, the scientists were able to use the model output to calculate the SIP for each lake. In some cases, they also calculated this measure of expected disinfecting power across different parts of, or for different time periods in, the same lake.The results allowed scientists to quantify the impacts of dissolved organic matter. For example, the summertime SIP for one lake in northeastern Pennsylvania — which, along with other regional lakes has undergone significant browning in recent decades — dropped by about half between 1994 and 2015.In California's Lake Tahoe, the SIP can be as much as ten times lower at Tahoe Meeks Bay, an area at lake's edge that is heavily used by humans and has a much higher level of dissolved organic matter, than in the relatively pristine center of the lake.The scientists also showed how SIP can dramatically decrease after a heavy rainfall event, using water samples collected from the region where the Manitowoc River flows into Lake Michigan, which supplies drinking water to more than 10 million people. Modeling based on samples taken before and after a strong storm moved through on June 21, 2011, showed that the SIP may have dropped by as much as 22 percent due to the extra dissolved organic matter that washed into the area in this single storm event.Additionally, the results for all lakes showed a significantly stronger SIP during the summer — when the Sun is higher in the sky — than winter. Lakes at higher elevations also had higher SIPs during all times of the year.Collaborating across disciplinesThe study highlights possible challenges for water supply managers and public health workers as the climate continues to warm and extreme precipitation events become more common. Not only does an increase in dissolved organic matter make it more difficult for sunlight to disinfect bodies of water, it also makes it more difficult for water treatment plants to work effectively, Williamson said. In the United States, 12 to 19 million people already become ill from waterborne pathogens annually.The research also underscores the importance of working across scientific disciplines to fully understand the impacts of climate change across the Earth system, said Madronich, who is an atmospheric chemist."What happens in the atmosphere affects what happens in lakes," he said. "These are not separate compartments of the world. These things are all connected."Other co-authors of the study are Aparna Lal and Robyn Lucas (The Australian National University), Richard Zepp (Environmental Protection Agency), Erin Overholt (Miami University), Kevin Rose (Rensselaer Polytechnic Institute), Geoffrey Schladow (University of California, Davis), and Julia Lee-Taylor (NCAR).About the articleTitle: Climate change-induced increases in precipitation are reducing the potential for solar ultraviolet radiation to inactivate pathogens in surface watersAuthors: Craig E. Williamson, Sasha Madronich, Aparna Lal, Richard G. Zepp, Robyn M. Lucas, Erin P. Overholt, Kevin C. Rose, S. Geofrey Schladow, and Julia Lee-TaylorJournal: Scientific Reports, DOI: 10.1038/s41598-017-13392-2Writer/contact:Laura Snider, Senior Science Writer


Subscribe to Climate & Climate Change