Climate & Climate Change

Reconciling Paris Agreement goals for temperature, emissions

As society faces the challenge of limiting warming to no more than 2 degrees Celsius, new research finds an apparent contradiction: Achieving that goal doesn't necessarily require cutting greenhouse gas emissions to zero, as called for in the Paris Agreement. But under certain conditions, even zero emissions might not be enough.The Paris Agreement, a global effort to respond to the threats of human-caused climate change, stipulates that warming be limited to between 1.5 degrees C (2.7 degrees Fahrenheit) and 2 degrees C (3.6 degrees F). It also stipulates that countries achieve net-zero greenhouse gas emissions in the second half of this century. But the relationship between the two — is the emissions goal sufficient or even necessary to meet the temperature goal? — has not been well understood.In a new study published in the journal Nature Climate Change, scientists used a computer model to analyze a variety of possible future scenarios to better understand how emissions reductions and temperature targets are connected. The study, published March 26, was led by Katsumasa Tanaka at the National Institute for Environmental Studies in Japan and co-authored by Brian O'Neill at the U.S. National Center for Atmospheric Research."What we found is that the two goals do not always go hand in hand," Tanaka said. "If we meet temperature targets without first overshooting them, we don't have to reduce greenhouse gas emissions to zero. But if we do reduce emissions to zero, we still might not meet the temperature targets if we don't reduce emissions quickly enough."The team also found that whether temperatures overshoot the target temporarily has a critical impact on the scale of emissions reductions needed."If we overshoot the temperature target, we do have to reduce emissions to zero. But that won’t be enough," Tanaka said. "We'll have to go further and make emissions significantly negative to bring temperatures back down to the target by the end of the century."The research was supported by the Environment Research and Technology Development Fund (2-1702) of the Environmental Restoration and Conservation Agency in Japan and by the U.S. National Science Foundation, NCAR's sponsor.Drafted in 2015, the Paris Agreement has been ratified by more than 170 countries. President Donald Trump announced last year the intention to withdraw the United States from the agreement.Modeling the problem from both sidesFor the study, the researchers used a simplified integrated assessment model that takes into account the physical connections between greenhouse gases and global mean temperature in the climate as well as the economic costs of emissions reductions."We investigated the consistency between the Paris targets in two ways. First we asked, what happens if you just meet the temperature target in a least-cost way? What would emissions look like?" said O'Neill, an NCAR senior scientist. "Then we said, let's just meet the emissions goal and see what kind of temperatures you get."The team generated 10 different scenarios. They found that Earth's warming could be stabilized at 1.5 or 2 degrees C — without overshooting the goal — by drastically cutting emissions in the short term. For example, total greenhouse gas emissions would need to be slashed by about 80 percent by 2033 to hit the 1.5-degree target or by about two-thirds by 2060 to meet the 2-degree target. In both these cases, emissions could then flatten out without ever falling to zero.Due to the difficulty of making such steep cuts, the scientists also looked at scenarios in which the temperature was allowed to temporarily overshoot the targets, returning to 1.5 or 2 degrees by the end of the century. In the 1.5-degree overshoot scenario, emissions fall to zero by 2070 and then stay negative for the rest of the century. (Negative emissions require activities that draw down carbon dioxide from the atmosphere.) For the 2-degree temporary overshoot scenario, emissions fall to zero in 2085 and also become negative, but for a shorter period of time.On the flip side, the scientists also looked at scenarios where they set the emissions levels instead of the temperature. In those cases, they analyzed what would happen if emissions were reduced to zero around mid-century (2060) or at the end of the century (2100). In the first case, the global temperature peaked around the 2-degree target and then declined. But in the second case, the temperature rose above 2 degrees around 2043 and stayed there for a century or more."The timing of when emissions are reduced really matters," O'Neill said. "We could meet the goal set out in the Paris Agreement of reducing emissions to zero in the second half of the century and still wildly miss the temperature targets in the same agreement if we wait to take action."The new study is part of a growing body of research that seeks to better understand and define what it will take to comply with the Paris Agreement. For example, another recent study — led by Tom Wigley, a climate scientist at the University of Adelaide who holds an honorary appointment at NCAR — also looks at the quantity and timing of emissions cuts needed to stabilize global temperature rise at 1.5 or 2 degrees above preindustrial levels. This work focuses in particular on implications for emissions of carbon dioxide, the main component of the broader greenhouse gas emissions category that makes up the Paris emissions target.O'Neill and Tanaka believe their work might be useful as countries begin to report the progress they've made reducing their emissions and adjust their goals. These periods of reporting and readjusting, known as global stocktakes, are formalized as part of the Paris Agreement and occur every five years."Our study and others may help provide countries with a clearer understanding of what work needs to be done to meet the goals laid out in the agreement. We believe that the Paris Agreement needs this level of scientific interpretation," Tanaka said.

Cutting greenhouse gas emissions would help spare cities worldwide from rising seas

BOULDER, Colo. — Coastal cities worldwide would face a reduced threat from sea level rise if society reduced greenhouse gas emissions, with especially significant benefits for New York and other U.S. East Coast cities, new research indicates.The study, by scientists at the National Center for Atmospheric Research (NCAR), used a powerful computer model to tease out the ways that winds and currents in a warming world push ocean water around, lifting it in some regions and lowering it in others. The scientists examined how these variations in sea level rise would change under two conditions: if emissions continue on their current trajectory, or if they are sharply reduced.The results showed that, if society can implement cuts soon on emissions of carbon dioxide and other heat-trapping gases, the projected increases in sea level around the globe would be significantly less toward the end of the century. This would help coastal cities in much of the world as they try to fend off rising waters, with the benefits most pronounced for cities on parts of the Atlantic and Indian oceans.Projected sea level rise for major cities worldwide will vary significantly later this century, depending on whether society continues to increase emissions of greenhouse gases at the current rate (a scenario known as RCP 8.5) or begins to sharply reduce them (RCP 4.5). Some cities, such as New York and London, would see particularly pronounced benefits if society cuts emissions. For more details on the range of projected sea level rise for major cities, click on the graphic or see table below. (Graphic by Simmi Sinha, ©UCAR. Click to enlarge. This graphic is freely available for media & nonprofit use.)  "Mitigating greenhouse gases will reduce sea level rise later this century, with some regions seeing especially significant benefits," said NCAR scientist Aixue Hu, the lead author of the new study. "As city officials prepare for sea level rise, they can factor in the compounding effect of local conditions, which are due to the winds and currents that cause internal variability in the oceans."Hu and his co-author, NCAR scientist Susan Bates, caution that the modeling study presents an incomplete picture, because it does not include runoff from melting ice sheets and glaciers — two factors that scientists are just now incorporating into computer models. Instead, it simulates the influence of climate change on variations in sea level worldwide to reveal which coastlines will benefit most from emission reductions associated with the additional heat absorbed by the ocean.The study, published this month in the journal Nature Communications, was funded by the U.S. Department of Energy and by the National Science Foundation, which is NCAR's sponsor.Global changes with local impactsSea level rise is one of the most consequential impacts of climate change, threatening to swamp low-lying islands and major coastal cities. Sea levels in some regions are expected to rise by several feet by the end of this century, due to a combination of melting ice sheets and glaciers (which account for about two-thirds of sea level rise) along with thermal expansion, or ocean waters expanding as they warm (which accounts for the remaining one-third).To study how changes in emissions would affect global sea level rise and local variations, Hu and Bates used two sets of computer simulations that are based on two different greenhouse gas scenarios.In the business-as-usual scenario, with emissions from human activity continuing to increase at current rates, global temperatures by late this century would rise by about 5.4 degrees Fahrenheit (3 degrees Celsius) over late 20th century levels. In the moderate mitigation scenario, with society taking steps to reduce greenhouse gases, warming would be held to about 3.2 degrees F (1.8 degrees C).The scientists found that reducing greenhouse gas emissions would not significantly restrain sea level rise for the next two decades. The reason, in part, has to do with the inertia of the climate system (once heat enters the oceans, it is retained for a period of time). In addition, winds and currents are naturally variable from year to year, pushing ocean water in different directions and making it hard to discern the full impact of planet-scale warming over the span of a decade or two.But the scientists found that later in the century, from 2061 to 2080, reduced emissions would have a significant impact across almost the entire world. The simulations showed that the extent of mean global sea level rise from thermal heat expansion (but not runoff from melting ice) was reduced by about 25 percent, from about 17.8 centimeters (7 inches) in the business-as-usual scenario to 13.2 centimeters (5.2 inches) in the moderate mitigation scenario.Locally, winds and currents make a differenceFor some cities, the benefits of the lower-emission scenario would be especially significant. New York City, where sea levels this century are expected to rise more than almost anywhere else in the world, would see a difference of 9.8 centimeters (3.9 inches). Other cities that would see a greater-than-average reduction include Boston (9.3 cm/3.7 in), London (8.3 cm/3.3 in), Dar es Salaam (6.8 cm/2.7 in), Miami (6.5 cm/2.6 in), and Mumbai (5.8 cm/2.3 in).On the other hand, some cities in South America (such as Buenos Aires), Asia (such as Bangkok and Jakarta), Australia (such as Melbourne), and the west coast of North America (such as Vancouver and San Francisco) would see lower-than-average benefits. And reducing greenhouse gases would have no statistically significant effect on sea level rise along the western coasts of Australia and the Philippines.The reason for the local differences in sea level rise has to do with the influence (or lack thereof) of a changing climate on major currents and on atmosphere-ocean interactions around the globe.In the northern Atlantic, for example, warming temperatures are expected to weaken the Gulf Stream that transports warmer water from the subtropics to the Arctic. The powerful current draws water away from much of the east coast of the United States, and scientists have warned that a weakening current would send those waters back toward the coastline and significantly raise sea levels. If actions taken by society resulted in reduced emissions, the Gulf Stream would be less affected and, therefore, sea level rise in the north Atlantic would be less substantial.In contrast, the currents in some other ocean basins appear to be less sensitive to climate change. Across much of the Pacific, for example, sea levels are influenced by the Pacific Decadal Oscillation, a phenomenon related to winds and sea surface temperatures. Although climate change is affecting winds and causing sea surface temperatures to rise in the Pacific, it is not disrupting currents there as much as it is in the northern Atlantic. As a result, climate change mitigation that reduces thermal expansion would generally have a less significant effect on Pacific sea levels.The study also found greater variations in future sea level rise in different regions, including some cities where local sea levels are influenced by the Pacific Decadal Oscillation or by an Atlantic climate pattern known as the North Atlantic Oscillation. As a result, the projected sea level rise in the model varied more for London and Tokyo than for New York."City planners in some places will be able to make decisions based on more certain sea level projections, but for other places it's going to be more difficult to know what the sea levels will be," Bates said.About the paperTitle: Internal climate variability and projected future regional steric and dynamic sea level rise
Authors: Aixue Hu and Susan BatesJournal: Nature CommunicationsNew research estimates the extent to which sea level rise would be reduced for major cities worldwide by later this century if society cuts emissions of greenhouse gas emissions. These tables incorporates projections based on the thermal expansion of ocean water as well as on the localized impacts of winds and currents, but they do not include additional sea level rise caused by the melting of ice sheets and glaciers. (Data produced by Aixue Hu and Susan Bates, NCAR. Graphic by Simmi Sinha, UCAR. Click to enlarge. This graphic is freely available for media & nonprofit use.)

The rate of sea level rise is accelerating, a new study finds

NCAR scientist John Fasullo is a co-author of a new study appearing in the Proceedings of the National Academies of Science. The study finds that the rate of sea level rise is accelerating. The following is an excerpt from a news release by the Cooperative Institute for Research in Environmental Sciences. February 13, 2018 | Global sea level rise is not cruising along at a steady 3 mm per year. It’s accelerating a little every year, according to a new study that harnessed 25 years of satellite data to calculate that the rate is increasing by about 0.08 mm/year every year — which could mean an annual rate of sea level rise of 10 mm/year, or even more, by 2100.“This acceleration, driven mainly by accelerated melting in Greenland and Antarctica, has the potential to double the total sea level rise by 2100 as compared to projections that assume a constant rate—to more than 60 cm instead of about 30.” said lead author Steve Nerem, a scientists at the Cooperative Institute for Research in Environmental Sciences. "And this is almost certainly a conservative estimate," he added. "Our extrapolation assumes that sea level continues to change in the future as it has over the last 25 years. Given the large changes we are seeing in the ice sheets today, that's not likely."If the oceans continue to change at this pace, sea level will rise 65cm (26 inches) by 2100—enough to cause significant problems for coastal cities, according to the new assessment by Nerem and several colleagues from CU Boulder, the University of South Florida, NASA Goddard Space Flight Center, Old Dominion University, and the National Center for Atmospheric Research. The team, driven to understand and better predict Earth’s response to a warming world, published their work today in the journal Proceedings of the National Academy of Sciences.Rising concentrations of greenhouse gases in Earth’s atmosphere increase the temperature of air and water, which causes sea level to rise in two ways. First, warmer water expands, and this "thermal expansion" of the oceans has contributed about half of the 7 cm of global mean sea level rise we've seen over the last 25 years, Nerem said. Second, melting land ice flows into the ocean, also increasing sea level across the globe.These increases were measured using satellite altimeter measurements since 1992, including the U.S./European TOPEX/Poseidon, Jason-1, Jason-2, and Jason-3 satellite missions. But detecting acceleration is challenging, even in such a long record. Episodes like volcanic eruptions can create variability: the eruption of Mount Pinatubo in 1991 decreased global mean sea level just before the Topex/Poseidon satellite launch, for example. In addition, global sea level can fluctuate due to climate patterns such as El Niños and La Niñas (the opposing phases of the El Niño Southern Oscillation, or ENSO) which influence ocean temperature and global precipitation patterns.Read the full news release here. 

Drier and wetter: The future of precipitation variability

January 17, 2018 | Precipitation variability — the swing from dry to wet and back again — will continue to increase across the majority of the world's land area as the climate warms, according to a new study led by scientists at the National Center for Atmospheric Research.The researchers expect precipitation variability to become greater from day to day, year to year, and even decade to decade. The new research, published in the Nature journal Scientific Reports, provides results from sophisticated computer simulations that predict that there will be both more droughts and more floods within the same areas as the climate warms. The findings are relevant for water managers who need to make long-range plans."When it's dry, it will be drier. When it's wet, it will be wetter — in the same place," said NCAR scientist Angeline Pendergrass, lead author of the study. "There will be a broader range of conditions that will become 'normal.'"The research was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Department of Energy.As the climate continues to warm, the range of precipitation that is "normal" in a particular place is likely to grow, meaning a single location can become both wetter and drier. The image on the left shows a flood in Colorado. The image on the right shows a droughtin Texas. (Images courtesy the U.S. Department of Defense and U.S. Department of Agriculture.)New tools to study changes in precipitationHistorically, changes in precipitation variability have been difficult to pin down because the amount of rain or snow a particular region gets can vary a great deal naturally.But in recent years, the availability of large ensembles of climate model runs has allowed scientists to begin separating some of the more subtle impacts of climate change from the natural chaos in the climate system. These ensembles may include 30 or 40 runs of a single climate model over the same time period with slightly different, but equally plausible, initial conditions.Pendergrass and her colleagues, NCAR scientists Flavio Lehner, Clara Deser, and Benjamin Sanderson, along with ETH-Zürich professor Reto Knutti, took a closer look at precipitation variability using large ensembles of runs from the NCAR-based Community Earth System Model (CESM) and from the Geophysical Fluid Dynamics Laboratory (GFDL) climate model. They also looked at a collection of individual runs taken from many different climate models and known as the Climate Model Intercomparison Project Phase 5, or CMIP5.The team found that precipitation variability will likely increase substantially over two-thirds of the world's land areas by the end of the century if greenhouse gas emissions continue unabated. They also found that, on average, variability increases 4 to 5 percent over land per degree Celsius of warming and that variability increases across all time scales, from days to decades."This increase in variability is arising due to more moisture in the atmosphere and a weakening of global atmospheric circulation," Pendergrass said. "That's important because it means that changes in precipitation variability are not just linked to changes in El Niño and La Niña events, as some previous work implied."Helping water managers plan for the futurePendergrass hopes the study's findings will be used by water managers in their future planning. Models used today by water managers often assume that the change in precipitation variability in the future will track with the expected increase in average precipitation.But the new study finds that the increase in precipitation variability will outstrip the increase in average precipitation, which means that water managers may be miscalculating the magnitude of future swings from wet to dry or vice versa."Water managers may be underestimating how much heavy events — floods or droughts — will change," Pendergrass said.About the articleTitle: Precipitation variability increases in a warmer climateAuthors: Pendergrass, A. G., R. Knutti, F. Lehner, C. Deser, and B. M. SandersonJournal: Scientific Reports, DOI: 10.1038/s41598-017-17966-yWriter/contact:Laura Snider, Senior Science Writer

The climate secrets of southern clouds

BOULDER, Colo. — This month, an international team of scientists will head to the remote Southern Ocean for six weeks to tackle one of the region's many persistent mysteries: its clouds.What they discover will be used to improve climate models, which routinely underestimate the amount of solar radiation reflected back into space by clouds in the region. Accurately simulating the amount of radiation that is absorbed or reflected on Earth is key to calculating how much the globe is warming.The field campaign, called the Southern Ocean Clouds, Radiation, Aerosol Transport Experimental Study, or SOCRATES, could also help scientists understand the very nature of how clouds interact with aerosols — particles suspended in the atmosphere that can be from either natural or human-made sources. Aerosols can spur cloud formation, change cloud structure, and affect precipitation, all of which affect the amount of solar radiation that is reflected.During the mission, which will run from mid-January through February, the scientists will collect data from a bevy of advanced instruments packed onboard an aircraft and a ship, both of which are specially designed for scientific missions."SOCRATES will allow for some of the best observations of clouds, aerosols, radiation, and precipitation that have ever been collected over the Southern Ocean," said Greg McFarquhar, a principal investigator and the director of the University of Oklahoma Cooperative Institute for Mesoscale Meteorological Studies (CIMMS). "These data will provide us with critical insight into the physics of cloud formation in the region, information we can use to improve global climate models."The U.S. portion of SOCRATES is largely funded by the National Science Foundation (NSF).“The Southern Ocean is famously remote and stormy and it's hard to imagine a worse place to do a field campaign. But a vast, stormy ocean is a great laboratory for studying clouds, and it's clear from our models that we have a lot to learn about them,” said Eric DeWeaver, program director for Climate and Large-Scale Dynamics in NSF’s Geoscience directorate."I'm excited about this campaign because I think it will answer some fundamental questions about clouds and their dependence on atmospheric conditions," DeWeaver said. "We'll be able to use this information to understand cloud behavior closer to home and how clouds are likely to adjust to changing climatic conditions."Critical observing and logistical support for SOCRATES is being provided by the Earth Observing Laboratory (EOL) at the National Center for Atmospheric Research (NCAR). Other U.S. principal investigators are based at the University of Washington.The Australian portion of SOCRATES is largely funded by the country's government through the Australian Marine National Facility, which is owned and operated by CSIRO.A supercooled mysteryMcFarquhar and his colleagues think the reason that climate models are not accurately capturing the amount of radiation reflected by clouds above the Southern Ocean is because they may not be correctly predicting the composition of the clouds. In particular, the models may not be producing enough supercooled water — droplets that stay liquid even when the temperature is below freezing.One possible explanation for the problem is the way models represent how clouds interact with aerosols, a process that affects the amount of supercooled water in a cloud. These representations were developed from atmospheric observations, largely in the Northern Hemisphere, where most of the world's population lives.But the atmosphere over the Northern Hemisphere — even over the Arctic — contains many more pollutants, including aerosols, than the atmosphere over the Southern Ocean, which is relatively pristine."We don't know how appropriate the representations of these processes are for the Southern Hemisphere," McFarquhar said. "SOCRATES will give us an opportunity to observe these cloud-aerosol interactions and see how much they differ, if at all, from those in the Northern Hemisphere."Flying through hazardous cloudsThe NSF/NCAR HIAPER Gulfstream V has been modified to serve as a flying laboratory. (©UCAR. This figure is freely available for media & nonprofit use.)For the SOCRATES field campaign, observations will be taken from the NSF/NCAR High-performance Instrumented Airborne Platform for Environmental Research, or HIAPER, a highly modified Gulfstream V aircraft, and the R/V Investigator, an Australian deep-ocean research vessel."Much of what we currently know about Southern Ocean cloud, aerosol, and precipitation properties comes from satellite-based estimates, which are uncertain and have undergone few comparisons against independent data," said co-investigator Roger Marchand, a scientist at the University of Washington. "The data collected during SOCRATES will also enable us to evaluate current satellite data over the Southern Ocean, as well as potentially help in the design of better satellite-based techniques."The research aircraft will be based out of Hobart, Tasmania, and will make about 16 flights over the Southern Ocean during the course of the campaign. The many high-tech instruments on board will measure the size and distribution of cloud droplets, ice crystals, and aerosols, as well as record the temperature, winds, air pressure, and other standard atmospheric variables.The instruments include NCAR's HIAPER Cloud Radar (HCR) and High Spectral Resolution Lidar (HSRL). The wing-mounted HCR is able to "see" inside clouds and characterize the droplets within, while the HSRL can measure air molecules and aerosols. Together, the two highly advanced instruments will give scientists a more complete picture of the wide range of particles in the atmosphere above the Southern Ocean.The nature of the research — flying a plane in search of supercooled water —presents some challenges with aircraft icing."Oftentimes, the cleaner the air, the more probable large drops and severe icing conditions become," said Cory Wolff, the NCAR project manager who is overseeing aircraft operations for SOCRATES. "We have a number of precautions we're taking to mitigate that risk."First, a mission coordinator whose sole job is to monitor icing conditions will join each flight. Second, the design of the flights themselves will help the crew anticipate icing conditions before they have to fly through them. On the flight south from Tasmania, the HIAPER GV will fly high above the clouds — and the icing danger. During that leg of the flight, the scientists will collect information about the clouds below, both with onboard radar and lidar as well as with dropsondes — small instrument packages released from the aircraft.With that information, the scientists can determine whether it's safe to pilot the aircraft through the clouds on the return trip, collecting detailed information about the cloud composition.Sailing the stormiest seasThe Australian R/V Investigator will take measurements of the atmosphere and ocean during its six-week voyage. (Image courtesy CSIRO.)The measurements taken from the sky will be complemented by data collected from instruments on board the Australian R/V Investigator, including the NCAR Integrated Sounding System. The ISS gathers extensive data by using a radar wind profiler, surface meteorology sensors, and a balloon-borne radiosonde sounding system. The team will launch soundings every six hours, and sometimes more often, throughout the campaign."Observations from the ship will help us understand the background state of the atmosphere — how it's behaving," said NCAR scientist Bill Brown, who traveled to Australia in late November to prepare the ISS for the voyage.The ship will be deployed for the entire six weeks and will face its own challenges, notably the notorious roughness of the Southern Ocean, sometimes called the stormiest place on Earth."There are no land masses to break up the winds down there," Brown said. "So the ocean can be quite rough."SOCRATES investigators will also draw on measurements from another Australian ship as it travels between Tasmania and Antarctica on resupply missions, the R/V Aurora Australis, as well as observations from buoys and some land-based instruments on Macquarie Island."I am excited that we will have such a comprehensive suite of observations," McFarquhar said. "If we just had the cloud observations we wouldn’t have the appropriate context. If we just had the aerosols and measurements below the clouds, we wouldn't be able to understand the complete picture."For more about the SOCRATES campaign, visit the project website.Collaborating institutions:Australian Antarctic DivisionAustralian Bureau of MeteorologyAustralian Department of Environment and EnergyColorado State UniversityCooperative Institute for Mesoscale Meteorological StudiesCSIROKarlsruhe Institute of TechnologyMonash UniversityNational Center for Atmospheric ResearchNational Science FoundationNorthWest Research AssociatesQueensland University of TechnologyUniversity of California San DiegoUniversity of Colorado BoulderUniversity of Illinois at Urbana-ChampaignUniversity of MelbourneUniversity of OklahomaUniversity of Washington

Groundbreaking data set gives unprecedented look at future weather

Dec. 7, 2017 | How will weather change in the future? It's been remarkably difficult to say, but researchers are now making important headway, thanks in part to a groundbreaking new data set at the National Center for Atmospheric Research (NCAR).Scientists know that a warmer and wetter atmosphere will lead to major changes in our weather. But pinning down exactly how weather — such as thunderstorms, midwinter cold snaps, hurricanes, and mountain snowstorms — will evolve in the coming decades has proven a difficult challenge, constrained by the sophistication of models and the capacity of computers.Now, a rich, new data set is giving scientists an unprecedented look at the future of weather. Nicknamed CONUS 1 by its creators, the data set is enormous. To generate it, the researchers ran the NCAR-based Weather Research and Forecasting model (WRF) at an extremely high resolution (4 kilometers, or about 2.5 miles) across the entire contiguous United States (sometimes known as "CONUS") for a total of 26 simulated years: half in the current climate and half in the future climate expected if society continues on its current trajectory.The project took more than a year to run on the Yellowstone supercomputer at the NCAR-Wyoming Supercomputing Center. The result is a trove of data that allows scientists to explore in detail how today's weather would look in a warmer, wetter atmosphere.CONUS 1, which was completed last year and made easily accessible through NCAR's Research Data Archive this fall, has already spawned nearly a dozen papers that explore everything from changes in rainfall intensity to changes in mountain snowpack."This was a monumental effort that required a team of researchers with a broad range of expertise — from climate experts and meteorologists to social scientists and data specialists — and years of work," said NCAR senior scientist Roy Rasmussen, who led the project. "This is the kind of work that's difficult to do in a typical university setting but that NCAR can take on and make available to other researchers."Other principal project collaborators at NCAR are Changhai Liu and Kyoko Ikeda. A number of additional NCAR scientists lent expertise to the project, including Mike Barlage, Andrew Newman, Andreas Prein, Fei Chen, Martyn Clark, Jimy Dudhia, Trude Eidhammer, David Gochis, Ethan Gutmann, Gregory Thompson, and David Yates. Collaborators from the broader community include Liang Chen, Sopan Kurkute, and Yanping Li (University of Saskatchewan); and Aiguo Dai (SUNY Albany).Climate and weather research coming togetherClimate models and weather models have historically operated on different scales, both in time and space. Climate scientists are interested in large-scale changes that unfold over decades, and the models they've developed help them nail down long-term trends such as increasing surface temperatures, rising sea levels, and shrinking sea ice.Climate models are typically low resolution, with grid points often separated by 100 kilometers (about 60 miles). The advantage to such coarse resolution is that these models can be run globally for decades or centuries into the future with the available supercomputing resources. The downside is that they lack detail to capture features that influence local atmospheric events,  such as land surface topography, which drives mountain weather, or the small-scale circulation of warm air rising and cold air sinking that sparks a summertime thunderstorm.Weather models, on the other hand, have higher resolution, take into account atmospheric microphysics, such as cloud formation, and can simulate weather fairly realistically. It's not practical to run them for long periods of time or globally, however — supercomputers are not yet up to the task.As scientific understanding of climate change has deepened, the need has become more pressing to merge these disparate scales to gain better understanding of how global atmospheric warming will affect local weather patterns."The climate community and the weather community are really starting to come together," Rasmussen said. "At NCAR, we have both climate scientists and weather scientists, we have world-class models, and we have access to state-of-the-art supercomputing resources. This allowed us to create a data set that offers scientists a chance to start answering important questions about the influence of climate change on weather."Weather models are typically run with a much higher resolution than climate models, allowing them to more accurately capture precipitation. This figure compares the average annual precipitation from a 13-year run of the Weather Research and Forecasting (WRF) model (left) with the average annual precipitation from running the global Community Earth System Model (CESM) to simulate the climate between 1976 and 2005 (right). (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.)Today's weather in a warmer, wetter futureTo create the data set, the research team used WRF to simulate weather across the contiguous United States between 2000 and 2013. They initiated the model using a separate "reanalysis" data set constructed from observations. When compared with radar images of actual weather during that time period, the results were excellent."We weren't sure how good a job the model would do, but the climatology of real storms and the simulated storms was very similar," Rasmussen said.With confidence that WRF could accurately simulate today's weather, the scientists ran the model for a second 13-year period, using the same reanalysis data but with a few changes. Notably, the researchers increased the temperature of the background climate conditions by about 5 degrees Celsius (9 degrees Fahrenheit), the end-of-the-century temperature increase predicted by averaging 19 leading climate models under a business-as-usual greenhouse gas emissions scenario (2080–2100). They also increased the water vapor in the atmosphere by the corresponding amount, since physics dictates that a warmer atmosphere can hold more moisture.The result is a data set that examines how weather events from the recent past — including named hurricanes and other distinctive weather events — would look in our expected future climate.The data have already proven to be a rich resource for people interested in how individual types of weather will respond to climate change. Will the squall lines of intense thunderstorms that rake across the country's midsection get more intense, more frequent, and larger? (Yes, yes, and yes.) Will snowpack in the West get deeper, shallower, or disappear? (Depends on the location: The high-elevation Rockies are much less vulnerable to the warming climate than the coastal ranges.) Other scientists have already used the CONUS 1 data set to examine changes to rainfall-on-snow events, the speed of snowmelt, and more.Running the Weather Research and Forecasting (WRF) model at 4 kilometers resolution over the contiguous United States produced realistic simulations of precipitation. Above, average annual precipitation from WRF for the years 2000-2013 (left) compared to the PRISM dataset for the same period (right). PRISM is based on observations. (©UCAR. This figure is courtesy Kyoko Ikeda. Ite is freely available for media & nonprofit use.) Pinning down changes in storm trackWhile the new data set offers a unique opportunity to delve into changes in weather, it also has limitations. Importantly, it does not reflect how the warming climate might shift large-scale weather patterns, like the typical track most storms take across the United States. Because the same reanalysis data set is used to kick off both the current and future climate simulations, the large-scale weather patterns remain the same in both scenarios.To remedy this, the scientists are already working on a new simulation, nicknamed CONUS 2.Instead of using the reanalysis data set — which was built from actual observations — to kick off the modeling run of the present-day climate, the scientists will use weather extracted from a simulation by the NCAR-based Community Earth System Model (CESM).  For the future climate run, the scientists will again take the weather patterns from a CESM simulation — this time for the year 2100 — and feed the information into WRF.The finished data set, which will cover two 20-year periods, will likely take another year of supercomputing time to complete, this time on the newer and more powerful Cheyenne system at the NCAR-Wyoming Supercomputing Center. When complete, CONUS 2 will help scientists understand how expected future storm track changes will affect local weather across the country.Scientists are already eagerly awaiting the data from the new runs, which could start in early 2018. But even that data set will have limitations. One of the greatest may be that it will rely on a single run from CESM. Another NCAR-based project ran the model 40 times from 1920 to 2100 with only minor changes to the model's starting conditions, showing researchers how the natural chaos of the atmosphere can cause the climate to look quite different from simulation to simulation.Still, a single run of CESM will let scientists make comparisons between CONUS 1 and CONUS 2, allowing them to pinpoint possible storm track changes in local weather. And CONUS 2 can also be compared to other efforts that downscale global simulations to study how regional areas will be affected by climate change, providing insight into the pros and cons of different research approaches."This is a new way to look at climate change that allows you to examine the phenomenology of weather and answer the question, 'What will today's weather look like in a future climate?'" Rasmussen said. "This is the kind of detailed, realistic information that water managers, city planners, farmers, and others can understand and helps them plan for the future."Get the dataHigh Resolution WRF Simulations of the Current and Future Climate of North America, DOI: 10.5065/D6V40SXPStudies that relied on the CONUS data setExtreme downpours could increase fivefold across parts of the U.S.Slower snowmelt in a warming worldNorth American storm clusters could produce 80 percent more rainWriter/contact:Laura Snider, Senior Science Writer

North American storm clusters could produce 80 percent more rain

BOULDER, Colo. — Major clusters of summertime thunderstorms in North America will grow larger, more intense, and more frequent later this century in a changing climate, unleashing far more rain and posing a greater threat of flooding across wide areas, new research concludes.The study, by scientists at the National Center for Atmospheric Research (NCAR), builds on previous work showing that storms are becoming more intense as the atmosphere is warming. In addition to higher rainfall rates, the new research finds that the volume of rainfall from damaging storms known as mesoscale convective systems (MCSs) will increase by as much as 80 percent across the continent by the end of this century, deluging entire metropolitan areas or sizable portions of states."The combination of more intense rainfall and the spreading of heavy rainfall over larger areas means that we will face a higher flood risk than previously predicted," said NCAR scientist Andreas Prein, the study's lead author. "If a whole catchment area gets hammered by high rain rates, that creates a much more serious situation than a thunderstorm dropping intense rain over parts of the catchment.""This implies that the flood guidelines which are used in planning and building infrastructure are probably too conservative," he added.The research team drew on extensive computer modeling that realistically simulates MCSs and thunderstorms across North America to examine what will happen if emissions of greenhouse gases continue unabated.The study will be published Nov. 20 in the journal Nature Climate Change. It was funded by the National Science Foundation, which is NCAR's sponsor, and by the U.S. Army Corps of Engineers. Hourly rain rate averages for the 40 most extreme summertime mesoscale convective systems (MCSs) in the current (left) and future climate of the mid-Atlantic region. New research shows that MSCs will generate substantially higher maximum rain rates over larger areas by the end of the century if society continues a "business as usual" approach of emitting greenhouse gases . (©UCAR, Image by Andreas Prein, NCAR. This image is freely available for media & nonprofit use.)A warning signalThunderstorms and other heavy rainfall events are estimated to cause more than $20 billion of economic losses annually in the United States, the study notes. Particularly damaging, and often deadly, are MSCs: clusters of thunderstorms that can extend for many dozens of miles and last for hours, producing flash floods, debris flows, landslides, high winds, and/or hail. The persistent storms over Houston in the wake of Hurricane Harvey were an example of an unusually powerful and long-lived MCS.Storms have become more intense in recent decades, and a number of scientific studies have shown that this trend is likely to continue as temperatures continue to warm. The reason, in large part, is that the atmosphere can hold more water as it gets warmer, thereby generating heavier rain.A study by Prein and co-authors last year used high-resolution computer simulations of current and future weather, finding that the number of summertime storms that produce extreme downpours could increase by five times across parts of the United States by the end of the century. In the new study, Prein and his co-authors focused on MCSs, which are responsible for much of the major summertime flooding east of the Continental Divide. They investigated not only how their rainfall intensity will change in future climates, but also how their size, movement, and rainfall volume may evolve.Analyzing the same dataset of computer simulations and applying a special storm-tracking algorithm, they found that the number of severe MCSs in North America more than tripled by the end of the century. Moreover, maximum rainfall rates became 15 to 40 percent heavier, and intense rainfall reached farther from the storm's center. As a result, severe MCSs increased throughout North America, particularly in the northeastern and mid-Atlantic states, as well as parts of Canada, where they are currently uncommon.The research team also looked at the potential effect of particularly powerful MCSs on the densely populated Eastern Seaboard. They found, for example, that at the end of the century, intense MCSs over an area the size of New York City could drop 60 percent more rain than a severe present-day system. That amount is equivalent to adding six times the annual discharge of the Hudson River on top of a current extreme MCS in that area."This is a warning signal that says the floods of the future are likely to be much greater than what our current infrastructure is designed for," Prein said. "If you have a slow-moving storm system that aligns over a densely populated area, the result can be devastating, as could be seen in the impact of Hurricane Harvey on Houston."This satellite image loop shows an MCS developing over West Virginia on June 23, 2016. The resulting floods caused widespread flooding, killing more than 20 people.  MCSs are responsible for much of the major flooding east of the Continental Divide during warm weather months. (Image by NOAA National Weather Service, Aviation Weather Center.) Intensive modelingAdvances in computer modeling and more powerful supercomputing facilities are enabling climate scientists to begin examining the potential influence of a changing climate on convective storms such as thunderstorms, building on previous studies that looked more generally at regional precipitation trends.For the new study, Prein and his co-authors turned to a dataset created by running the NCAR-based Weather and Research Forecasting (WRF) model over North America at a resolution of 4 kilometers (about 2.5 miles). That is sufficiently fine-scale resolution to simulate MCSs. The intensive modeling, by NCAR scientists and study co-authors Roy Rasmussen, Changhai Liu, and Kyoko Ikeda, required a year to run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.The team used an algorithm developed at NCAR to identify and track simulated MCSs. They compared simulations of the storms at the beginning of the century, from 2000 to 2013, with observations of actual MCSs during the same period and showed that the modeled storms are statistically identical to real MCSs.The scientists then used the dataset and algorithm to examine how MCSs may change by the end of the century in a climate that is approximately 5 degrees Celsius (9 degrees Fahrenheit) warmer than in the pre-industrial era — the temperature increase expected if greenhouse gas emissions continue unabated.About the paperTitle: Increased rainfall volume from future convective storms in the USAuthors: Andreas F Prein, Changhai Liu, Kyoko Ikeda, Stanley B Trier, Roy M Rasmussen, Greg J Holland, Martyn P ClarkJournal: Nature Climate Change  

Investing in climate observations would generate major returns

November 14, 2017 | A major new paper by more than two dozen climate experts concludes that a well-designed climate observing system could deliver trillions of dollars in benefits while providing decision makers with the information they need in coming decades to protect public health and the economy."We are on the threshold of a new era in prediction, drawing on our knowledge of the entire Earth system to strengthen societal resilience to potential climate and weather disasters," said Antonio Busalacchi, president of the University Corporation for Atmospheric Research and one of the co-authors. "Strategic investments in observing technologies will pay for themselves many times over by protecting life and property, promoting economic growth, and providing needed intelligence to decision makers."Elizabeth Weatherhead, a scientist with the Cooperative Institute for Research in Environmental Sciences at the University of Colorado Boulder, is the lead author of the new paper, published last week in Earth's Future. The co-authors include two scientists associated with the National Center for Atmospheric Research: Jeffrey Lazo and Kevin Trenberth.The scientists urge that investments focus on tackling seven grand challenges. These include predicting extreme weather and climate shifts, the role of clouds and circulation in regulating climate, regional sea level change and coastal impacts, understanding the consequences of melting ice, and feedback loops involving carbon cycling.For more about the paper, see the CIRES news release.

New climate forecasts for watersheds - and the water sector

Nov. 10, 2017 | Water managers and streamflow forecasters can now access bi-weekly, monthly, and seasonal precipitation and temperature forecasts that are broken down by individual watersheds, thanks to a research partnership between the National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU Boulder). The project is sponsored by the National Oceanic and Atmospheric Administration (NOAA) through the Modeling, Applications, Predictions, and Projections program.Operational climate forecasts for subseasonal to seasonal time scales are currently provided by the NOAA Climate Prediction Center and other sources. The forecasts usually take the form of national contour maps (example) and gridded datasets at a relatively coarse geographic resolution. Some forecast products are broken down further, based on state boundaries or on climate divisions, which average two per state; others are summarized for major cities. But river forecasters and water managers grapple with climate variability and trends in the particular watersheds within their service areas, which do not align directly with the boundaries of existing forecast areas. A forecast that directly describes predicted conditions inside an individual watershed would be extremely valuable to these users for making decisions in their management areas, such as how much water to release or store in critical reservoirs and when.To bridge this gap, the NCAR–CU Boulder research team has developed a new prototype prediction system that maps climate forecasts to watershed boundaries over the contiguous United States in real time. The system is currently running at NCAR, with real-time forecasts and analyses available on a demonstration website."We are trying to improve the accessibility and relevance of climate predictions for streamflow forecasting groups and water managers," said NCAR scientist Andy Wood, who co-leads the project. "We can’t solve all the scientific challenges of climate prediction, but we can make it easier for a person thinking about climate and water in a river basin — such as the Gunnison, or the Yakima, or the Potomac — to find and download operational climate information that has been tailored to that basin’s observed variability."The project is funded by NOAA, and the scientists plan to hand off successful components of the system for experimental operational evaluation within the NOAA National Weather Service.  Collaborators include scientists from the NOAA Climate Prediction Center and partners from the major federal water agencies: the U.S. Army Corps of Engineers and the Bureau of Reclamation.This screenshot of the S2S Climate Outlooks for Watersheds website shows forecasted temperature anomalies for watersheds across the contiguous United States. As users scroll across different watersheds, they get more precise information. In this screenshot from early November 2017, the forecast is showing that, over the next one to two weeks, the Colorado Headwaters watershed is expected to be 1.2 degrees warmer than normal. Visit the website to learn more. (©UCAR. This image is freely available for media & nonprofit use.)  Beyond the standard weather forecastPrecipitation and temperature forecasts that extend beyond the typical 7- to 10-day window can be useful to water managers making a number of important decisions about how to best regulate supplies. For instance, during a wet water year, when snowpack is high and reservoirs are more full than usual, the relative warmth or coolness of the coming spring can affect how quickly the snow melts. Good spring season forecasts allow water managers to plan in advance for how to best manage the resulting runoff.For water systems in drought, such as California's during 2012–2015, early outlooks on whether the winter rainy season will help alleviate the drought or exacerbate it can help water utilities strategize ways of meeting the year’s water demands. Historically, making these kinds of longer-term predictions accurately has been highly challenging. But in recent years, scientists have improved their skill at subseasonal and seasonal climate prediction. NOAA’s National Centers for Environmental Prediction plays a key role, both running an in-house modeling system — the Climate Forecast System, version 2 (CFSv2) — and leading an effort called the North American Multi-Model Ensemble (NMME). These model-based forecasts help inform the NOAA official climate forecasts, which also include other tools and expert judgment. NMME combines forecasts from seven different climate models based in the U.S. and Canada to form a super-ensemble of climate predictions that extend up to 10 months into the future. The combination of the different forecasts is often more accurate than the forecast from any single model. Temperature forecasts, in particular, from the combined system are notably more accurate than they were 10 years ago, Wood said, partly due to their representation of observed warming trends. Even with these new tools, however, predicting seasonal precipitation beyond the first month continues to be a major challenge. The NCAR–CU Boulder project makes use of both the CFSv2 and NMME forecasts. It generates predictions for bi-weekly periods (weeks 1-2, 2-3, and 3-4) from CFSv2 that are updated daily and longer-term forecasts derived from the NMME (months 1, 2, 3, and season 1) that are updated monthly. The scientists currently map these forecasts to 202 major watersheds in the contiguous U.S.Analyzing forecast skillThe resulting watershed-specific forecasts are available in real-time on the project's interactive website, which also provides information about their accuracy and reliability."It's important for users to be able to check on the quality of the forecasts," said Sarah Baker, a doctoral student in the Civil, Environmental, and Architectural Engineering Department at CU Boulder. "We're able to use hindcasts, which are long records of past forecasts, to analyze and describe the skill of the current forecasts. Baker, who also works for the Bureau of Reclamation, has been building the prototype system under the supervision of Wood and her academic adviser, CU Professor Balaji Rajagopalan. The researchers are also using analyses of forecast accuracy and reliability to begin correcting for systematic biases — such as consistently over-predicting springtime rains in one watershed or under-predicting summertime heat in another — in the forecasts.The project team has presented the project at a number of water-oriented meetings in the western U.S. Water managers, operators, and researchers from agencies such as the Bureau of Reclamation and utilities such as the Southern Nevada Water Authority, which manages water for Las Vegas, have expressed interest in the new forecast products."This project has great potential to provide climate outlook information that is more relevant for hydrologists and the water sector. It will be critical to connect with stakeholders or possible users of the forecasts so that their needs can continue to help shape this type of information product," said NOAA’s Andrea Ray. Ray leads an effort funded by NIDIS, the National Integrated Drought Information System, to identify tools and information such as this for a NOAA online Water Resources Monitor and Outlook that would also help connect stakeholders to climate and water information.In the coming year, the research team will implement statistical post-processing methods to improve the accuracy of the forecasts. They will also investigate the prediction of extreme climate events at the watershed scale. ContactAndy Wood, NCAR Research Applications LaboratoryWebsitehttp://hydro.rap.ucar.edu/s2sCollaboratorsCU BoulderNCARNOAAU.S. Army Corps of EngineersBureau of Reclamation FunderNOAA's Modeling, Applications, Predictions and Projections Climate Testbed program

New approach to geoengineering simulations is significant step forward

BOULDER, Colo. — Using a sophisticated computer model, scientists have demonstrated for the first time that a new research approach to geoengineering could potentially be used to limit Earth’s warming to a specific target while reducing some of the risks and concerns identified in past studies, including uneven cooling of the globe.The scientists developed a specialized algorithm for an Earth system model that varies the amount and location of geoengineering — in this case, injections of sulfur dioxide high into the atmosphere — that would in theory be needed, year to year, to effectively cap warming. They caution, however, that more research is needed to determine if this approach would be practical, or even possible, in the real world.The findings from the new research, led by scientists from the National Center for Atmospheric Research (NCAR), Pacific Northwest National Laboratory (PNNL), and Cornell University, represent a significant step forward in the field of geoengineering. Still, there are many questions that need to be answered about sulfur dioxide injections, including how this type of engineering might alter regional precipitation patterns and the extent to which such injections would damage the ozone layer. The possibility of a global geoengineering effort to combat warming also raises serious governance and ethical concerns."This is a major milestone and offers promise of what might be possible in the future,” said NCAR scientist Yaga Richter, one of the lead authors. “But it is just the beginning; there is a lot more research that needs to be done."Past modeling studies have typically sought to answer the question "What happens if we do geoengineering?" The results of those studies have described the outcomes — both positive and negative — of injecting a predetermined amount of sulfates into the atmosphere, often right at Earth's equator. But they did not attempt to specify the outcome they hoped to achieve at the outset.In a series of new studies, the researchers turned the question around, instead asking, "How might geoengineering be used to meet specific climate objectives?""We have really shifted the question, and by doing so, found that we can better understand what geoengineering may be able to achieve," Richter said.The research findings are detailed in a series of papers published in a special issue of the Journal of Geophysical Research – Atmospheres.Mimicking a volcanoIn theory, geoengineering — large-scale interventions designed to modify the climate — could take many forms, from launching orbiting solar mirrors to fertilizing carbon-hungry ocean algae. For this research, the team studied one much-discussed approach: injecting sulfur dioxide into the upper atmosphere, above the cloud layer.The idea of combating global warming with these injections is inspired by history's most massive volcanic eruptions. When volcanoes erupt, they loft sulfur dioxide high into the atmosphere, where it's chemically converted into light-scattering sulfate particles called aerosols. These sulfates, which can linger in the atmosphere for a few years, are spread around the Earth by stratospheric winds, forming a reflective layer that cools the planet.To mimic these effects, sulfur dioxide could be injected directly into the stratosphere, perhaps with the help of high-flying aircraft. But while the injections would counter global warming, they would not address all the problems associated with climate change, and they would likely have their own negative side effects.For example, the injections would not offset ocean acidification, which is linked directly to carbon dioxide emissions. Geoengineering also could result in significant disruptions in rainfall patterns as well as delays in healing the ozone hole. Moreover, once geoengineering began, if society wanted to avoid a rapid and drastic increase in temperature, the injections would need to continue until mitigation efforts were sufficient to cap warming on their own.There would also likely be significant international governance challenges that would have to be overcome before a geoengineering program could be implemented."For decision makers to accurately weigh the pros and cons of geoengineering against those of human-caused climate change, they need more information," said PNNL scientist Ben Kravitz, also a lead author of the studies. "Our goal is to better understand what geoengineering can do — and what it cannot."Modeling the complex chemistryFor the new studies, the scientists used the NCAR-based Community Earth System Model with its extended atmospheric component, the Whole Atmosphere Community Climate Model. WACCM includes detailed chemistry and physics of the upper atmosphere and was recently updated to simulate stratospheric aerosol evolution from source gases, including geoengineering."It was critical for this study that our model be able to accurately capture the chemistry in the atmosphere so we could understand how quickly sulfur dioxide would be converted into aerosols and how long those aerosols would stick around," said NCAR scientist Michael Mills, also a lead author. "Most global climate models do not include this interactive atmospheric chemistry.”The scientists also significantly improved how the model simulates tropical stratospheric winds, which change direction every few years. Accurately representing these winds is critical to understanding how aerosols are blown around the planet.The scientists successfully tested their model by seeing how well it could simulate the massive 1991 eruption of Mount Pinatubo, including the amount and rate of aerosol formation, as well as how those aerosols were transported around the globe and how long they stayed in the atmosphere.Then the scientists began to explore the impacts of injecting sulfur dioxide at different latitudes and altitudes. From past studies, the scientists knew that sulfates injected only at the equator affect Earth unevenly: over-cooling the tropics and under-cooling the poles. This is especially problematic since climate change is warming the Arctic at a faster rate. Climate change is also causing the Northern Hemisphere to warm more quickly than the Southern Hemisphere.The researchers used the model to study 14 possible injection sites at seven different latitudes and two different altitudes — something never before tried in geoengineering research. They found that they could spread the cooling more evenly across the globe by choosing injection sites on either side of the equator.The simulations on the left represent how global temperatures are expected to change if greenhouse gas emissions continue on a "business as usual" trajectory. The simulations on the right show how temperature could be stabilized in a model by injecting sulfur dioxide high into the atmosphere at four separate locations. Because greenhouse gases are being emitted at the same rate in the simulations on the left and the right, stopping geoengineering would result in a drastic spike in global temperatures. (©UCAR. This image is freely available for media & nonprofit use.)  Meeting multiple objectivesThe researchers then pieced together all their work into a single model simulation with specific objectives: to limit average global warming to 2020 levels through the end of the century and to minimize the difference in cooling between the equator and the poles as well as between the northern and southern hemispheres.They gave the model four choices of injection sites — at 15 degrees and 30 degrees North and South in latitude — and then implemented an algorithm that determines, for each year, the best injection sites and the quantity of sulfur dioxide needed at those sites. The model's ability to reformulate the amount of geoengineering needed each year, based on that year's conditions, also allowed the simulation to respond to natural fluctuations in the climate.The model successfully kept the surface temperatures near 2020 levels against a background of increasing greenhouse gas emissions that would be consistent with a business-as-usual scenario. The algorithm’s ability to choose injection sites cooled the Earth more evenly than in previous studies, because it could inject more sulfur dioxide in regions that were warming too quickly and less in areas that had over-cooled.However, by the end of the century, the amount of sulfur dioxide that would need to be injected each year to offset human-caused global warming would be enormous: almost five times the amount spewed into the air by Mount Pinatubo on June 15, 1991.Flipping the research question"The results demonstrate that it is possible to flip the research question that's been guiding geoengineering studies and not just explore what geoengineering does but see it as a design problem,” said Doug MacMartin, a scientist at Cornell and the California Institute of Technology. “When we see it in that light, we can then start to develop a strategy for how to meet society’s objectives."In the current series of studies, adjusting the geoengineering plan just once a year allowed the researchers to keep the average global temperature to 2020 levels in a given year, but regional temperatures — as well as seasonal temperature changes — were sometimes cooler or hotter than desired. So next steps could include exploring the possibility of making more frequent adjustments at a different choice of injection locations.The scientists are already working on a new study to help them understand the possible impacts geoengineering might have on regional phenomena, such as the Asian monsoons."We are still a long way from understanding all the interactions in the climate system that could be triggered by geoengineering, which means we don’t yet understand the full range of possible side effects," said NCAR scientist Simone Tilmes, a lead author. "But climate change also poses risks. Continuing research into geoengineering is critical to assess benefits and side effects and to inform decision makers and society."The research was funded by the Defense Advanced Research Projects Agency and the National Science Foundation, NCAR's sponsor.Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation or the Defense Advanced Research Projects Agency.About the papers:Titles:Radiative and chemical response to interactive stratospheric sulfate aerosols in fully coupled CESM1(WACCM), DOI: 10.1002/2017JD027006Sensitivity of aerosol distribution and climate response to stratospheric SO2 injection locations, DOI: 10.1002/2017JD026888Stratospheric Dynamical Response and Ozone Feedbacks in the Presence of SO2 Injections, DOI: 10.1002/2017JD026912The climate response to stratospheric aerosol geoengineering can be tailored using multiple injection locations, DOI: 10.1002/2017JD026868First simulations of designing stratospheric sulfate aerosol geoengineering to meet multiple simultaneous climate objectives, DOI: 10.1002/2017JD026874Authors: B. Kravitz, D. MacMartin, M. J. Mills, J. H. Richter, and S. TilmesCo-authors: F. Vitt, J. J. Tribbia, J.-F. LamarqueJournal: Journal of Geophysical Research – AtmospheresData access: All the data from the experiments are available on the Earth System Gridathttps://www.earthsystemgrid.org/dataset/ucar.cgd.ccsm4.so2_geoeng.htmlorhttp://dx.doi.org/10.5065/D6X63KMM and https://www.earthsystemgrid.org/dataset/ucar.cgd.ccsm4.so2_ctl_fb.html orhttp://dx.doi.org/10.5065/D6PC313T.Writer:Laura Snider, Senior Science Writer

Pages

Subscribe to Climate & Climate Change