Climate & Climate Change

Days of record-breaking heat ahead

BOULDER, Colo. — If society continues to pump greenhouse gases into the atmosphere at the current rate, Americans later this century will have to endure, on average, about 15 daily maximum temperature records for every time that the mercury notches a record low, new research indicates.That ratio of record highs to record lows could also turn out to be much higher if the pace of emissions increases and produces even more warming, according to the study led by scientists at the National Center for Atmospheric Research (NCAR).Over the last decade, in contrast, the ratio of record high temperatures to record lows has averaged about two to one."More and more frequently, climate change will affect Americans with record-setting heat," said NCAR senior scientist Gerald Meehl, lead author of the new paper. "An increase in average temperatures of a few degrees may not seem like much, but it correlates with a noticeable increase in days that are hotter than any in the record, and nights that will remain warmer than we've ever experienced in the past." The United States has experienced unusual warmth lately, as indicated by this July 22, 2016, weather map showing much of the country facing highs in the 90s and 100s and lows in the 70s. New research indicates that more record high temperatures may be in store. (Weather map by the National Oceanic and Atmospheric Administration's Weather Prediction Center.)The 15-to-1 ratio of record highs to lows is based on temperatures across the continental United States increasing by slightly more than 3 degrees Celsius (5.4 degrees Fahrenheit) above recent years, which is about the amount of warming expected to occur with the current pace of greenhouse gas emissions.The new research appears this week in the "Proceedings of the National Academy of Sciences." It was funded by the Department of Energy (DOE) and the National Science Foundation (NSF), which is NCAR's sponsor. The study was coauthored by NCAR scientist Claudia Tebaldi and by Dennis Adams-Smith, a scientist previously at Climate Central and now at the National Oceanic and Atmospheric Administration's Geophysical Fluid Dynamics Laboratory.Hotter days  In a 2009 study, Meehl and colleagues found that the ratio of record daily high temperatures to record daily low temperatures has steadily increased since the 1970s as average temperatures over the United States have warmed. Computer models at that time indicated that the ratio could continue to increase during this century, although the research team looked into just one scenario of future emissions. The scientists also found that the models were overstating the ratio of record highs to record lows in recent years, compared to observations.By digging further into the issue and analyzing why the models differed from observations, Meehl and his co-authors have now produced a better calibrated projection of future record-breaking daily highs across the U.S. They based their projections on the average temperature increase over the continental United States, rather than on a particular scenario of future emissions.By about 2065, for example, U.S. temperatures will rise by an average of slightly more than 3 degrees C (5.4 degrees F) if society maintains a “business as usual” increase in the emission of greenhouse gases. Under such a scenario, the ratio of record daily high temperatures to record daily lows will likely be about 15 to 1, although it could range anywhere from 7 to 1 up to 22 to 1, the study found.If temperatures increase even more this century, the ratio of record highs to record lows will jump substantially. For example, if temperatures climb more than 4 degrees C (7.2 degrees F), Americans could experience about 38 record highs for every record low. Such an outcome could occur if society does not make any efforts to mitigate the production of greenhouse gases."Every degree of warming makes a substantial amount of difference, with the ratio of record highs to record lows becoming much greater," Meehl said. "Even with much warmer temperatures on average, we will still have winter and we will still get record cold temperatures, but the numbers of those will be really small compared to record high maximums."If temperatures were not warming, Meehl said, the ratio of record highs to record lows would average out to about one to one.Instead, record high temperatures have already become a common occurrence in much of the country. The ratio of record highs to lows has averaged about 2 to 1 over the first decade of the 21st century, but there is considerable year-to-year variation. The ratio was about 5 to 1 in 2012, dropping to about 1 to 1 in 2013 and 2014, then almost 3 to 1 in 2015. The unusual warmth of 2016, resulting from both climate change and natural patterns such as El Niño, has led to 24,519 record daily maximums vs. 3,970 record daily minimums—a ratio of about 6 to 1.Precipitation and the warm 1930sA key part of the study involved pinpointing why the models in the 2009 study were simulating somewhat more daily record high maximum temperatures compared with recent observations, while there was good agreement between the models and the observed decreases in record low minimums. The authors focused on two sets of simulations conducted on the NCAR-based Community Climate System Model (version 4), which is funded by DOE and NSF and developed by climate scientists across the country.Their analysis uncovered two reasons for the disparity between the computer models and observations.First, the models tended to underestimate precipitation. Because the air is cooled by precipitation and resulting evapotranspiration — the release of moisture from the land and plants back to the atmosphere — the tendency of the computer models to create an overly dry environment led to more record high temperatures.Second, the original study in 2009 only went back to the 1950s. For the new study, the research team also analyzed temperatures in the 1930s and 1940s, which is as far back as accurate recordkeeping will allow. Because the Dust Bowl days of the 1930s were unusually warm, with many record-setting high temperatures, the scientists found that it was more difficult in subsequent years to break those records, even as temperatures warmed. However, even taking the warm 1930s into account, both the model-simulated and observed ratio of record highs to record lows have been increasing."The steady increase in the record ratio is an immediate and stark reminder of how our temperatures have been shifting and continue to do so, reaching unprecedented highs and fewer record lows," said Tebaldi. "These changes pose adaptation challenges to both human and natural systems. Only a substantial mitigation of greenhouse gas emissions may stop this increase, or at least slow down its pace."About the articleTitle: "US daily temperature records past, present, and future"Authors: Gerald A. Meehl, Claudia Tebaldi, and Dennis Adams-SmithJournal: Proceedings of the National Academy of Sciences

Applying indigenous and Western knowledge to environmental research

November 3, 2016 | Native American researchers, students, and community members will partner with Western science organizations to help shape mutually beneficial research projects as part of a two-year National Science Foundation grant awarded recently to the University Corporation for Atmospheric Research. UCAR manages the National Center for Atmospheric Research (NCAR) under sponsorship by NSF.The project marks a milestone in collaborations between NCAR|UCAR and Native American partners to increase the presence of indigenous perspectives and participants in geoscience research. It also comes at a time when indigenous people are among the hardest-hit by climate change, with several communities forming America's first wave of climate refugees.Aimed at building research partnerships between Native American and Western scientists, the NCAR|UCAR project has two supporting goals: broadening career paths for Native American students interested in Earth system science, and increasing the cultural sensitivity of Western scientists. Other partners in the project include the NCAR-based Rising Voices program, Haskell Indian Nations University, the University of Arizona's Biosphere 2, Michigan State University, and the GLOBE citizen science program conducted by the UCAR-based Global Learning and Observations to Benefit the Environment."It's an exciting opportunity for both young indigenous scientists and scientists at NCAR and Biosphere 2," said Carolyn Brinkworth, NCAR director of Diversity, Education, and Outreach, and principal investigator of the project. "It's also a very different way of thinking about the science - truly integrating indigenous and traditional Western practices to benefit all of our partners."For example, she noted, indigenous communities can contribute important information about climate change by bringing generations of knowledge and experience with resource management and environmental and ecological processes.Students attending the Rising Voices workshop in Waimea, Hawaii, in 2016, visited a food garden planted according to traditional Hawaiian techniques to learn about climate change and phenology – the study of the seasonality of plants and animals. (Photo courtesy Craig Elevitch.)The pilot project is one of 37 awarded nationwide as part of a new NSF program called INCLUDES (Inclusion across the Nation of Communities of Learners of Underrepresented Discoverers in Engineering and Science). The program aspires to make careers in science, technology, engineering, and mathematics (STEM) more accessible to underserved populations.Two students from tribal colleges and universities will be selected to become interns in UCAR's SOARS program (Significant Opportunities in Atmospheric Research and Science). The students will join research teams comprised of mentors from NCAR, Biosphere 2, and their home communities to co-develop their research projects.One of the project partners, the four-year-old Rising Voices program, has brought social and physical scientists and engineers together with Native American community members to build bonds that lead to research collaboration."The INCLUDES project will actualize many topics we've been talking about in Rising Voices," said Heather Lazrus, an NCAR environmental anthropologist and Rising Voices co-founder. "The project will create a pathway for the students to become engaged in atmospheric sciences at a young age through a citizen science component, and then help keep them engaged for the long haul.”The GLOBE citizen science component will help the SOARS students reach out to their communities through a number of activities, especially with middle- and high-school students. The project also will connect community youth with undergraduate programs at Haskell and the University of Arizona.As it does for all its interns, SOARS will provide multiple mentors to help the Native American students develop their research, computer modeling, scientific communication, and professional skills.SOARS Director Rebecca Haacker said the internship program has brought in students from Haskell before. “But this will enable us to expand our relationship with indigenous students, and it's nice to see the student internships being part of this larger effort.”The mentors will be supported with cultural training by Michigan State University professor Kyle Powys Whyte, who is also a member of Rising Voices. "We don't want a situation of Western scientists working with Native Americans without any preparation," Brinkworth said. "We want the Western scientists to be introduced to the students' culture, their ways of thinking, their ways of working."The plan is for two SOARS interns to be selected by early 2017 and participate in research projects over the summer. In a second phase, NSF plans to bring together all the pilot projects two years from now with the goal of building out a comprehensive “Alliance” program.Brinkworth said that when she saw the request for proposals, she thought NCAR was uniquely positioned, in part because of Rising Voices, which has strengthened relationships among participating scientists and Native American communities.She hopes the new pilot project and the lessons to be learned will become a template for other efforts. "We are trying to produce a model for other Western scientific organizations that want to partner with indigenous scientists and communities," she said.Writer/contactJeff Smith, Science Writer and Public Information Officer 

40 Earths: NCAR's Large Ensemble reveals staggering climate variability

Sept. 29, 2016 | Over the last century, Earth's climate has had its natural ups and downs. Against the backdrop of human-caused climate change, fluctuating atmosphere and ocean circulation patterns have caused the melting of Arctic sea ice to sometimes speed up and sometimes slow down, for example. And the back-and-forth formation of El Niño and La Niña events in the Pacific has cause d some parts of the world to get wetter or drier while some parts get warmer or cooler, depending on the year.But what if the sequence of variability that actually occurred over the last century was just one way that Earth's climate story could have plausibly unfolded? What if tiny — even imperceptible — changes in Earth's atmosphere had kicked off an entirely different sequence of naturally occurring climate events?"It's the proverbial butterfly effect," said Clara Deser, a senior climate scientist at the National Center for Atmospheric Research (NCAR). "Could a butterfly flapping its wings in Mexico set off these little motions in the atmosphere that cascade into large-scale changes to atmospheric circulation?"To explore the possible impact of miniscule perturbations to the climate — and gain a fuller understanding of the range of climate variability that could occur — Deser and her colleague Jennifer Kay, an assistant professor at the University of Colorado Boulder and an NCAR visiting scientist, led a project to run the NCAR-based Community Earth System Model (CESM) 40 times from 1920 forward to 2100. With each simulation, the scientists modified the model's starting conditions ever so slightly by adjusting the global atmospheric temperature by less than one-trillionth of one degree, touching off a unique and chaotic chain of climate events.The result, called the CESM Large Ensemble, is a staggering display of Earth climates that could have been along with a rich look at future climates that could potentially be."We gave the temperature in the atmosphere the tiniest tickle in the model — you could never measure it — and the resulting diversity of climate projections is astounding," Deser said. "It's been really eye-opening for people."The dataset generated during the project, which is freely available, has already proven to be a tremendous resource for researchers across the globe who are interested in how natural climate variability and human-caused climate change interact. In a little over a year, about 100 peer-reviewed scientific journal articles have used data from the CESM Large Ensemble.Winter temperature trends (in degrees Celsius) for North America between 1963 and 2012 for each of 30 members of the CESM Large Ensemble. The variations in warming and cooling in the 30 members illustrate the far-reaching effects of natural variability superimposed on human-induced climate change. The ensemble mean (EM; bottom, second image from right) averages out the natural variability, leaving only the warming trend attributed to human-caused climate change. The image at bottom right (OBS) shows actual observations from the same time period. By comparing the ensemble mean to the observations, the science team was able to parse how much of the warming over North America was due to natural variability and how much was due to human-caused climate change. Read the full study in the American Meteorological Society's Journal of Climate. (© 2016 AMS.) A community effortRunning a complex climate model like the CESM several dozen times takes a vast amount of computing resources, which makes such projects rare and difficult to pull off. With that in mind, Deser and Kay wanted to make sure that the data resulting from the Large Ensemble were as useful as possible. To do that, they queried scientists from across the community who might make use of the project results — oceanographers, geochemists, atmospheric scientists, biologists, socioeconomic researchers — about what they really wanted."It took a village to make this ensemble happen and for it to be useful to and usable by the broad climate community," Kay said. "The result is a large number of ensemble members, in a state-of-the-art climate model, with outputs asked for by the community, that is publicly available and relatively easy to access — it's no wonder it's getting so much use."Scientists have so far relied on the CESM Large Ensemble to study everything from oxygen levels in the ocean to potential geoengineering scenarios to possible changes in the frequency of moisture-laden atmospheric rivers making landfall. In fact, so many researchers have found the Large Ensemble so useful that Kay and Deser were honored with the 2016 CESM Distinguished Achievement Award, which recognizes significant contributions to the climate modeling community.The award citation noted the pair was chosen because "the Large Ensemble represents one of NCAR's most significant contributions to the U.S. climate research community. … At a scientific level, the utility of the Large Ensemble cannot be overstated."The power of multiple runs: Looking forward — and backwardClearly, the CESM Large Ensemble is useful for looking forward: What is the range of possible futures we might expect in the face of a changing climate? How much warmer will summers become? When will summer Arctic sea ice disappear? How will climate change affect ocean life?But the Large Ensemble is also an extremely valuable tool for understanding our past. This vast storehouse of data helps scientists evaluate observations and put them in context: How unusual is a particular heat wave? Is a recent change in rainfall patterns the result of global warming or could it be from solely natural causes?With only a single model run, scientists are limited in what they can conclude when an observation doesn't match up with a model's projection. For example, if the Arctic sea ice extent were to expand, even though the model projected a decline, what would that mean? Is the physics underlying the model wrong? Or does the model incorrectly capture the natural variability? In other words, if you ran the model more times, with slightly different starting conditions, would one of the model runs correctly project the growth in sea ice?The Large Ensemble helps answer that question. Armed with 40 different simulations, scientists can characterize the range of historic natural variability. With this information, they can determine if observations fit within the envelope of natural variability outlined in the model, instead of comparing them to a single run.Creating an envelope of what can be considered natural also makes it possible to see when the signal of human-caused climate change has pushed an observation beyond the natural variability. The Large Ensemble can also clarify the climate change "signal" in the model. That's because averaging together the 40 ensemble members can effectively cancel out the natural variability — a La Niña in one model run might cancel out an El Niño in another, for example — leaving behind only changes due to climate change."This new ability to separate natural internal variability from externally driven trends is absolutely critical for moving forward our understanding of climate and climate change," said Galen McKinley, a professor of atmospheric and oceanic sciences at the University of Wisconsin–Madison.McKinley used the Large Ensemble — which she called a "transformative tool" — to study changes in the ocean's ability to take up carbon dioxide in a warming climate.The two components of the climate systemThe CESM Large Ensemble is not the first ensemble of climate simulations, though it is perhaps the most comprehensive and widely used. Scientists have long understood that it makes sense to look at more than one model run. Frequently, however, scientists have done this by comparing simulations from different climate models, collectively called a multi-model ensemble.This method gives a feel for the diversity of possible outcomes, but it doesn't allow researchers to determine why two model simulations might differ: Is it because the models themselves represent the physics of the Earth system differently? Or is it because the models have different representations of the natural variability or different sensitivities to changing carbon dioxide concentrations?The Large Ensemble helps resolve this dilemma. Because each member is run using the same model, the differences between runs can be attributed to differences in natural variability alone. The Large Ensemble also offers context for comparing simulations in a multi-model ensemble. If the simulations appear to disagree about what the future may look like—but they still fit within the envelope of natural variability characterized by the Large Ensemble—that could be a clue that the models do not actually disagree on the fundamentals. Instead, they may just be representing different sequences of natural variability.This ability to put model results in context is important, not just for scientists but for policy makers, according to Noah Diffenbaugh, a climate scientist at Stanford University who has used the Large Ensemble in several studies, including one that looks at the contribution of climate change to the recent, severe California drought.“It’s pretty common for real-world decision makers to look at the different simulations from different models, and throw up their hands and say, 'These models don't agree so I can't make decisions,'" he said. "In reality, it may not be that the models are disagreeing. Instead, we may be seeing the actual uncertainty of the climate system. There is some amount of natural uncertainty that we can't reduce — that information is really important for making robust decisions, and the Large Ensemble is giving us a window that we haven’t had before.”Deser agrees that it's important to communicate to the public that, in the climate system, there will always be this "irreducible" uncertainty."We’re always going to have these two components to the climate system: human-induced changes and natural variability. You always have to take both into account," Deser said. "In the future, it will all depend on how the human-induced component is either offset — or augmented — by the sequence of natural variability that unfolds."About the articleTitle: The Community Earth System Model (CESM) Large Ensemble Project: A Community Resource for Studying Climate Change in the Presence of Internal Climate VariabilityAuthors:  J. E. Kay, C. Deser, A. Phillips, A. Mai, C. Hannay, G. Strand, J. M. Arblaster, S. C. Bates, G. Danabasoglu, J. Edwards, M. Holland, P. Kushner, J.-F. Lamarque, D. Lawrence, K. Lindsay, A. Middleton, E. Munoz, R. Neale, K. Oleson, L. Polvani, and M. VertensteinJournal: Bulletin of the American Meteorological Society, DOI: 10.1175/BAMS-D-13-00255.1Funders: National Science FoundationU.S. Department of EnergyIn the news: Stories about research using the CESM Large EnsembleCauses of California drought linked to climate change, Stanford scientists sayStanford University (UCAR Member)The difficulty of predicting an ice-free ArcticUniversity of Colorado Boulder (UCAR Member)Widespread loss of ocean oxygen to become noticeable in 2030sNCARCornell Scientist Predicts Climate Change Will Prompt Earlier Spring Start DateCornell University (UCAR Member)The 2-degree goal and the question of geoengineeringNCAR New climate model better predicts changes to ocean-carbon sinkUniversity of Wisconsin Madison (UCAR Member)Future summers could regularly be hotter than the hottest on recordNCARExtreme-Weather Winters Becoming More CommonStanford (UCAR Member)More frequent extreme precipitation ahead for western North AmericaPacific Northwest National LaboratoryCloudy With A Chance of WarmingUniversity of Colorado Boulder (UCAR Member)Climate change already accelerating sea level rise, study finds NCARLess ice, more water in Arctic Ocean by 2050s, new CU-Boulder study findsUniversity of Colorado Boulder (UCAR Member)California 2100: More frequent and more severe droughts and floods likelyPacific Northwest National Laboratory Searing heat waves detailed in study of future climateNCAR Did climate change, El Nino make Texas floods worse?Utah State University (UCAR Member)Writer/contact:Laura Snider, Senior Science Writer and Public Information Officer

Food security report wins USDA award

BOULDER, Colo. — A comprehensive report warning of the impacts of climate change on the world's food security has won a top U.S. Department of Agriculture (USDA) award."Climate Change, Global Food Security, and the U.S. Food System," with co-authors from the National Center for Atmospheric Research (NCAR), provides an overview of recent research in climate change and agriculture. It warns that warmer temperatures and altered precipitation patterns can threaten food production, disrupt transportation systems, and degrade food safety, among other impacts, and that the world's poor and those living in tropical regions are particularly vulnerable.Michael Scuse, USDA acting deputy secretary (center), with members of the team of experts who produced the award-winning report, "Climate Change, Global Food Security, and the U.S. Food System." Those pictured are (back row from left): William Easterling (The Pennsylvania State University), Edward Carr (Clark University), and Peter Backlund (Colorado State University); front row from left: Rachel Melnick (USDA), Margaret Walsh (USDA), Scuse, Moffat Ngugi (U.S. Agency for International Development/USDA), and Karen Griggs (NCAR). (Photo by USDA.) The USDA this month named it as the winner of the 2016 Abraham Lincoln Honor Award for Increasing Global Food Security. The Abraham Lincoln Honor Award is the most prestigious USDA award presented by the Secretary of Agriculture, recognizing noteworthy accomplishments that significantly contribute to the advancement of the USDA's strategic goals, mission objectives, and overall management excellence.The report was produced as part of a collaboration between NCAR, the USDA, and the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation. It was written by 32 experts from 19 federal, academic, nongovernmental, intergovernmental, and private organizations in the United States, Argentina, Britain, and Thailand. The authors included three NCAR scientists, as well as eight experts affiliated with UCAR member universities."This award highlights the importance of addressing climate change in order to maintain the progress the world has made on food security in recent decades," said NCAR program director Lawrence Buja, who helped oversee production of the report. "Scientists will continue to study this critical issue and work with decision makers to co-develop the information they need about potential climate impacts on future production, distribution, and other aspects of our U.S. and global food systems."Published under the auspices of the U.S. Global Change Research Program, the reportfocuses on identifying climate change impacts on global food security through 2100. The authors emphasize that food security — the ability of people to obtain and use sufficient amounts of safe and nutritious food — will be affected by several factors in addition to climate change, such as technological advances, increases in population, the distribution of wealth, and changes in eating habits."Climate change has a myriad of potential impacts, especially on food, water, and energy systems," said UCAR President Antonio J. Busalacchi. "I commend the authors of this report for clearly analyzing this very complex issue in the agriculture sector, which has implications for all of society, from the least developed nations to the most advanced economies."Report authorsMolly Brown, University of Maryland*John Antle, Oregon State University*Peter Backlund, Colorado State University *Edward Carr, Clark UniversityBill Easterling, Pennsylvania State University*Margaret Walsh, USDA Office of the Chief Economist/Climate Change Program OfficeCaspar Ammann, NCARWitsanu Attavanich, Kasetsart UniversityChris Barrett, Cornell University*Marc Bellemare, University of Minnesota*Violet Dancheck, U.S. Agency for International DevelopmentChris Funk, U.S. Geological SurveyKathryn Grace, University of Utah*John Ingram, University of OxfordHui Jiang, USDA Foreign Agricultural ServiceHector Maletta, Universidad de Buenos AiresTawny Mata, USDA/American Association for the Advancement of ScienceAnthony Murray, USDA-Economic Research ServiceMoffatt Ngugi, U.S. Agency for International Development/USDA Foreign Agricultural ServiceDennis Ojima, Colorado State University*Brian O'Neill, NCARClaudia Tebaldi, NCAR*UCAR member universityReport project teamLawrence Buja, NCARKaren Griggs, NCAR 

Atmospheric rivers come into focus with high-res climate model

A high-resolution climate model based at the National Center for Atmospheric Research (NCAR) is able to accurately capture the ribbons of moist air that sometimes escape the sodden tropics and flow toward the drier mid-latitudes, allowing scientists to investigate how "atmospheric rivers" may change as the climate warms. These rivers in the sky can unleash drenching rains when they crash onto land. Because these downpours can alleviate droughts and also cause damaging floods, scientists are keenly interested in how their frequency, intensity, or path may be altered with climate change. But standard-resolution climate models have had difficulty realistically simulating atmospheric rivers and their impacts.In a pair of studies published this summer in the journal Geophysical Research Letters, NCAR scientists Christine Shields and Jeffrey Kiehl tested to see if a high-resolution climate model could do a better job. They found that a version of the NCAR-based Community Climate System Model 4.0 (CCSM4) with a resolution twice as high as normal does a good job of capturing the frequency with which atmospheric rivers made landfall over the last century as well as their locations and associated storms.Satellite images of water vapor over the oceans show atmospheric rivers known as the Pineapple Express hitting the U.S. West Coast in 2006 (top), 2009 (middle), and 2004 (bottom). (Images courtesy of NOAA.) Looking forward, the model projects that storms on the U.S. West Coast associated with a type of atmospheric river called the Pineapple Express, which sweeps moisture in from Hawaii, could linger and become more intense if greenhouse gas emissions are not mitigated.The studies also find that future changes to atmospheric rivers in general — including a possible increase in the number that make landfall in Southern California — will likely be dependent on how jet streams change in a warming world."Atmospheric rivers play an extremely important role in the Earth's water cycle. At any latitude, they account for only 10 percent of the air but they transport as much as 90 percent of the water that is moving from the tropics toward the poles," Kiehl said. "Understanding atmospheric rivers is critical to understanding how the entire climate system works."The how and why of future changesAtmospheric rivers were first discovered in the 1990s, and much of the early research was focused at understanding their detailed structure and the dynamics of how they form."We've gotten to a point in the science where we're able to track atmospheric rivers and detect them fairly well, and we can make some general statements about duration, intensity, and the precipitation associated with them," Shields said. "So the next step is really trying to understand how they might change in the future and, then, why they are changing."Shields and Kiehl suspected that the high-resolution version of the CCSM4 would be useful for answering those questions for a couple of reasons. Because the model has a resolution of about 50 kilometers (31 miles), it does a better job of capturing narrower phenomena, like the rivers. It also represents the complex terrain on the land surface that can trigger the atmospheric rivers to release rain or snow. As the rivers plow into the mountains of California, for example, they're forced higher into the atmosphere, where the moisture condenses and falls to the ground.As they'd hoped, the model did do a better job than a standard-resolution climate model at representing both the atmospheric rivers and their interactions with terrain. This allowed them to run the model forward to get a look at what rivers might do in the future if human-caused climate change continues unabated. What they found is that how — and why — atmospheric rivers change depends on the area of the world."Changes to atmospheric rivers in the future track with what the jets are doing," Shields said. "And that depends on your region."For example, the scientists found that the atmospheric rivers that hit California were influenced by changes to the subtropical jet, while atmospheric rivers that hit the United Kingdom were influenced by the polar jet.While understanding these connections gives scientist important insight into what factors may impact atmospheric rivers in the future, it's still a challenge for scientists to project how atmospheric rivers may actually change. That's because climate models tend to disagree about how jets will shift regionally as the climate warms.In the future, Shields and Kiehl plan to expand their analysis to other parts of the world, including the Iberian Peninsula."The climate change picture and what's going to happen to these atmospheric rivers really matter," Shields said. "They are a critical component of the hydrology in many places in the world."About the papers: Titles: "Simulating the Pineapple Express in the half degree Community Climate System Model, CCSM4," and "Atmospheric River Landfall-Latitude Changes in Future Climate Simulations"Authors: Christine A. Shields and Jeffrey T. KiehlJournal: Geophysical Research Letters, DOIs: 10.1002/2016GL069476 and 10.1002/2016GL070470Funders:National Science FoundationU.S. Department of EnergyWriter/contact:Laura Snider, Senior Science Writer and Public Information Officer 

The 2-degree goal and the question of geoengineering

Sept. 7, 2016 | With world leaders agreeing to try to limit the increase in global temperatures, scientists at the National Center for Atmospheric Research (NCAR) are taking a look at whether geoengineering the climate could counter enough warming to help meet that goal. In a new study, the scientists found that if society doesn't make steep cuts in greenhouse gas emissions in the next couple of decades, injections of planet-cooling sulfates into the atmosphere could theoretically limit warming to 2 degrees Celsius (3.6 degrees Fahrenheit) above preindustrial levels. But such geoengineeing would mean a sustained effort stretching over more than a century and a half, and it would fail to prevent certain aspects of climate change."One thing that surprised me about this study is how much geoengineering it would take to stay within 2 degrees if we don't start reducing greenhouse gases soon," said NCAR scientist Simone Tilmes, the lead author.For the study, the research team focused on the potential impacts of geoengineering on temperatures, the drying of land surfaces, and Arctic sea ice. They did not examine possible adverse environmental consequences such as potential damage to the ozone layer. The sulfate injections also would not alleviate the impact of carbon dioxide emissions on ocean acidification.The research was published in the journal Geophysical Research Letters.Meeting an ambitious targetRepresentatives of 195 nations negotiated last fall's Paris Agreement, which sets an ambitious target of capping global warming at no more than 2 degrees. Scientists have found, however, that such a target will be extremely difficult to achieve. It would require society to begin dramatically reducing emissions of carbon dioxide and other greenhouse gases within a few years. Efforts to develop new technologies that could draw down carbon dioxide from the atmosphere would also be needed to succeed.Volcanic eruptions spew sulfates into the air, which can block incoming sunlight and have a cooling effect on the planet. One type of proposed geoengineering would rely on a similar method: injecting sulfates high in the atmopshere to try to cool the Earth. (Image courtesy of USGS.)The new study examined a scenario in which emissions continue growing at current rates until about 2040, when warming would reach 2 degrees. The authors found that, even if society then adopted an aggressive approach to reducing emissions and was able to begin drawing down carbon dioxide from the atmosphere, warming would reach 3 degrees by the end of the century.So they explored an additional possibility: injecting sulfate particles, like those emitted during volcanic eruptions, into the stratosphere. This approach to geoengineering, which is untested but has generated discussion for several years, would theoretically counter global warming because the sulfates would block incoming sunlight and shade the planet. This is why large volcanic eruptions can have a planet-cooling effect.The research team estimated that society would need to keep injecting sulfates for 160 years to stay within the target of 2 degrees. This would require a peak rate of 18 megatons of sulfur dioxide per year, or about 1.5 times the amount emitted by the massive eruption of Mt. Pinatubo in 1992.A different climateEven so, the climate would be noticeably altered under this scenario. Extreme hot days with geoengineering would be about twice as frequent in North America and other regions compared to present-day conditions. (In comparison, they would be about five to six times more frequent without geoengineering.) Summertime Arctic sea ice would retreat significantly with geoengineering, whereas it would disappear altogether if society relied solely on reducingcarbon dioxide in the atmosphere after 2040. Precipitation patterns would also change with geoengineering, causing drying in some regions."If society doesn't act quickly on emissions, we may be facing more uncertain methods like geoengineering to keep temperatures from going over the 2-degree target," Tilmes said. "But even with geoengineering, we'd still be looking at a climate that's different than today's. For the study, Tilmes and her colleagues used a pair of computer models: the NCAR-based Community Earth System Model and the Integrated Science Assessment Model at the University of Illinois. These enabled the authors to simulate climate conditions with different levels of greenhouse gases as well as stratospheric sulfates.The research was supported by the National Science Foundation and the Department of Energy.About the article:Title: Climate impacts of geoengineering in a delayed mitigation scenarioAuthors: Simone Tilmes, Benjamin Sanderson, and Brian O'NeillJournal: Geophysical Research Letters, DOI: 10.1002/2016gl070122Funders:National Science FoundationU.S. Department of EnergyWriter/contact:David Hosansky, Manager of Media Relations

Climate change already accelerating sea level rise, study finds

BOULDER, Colo. — Greenhouse gases are already having an accelerating effect on sea level rise, but the impact has so far been masked by the cataclysmic 1991 eruption of Mount Pinatubo in the Philippines, according to a new study led by the National Center for Atmospheric Research (NCAR).Satellite observations, which began in 1993, indicate that the rate of sea level rise has held fairly steady at about 3 millimeters per year. But the expected acceleration due to climate change is likely hidden in the satellite record because of a happenstance of timing: The record began soon after the Pinatubo eruption, which temporarily cooled the planet, causing sea levels to drop.The new study finds that the lower starting point effectively distorts the calculation of sea level rise acceleration for the last couple of decades.The study lends support to climate model projections, which show the rate of sea level rise escalating over time as the climate warms. The findings were published today in the open-access Nature journal Scientific Reports.Mount Pinatubo's caldera on June 22, 1991. (Image courtesy USGS.)"When we used climate model runs designed to remove the effect of the Pinatubo eruption, we saw the rate of sea level rise accelerating in our simulations," said NCAR scientist John Fasullo, who led the study. "Now that the impacts of Pinatubo have faded, this acceleration should become evident in the satellite measurements in the coming decade, barring another major volcanic eruption."Study co-author Steve Nerem, from the University of Colorado Boulder, added: “This study shows that large volcanic eruptions can significantly impact the satellite record of global average sea level change. So we must be careful to consider these effects when we look for the effects of climate change in the satellite-based sea level record."The findings have implications for the extent of sea level rise this century and may be useful to coastal communities planning for the future. In recent years, decision makers have debated whether these communities should make plans based on the steady rate of sea level rise measured in recent decades or based on the accelerated rate expected in the future by climate scientists.The study was funded by NASA, the U.S. Department of Energy, and the National Science Foundation, which is NCAR's sponsor.Reconstructing a pre-Pinatubo worldClimate change triggers sea level rise in a couple of ways: by warming the ocean, which causes the water to expand, and by melting glaciers and ice sheets, which drain into the ocean and increase its volume. In recent decades, the pace of warming and melting has accelerated, and scientists have expected to see a corresponding increase in the rate of sea level rise. But analysis of the relatively short satellite record has not borne that out.To investigate, Fasullo, Nerem, and Benjamin Hamlington of Old Dominion University worked to pin down how quickly sea levels were rising in the decades before the satellite record began.Prior to the launch of the international TOPEX/Poseidon satellite mission in late 1992, sea level was mainly measured using tide gauges. While records from some gauges stretch back to the 18th century, variations in measurement technique and location mean that the pre-satellite record is best used to get a ballpark estimate of global mean sea level.Mount Pinatubo erupting in 1991. (Image courtesy USGS.)To complement the historic record, the research team used a dataset produced by running the NCAR-based Community Earth System Model 40 times with slightly different—but historically plausible—starting conditions. The resulting simulations characterize the range of natural variability in the factors that affect sea levels. The model was run on the Yellowstone system at the NCAR-Wyoming Supercomputing Center.A separate set of model runs that omitted volcanic aerosols — particles spewed into the atmosphere by an eruption — was also assessed. By comparing the two sets of runs, the scientists were able to pick out a signal (in this case, the impact of Mount Pinatubo's eruption) from the noise (natural variations in ocean temperature and other factors that affect sea level)."You can't do it with one or two model runs—or even three or four," Fasullo said. "There's just too much accompanying climate noise to understand precisely what the effect of Pinatubo was. We could not have done it without large numbers of runs."Using models to understand observationsAnalyzing the simulations, the research team found that Pinatubo's eruption caused the oceans to cool and sea levels to drop by about 6 millimeters immediately before TOPEX/Poseidon began recording observations.As the sunlight-blocking aerosols from Mount Pinatubo dissipated in the simulations, sea levels began to slowly rebound to pre-eruption levels. This rebound swamped the acceleration caused by the warming climate and made the rate of sea level rise higher in the mid- to late 1990s than it would otherwise have been.This higher-than-normal rate of sea level rise in the early part of the satellite record makes it appear that the rate of sea level rise has not accelerated over time and may actually have decreased somewhat. In fact, according to the study, if the Pinatubo eruption had not occurred—leaving sea level at a higher starting point in the early 1990s—the satellite record would have shown a clear acceleration."The satellite record is unable to account for everything that happened before the first satellite was launched, " Fasullo said. "This study is a great example of how computer models can give us the historical context that's needed to understand some of what we're seeing in the satellite record."Understanding whether the rate of sea level rise is accelerating or remaining constant is important because it drastically changes what sea levels might look like in 20, 50, or 100 years.“These scientists have disentangled the major role played by the 1991 volcanic eruption of Mt. Pinatubo on trends in global mean sea level,” said Anjuli Bamzai, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which funded the research.  “This research is vital as society prepares for the potential effects of climate change."Because the study's findings suggest that acceleration due to climate change is already under way, the acceleration should become evident in the satellite record in the coming decade, Fasullo said.Since the original TOPEX/Poseidon mission, other satellites have been launched—Jason-1 in 2001 and Jason-2 in 2008—to continue tracking sea levels. The most recent satellite, Jason-3, launched on Jan. 17 of this year."Sea level rise is potentially one of the most damaging impacts of climate change, so it's critical that we understand how quickly it will rise in the future," Fasullo said. "Measurements from Jason-3 will help us evaluate what we've learned in this study and help us better plan for the future."The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.The graph shows how sea level rises and falls as ocean heat content fluctuates. After volcanic eruptions, the Earth cools and, in turn, the heat content in the ocean drops, ultimately lowering sea level.The solid blue line is the average sea level rise of climate model simulations that include volcanic eruptions. The green line is the average from model simulations with the effect of volcanic eruptions removed, and it shows a smooth acceleration in the rate of sea level rise due to climate change.The blue line between the start of the satellite record and present day makes a relatively straight line — just as we see from actual satellite observations during that time —  indicating that the rate of sea level rise has not accelerated. But in the future, barring another major volcanic eruption, scientists expect sea level to follow the gray dotted line, which is on the same accelerating path as the green line below it. Click to enlarge. (©UCAR. This graph is freely available for media & nonprofit use.) About the articleTitle: Is the detection of sea level rise imminent?Authors: J.T. Fasullo, R. S. Nerem, and B. HamlingtonJournal: Scientific Reports, DOI: 10.1038/srep31245 Funders:  NASANational Science FoundationU.S. Department of Energy Collaborators: Univesity of Colorado Boulder (UCAR member)Old Dominion University (UCAR member)Writer:Laura Snider, Senior Science Writer and Public Information Officer

Expanding Antarctic sea ice linked to natural variability

BOULDER — The recent trend of increasing Antarctic sea ice extent — seemingly at odds with climate model projections — can largely be explained by a natural climate fluctuation, according to a new study led by the National Center for Atmospheric Research (NCAR). The study offers evidence that the negative phase of the Interdecadal Pacific Oscillation (IPO), which is characterized by cooler-than-average sea surface temperatures in the tropical eastern Pacific, has created favorable conditions for additional Antarctic sea ice growth since 2000. The findings, published in the journal Nature Geoscience, may resolve a longstanding mystery: Why is Antarctic sea ice expanding when climate change is causing the world to warm? The study's authors also suggest that sea ice may begin to shrink as the IPO switches to a positive phase. "The climate we experience during any given decade is some combination of naturally occurring variability and the planet's response to increasing greenhouse gases," said NCAR scientist Gerald Meehl, lead author of the study. "It's never all one or the other, but the combination, that is important to understand." Study co-authors include Julie Arblaster of NCAR and Monash University in Australia, Cecilia Bitz of the University of Washington, Christine Chung of the Australian Bureau of Meteorology, and NCAR scientist Haiyan Teng. The study was funded by the U.S. Department of Energy and by the National Science Foundation, which sponsors NCAR. On Sept. 19, 2014, the five-day average of Antarctic sea ice extent exceeded 20 million square kilometers (about 7.7 million square miles) for the first time since 1979, according to the National Snow and Ice Data Center. The red line shows the average maximum extent from 1979-2014. (Image courtesy NASA's Scientific Visualization Studio/Cindy Starr) Expanding ice The sea ice surrounding Antarctica has been slowly increasing in area since the satellite record began in 1979. But the rate of increase rose nearly five fold between 2000 and 2014, following the IPO transition to a negative phase in 1999. The new study finds that when the IPO changes phase, from positive to negative or vice versa, it touches off a chain reaction of climate impacts that may ultimately affect sea ice formation at the bottom of the world. When the IPO transitions to a negative phase, the sea surface temperatures in the tropical eastern Pacific become somewhat cooler than average when measured over a decade or two. These sea surface temperatures, in turn, change tropical precipitation, which drives large-scale changes to the winds that extend all the way down to Antarctica. The ultimate impact is a deepening of a low-pressure system off the coast of Antarctica known as the Amundsen Sea Low. Winds generated on the western flank of this system blow sea ice northward, away from Antarctica, helping to enlarge the extent of sea ice coverage. “Compared to the Arctic, global warming causes only weak Antarctic sea ice loss, which is why the IPO can have such a striking effect in the Antarctic," said Bitz. "There is no comparable natural variability in the Arctic that competes with global warming.” Sifting through simulations To test if these IPO-related impacts were sufficient to cause the growth in sea ice extent observed between 2000 and 2014, the scientists first examined 262 climate simulations created by different modeling groups from around the world. When all of those simulations are averaged, the natural variability cancels itself out. For example, simulations with a positive IPO offset those with a negative IPO. What remains is the expected impact of human-caused climate change: a decline in Antarctic sea ice extent. But for this study, the scientists were not interested in the average. Instead, they wanted to find individual members that correctly characterized the natural variability between 2000-2014, including the negative phase of the IPO. The team discovered 10 simulations that met the criteria, and all of them showed an increase in Antarctic sea ice extent across all seasons. "When all the models are taken together, the natural variability is averaged out, leaving only the shrinking sea ice caused by global warming," Arblaster said. "But the model simulations that happen to sync up with the observed natural variability capture the expansion of the sea ice area. And we were able to trace these changes to the equatorial eastern Pacific in our model experiments." Scientists suspect that in 2014, the IPO began to change from negative to positive. That would indicate an upcoming period of warmer eastern Pacific Ocean surface temperatures on average, though year-to-year temperatures may go up or down, depending on El Niño/La Niña conditions. Accordingly, the trend of increasing Antarctic sea ice extent may also change in response. "As the IPO transitions to positive, the increase of Antarctic sea ice extent should slow and perhaps start to show signs of retreat when averaged over the next 10 years or so," Meehl said. About the article Title: Antarctic sea-ice expansion between 2000 and 2014 driven by tropical Pacific decadal climate variability Authors: Gerald A. Meehl, Julie M. Arblaster, Cecilia M. Bitz, Christine T. Y. Chung, and Haiyan Teng Publication: Nature Geoscience, DOI: 10.1038/NGEO2751 WriterLaura Snider, Senior Science Writer and Public Information Officer

Capping warming at 2 degrees

June 27, 2016 | Even if countries adhere to the Paris climate agreement hammered out last fall, capping global warming at 2 degrees Celsius would likely require net zero greenhouse gas emissions by 2085 and substantial negative emissions over the long term, according to an in-depth analysis by scientists at the National Center for Atmospheric Research (NCAR).More than 100 parties to the Paris Agreement submitted pledges to the United Nations Framework Convention on Climate Change outlining their individual commitments to cutting greenhouse gas emissions by 2025 or 2030. The new study finds that, even if all the countries follow through on their commitments, steeper cuts would be necessary after 2030 to stay below 2 degrees of warming. And by the end of the century, total emissions would need to become negative, meaning more greenhouse gases would be removed from the air than are emitted into the atmosphere.These negative emissions would need to reach net minus 15 gigatons of "carbon dioxide equivalent," a measure that tabulates the global warming potential of all types of greenhouse gases in relation to carbon dioxide, according to model simulations created for the study.Worldwide, yearly greenhouse gas emissions now equal about 50 gigatons of carbon dioxide equivalent."The emissions targets in the Paris Agreement are an important first step, and it's known that additional action will be required to meet the goal of limiting warming to 2 degrees," said NCAR scientist Benjamin Sanderson, lead author of the study. "This paper provides details of what the next steps would need to look like in order to actually hit that target."The study, published in Geophysical Research Letters, a journal of the American Geophysical Union, was funded by the U.S. Department of Energy and by the National Science Foundation, NCAR's sponsor. This graph represents eight possible pathways that society could take to have a two-in-three chance of limiting warming to 2 degrees Celsius.  The blue line represents our current emissions trajectory. The red line represents the path that society will be on if countries adhere to the Paris Agreement. The gray lines represent other possibilities, all of which require more stringent emissions cuts in the near term but fewer negative emissions later. Click to enlarge. (©UCAR. This image is freely available for media & nonprofit use.)Small changes now equal big benefits laterEven before the Paris agreement was finished, it was clear that the pledged emissions cuts by 2030 would not be sufficient on their own to meet the target of limiting warming to 2 degrees. This study gives a comprehensive look at the possible paths society could take to have a two-in-three chance of staying below the target."We created a wide range of possible global emissions pathways that would allow us to have a decent shot at limiting warming to two degrees," said Sanderson. "We found that very small increases in the rate at which we cut greenhouse gases now could lead to very large decreases in the amount of negative emissions we need later." Negative emissions in the future will require the massive deployment of technologies that are still hypothetical to draw down greenhouse gases from the atmosphere. That makes it difficult to know how capable society will be to implement large-scale carbon removal in the future.Sanderson and his colleagues, NCAR scientists Brian O'Neill and Claudia Tebaldi, also found that it is still possible to stay below 2 degrees of warming without net negative emissions, but to do so would require near-term cuts that are much more aggressive than those proposed in the Paris agreement.About the articleBenjamin M. Sanderson, Brian C. O’Neill, and Claudia Tebaldi, What would it take to achieve the Paris temperature targets?, Geophysical Research LettersWriter/contact:Laura Snider, Senior Science Writer

Climate modeling 101: Explanations without equations

A new book breaks down climate models into easy-to-understand concepts. (Photo courtesy Springer.) June 21, 2016 | Climate scientists tell us it's going to get hotter. How much it rains and where it rains is likely to shift. Sea level rise is apt to accelerate. Oceans are on their way to becoming more acidic and less oxygenated. Floods, droughts, storms, and other extreme weather events are projected to change in frequency or intensity.  But how do they know what they know? For climate scientists, numerical models are the tools of the trade. But for the layperson — and even for scientists in other fields — climate models can seem mysterious. What does "numerical" even mean? Do climate models take other things besides the atmosphere into account?How do scientists know if a model is any good? * Two experts in climate modeling, Andrew Gettelman of the National Center for Atmospheric Research and Richard Rood of the University of Michigan, have your answers and more, free of charge. In a new open-access book, "Demystifying Climate Models," the pair lay out the fundamentals. In 282 pages, the scientists explain the basics of climate science, how that science is translated into a climate model, and what those models can tell us (as well as what they can't) — all without using a single equation. *Find the answers on pages 8, 13, and 161, respectively, of the book. AtmosNews sat down with Gettelman to learn more about the book, which anyone can download at http://www.demystifyingclimate.org.   NCAR scientist Andrew Gettelman has written a new book on climate modeling with Richard Rood of the University of Michigan. (Courtesy photo. This image is freely available for media & nonprofit use.) What was the motivation to write this book? There isn't really another book that sets out the philosophy and structure of models. There are textbooks, but inside you'll find a lot of physics and chemistry: information about momentum equations, turbulent fluxes — which is useful if you want to build your own model. And then there are books on climate change for the layperson, and they devote maybe a paragraph to climate modeling. There's not much in the middle. This book provides an introduction for the beginning grad student, or someone in another field who is interested in using model output, or anyone who is just curious how climate works and how we simulate it. What are some of the biggest misperceptions about climate models that you hear? One is that people say climate models are based on uncertain science. But that's not true at all. If we didn't know the science, my cellphone wouldn't work. Radios wouldn't work. GPS wouldn't work. That's because the energy that warms the Earth, which radiates from the Sun, and is absorbed and re-emitted by Earth's surface — and also by greenhouse gases in the atmosphere — is part of the same spectrum of radiation that makes up radio waves. If we didn't understand electromagnetic waves, we couldn't have created the technology we rely on today. The same is true for the science that underlies other aspects of climate models. (Learn more on page 38 of the book.) But we don't understand everything, right? We have understood the basic physics for hundreds of years. The last piece of it, the discovery that carbon dioxide warms the atmosphere, was put in place in the late 19th, early 20th century. Everything else — the laws of motion, the laws of thermodynamics — was all worked out between the 17th and 19th centuries. (Learn more on page 39 of the book.) We do still have uncertainty in our modeling systems. A big part of this book is about how scientists understand that uncertainty and actually embrace it as part of their work. If you know what you don't know and why, you can use that to better understand the whole climate system. Can we ever eliminate the uncertainty? Not entirely. In our book, we break down uncertainty into three categories: model uncertainty (How good are the models at reflecting how the Earth really works?), initial condition uncertainty (How well do we understand what the Earth system looks like right now?), and scenario uncertainty (What will future emissions look like?) To better understand, it might help to think about the uncertainty that would be involved if you had a computer model that could simulate making a pizza. Instead of trying to figure out what Earth's climate would look like in 50 or 100 years, this model would predict what your pizza would look like when it was done.  The first thing you want to know is how well the model reflects the reality of how a pizza is made. For example, does the model take into account all the ingredients you need to make the pizza, and how they will each evolve? The cheese melts, the dough rises, and the pepperoni shrinks. How well can the model approximate each of those processes? This is model uncertainty. The second thing you'd want to know is if you can input all the pizza's "initial conditions" into the model. Some initial conditions — like how many pepperoni slices are on the pizza and where — are easy to observe, but others are not. For example, kneading the pizza dough creates small pockets of air, but you don’t know exactly where they are. When the dough is heated, the air expands and forms big bubbles in the crust. If you can't tell the model where the air pockets are, it can't accurately predict where the crust bubbles will form when the pizza is baked. The same is true for a climate model. Some parts of the Earth, like the deep oceans and the polar regions, are not easy to observe with enough detail, leaving scientists to estimate what the conditions there are like and leading to the second type of uncertainty in the model results.  Finally, the pizza-baking model also has to deal with "scenario uncertainty," because it doesn't know how long the person baking the pizza will keep it in the oven, or at what temperature. Without understanding the choices the human will make, the model can't say for sure if the dough will be soft, crispy, or burnt. With climate models, over long periods of time, like a century, we've found that this scenario uncertainty is actually the dominant one. In other words, we don't know how much carbon dioxide humans around the world going to emit in the years and decades to come, and it turns out that that's what matters most.  (Learn more about uncertainty on page 10 of the book.) Any other misperceptions you frequently hear? People always say, "If we can't predict the weather next week, how can we know what the climate will be like in 50 years?" Generally speaking, we can't perfectly predict the weather because we don't have a full understanding of all the current conditions. We don't have observations for every grid point on a weather model or for large parts of the ocean, for example. But climate is not concerned about the exact weather on a particular day 50 or 100 years from now. Climate is the statistical distribution of weather, not a particular point on that distribution. Climate prediction is focused on the statistics of this distribution, and that is governed by conservation of energy and mass on long time scales, something we do understand. (Learn more on page 6 of the book. Read more common misperceptions at http://www.demystifyingclimate.org/misperceptions.) Did you learn anything about climate modeling while working on the book? My background is the atmosphere. I sat down and wrote the whole section on the atmosphere in practically one sitting. But I had to learn about the other aspects of models, the ocean and the land, which work really differently. The atmosphere has only one boundary, a bottom boundary. We just have to worry about how it interacts with mountains and other bumps on the surface. But the ocean has three hard boundaries: the bottom and the sides, like a giant rough bathtub. It also has a boundary with the atmosphere on the top. Those boundaries really change how the ocean moves. And the land is completely different because it doesn't move at all. Writing this book really gave me a new appreciation for some of the subtleties of other parts of the Earth System and the ways my colleagues model them. (Learn more on page 13 of the book.) What was the most fun part of writing the book for you? I think having to force myself to think in terms of analogies that are understandable to a variety of people. I can describe a model using a whole bunch of words most people don't use every day, like "flux." It was a fun challenge to come up with words that would accurately describe the models and the science but that were accessible to everyone.

Pages

Subscribe to Climate & Climate Change