Weather Research

NCAR weather ensemble offers glimpse at forecasting's future

July 1, 2016 | Last spring, scientists at the National Center for Atmospheric Research (NCAR) flipped the switch on a first-of-its-kind weather forecasting system. For more than a year, NCAR's high-resolution, real-time ensemble forecasting system has been ingesting 50,000 to 70,000 observations every six hours and creating a whopping 90,000 weather maps each day.The system has become a favorite among professional forecasters and casual weather wonks: Typically more than 200 people check out the site each day with more than a thousand coming during major weather events. During this experimental period, the NCAR ensemble has also become a popular source of guidance within the National Weather Service, where it has already been referenced several hundred times by forecasters at more than 50 different offices. But perhaps more important, the data accumulated from running the system daily — and there is lots of it — is being used by researchers at universities across the country to study a range of topics, from predicting hail size to anticipating power outages for utilities."We wanted to demonstrate that a real-time system of this scale was feasible," said NCAR scientist Craig Schwartz. "But it's also a research project that can help the community learn more about the predictability of different kinds of weather events."Schwartz is a member of the team that designed and operates the system, along with NCAR colleagues Glen Romine, Ryan Sobash, and Kate Fossell.This animation shows the forecast for accumulated snowfall made by each of the NCAR ensemble's 10 members for the 48-hour period beginning on Jan. 22, 2016. In the run-up to the blizzard, which ultimately dropped more than 30 inches of snow on parts of the Mid-Atlantic, more than 1,000 people visited the NCAR ensemble's website. (©UCAR. This animation is freely available for media & nonprofit use.)Testing a unique toolNCAR's high-resolution ensemble forecasting system is unique in the country for a couple of reasons, both of which are revealed in its name: It's an ensemble, and it's high resolution.Instead of producing a single forecast, the system produces an "ensemble" of 10 forecasts, each with slightly different (but equally likely) starting conditions. The degree to which the forecasts look the same or different tells scientists something about the probability that a weather event, like rain, hail, or wind, will actually occur.By comparing the actual outcomes to the forecasted probabilities, scientists can study the predictability of particular weather events under different circumstances. The forecasting system's high resolution (the grid points are just 3 kilometers apart) allows it to simulate small-scale weather phenomena, like the creation of individual storms from convection — the process of moist, warm air rising and then condensing into clouds. The combination of fine grid spacing and ensemble predictions in the NCAR system offers a sneak peek at what the future of weather forecasting might look like, and weather researchers across the country have noticed. Cliff Mass, a professor of atmospheric sciences at the University of Washington whose specialty is forecasting, said: "It's extremely important for the United States to have a convection-allowing ensemble system to push our forecasting capabilities forward. We were delighted that NCAR demonstrated that this could be done."'The cat's meow'The treasure trove of accruing weather data generated by running the NCAR ensemble is already being used by researchers both at NCAR and in the broader community. Jim Steenburgh, for instance, is a researcher at the University of Utah who is using the system to understand the predictability of mountain snowstorms."NCAR's ensemble not only permits the 'formation' of clouds, it can also capture the topography of the western United States," he said. "The mountains control the weather to some degree, so you need to be able to resolve the mountains' effects on precipitation."Steenburgh has also been using the ensemble with his students. "We’re teaching the next generation of weather forecasters," he said. "In the future, these high-resolution ensemble forecasts will be the tools they need to use, and this gives them early, hands-on experience." Like Steenburgh, Lance Bosart, an atmospheric researcher at the University of Albany, State University of New York, has used the ensemble both in his own research — studying the variability of convective events — and with his students. He said having 10 members in the ensemble forecast helps students easily see the great spread of possibilities, and the visual emphasis of the user interface makes it easy for students to absorb the information."What makes it an invaluable tool is the graphical display," he said. "It's visually compelling. You don't have to take a lot of time to explain what you're looking at; you can get right into explaining the science. I like to say it's the cat's meow." Setting an exampleThe NCAR ensemble is also enabling the researchers running it to further their own research. "We're collecting statistics on the misfit between the model predictions and observations and then we're trying to use that to improve our model physics," Romine said. The ensemble project is also teaching the team about the strengths and weaknesses of the way they've chosen to kick off, or "initialize," each of the ensemble members. "The NCAR ensemble happens to produce a pretty good forecast, but we realize there are some shortcomings," Schwartz said. "For example, if we were trying to make the best forecast in the world, we would probably not be initializing the model the way we are. But then we wouldn’t learn as much from a research perspective." The NCAR ensemble began as a yearlong trial, but the project is continuing to run for now. The team would like to keep the system online until next summer, but they don't yet have the computing resources they need to run it past September.If the system does continue to run, the researchers who are using it say there's still more that they and their students can learn from the project. And if not, there's loads of data already collected that are still waiting to be mined. In any case, Mass says NCAR's ensemble has been a valuable project. "It set a really good example for the nation," he said.Community members interested in collaborating or helping support the NCAR ensemble project are encouraged to contact the team at Snider, Senior Science Writer and Public Information Officer 

Sizing up cyclones

UPDATE: 2016 SEASONAL FORECASTThe NCAR-based Engineering for Climate Extremes Partnership (ECEP) has issued its first seasonal forecast using the Cyclone Damage Potential index. The forecast is for a hurricane season with higher-than-average potential to cause damage. This year's forecasted seasonal CDP is 5.7, compared to an average seasonal CDP of 3.7 for the years 1981 - 2010. For more details, visit the ECEP website.May 18, 2016 |In early July 2005, Hurricane Dennis, a Category 3 storm on the Saffir-Simpson Hurricane Wind Scale, was bearing down on the Gulf Coast. Anyone paying attention would have been forgiven for having a foreboding sense of déjà vu. Just 10 months earlier, another Category 3 storm, Hurricane Ivan, had followed a strikingly similar track, making landfall just west of Gulf Shores, Alabama. Ivan ravaged the region, ultimately causing an estimated $18.8 billion in damages. But Dennis, despite roaring ashore in practically the same neighborhood, caused only $2.5 billion in damages—less than one-seventh that caused by Ivan.The fact that two Category 3 hurricanes making similar landfall less than one year apart had such different impacts illustrates a weakness in the Saffir-Simpson scale, the system most commonly used by weather forecasters to categorize hurricane risk.Scientists at the National Center for Atmospheric Research (NCAR)—in collaboration with global insurance broker Willis—have developed a new index that they suspect can do a better job of quantifying a hurricane's ability to cause destruction. The Cyclone Damage Potential index (CDP) rates storms on a scale of 1 to 10, with 10 being the greatest potential for destruction.A prototype for an app that will allow risk managers to easily use the CDP to identify local damage potential is already available and will be among the first tools included in the beta version of the NCAR-based Global Risk, Resilience, and Impacts Toolbox when it is released later this year.Infrared satellite imagery of Hurricane Ivan (left) and Hurricane Dennis (right). Both storms were rated Category 3, both made landfall in almost the same area, and yet they caused vastly different amounts of damage. Click to enlarge. (Images courtesy NOAA.)Moving beyond wind speedOn the frequently used Saffir-Simpson Hurricane Wind Scale, hurricanes are placed in one of five categories, based on their sustained wind speeds. On the low end, Category 1 storms have sustained winds between 74–95 mph and are expected to cause "some damage." On the high end, Category 5 storms have sustained winds of 157 mph or higher and are expected to cause "catastrophic damage."Because the Saffir-Simpson scale relies solely on sustained wind speeds, it does not take into account all the characteristics of a storm linked to its destructive power."Hurricane wind damage is driven by more than simply wind speed," said James Done, one of three NCAR scientists who worked on the CDP. "The hurricane's size and forward speed also are important. A large, slowly moving hurricane that repeatedly batters an area with high winds can cause greater total damage than a smaller, faster hurricane that blows quickly through a region."Damage caused to a marina in New Orleans by Hurricane Katrina. Katrina would have received a CDP rating of 6.6, compared to a 5.0 for Hurricane Ivan and a 2.4 for Hurricane Dennis. (Image courtesy NOAA. Click here for high resolution.)For example, the critical difference between Ivan and Dennis turned out to be hurricane size, according to a study of the storms by Done and Jeffrey Czajkowski at the University of Pennsylvania's Wharton Risk Management and Decision Processes Center.To create the CDP, the scientists incorporated hurricane size and forward speed into their index, along with sustained winds. To determine the relative importance of each, the team used hurricane damage statistics from the hundreds of offshore oil and gas facilities that pepper the northern Gulf of Mexico. Because facilities are spread more-or-less evenly across the region, their exposure to hurricanes is approximately the same. Damage differences from storm to storm can therefore be attributed to differences in the storms themselves. The CDP does not predict actual damage – which could vary markedly, depending on where (or if) a hurricane makes landfall – but instead predicts the storm's potential.When applying the CDP to past hurricanes, the index was able to discern a difference between Ivan, which would have scored 5.0 on the CDP prior to landfall, and the much smaller Dennis, which would have earned a 2.4. The CDP rating for Hurricane Katrina, which ultimately caused more than $80 billion in damages in 2005, would have been 6.6.“The value of the index is in comparing current storms with storms from the past," Done said. "For example, if a hurricane is approaching New Orleans, you can compare its CDP with Hurricane Katrina's CDP and get a fuller picture of how much damage the storm is likely to cause."The CDP project was led by NCAR scientist Greg Holland, along with NCAR colleagues Done, Ming Ge, and Willis collaborator Rowan Douglas.Dive deeperFrom today's storm to tomorrow's climateIn its original form, the CDP can be easily applied in real time to existing hurricanes. But Done also wanted to find a way to examine how hurricane damage might change in the future, especially as the climate warms. The question of how climate change may influence hurricanes has been difficult to answer, in part because global climate models are typically not able to "see" the small-scale details of individual storms. Though some scientists have run climate models at a resolution that is fine enough to study hurricane formation, the effort requires so much computing power that it hasn't been practical to replicate on a large scale.To skirt this problem, hurricane researchers have looked for links between hurricane activity and phenomena that climate models can see—for example, the sea surface temperatures of ocean basins."People have used large-scale variables to infer tropical cyclone activity for decades," Done said. "I've done a similar thing, but instead of predicting how many hurricanes will form, I’m predicting hurricane damage potential."To make this "climate" version of the CDP, Done – together with NCAR colleagues Debasish PaiMazumder and Erin Towler, and Indian Space Research Organization collaborator Chandra Kishtawal – looked for variables in the large-scale environment that could be correlated to the three variables used for the original CDP: sustained winds, size, and storm speed.The team found that "steering flow," the winds that would blow along a hurricane, is correlated with forward speed. They also found that relative sea surface temperature – the difference between temperatures in the Atlantic and Pacific ocean basins – is linked to seasonal average hurricane intensity and size. This is because relative sea surface temperatures affect wind speeds higher up in the atmosphere, which in turn affect hurricane formation. The result is an index that can spit out a damage potential rating for a season, a year, or even longer, without needing to predict how many individual storms might form. Such forecasts are of interest to large reinsurance companies, like Willis Re and others."This technique enables us to translate our climate model simulations into information about extreme events that’s critical for businesses and policy makers,” Done said.Writer/ContactLaura Snider, Senior Science Writer and Public information OfficerFundersResearch Partnership to Secure Energy for AmericaWillis Re CollaboratorsEngineering for Climate Extremes PartnershipWillis Re

A 3D window into a tornado

This simulation was created by NCAR scientist George Bryan to visualize what goes on inside a tornado. The animation is the "high swirl" version in a series that goes from low, to medium, to high. Click to enlarge. (Courtesy Goerge Bryan, NCAR. This image is freely available for media & nonprofit use.) May 17, 2016 | What's really going on inside a tornado? How fast are the strongest winds, and what are the chances that any given location will experience them when a tornado passes by? Due to the difficulties of measuring wind speeds in tornadoes, scientists don't have answers to these questions. However, a collaborative project between researchers at the University of Miami and NCAR has been seeking clues with new, highly detailed computer simulations of tornado wind fields. The simulations can be viewed in a series of animations, created by NCAR scientist George Bryan, that provide a 3D window into the evolving wind fields of idealized tornadoes at different rates of rotation. The "high-swirl animation," shown here, which depicts a powerful tornado with 200-plus mph winds, the purple tubelike structures depict the movements of rapidly rotating vortices. Near-surface winds are represented by colors ranging from light blue (less than 20 meters per second, or 45 mph) to deep red (more than 100 meters per second, or 224 miles per hour). The vortices and winds are contained within a condensation cloud that rises more than 500 meters (1,640 feet) above the surface. Such visualizations can help atmospheric scientists better understand the structures of tornadoes, as well as the shifting location and strength of maximum wind speeds.  Bryan also uses them in presentations to meteorology students. “When you make these 3D visualizations and then animate them, they give you a sense of how the flow evolves and how the turbulence changes,” Bryan said. “These are details you don’t see by just looking at a photograph.” For example, he learned from the visualization that the rotating tubes tilt backward against the flow at higher altitudes. These are the kinds of details that can eventually help scientists better understand these complex storms. The information is also critical for public safety officials and engineers. “If you’re an engineer and designing a building, you want to know details like how much greater is peak wind over average wind in a tornado,” Bryan said. “We’ll get questions from engineers asking about the details of wind gusts in those purple tubes.” Bryan is collaborating on the simulations with Dave Nolan, chair of Miami’s Department of Atmospheric Sciences. To create the animation, Bryan used innovative NCAR software that enables researchers in the atmospheric and related sciences to analyze and interpret results from large computer models. VAPOR (Visualization and Analysis Platform for Ocean, Atmosphere, and Solar Researchers) is an interactive 3D visualization environment for both animations and still-frame images. The open-source software can be downloaded and used on personal computers. VAPOR was developed at NCAR in partnership with the University of California at Davis and Ohio State University. Funding comes from the National Science Foundation and the Korea Institute of Science and Technology Information. Writer/contactDavid Hosansky FunderNational Science Foundation CollaboratorUniversity of Miami

Ocean temps predict U.S. heat waves 50 days out, study finds

BOULDER — The formation of a distinct pattern of sea surface temperatures in the middle of the North Pacific Ocean can predict an increased chance of summertime heat waves in the eastern half of the United States up to 50 days in advance, according to a new study led by a scientist at the National Center for Atmospheric Research (NCAR).  The pattern is a contrast of warmer-than-average water butting up against cooler-than-average seas. When it appears, the odds that extreme heat will strike during a particular week—or even on a particular day—can more than triple, depending on how well-formed the pattern is. The research is being published in the journal Nature Geoscience. "Summertime heat waves are among the deadliest weather events, and they can have big impacts on farming, energy use, and other critical aspects of society," said Karen McKinnon, a postdoctoral researcher at NCAR and the lead author of the study. "If we can give city planners and farmers a heads up that extreme heat is on the way, we might be able to avoid some of the worst consequences." The research was largely funded by the National Science Foundation, NCAR's sponsor. In addition to McKinnon, the research team includes Andrew Rhines, of the University of Washington; Martin Tingley, of Pennsylvania State University; and Peter Huybers, of Harvard University. A fingerprint on the ocean For the study, the scientists divided the country into regions that tend to experience extreme heat at the same time. The scientists then focused on the largest of the resulting blocks: a swath that stretches across much of the Midwest and up the East Coast, encompassing both important agricultural areas and heavily populated cities.  Top: Sea surface temperature anomalies in the mid-latitude Pacific 50 days in advance of June 29, 2012. The pattern inside the green box resembled the Pacific Extreme Pattern, indicating that there would be an increase in the odds of a heat wave in the eastern half of the United States at the end of June. (Image courtesy of Karen McKinnon, NCAR. This image is freely available for media & nonprofit use.) Bottom: June 29, 2012, was the hottest day of the year in the eastern United States. The hot temperatures in late June and early July were part of an extraordinarily hot summer that saw three heat waves strike the country. (Map courtesy of the National Weather Service's Weather Prediction Center.) The research team looked to see if there was a relationship between global sea surface temperature anomalies—waters that are warmer or cooler than average—and extreme heat in the eastern half of the U.S. Right away, a pattern popped out in the middle of the Pacific, above about 20 degrees North latitude. The scientists found that the particular configuration of ocean water temperatures, which they named the Pacific Extreme Pattern, was not only found when it was already hot in the eastern U.S., but that it tended to form in advance of that heat. "Whatever mechanisms ultimately leads to the heat wave also leaves a fingerprint of sea surface temperature anomalies behind," McKinnon said. Improving seasonal forecasts To test how well that fingerprint could predict future heat, the scientists used data collected from 1,613 weather stations across the eastern U.S. between 1982 and 2015, as well as daily sea surface temperatures for the same time period. The scientists defined extreme heat in the eastern U.S. as a summertime day when the temperature readings from the warmest 5 percent of weather stations in the region were at least 6.5 degrees Celsius (11.7 degrees Fahrenheit) hotter than average. The scientists only looked at extreme heat during that region’s 60 hottest days of the year: June 24 through Aug. 22. The researchers "hindcasted" each year in the dataset to see if they could retrospectively predict extreme heat events—or lack of those events—during that year's summer, using only data collected during the other years as a guide. At 50 days out, the scientists were able to predict an increase in the odds—from about 1 in 6 to about 1 in 4—that a heat wave would strike somewhere in the eastern U.S. during a given week. At 30 days out or closer, the scientists were able to predict an increase in the odds—to better than 1 in 2 for a particularly well-formed pattern—that a heat wave would strike on a particular day. This new technique could improve existing seasonal forecasts, which do not focus on predicting daily extremes. Seasonal forecasts typically predict whether an entire summer is expected to be warmer than normal, normal, or cooler than normal. For example, the seasonal forecast issued for the summer of 2012 predicted normal heat for the Northeast and Midwest. But, the summer ended up being especially hot, thanks to three major heat waves that struck in late June, mid-July, and late July. When the science team used the Pacific Extreme Pattern to hindcast 2012, they were able to determine as early as mid-May that there were increased odds of extremely hot days occurring in late June. The hottest day of the summer of 2012, as measured by the technique used for this study, was June 29, when the warmest 5 percent of weather stations recorded temperatures that were 10.4 degrees Celsius (18.7 degrees Fahrenheit) above average. "We found that we could go back as far as seven weeks and still predict an increase in the odds of future heat waves," McKinnon said. “What’s exciting about this is the potential for long-range predictions of individual heat waves that gives society far more notice than current forecasts.” Looking ahead Scientists do not yet know why the fingerprint on sea surface temperatures in the Pacific predicts heat in the eastern U.S. It could be that the sea surface temperatures themselves kick off weather patterns that cause the heat. Or it could be that they are both different results of the same phenomenon, but one does not cause the other. To learn more about how the two are connected, McKinnon is working with colleagues at NCAR to use sophisticated computer models to try and tease apart what is really happening. The study's findings also point toward the possibility that the Pacific Extreme Pattern, or a different oceanic fingerprint, could be used to forecast other weather events far in advance, including cooler-than-average days and extreme rainfall events. “The results suggest that the state of the mid-latitude ocean may be a previously overlooked source of predictability for summer weather,” McKinnon said. About the article Title: Long-lead predictions of eastern United States hot days from Pacific sea surface temperaturesAuthors: Karen McKinnon, Andrew Rhines, Martin Tingley, and Peter HuybersJournal: Nature Geoscience Writer:Laura Snider, senior science writer and public information officer

Potential Zika virus risk estimated for 50 U.S. cities

BOULDER – Key factors that can combine to produce a Zika virus outbreak are expected to be present in a number of U.S. cities during peak summer months, new research shows. The Aedes aegypti mosquito, which is spreading the virus in much of Latin America and the Caribbean, will likely be increasingly abundant across much of the southern and eastern United States as the weather warms, according to a new study led by mosquito and disease experts at the National Center for Atmospheric Research (NCAR). Summertime weather conditions are favorable for populations of the mosquito along the East Coast as far north as New York City and across the southern tier of the country as far west as Phoenix and Los Angeles, according to computer simulations conceived and run by researchers at NCAR and the NASA Marshall Space Flight Center. Spring and fall conditions can support low to moderate populations of the Aedes aegypti mosquito in more southern regions of its U.S. range. Wintertime weather is too cold for the species outside southern Florida and southern Texas, the study found. By analyzing travel patterns from countries and territories with Zika outbreaks, the research team further concluded that cities in southern Florida and impoverished areas in southern Texas may be particularly vulnerable to local virus transmission. Many U.S. cities face potential risk in summer of low, moderate, or high populations of the mosquito species that transmits Zika virus (colored circles). The mosquito has been observed in parts of the United States (shaded portion of map) and can establish populations in additional cities because of favorable summertime meteorological conditions. In addition, Zika risk may be elevated in cities with more air travelers arriving from Latin America and the Caribbean (larger circles). For a high-resolution map, click here or on the image. (Image based on data mapped by Olga Wilhelmi, NCAR GIS program. This image is freely available for media & nonprofit use.) "This research can help us anticipate the timing and location of possible Zika virus outbreaks in certain U.S. cities,” said NCAR scientist Andrew Monaghan, the lead author of the study. “While there is much we still don’t know about the dynamics of Zika virus transmission, understanding where the Aedes aegypti mosquito can survive in the U.S. and how its abundance fluctuates seasonally may help guide mosquito control efforts and public health preparedness.” “Even if the virus is transmitted here in the continental U.S., a quick response can reduce its impact,” added NCAR scientist Mary Hayden, a medical anthropologist and co-author of the study. Although the study does not include a specific prediction for this year, the authors note that long-range forecasts for this summer point to a 40–45 percent chance of warmer-than-average temperatures over most of the continental United States. Monaghan said this could lead to increased suitability for Aedes aegypti in much of the South and East, although above-normal temperatures would be less favorable for the species in the hottest regions of Texas, Arizona, and California. Monaghan stressed that, even if Zika establishes a toehold in the mainland United States, it is unlikely to spread as widely as in Latin America and the Caribbean. This is partly because a higher percentage of Americans live and work in air-conditioned and largely sealed homes and offices. The study is being published today in the peer-reviewed journal PLOS Currents Outbreaks. It was funded by the National Institutes of Health, NASA, and the National Science Foundation, which is NCAR’s sponsor. It was co-authored by scientists at NASA, North Carolina State University, Maricopa County Environmental Services Vector Control Division, University of Arizona, and Durham University. Spreading rapidly First identified in Uganda in 1947, the Zika virus has moved through tropical regions of the world over the past decade. It was introduced into Brazil last year and spread explosively across Latin America and the Caribbean, with more than 20 countries now facing pandemics. About 80 percent of infected people do not have significant symptoms, and most of the rest suffer relatively mild flu- or cold-like symptoms that generally clear up in about a week. However, scientists are investigating correlations between contracting the disease during pregnancy and microcephaly, a rare birth defect characterized by an abormally small head and brain damage. To determine the potential risk in the mainland United States, the research team ran two computer models that simulated the effect of meteorological conditions on a mosquito’s entire lifecycle (egg, larval, pupal, and adult stages) in 50 cities in or near the known range of the species. Monaghan and several team members have studied Aedes aegypti for years because it also carries the viruses that cause dengue and chikungunya. Generally, the mosquitoes need warm and relatively stable temperatures, as well as water-filled containers such as buckets, barrels, or tires, for their eggs to hatch. Once a mosquito bites an infected person, it also needs to live long enough – probably a week or more, depending on ambient temperatures – for the virus to travel from the mosquito's mid-gut to its salivary glands. Once in the saliva, the virus can then be transmitted by the mosquito biting another person. The study results show that, as springtime weather warms, the potential abundance of the mosquito begins to increase in April in the Southeast and some Arizona cities. By June, nearly all of the 50 cities studied have the potential for at least low-to-moderate abundance, and most eastern cities are suitable for moderate-to-high abundance. Conditions become most suitable for mosquito populations in July, August, and September, although the peak times vary by city. Weather conditions in southern and western cities remain suitable as late as November. Even some cities where the Aedes aegypti mosquito has not been detected, such as St. Louis and Denver, have suitable midsummer weather conditions for the species if it were introduced via transport of used tires or by other human activities, according to the computer models. The researchers stressed that additional factors outside the scope of the study could affect populations of the species, such as mosquito control efforts, competition with other mosquito species, and the extent to which eggs can survive in borderline temperatures. The study noted that northern cities could become vulnerable if a related species of mosquito that is more tolerant of cold temperatures, Aedes albopictus, begins to carry the virus. This animation shows the varying extent to which meteorological conditions can favor populations of the Aedes aegypti mosquito, which transmits the Zika virus, in 50 U.S. cities throughout the year. Red dots represent high-abundance conditions, orange represents medium-to-high, yellow represents low-to-medium, and gray represents no significant mosquito population. (Animation based on data from Andrew Monaghan, NCAR. This image is freely available for media & nonprofit use.) Factoring in travel, poverty In addition to looking at meteorological conditions, the researchers studied two other key variables that could influence the potential for Zika outbreaks: travel from Zika-affected areas and socioeconomic conditions in states that may face abundant mosquito populations. To analyze air travel, the team estimated the number of passengers arriving into U.S. cities on direct flights from airports in 22 Latin American countries and territories listed on the Centers for Disease Control and Prevention’s Zika travel advisory as of January 29. Cities that had both high potential numbers of Aedes aegypti and a large volume of air travelers included Miami, Houston, and Orlando. Since the scientists were able to obtain passenger numbers for direct flights only, they could not estimate the number of passengers continuing on to smaller cities. They noted that the summertime peak in air travel coincides with the peak season in mosquito abundance. The study also estimated that nearly five times as many people cross the U.S.-Mexico border per month than arrive by air in all 50 cities. This could indicate a high potential for transmission in border areas from Texas to California, although the Zika virus has not been widely reported in northern Mexico. Those border areas, as well as other parts of the South where the mosquitoes are expected to be abundant, have a high percentage of households living below the poverty line, according to 2014 U.S. Census data analyzed by the research team. Lower-income residents can be more exposed to mosquito bites if they live in houses without air conditioning or have torn or missing screens that enable mosquitoes to enter their homes more easily. However, Aedes aegypti populations tend to thrive in densely populated urban areas, while some of the most impoverished areas are rural. “The results of this study are a step toward providing information to the broader scientific and public health communities on the highest risk areas for Zika emergence in the United States,” said Kacey Ernst, an epidemiologist at the University of Arizona and co-author of the study. “We hope that others will build on this work as more information becomes available. All areas with an environment suitable to the establishment of Aedes aegypti should be working to enhance surveillance strategies to monitor the Aedes aegypti populations and human populations for disease emergence.” “This research highlights the complex set of human and environmental factors that determine whether a mosquito-borne disease is carried from one area to another, and how severely it affects different human populations,” said Sarah Ruth, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences. “By integrating information on weather, travel patterns, mosquito biology, and human behavior, the project team has improved our ability to forecast, deal with, and possibly even prevent future outbreaks of Zika and other serious diseases.” About the article Title: On the seasonal occurrence and abundance of the Zika virus vector mosquito Aedes aegypti in the contiguous United States Authors: Andrew Monaghan, Cory Morin, Daniel Steinhoff, Olga Wilhelmi, Mary Hayden, Dale Quattrochi, Michael Reiskind, Alun Lloyd, Kirk Smith, Christopher Schmidt, Paige Scalf, and Kacey Ernst Journal: PLOS Currents Outbreaks

COSMIC turns 10: Microsatellites reveal atmospheric properties in 3D

A constellation of six small satellites — rocketed into space a decade ago — has made outsized contributions to our ability to forecast severe weather events, track climate change, and understand space weather. April 15 at 01:40 UTC marks the 10-year anniversary of the launch of the Constellation Observing System for Meteorology, Ionosphere, and Climate (COSMIC). The project is a partnership between Taiwan, where it is called FORMOSAT-3, and the United States. Over its lifetime, COSMIC has proven to be an extremely cost-effective way to gather an abundance of atmospheric data, including three-dimensional profiles of temperature, humidity, and pressure, as well as electron density in the ionosphere. The broadcast by Taiwan's National Space Organization showing the launch of COSMIC in April 2006.  "The results have been outstanding," said Rick Anthes, one of COSMIC's founding scientists and president emeritus of the University Corporation for Atmospheric Research (UCAR), which manages COSMIC in the U.S. "COSMIC has significantly improved weather prediction and expanded our knowledge of the atmosphere around the world. It has exceeded all expectations." In fact, COSMIC has been such a success that the two countries are collaborating on a follow-on mission, known as COSMIC-2 in the U.S. and FORMOSAT-7 in Taiwan. "I am pleased to say that FORMOSAT-3/COSMIC has much exceeded the original planned 2-year mission goal and continues to operate into the 10-year anniversary and beyond," said Guey-Shin Chang, director general of Taiwan's National Space Organization. "FORMOSAT-7/COSMIC-2 is moving along very well and is on target to launch the first six satellites within a year to replace the decaying COSMIC satellites." A second set of six satellites is scheduled to launch in 2019. A fruitful partnership The roots of COSMIC stretch back half a century and across tens of millions of miles to NASA's Mariner 4 mission to Mars. The mission, launched in 1965, was the first to use a technique known as radio occultation to probe a planet's atmosphere. Radio occultation measures the degree to which a radio signal bends and slows as it passes through the atmosphere — the greater the bend and delay, the denser the atmosphere. Receivers on the COSMIC satellites measure the bend and delay of radio signals emitted by GPS satellites to infer information about the atmosphere. Image courtesy of the COSMIC Program. Click to enlarge. NASA went on to use the technique to explore the atmospheres of several other planets, but radio occultation was not used to gather information about our own atmosphere until three decades later. A UCAR radio occultation experiment named GPS/MET changed that. Launched on a single satellite in 1995, GPS/MET demonstrated for the first time that the radio signals already being sent out by existing GPS satellites could be harnessed to record Earth's atmospheric density, a measurement that can be further broken down into information about temperature, humidity, and pressure.  The success of that proof-of-concept experiment fueled interest in launching a constellation of small, relatively inexpensive satellites that could provide similar measurements, but many more of them and over the entire globe. To turn the idea into a reality, UCAR partnered with Taiwan's National Space Organization.  "They really set an example for how a satellite project should be executed," said Ying-Hwa "Bill" Kuo, the longtime director of UCAR's COSMIC Program. "The budget was really tight, but they were very disciplined and were able to complete the project on time and without cost overruns." Each finished COSMIC satellite looks like an oversized pancake with two flipped up solar panels and weighs about 140 pounds. The budget for COSMIC, including the first two years of operation, was $100 million. Taiwan covered 80 percent of the cost. In the United States, the leading sponsor of the project is the National Science Foundation (NSF). Other U.S. partners include the National Oceanic and Atmospheric Administration, the Air Force, the Office of Naval Research, and NASA, which developed the radio occultation receivers used on the satellites. "Unlike most proposed projects where less is achieved than is promised, COSMIC produced much more than proposed and even expected by its proponents," said Jay Fein, who managed COSMIC for NSF when the project was coming together. "Many people and institutions helped to make COSMIC an extraordinary success." A COSMIC microsatellite during testing. Just over 6 inches deep, the satellite contains a GPS occultation transmitter, a Tiny Ionospheric Photometer, and a tri-band beacon to relay data to ground stations. Photo still from video. Photo courtesy of Taiwan's National Space Organization. A wealth of data After a successful launch, the data came flooding back from COSMIC. The constellation took more than 2,500 soundings — vertical profiles of the atmosphere — per day globally. Instantly, these high-quality data were provided free of charge in real time to atmospheric researchers and forecasters around the world, who wasted little time taking advantage of them. “We have been integrating COSMIC-1 data into our numerical weather prediction models since 2007,” said Louis W. Uccellini, director of NOAA’s National Weather Service. “Its unique characteristics of high accuracy and vertical resolution complement other satellite observations used in the models. With the recent upgrade of our supercomputers, we're now able to process more data than ever, including the increased observations expected from COSMIC-2 next year." At its peak, COSMIC was among the top five observing systems in the world when measured by ability to boost the accuracy of numerical weather predictions, according to studies by the European Centre for Medium-Range Weather Forecasts and other international weather prediction centers. "COSMIC has conclusively demonstrated the scientific and practical value of radio occultation and established the technique as an essential part of the future global observing system," Anthes said. Accurate and precise Not only has COSMIC provided a huge volume of atmospheric observations, but those observations have also proven to be highly accurate — so much so that COSMIC data have been used over the past decade to uncover and correct instrument bias in other types of satellite observations. These biases arise as satellite sensors that detect microwave and infrared radiation degrade over time due to relentless exposure to the Sun's rays.  Scientists also have found that data collected by COSMIC are especially useful for forecasting tropical cyclones, including typhoons and hurricanes. COSMIC is able to provide critical observations of water vapor, the fuel that drives tropical cyclones, in high vertical resolution, which means scientists can determine how much water is present at what height in the atmosphere. Other observing systems over the oceans — as well as weather prediction models — often underestimate reservoirs of moisture stored in the lower troposphere, the layer of the atmosphere closest to Earth's surface where tropical cyclones form. COSMIC, on the other hand, provides valuable information to weather prediction models on how much water is available to fuel a forming cyclone.  The upcoming launch of the first phase of COSMIC-2 will increase radio occultation measurements over the tropics, where typhoons and hurricanes are born. "COSMIC-2 has the potential to revolutionize tropical cyclone forecasts," said Kuo, who is now the director of UCAR Community Programs. "Not only will COSMIC-2 provide more data from the tropics, but the data will be higher quality as well, thanks to an improved antenna on the new satellites." More than 140 scientists, engineers, and other experts in radio occultation and its use from 20 countries are gathering in Taiwan for an international conference in Taipei from March 9-11 to celebrate the 10th anniversary of the launch. 

NCAR weather modeling system aids Antarctic rescue effort

Forecast products generated by AMPS for an Antarctic rescue mission. From top: flight-level winds, surface temperature, and cloud base heights. Click to enlarge. Images courtesy of Jordan Powers, NCAR. As the U.S. Antarctic Program began mobilizing a specially equipped ski plane this week to rescue a stranded group of researchers, they asked for some key help from the National Center for Atmospheric Research. They needed a good forecast. "Antarctica has some really bad weather. Every day is different, but often not in a good way," said NCAR scientist Jordan Powers. "We provide model weather guidance to the U.S. Antarctic Program on an ongoing basis. And when these emergencies come up, they often ask us for specialized forecast products, too." This time, the emergency is related to a group of more than 30 people who are stranded at an Australian research station. The group was originally scheduled to return home on the Aurora Australis icebreaker. But the ship broke free of its moorings during a blizzard on Tuesday night (U.S. time) and ran aground. The U.S. Antarctic Program plans to send a ski-equipped LC-130 aircraft to pick up the group and fly them to Australia's Casey research station, where they can catch another flight home.  To ensure the safety of the rescue flight, the U.S. Antarctic Program asked NCAR to provide information on expected winds at 18,000 and 24,000 feet—the altitude where the LC-130 is expected to fly.  The forecast model system created by NCAR and used in Antarctica is called the Antarctic Mesoscale Prediction System (AMPS). AMPS was zoomed in on the section of the continent where the plane would be flying and is providing the information meteorologists need to make a good regional forecast. AMPS has been used in Antarctica since September 2000. About every year and a half, NCAR has been called on to help with a rescue, Powers said. "It's relatively quick to provide the specialized forecast products now because we've been doing these for so long," Powers said.  AMPS is run on Erebus, a supercomputer funded by the National Science Foundation and maintained by NCAR's Computational Information and Systems Lab. Writer/contact:Laura Snider, Senior Science Writer and Public Information Officer

Southwest dries as wet weather systems become more rare

BOULDER—The weather patterns that typically bring moisture to the southwestern United States are becoming more rare, an indication that the region is sliding into the drier climate state predicted by global models, according to a new study. "A normal year in the Southwest is now drier than it once was," said Andreas Prein, a postdoctoral researcher at the National Center for Atmospheric Research (NCAR) who led the study. "If you have a drought nowadays, it will be more severe because our base state is drier." Climate models generally agree that human-caused climate change will push the southwestern United States to become drier. And in recent years, the region has been stricken by drought. But linking model predictions to changes on the ground is challenging. In the new study—published online today in the journal Geophysical Research Letters, a publication of the American Geophysical Union—NCAR researchers grapple with the root cause of current drying in the Southwest to better understand how it might be connected to a warming climate. Subtle shift yields dramatic impact For the study, the researchers analyzed 35 years of data to identify common weather patterns—arrangements of high and low pressure systems that determine where it's likely to be sunny and clear or cloudy and wet, among other things. They identified a dozen patterns that are typical for the weather activity in the contiguous United States and then looked to see whether those patterns were becoming more or less frequent. Weather systems that typically bring moisture to the southwestern United States are forming less often, resulting in a drier climate across the region. This map depicts the portion of overall changes in precipitation across the United States that can be attributed to these changes in weather system frequency. The gray dots represent areas where the results are statistically significant. (Map courtesy of Andreas Prein, NCAR. This image is freely available for media & nonprofit use. Click image to enlarge.) "The weather types that are becoming more rare are the ones that bring a lot of rain to the southwestern United States," Prein said. "Because only a few weather patterns bring precipitation to the Southwest, those changes have a dramatic impact." The Southwest is especially vulnerable to any additional drying. The region, already the most arid in the country, is home to a quickly growing population that is putting tremendous stress on its limited water resources. “Prolonged drought has many adverse effects,” said Anjuli Bamzai, program director in the National Science Foundation’s Division of Atmospheric and Geospace Sciences, which funded the research, “so understanding regional precipitation trends is vital for the well-being of society. These researchers demonstrate that subtle shifts in large-scale weather patterns over the past three decades or so have been the dominant factor in precipitation trends in the southwestern United States.” The study also found an opposite, though smaller, effect in the Northeast, where some of the weather patterns that typically bring moisture to the region are increasing. “Understanding how changing weather pattern frequencies may impact total precipitation across the U.S. is particularly relevant to water resource managers as they contend with issues such as droughts and floods, and plan future infrastructure to store and disperse water,” said NCAR scientist Mari Tye, a co-author of the study. The climate connection The three patterns that tend to bring the most wet weather to the Southwest all involve low pressure centered in the North Pacific just off the coast of Washington, typically during the winter. Between 1979 and 2014, such low-pressure systems formed less and less often. The associated persistent high pressure in that area over recent years is a main driver of the devastating California drought. This shift toward higher pressure in the North Pacific is consistent with climate model runs, which predict that a belt of higher average pressure that now sits closer to the equator will move north. This high-pressure belt is created as air that rises over the equator moves poleward and then descends back toward the surface. The sinking air causes generally drier conditions over the region and inhibits the development of rain-producing systems. Many of the world's deserts, including the Sahara, are found in such regions of sinking air, which typically lie around 30 degrees latitude on either side of the equator. Climate models project that these zones will move further poleward. The result is a generally drier Southwest. While climate change is a plausible explanation for the change in frequency, the authors caution that the study does not prove a connection. To examine this potential connection further, they are studying climate model data for evidence of similar changes in future weather pattern frequencies. "As temperatures increase, the ground becomes drier and the transition into drought happens more rapidly,” said NCAR scientist Greg Holland, a co-author of the study. “In the Southwest the decreased frequency of rainfall events has further extended the period and intensity of these droughts.” The study was funded in part by the National Science Foundation, NCAR's sponsor, and the Research Partnership to Secure Energy for America. Other co-authors of the study include NCAR scientists Roy Rasmussen and Martyn Clark. About the article Title: Running dry: The U.S. Southwest’s drift into a drier climate stateAuthors: Andreas F. Prein, Gregory J. Holland, Roy M. Rasmussen, Martyn P. Clark, and Mari R. TyePublication: Geophysical Research Letters

The quest to predict severe weather sooner

January 26, 2016 | Weather forecasts have become increasingly more reliable thanks to improvements over the past several decades in computer modeling and observational equipment. However, when it comes to severe weather, that reliability typically begins to deteriorate beyond a two-day forecast. To provide an accurate severe weather outlook three or more days in advance, forecasters need to capture the fine-scale behavior of clouds, vertical wind shear and other local processes, as well as the global atmospheric conditions surrounding the local region of interest. Regional models examine fine-scale conditions at high resolution, but they have a difficult time with accuracy between the area of interest and the surrounding region. Errors in these so-called boundary regions can distort the results for the target area. Simulating the entire globe in high resolution would help, but that takes an exorbitant amount of computing time. MPAS's variable mesh enables smooth transitions from higher resolution (over North America in this example) to coarser resolution over the rest of the globe. (@UCAR. This image is freely available for media & nonprofit use.) A global software platform called Model for Prediction Across Scales, or MPAS, aims at resolving those issues. It offers a new way of simulating the atmosphere while providing scientists with more flexibility when focusing on regional conditions. Its development comes at a time when the U.S. National Weather Service wants to increase the lead time and accuracy of forecasts of severe storms, including hurricanes, tornadoes and flash floods, so communities can be better prepared. Unlike traditional three-dimensional models that calculate atmospheric conditions at multiple points within a block-shaped grid, MPAS uses a hexagonal mesh resembling a soccer ball or honeycomb that can be stretched wide or compressed for higher resolution as needed. "The mesh allows for a smooth transition between areas of coarse and fine resolution, with the goal of eliminating boundary distortions," said NCAR Senior Scientist William Skamarock, one of the developers of MPAS. Look globally as well as locally Vertical wind shear, or the change of winds at height, is a critical factor in determining thunderstorm severity. MPAS is able to simulate vertical wind shear at higher resolutions over local areas of interest, as well as cloud behavior and other processes vital to severe weather prediction. Ocean currents and many other global factors also can alter weather quickly. Global forecasts produced by the National Oceanic and Atmospheric Administration (NOAA) go out to 16 days; for tropical cyclones and hurricanes it's five days, but accuracy declines for the extended forecasts. "For some weather events, such as tropical cyclones, what's going on at the other side of the globe can influence the forecast for your region," Skamarock said. So forecasters need to portray the global environment surrounding a region that's under threat. Jointly developed at NCAR and the Los Alamos National Laboratory in New Mexico, MPAS is being groomed especially to improve regional and global weather forecasts, climate modeling, and atmospheric chemistry research, such as regional air-quality forecasts. Last July, MPAS was selected by NOAA as one of the finalists to become the National Weather Service’s next-generation global weather model. The decision is expected later this year. "The fact that MPAS is a finalist is an expression of confidence in the model’s capabilities," Skamarock said. In tests, MPAS has performed well in predicting springtime thunderstorms and other severe weather over the Great Plains. It also has produced realistic simulations of certain tropical cyclones, including Hurricane Sandy of 2012. However, along with other U.S. models, it missed on 2015's Hurricane Joaquin.  Longer lead times ahead NOAA has reported that MPAS provided realistic, fine-scale detail for Hurricane Sandy in 2012 and for 2013 springtime weather over the continental U.S., including the tornado that struck Moore, Okla.  "MPAS also did reasonably well in providing five-day forecasts during a NOAA hazardous weather experiment last May," Skamarock said. MPAS's 48-hour forecast for July 8, 2015, accurately predicted heavy rain for northern Texas and much of Oklahoma. Abilene wound up getting 8.25 inches, its wettest day since record keeping started in 1885. (@UCAR. This image is freely available for media & nonprofit use.) In spring 2015, MPAS also won high marks for the accuracy of its three-day forecasts that helped guide research aircraft missions during a major field campaign to study nighttime thunderstorms on the Great Plains, called PECAN (Plains Elevated Convection at Night). NCAR Project Scientist Stan Trier, who worked as a forecaster on the PECAN campaign, said the MPAS forecasts were usually the first he would look at for planning purposes because MPAS was the only model that had the resolution to indicate possible storm structures beyond 48 hours. Then, as the time to make decisions on overnight field operations approached, he would update these earlier forecasts with new information produced by shorter-range, high-resolution models. "There were multiple situations where MPAS did quite well at these longer time ranges," Trier said. "Forecasts with two to three days of lead time are less accurate than one-day forecasts. This is expected. But overall, I would definitely say that MPAS was a useful part of the PECAN forecasting process." Most recently, MPAS has been tested in Antarctica and during the 2015 tropical cyclone season in the Atlantic and Pacific oceans. It also is being used as a component within the NCAR-based Community Earth System Model for long-term climate prediction, and has been tested at the Taiwan Typhoon and Flood Research Institute to predict severe weather events in that country. Even if MPAS emerges as the National Weather Service’s next-generation weather model, there will still be a role for the Weather Research and Forecasting platform hosted by NCAR. WRF, an open source model used widely worldwide, is especially adept for local and regional weather predictions in the mid-latitudes. And, while MPAS's variable-mesh design conserves computing requirements, as a global model, it still uses more computing resources than WRF. "With MPAS, we want to predict severe thunderstorms with a mesh spacing of a few kilometers," Skamarock said. "That takes a lot of computer power." Writer/contactJeff Smith, Science Writer and Public Information Officer

NCAR to develop wildland fire prediction system for Colorado

BOULDER – The state of Colorado is turning to the National Center for Atmospheric Research (NCAR) to establish the country’s most advanced system for predicting wildland fire behavior, including where and how quickly the blazes will spread. Developed in response to legislation that Gov. John Hickenlooper signed in May, the new agreement finalized this month creates an innovative research and development partnership to generate real-time, 18-hour forecasts of active wildfires in the state. NCAR will work with the Colorado Division of Fire Prevention and Control’s new Center of Excellence for Advanced Technology Aerial Firefighting in Rifle to design and develop the system and begin testing it as early as next year. “This technology represents the next generation of wildland fire prediction,” said NCAR science manager William Mahoney, who worked with state officials on developing the new agreement. “It will capture some of the critical feedbacks between large fires and the local weather, which often result in extreme fire behaviors that threaten lives and property. Colorado is using homegrown technology to lead the nation in wildland fire prediction.” The experimental forecast products will draw on powerful NCAR computer simulations and newly available satellite measurements of fires, obtained with a technique developed at the University of Maryland. They will also incorporate observations from Colorado’s Multi-Mission Aircraft. The Division of Fire Prevention and Control’s Center of Excellence is “excited to be working with NCAR to develop this stakeholder-driven technology,” said Center of Excellence Director Melissa Lineberger. She added that the technology will be particularly valuable to Colorado because it is being developed with stakeholder input and firefighters’ needs in mind. NCAR scientist Janice Coen used the CAWFE modeling system to create this simulation of the 2013 Yarnell Hill fire in Arizona, which killed 19 firefighters.  Click here for other wildfire animations. (©UCAR. This image is freely available for media & nonprofit use.) The system will provide unprecedented detail about interactions between weather and fire, which can create dangers for firefighters on the ground as well as for firefighting aircraft. It will build on a specialized computer model that was developed at NCAR with support by the National Science Foundation, NASA, and the Federal Emergency Management Agency. Once the system is fully developed and operational, it will be run by the Colorado Division of Fire Prevention and Control. Tackling a major threat Wildland fires are highly damaging in Colorado, taking the lives of firefighters and local residents, devastating large areas, and causing hundreds of millions of dollars in damage. Insurance claims from a single blaze, the 2012 Waldo Canyon Fire, totaled more than $450 million. To better protect Colorado, state Rep. Tracy Kraft-Tharp (D-Arvada) and state Sen. Ellen Roberts (R-Durango) sponsored legislation earlier this year to fund development of the forecasting system. “This is a revolutionary early-warning system that will better safeguard all of us for years to come,” Kraft-Tharp said. The lessons learned from the Colorado system are expected to yield benefits for fighting wildfires across the western United States. Capturing fire weather Despite the lives and economic costs at risk, the techniques currently available for anticipating fire behavior remain similar to those of past decades. Typically, firefighters infer how fast the edge of a fire will expand based on terrain, fuels, and a measurement or estimate of the winds. But this approach cannot capture changes associated with the interaction of fire and weather. To accurately forecast a wildland fire in detail, a computer model has to simulate highly localized winds that drive the flames. Adding to the complexity, a major blaze alters local weather, creating winds within the fire that may be more than 10 times stronger than those outside. These internal winds can contribute to potentially deadly accelerations, increases in intensity, unexpected shifts in direction, or splits in which the flames go in multiple directions. This interplay between fire and weather is particularly pronounced in Colorado and other western states, where clouds produce strong outflows and winds can rush down mountainsides and vary from one valley to the next. Wildfire is a major concern across Colorado and many other states across the western U.S. (Fire over Camp Pendleton, California, October 23, 2007. U.S. Marine Corps photo by Lance Cpl. Albert F. Hunt, via Wikimedia Commons.) To tackle this problem, the Colorado forecasting system will use a breakthrough computer model developed by NCAR scientist Janice Coen, who has studied wildland fires for more than 20 years. NCAR’s CAWFE® modeling system (derived from Coupled Atmosphere-Wildland Fire Environment) combines weather prediction with fire behavior simulations to capture the highly complex interplay of fire and weather. By restarting the model every few hours with the latest satellite and aircraft observations of an active fire—a process known as cycling—Coen and her research partner, University of Maryland professor Wilfrid Schroeder, have shown that it is possible to accurately predict the course of a blaze over the next one to two days. They can keep refreshing the model, making it possible to simulate the entire lifetime of even a very long-lived fire, from ignition to extinction. “Even though fires are complex and rapidly changing and often described as unpredictable, much of a fire event can be foreseen by this more sophisticated model,” Coen said. WriterDavid Hosansky, Manager of Media Relations


Subscribe to Weather Research