Promoting diversity in high-performance computing

May 2, 2017 | Justin Moore was supporting his family of four with a job at an auto parts store while juggling classes at Salish Kootenai College, a Native American college in Montana, when he heard about a computing internship in 2014 at the National Center for Atmospheric Research (NCAR) in Boulder, Colo.The internship, which used a small, low-cost computer called Raspberry Pi to teach key concepts of high-performance computing, quickly paid off. Today, Moore works full-time as an IT network specialist at Energy Keepers Inc., which manages the hydroelectric plant on the Flathead Indian Reservation in Montana, while he continues to chip away at his degree."I believe the skills I obtained in the internship can be directly attributed to my success in my field," Moore said. "It also gave me the chance to network with some of the brightest minds in the country."Justin Moore turned a summer internship at NCAR into a full-time computer networking job at a hydroelectric plant on the Flathead Indian Reservation in Montana. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.)Since 2014, NCAR has been using Raspberry Pi as part of the SIParCS (Summer Internships in Parallel Computational Science) program to teach "hot" computing skills to small groups of university students, including one or two who are underrepresented in the sciences. In March, in efforts to reach more students, NCAR pivoted to an "externship" model, bringing the Raspberry Pi training to Miami Dade College faculty who can teach the skills to dozens of students at a time. “Raspberry Pi is a perfect platform for high-performance computing education because the credit-card sized mother boards can be linked together to mimic the parallel processing capabilities of a supercomputer and perform simplified geoscience applications,” said Rich Loft, director of technology development in NCAR's Computational and Information Systems Laboratory.A Raspberry Pi, which costs $35 or less, can run a full Linux operating system — the same system used by nearly all supercomputers, in more than 90 percent of smartphones, and in many other electronic devices.A Raspberry Pi kit used during the NCAR training at Miami Dade College. The Raspberry Pi circuit board is in the upper right-hand corner, connected to a blue cable. Components plug into a breadboard in the center of the picture (Photo courtesy Rich Loft, NCAR.)"It's inexpensive. It levels the playing field," said Loft, who led the training at Miami Dade College. "In my view it busts the digital divide."Loft noted that the previous internship approach wasn't reaching as many students as NCAR had hoped, partly because many students found it too difficult to relocate to Boulder during the summer. Miami Dade proved an ideal testbed for an externship model, since it's one of the country's largest universities, with eight campuses and more than 90,000 students, 70 percent of whom are Hispanic and 17 percent of whom are African American."This approach has scalability," Loft said, shortly after returning from the intensive two-day faculty workshop. "You can't scale up a program training one student at a time, even though it's very rewarding."The NCAR directorate, which supported the Miami Dade training through a diversity grant, hopes that an expanded program will reap even greater outcomes.A legacy of successThe Raspberry Pi internship approach already has yielded additional success stories, with students going on to graduate school and receiving prestigious scholarships.Lauren Patterson, for example, was a student at Hampton University in Virginia when she learned Raspberry Pi as a SIParCs intern at NCAR, also in 2014. "I loved that I was able to work hands-on and assemble the Raspberry Pi cluster myself," Patterson said.Lauren Patterson has received an Apple scholarship and will start a job at Google after completing her summer internship on Raspberry Pi at NCAR. (@UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) She said her experience led to an Apple internship under its scholars program, a $25,000 scholarship, and a software engineering job at Google starting next fall in New York City. Apple scholars participate in a 12-week internship at Apple headquarters in California, receive ongoing coaching and guidance, and serve as Apple ambassadors on their campuses.Gaston Seneza, a senior at Philander Smith College in Arkansas, said that before NCAR's SIParCS 2015 internship he had no practical knowledge of computers.He learned about Linux, sensors, programming, cloud storage, and scientific research, and now has a passion for computer science. "Raspberry Pi was a game-changer for me," he said.Gaston Seneza, who is from Rwanda, also won an Apple scholarship after his summer internship at NCAR. (@UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) The Rwandan native also was named an Apple scholar, and aspires to go into the field of artificial intelligence. "My dream is to see a world where intelligent machines work for us."Said Loft: "We're trying to get these kids on the hi-tech career onramp. You put machine learning or experience with parallel computing on your resume and you can get hired by Apple, Google, or Amazon – or get into graduate school. These are hot skills." Machine learning is a type of artificial intelligence in which a computer program can change or "learn" as it encounters new data.Moore, Patterson, and Seneza all praised the mentoring by Loft, an NCAR senior scientist, and Raghu Raj Prassana Kumar, an NCAR project scientist who has worked with the Raspberry Pi training project since its beginning."It's a lot of fun, and it's very rewarding to help these young people learn," Kumar said.Kumar is also known at NCAR for creative uses of Raspberry Pi, including connecting 12 of them to calculate Pi to a million digits on Pi Day in 2015. (It took longer than a day and one Raspberry Pi burned out from exertion, but it was successful.)Connecting learning to everyday lifeAt the recent Miami Dade workshop, Kumar and Loft, along with University of Wyoming Professor Suresh Muknahallipatna and three of his students, taught 20 Miami Dade faculty members how to set up and program simple projects with a Raspberry Pi. One group used sensors to measure things like temperature, pressure, and humidity, while another created a word frequency histogram from the complete works of William Shakespeare using a Raspberry Pi Hadoop cluster.Ana Guzman (far right), a Miami Dade College associate professor of electricial engineering, gets Raspberry Pi tips from Cena Miller, a University of Wyoming student. A group of Miami Dade faculty members were trained recently on using the low-cost computers for hands-on teaching by a team that included NCAR computer scientists and University of Wyoming students. (Photo courtesy Rich Loft, NCAR.) David Freer, a Miami Dade computer science professor, said he and his colleagues thought the workshop was terrific. "We worked with flame sensors that sent messages to users on their cell phones, along with other cool projects," he said.Djuradj Babich, director of Miami Dade's School of Engineering and Technology, said he hopes to "ride the excitement wave" from the training and develop an ongoing relationship with NCAR. Loft said NCAR also hopes to reach out to additional universities.Qiong Cheng, an assistant professor at Miami Dade, has since set up a Raspberry Pi in her office, complete with a motion detector. She said she will use the Raspberry Pi platform in her classes this fall, which are part of a new bachelor's program in data analytics.She likes the fact that Raspberry Pi, combined with sensors, is an inexpensive way to measure data in the real world, and thus connect learning to everyday life.  "Students are more interested in that," she said, adding that Raspberry Pi supports "our mission to reach underrepresented students — to motivate them, to inspire them, and to provide them with a hands-on learning experience."That's the kind of talk that excites Loft."We want to continue to collaborate to drive this home. Which means that Miami Dade is using this in their curriculum as the workhorse in their computer lab for students," he said. "That's what's going to make me very happy."Writer/Contact:Jeff Smith, Science Writer and Public Information Officer  

NWSC benefits Wyoming with jobs and education

UCAR President Antonio J. Busalacchi co-wrote this perspective about the NCAR-Wyoming Supercomputing Center with Randy Bruns (CEO of the Cheyenne-Laramie County Corporation for Economic Development) and William A. Gern (vice president for research and economic development at the University of Wyoming). The Cheyenne Tribune Eagle on March 28 published it as a guest column (subscription required).March 27, 2017 | With the recent news that the NCAR-Wyoming Supercomputing Center (NWSC) has acquired a new supercomputer that is three times faster than its predecessor, this is a good time to consider how much the center has contributed to Wyoming.The NWSC opened its doors in Cheyenne in October 2012 as one of the premier facilities for science in the United States. Operated by the National Center for Atmospheric Research (NCAR), its key goals included accelerating scientific discovery nationwide, powering economic development in the Cheyenne area, expanding research and computing expertise at the University of Wyoming and improving technology education across the state.The new Cheyenne supercomputer at the NCAR-Wyoming Supercomputing Center (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.)We're pleased to report major progress on all these fronts.The incredible power of the center's supercomputers enables scientists to glean new insights about our planet in ways that support its mission of saving lives, increasing U.S. economic competitiveness, and strengthening national security. One of the more exciting developments, for example, is how researchers at NCAR and partners from research universities, federal labs, and the private sector are using it to improve the prediction of weather patterns months in advance. That's the kind of intelligence that is vital to farmers, energy producers, shipping companies, and other planners in nearly every economic sector.In addition to its role in national research, the NWSC is producing at least three significant benefits here in Wyoming.First, it has emerged as a catalyst for economic development projects. After the NWSC opened, Microsoft committed to a significant data center complex next door, and EchoStar and Green House Data expanded their operations in Cheyenne. A small network of IT support companies has sprung up around them. The result: hundreds of new, high-paying jobs that have helped to insulate the region from the ups and downs of the energy industry.Second, the NWSC has helped propel UW into the upper ranks of research universities. The university has added cutting-edge science and technology classes and attracted top-flight professors from across the country. UW professors and students have priority access to the supercomputer, and they are conducting landmark research into such important topics as traditional and renewable energy, earthquakes, and wildfires.Third, it has improved technology education around the state. UW and NCAR have trained close to 100 Wyoming high school teachers to create lessons based on inexpensive miniature computers. Their students use them to write innovative software and run science experiments. Building on this success, UW is now embarking on a statewide initiative to help teachers in every high school get certified in computer science. Related efforts are also reaching adults, such as mid-career workers who need IT skills to make sure they remain competitive in the ever-changing job market.Well before the state's recent economy downturn, Wyoming began using technology to make inroads in diversifying its economy, aided by the presence of the NWSC. Thanks to access to world-class computing resources and increasingly sophisticated training, Wyoming is expanding technological opportunities for our youth within the state, starting with thousands of schoolchildren as young as first graders who have gotten their initial taste of supercomputing by touring the NWSC.The benefits have gone both ways. Designed for high-performance computing from the ground up, the NWSC provides NCAR with a reliable and energy-efficient home for systems that include high-speed data transfer, visualization, and storage. Scientists at NCAR and more than 100 university partners across the country have engaged in research that was never before possible, such as generating high-resolution simulations of the Sun that will help society better predict "space weather" — the powerful solar storms that periodically threaten orbiting satellites, global communications, and even the nation's electrical grid.When the NWSC opened, its flagship Yellowstone supercomputer ranked among the fastest in the world, capable of performing about 1.5 quadrillion calculations per second. This incredible computing power helped scientists across the country answer key questions about energy production and improve predictions of tornadoes, droughts, floods, and other natural hazards.Now the new NWSC supercomputer, named Cheyenne, more than triples that supercomputing capability. Ranked as the 20th fastest supercomputer in the world—and the fastest in the Mountain West—it will enable researchers to further expand the frontier of scientific knowledge.The NWSC was born from an innovative, public-private partnership that included the state of Wyoming; University of Wyoming; Cheyenne LEADS; Black Hills Energy; NCAR; the National Science Foundation, which sponsors NCAR; and the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation. In less than five years of operation, it has already demonstrated the wisdom of that partnership.We are proud that the NWSC has spurred economic and educational benefits throughout Wyoming, while accelerating research across the United States. The NWSC has established itself as one of the leading supercomputing centers in the world, and we look forward to many more years of exciting research, education, and economic opportunity for the state of Wyoming and the nation.

Turbocharging science

CHEYENNE, Wyoming — The National Center for Atmospheric Research (NCAR) is launching operations this month of one of the world's most powerful and energy-efficient supercomputers, providing the nation with a major new tool to advance understanding of the atmospheric and related Earth system sciences.Named "Cheyenne," the 5.34-petaflop system is capable of more than triple the amount of scientific computing performed by the previous NCAR supercomputer, Yellowstone. It also is three times more energy efficient.Scientists across the country will use Cheyenne to study phenomena ranging from wildfires and seismic activity to gusts that generate power at wind farms. Their findings will lay the groundwork for better protecting society from natural disasters, lead to more detailed projections of seasonal and longer-term weather and climate variability and change, and improve weather and water forecasts that are needed by economic sectors from agriculture and energy to transportation and tourism."Cheyenne will help us advance the knowledge needed for saving lives, protecting property, and enabling U.S. businesses to better compete in the global marketplace," said Antonio J. Busalacchi, president of the University Corporation for Atmospheric Research. "This system is turbocharging our science."UCAR manages NCAR on behalf of the National Science Foundation (NSF).Cheyenne currently ranks as the 20th fastest supercomputer in the world and the fastest in the Mountain West, although such rankings change as new and more powerful machines begin operations. It is funded by NSF as well as by the state of Wyoming through an appropriation to the University of Wyoming.Cheyenne is housed in the NCAR-Wyoming Supercomputing Center (NWSC), one of the nation's premier supercomputing facilities for research. Since the NWSC opened in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources."Through our work at the NWSC, we have a better understanding of such important processes as surface and subsurface hydrology, physics of flow in reservoir rock, and weather modification and precipitation stimulation," said William Gern, vice president of research and economic development at the University of Wyoming. "Importantly, we are also introducing Wyoming’s school-age students to the significance and power of computing."The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support the center has received from the people of that city. The name also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne Nation.Contour lines and isosurfaces provide valuable information about turbulence and aerodynamic drag in this visualization of air flow through the blades of a wind turbine, the product of a simulation on the NCAR-Wyoming Supercomputing Center's Yellowstone system. (Image courtesy Dimitri Mavriplis, University of Wyoming.) Increased power, greater efficiencyCheyenne was built by Silicon Graphics International, or SGI (now part of Hewlett Packard Enterprise Co.), with DataDirect Networks (DDN) providing centralized file system and data storage components. Cheyenne is capable of 5.34 quadrillion calculations per second (5.34 petaflops, or floating point operations per second).The new system has a peak computation rate of more than 3 billion calculations per second for every watt of energy consumed. That is three times more energy efficient than the Yellowstone supercomputer, which is also highly efficient.The data storage system for Cheyenne provides an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  The new DDN system also transfers data at the rate of 220 gigabytes per second, which is more than twice as fast as the previous file system’s rate of 90 gigabytes per second.Cheyenne is the latest in a long and successful history of supercomputers supported by the NSF and NCAR to advance the atmospheric and related sciences.“We’re excited to provide the research community with more supercomputing power,” said Anke Kamrath, interim director of NCAR’s Computational and Information Systems Laboratory, which oversees operations at the NWSC. “Scientists have access to increasingly large amounts of data about our planet. The enhanced capabilities of the NWSC will enable them to tackle problems that used to be out of reach and obtain results at far greater speeds than ever.”More detailed predictionsHigh-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex events and predict how they might unfold in the future. With more supercomputing power, scientists can capture additional processes, run their models at a higher resolution, and conduct an ensemble of modeling runs that provide a fuller picture of the same time period."Providing next-generation supercomputing is vital to better understanding the Earth system that affects us all, " said NCAR Director James W. Hurrell. "We're delighted that this powerful resource is now available to the nation's scientists, and we're looking forward to new discoveries in climate, weather, space weather, renewable energy, and other critical areas of research."Some of the initial projects on Cheyenne include:Long-range, seasonal to decadal forecasting: Several studies led by George Mason University, the University of Miami, and NCAR aim to improve prediction of weather patterns months to years in advance. Researchers will use Cheyenne's capabilities to generate more comprehensive simulations of finer-scale processes in the ocean, atmosphere, and sea ice. This research will help scientists refine computer models for improved long-term predictions, including how year-to-year changes in Arctic sea ice extent may affect the likelihood of extreme weather events thousands of miles away.Wind energy: Projecting electricity output at a wind farm is extraordinarily challenging as it involves predicting variable gusts and complex wind eddies at the height of turbines, which are hundreds of feet above the sensors used for weather forecasting. University of Wyoming researchers will use Cheyenne to simulate wind conditions on different scales, from across the continent down to the tiny space near a wind turbine blade, as well as the vibrations within an individual turbine itself. In addition, an NCAR-led project will create high-resolution, 3-D simulations of vertical and horizontal drafts to provide more information about winds over complex terrain. This type of research is critical as utilities seek to make wind farms as efficient as possible.Space weather: Scientists are working to better understand solar disturbances that buffet Earth's atmosphere and threaten the operation of satellites, communications, and power grids. New projects led by the University of Delaware and NCAR are using Cheyenne to gain more insight into how solar activity leads to damaging geomagnetic storms. The scientists plan to develop detailed simulations of the emergence of the magnetic field from the subsurface of the Sun into its atmosphere, as well as gain a three-dimensional view of plasma turbulence and magnetic reconnection in space that lead to plasma heating.Extreme weather: One of the leading questions about climate change is how it could affect the frequency and severity of major storms and other types of severe weather. An NCAR-led project will explore how climate interacts with the land surface and hydrology over the United States, and how extreme weather events can be expected to change in the future. It will use advanced modeling approaches at high resolution (down to just a few miles) in ways that can help scientists configure future climate models to better simulate extreme events.Climate engineering: To counter the effects of heat-trapping greenhouse gases, some experts have proposed artificially cooling the planet by injecting sulfates into the stratosphere, which would mimic the effects of a major volcanic eruption. But if society ever tried to engage in such climate engineering, or geoengineering, the results could alter the world's climate in unintended ways. An NCAR-led project is using Cheyenne's computing power to run an ensemble of climate engineering simulations to show how hypothetical sulfate injections could affect regional temperatures and precipitation.Smoke and global climate: A study led by the University of Wyoming will look into emissions from wildfires and how they affect stratocumulus clouds over the southeastern Atlantic Ocean. This research is needed for a better understanding of the global climate system, as stratocumulus clouds, which cover 23 percent of Earth's surface, play a key role in reflecting sunlight back into space. The work will help reveal the extent to which particles emitted during biomass burning influence cloud processes in ways that affect global temperatures.

Raising the visibility of women in IT

October 17, 2016 | To provide a boost to women working in information technology, the University Corporation for Atmospheric Research (UCAR) is helping to bring together a team of women who will help build and operate a high-capacity network at a major supercomputing conference.The Women in IT Networking at SC program, or WINS, is a collaboration among UCAR, the U.S. Department of Energy’s Energy Sciences Network, and the Pennsylvania-based Keystone Initiative for Network Based Education and Research. Following a national competition, WINS selected seven women who work in IT departments at universities and national labs around the country to help build and operate SCinet, the very high capacity network at the SC16 international supercomputing conference in Salt Lake City next month.For the second year in a row, UCAR will help bring together a team of women to provide technical support at SC, a leading supercomputing conference. UCAR's Marla Meehl (left) and ESnet's Jason Zuraski (second from left) are pictured at last year's conference, meeting with WINS team members. (Photo by Marijke Unger, NCAR.)"This provides the women with great exposure to the latest in technology, working with some of the top engineers who are out there," said Marla Meehl, manager of the Network Engineering and Telecommunications Section for UCAR and NCAR, the National Center for Atmospheric Research. "It's an opportunity to learn and have exposure to things that they don't work with every day."Women are increasingly underrepresented in technological fields. A report last year by the American Association of University Women found that the number of U.S. women working in the computing and mathematical professions dropped from 35% in 1990 to just 26% in 2013.Meehl worked with several other IT experts to launch WINS last year and expand the number of women among the volunteers who design and deliver SCinet. Planning begins more than a year in advance and culminates in a high-intensity, around-the-clock installation in the days leading up to the conference."I’m grateful to be one of the WINS grant awardees and participate in SCinet," said Angie Asmus, IT security analyst at Colorado State University. "Because of WINS, I will be able to be mentored by and work with some of the brightest minds in IT. This is an amazing opportunity for me to gain hands-on experience and build important relationships that will be valuable to me as I progress in my career."Other participants are Denise Grayson, Sandia National Laboratories; Julie Locke, Los Alamos National Laboratory; Kali McLennan, University of Oklahoma: Amber Rasche, North Dakota State University; Jessica Shaffer, Georgia Institute of Technology; Julia Staats, CENIC; and, with separate funding, Indira Kassymkhanova of Lawrence Berkeley National Laboratory.The WINS participants were chosen from 28 eligible applicants—a big jump from the 19 applications received the previous year. The selection team weighed a variety of factors, looking for applicants who had experience in networking; whose skillset matched their area of interest; whose participation was supported by their institution; and who added to the group’s diversity, whether geographically, institutionally or otherwise.The WINS awardee selection team, led by Wendy Huntoon of the Keystone Initiative, included Susan Lucas from ESnet, Linda Winkler from Argonne National Labs, Dave Jent from Indiana University, and Florence Hudson from Internet2.Meehl was able to secure funding from the National Science Foundation for participants from research and education organizations. The Department of Energy is supporting the women from its national laboratories.“Although there are more jobs in IT, there’s a massive shortage of workers, especially in the number of women in the field,” Meehl said. “It was really fulfilling this year to see a huge jump in the number of really qualified applicants. It was very hard to choose.”Writer/editor:David Hosansky, Manager of Media Relations

Wrangling observations into models

April 4, 2016 | If scientists could directly measure the properties of all the water throughout the world’s oceans, they wouldn’t need help from NCAR scientist Alicia Karspeck. But since large expanses of the oceans are beyond the reach of observing instruments, Karspeck’s work is critical for those who want estimates of temperature, salinity, and other properties of water around the globe. Scientists need these estimates to better understand the world’s climate system and how it is changing. “It’s painstaking work, but my hope is it will lead to major advances in climate modeling and long-term prediction,” Karspeck said. She is one of a dozen or so researchers at NCAR who spend their days on data assimilation, a field that is becoming increasingly important for the geosciences and other areas of research. Broadly speaking, data assimilation is any method of enabling computer models to utilize relevant observations. Part science and part art, it involves figuring out how to get available measurements--which may be sparse, tightly clustered, or irregularly scattered--into models that tend to simplify the world by breaking it into gridded boxes. Commonly used in weather forecasting, the technique can improve simulations and help scientists predict future events with more confidence. It can also identify deficiencies in both models and observations. As models have become more powerful and observations more numerous, the technique has become so critical that NCAR last year launched a Data Assimilation Program to better leverage expertise across its seven labs. “Activities in data assimilation have grown well beyond traditional applications in numerical weather prediction for the atmosphere and now span across NCAR’s laboratories,” said NCAR Director Jim Hurrell. “The Data Assimilation program is designed to enhance data assimilation research at NCAR, while at the same time serving the broader U.S. research community.” Scientists are using data assimilation techniques to input a range of North American observations into experimental, high-resolution U.S. forecasts. These real-time ensemble forecasts are publicly available while they're being tested. (@UCAR. This image is freely available for media & nonprofit use.) Improving prediction Created by the NCAR Directorate, the Data Assimilation Program is designed to advance prediction of events ranging from severe weather and floods to air pollution outbreaks and peaks in the solar cycle. One of its goals is to encourage collaborations among data assimilation experts at NCAR and the larger research community. For example, scientists in several labs are joining forces to apply data assimilation methods to satellite measurements to create a database of global winds and other atmospheric properties. This database will then be used for a broad range of climate and weather studies. The program also provides funding to hire postdocs at NCAR to focus on data assimilation projects as well as for a software engineer to support such activities. NCAR Senior Scientist Chris Snyder coordinates the Data Assimilation Program. "By bringing money to the table, we’re building up data assimilation capability across NCAR,” said NCAR Senior Scientist Chris Snyder, who coordinates the Data Assimilation Program. “This is critical because data assimilation provides a framework to scientists throughout the atmospheric and related sciences who need to assess where the uncertainties are and how a given observation can help.” NCAR Senior Scientist Jeff Anderson, who oversees the Data Assimilation Research Testbed (DART), says that data assimilation has become central for the geosciences. DART is a software environment that helps researchers develop data assimilation methods and observations with various computer models. “I think the Data Assimilation Program is a huge win for NCAR and the entire atmospheric sciences community,” Anderson said. “The scientific method is about taking observations of the world and making sense of them, and data assimilation is fundamental for applying the scientific method to the geosciences as well as to other research areas.” From oceans to Sun Here are examples of how data assimilation is advancing our understanding of atmospheric and related processes from ocean depths to the Sun’s interior: Oceans. Karspeck is using data assimilation to estimate water properties and currents throughout the world's oceans. This is a computationally demanding task that requires feeding observations into the NCAR-based Community Earth System Model, simulating several days of ocean conditions on the Yellowstone supercomputer, and using those results to update the conditions in the model and run another simulation. The good news: the resulting simulations match well with historical records, indicating that the data assimilation approach is working. “My goal is to turn this into a viable system for researchers,” Karspeck said. Air quality. Atmospheric chemists at NCAR are using data assimilation of satellite observations to improve air quality models that currently draw on limited surface observations of pollutants. For example, assimilating satellite observations would show the effect of emissions from a wildfire in Montana on downwind air quality, such as in Chicago.  “We've done a lot of work to speed up the processing time and the results are promising," said NCAR scientist Helen Worden. “The model simulations after assimilating satellite carbon monoxide data are much closer to actual air quality conditions.” Weather forecasting. Data assimilation is helping scientists diagnose problems with weather models. For example, why do models consistently overpredict or underpredict temperatures near the surface? Using data assimilation, NCAR scientist Josh Hacker discovered that models incorrectly simulate the transfer of heat from the ground into the atmosphere. “With data assimilation, you’re repeatedly confronting the model with observations so you can very quickly see how things go wrong,” he said. Solar cycle. Scientists believe the 11-year solar cycle is driven by mysterious processes deep below the Sun’s surface, such as the movements of cells of plasma between the Sun’s lower latitudes and poles. To understand the causes of the cycle and ultimately predict it, they are turning to data assimilation to augment observations of magnetic fields and plasma flow at the Sun’s surface and feed the resulting information into a computer model of subsurface processes. “We are matching surface conditions to the model, such as the pattern and speed of the plasma flows and evolving magnetic fields,” said NCAR scientist Mausumi Dikpati. Capturing data. In addition to helping scientists improve models, the new Data Assimilation Program is also fostering discussions about observations. NCAR senior scientist Wen-Chau Lee and colleagues who are experts in gathering observations are conferring with computer modelers over how to process the data for the models to readily ingest. One challenge, for example, is that radars may take observations every 150 meters whereas the models often have a resolution of 1-3 kilometers. Inputting the radar observations into the models requires advanced quality control techniques, including coordinate transformation (modifying coordinates from observations to the models) and data thinning (reducing the density of observations while retaining the basic information). “We are modifying our quality control procedures to make sure that the flow of data is smooth.” Lee said. “With data assimilation, the first word is ‘data’,” he added. “Without data, without observations, there is no assimilation.” Writer/contactDavid Hosansky, Manager of Media Relations FundersNCAR Directorate National Science FoundationAdditional funding agencies for specific projects  

NCAR announces powerful new supercomputer for scientific discovery

BOULDER—The National Center for Atmospheric Research (NCAR) announced today that it has selected its next supercomputer for advancing atmospheric and Earth science, following a competitive open procurement process. The new machine will help scientists lay the groundwork for improved predictions of a range of phenomena, from hour-by-hour risks associated with thunderstorm outbreaks to the timing of the 11-year solar cycle and its potential impacts on GPS and other sensitive technologies.The new system, named Cheyenne, will be installed this year at the NCAR-Wyoming Supercomputing Center (NWSC) and become operational at the beginning of 2017.Cheyenne will be built by Silicon Graphics International Corp. (SGI) in conjunction with centralized file system and data storage components provided by DataDirect Networks (DDN). The SGI high-performance computer will be a 5.34-petaflop system, meaning it can carry out 5.34 quadrillion calculations per second. It will be capable of more than three times the amount of scientific computing performed by Yellowstone, the current NCAR supercomputer.Funded by the National Science Foundation and the state of Wyoming through an appropriation to the University of Wyoming, Cheyenne will be a critical tool for researchers across the country studying climate change, severe weather, geomagnetic storms, seismic activity, air quality, wildfires, and other important geoscience topics. Since the supercomputing facility in Wyoming opened its doors in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources.Six clips of scientific visualizations created with the help of the Yellowstone supercomputer. For more details on the individual clips, and to see the full-length visualizations, click here. “We’re excited to bring more supercomputing power to the scientific community,” said Anke Kamrath, director of operations and services at NCAR’s Computational and Information Systems Laboratory. “Whether it’s the threat of solar storms or a heightened risk in certain severe weather events, this new system will help lead to improved predictions and strengthen society’s resilience to potential disasters.”“Researchers at the University of Wyoming will make great use of the new system as they continue their work into better understanding such areas as the surface and subsurface flows of water and other liquids, cloud processes, and the design of wind energy plants,” said William Gern, vice president of research and economic development at the University of Wyoming. “UW’s relationship with NCAR through the NWSC has greatly strengthened our scientific computing and data-centric research. It’s helping us introduce the next generation of scientists and engineers to these endeavors.”The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support that it has received from the people of that city. It also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne nation.Increased power, greater efficiencyThe new data storage system for Cheyenne will be integrated with NCAR’s existing GLADE file system. The DDN storage will provide an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  This, combined with the current 16 petabytes of GLADE, will total 36 petabytes of high-speed storage. The new DDN system also will transfer data at the rate of 200 gigabytes per second, which is more than twice as fast as the current file system’s rate of 90 gigabytes per second.The system will include powerful Intel Xeon processors, whose performance will be augmented through optimization work that has been done by NCAR and the University of Colorado Boulder. NCAR and the university performed this work through their participation in the Intel Parallel Computing Centers program.Even with its increased power, Cheyenne will be three times more energy efficient (in floating point operations per second, or flops, per watt) than Yellowstone, its predecessor, which is itself highly efficient.“The new system will have a peak computation rate of over 3 billion calculations per second for every watt of power consumed," said NCAR’s Irfan Elahi, project manager of Cheyenne and section manager for high-end supercomputing services.Scientists used the Yellowstone supercomputer to develop this 3-D rendering of a major thunderstorm in July 2011 that caused flooding in Fourmile Canyon west of Boulder. The colors show conditions in the clouds, including ice particles (light blue), graupel (orange), snow (pink), rain (blue), and water (grey). (Image by David Gochis, NCAR. This image is freely available for media & nonprofit use.)More detailed predictionsHigh-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex processes and how they might unfold in the future. These predictions give resource managers and policy experts valuable information for planning ahead and mitigating risk.Some of the areas in which Cheyenne is expected to accelerate research include the following:Streamflow. Year-ahead predictions of streamflows and associated reservoir levels at a greater level of detail will provide water managers, farmers, and other decision makers with vital information about likely water availability and the potential for drought or flood impacts.Severe weather. By conducting multiple simultaneous runs (or ensembles) of high-resolution forecast models, scientists will lay the groundwork for more specific predictions of severe weather events, such as the probability that a cluster of intense thunderstorms with the risk of hail or flooding will strike a county at a particular hour.Solar energy. Specialized models of solar irradiance and cloud cover will be run more frequently and at higher resolution, producing research that will help utilities predict how much energy will be generated by major solar arrays hours to days in advance.Regional climate change. Scientists will conduct multiple simulations with detailed climate models, predicting how particular regions around the world will experience changing patterns of precipitation and temperature, along with potential impacts from sea level rise, streamflow, and runoff.Decadal prediction. Ensembles of detailed climate models will also help scientists predict the likelihood of certain climate patterns over a 10-year period, such as the risk of drought for a certain region or changes in Arctic sea ice extent.Air quality. Scientists will be able to simulate the movement and evolution of air pollutants in far more detail, thereby better understanding the potential health effects of particular types of emissions and working toward improved forecasts of air quality.Subsurface flows. More accurate and detailed models will enable researchers to better simulate the subsurface flows of water, oil, and gas, leading to a greater understanding of these resources.Solar storms. Innovative, three-dimensional models of the Sun will lay the groundwork for predictions of the timing and strength of the Sun’s 11-year cycle as well as for days-ahead forecasts of solar disturbances that can generate geomagnetic storms in Earth’s upper atmosphere."Supercomputing is vital to NCAR’s scientific research and applications, giving us a virtual laboratory in which we run experiments that would otherwise be impractical or impossible to do,” said NCAR Director James Hurrell. “Cheyenne will be a key component of the research infrastructure of the United States through its provision of supercomputing specifically tailored for the atmospheric, geospace, and related sciences. The capabilities of this new system will be central to the continued improvement of our ability to understand and predict changes in weather, climate, air quality, and space weather, as well as their impacts on people, ecosystems, and society.”This series of images, based on a research project run on the Yellowstone supercomputer, shows order and chaos in the Sun's interior dynamo. Turbulent plasma motions (image a) generate a tangled web of magnetic field lines, with opposing "wreaths" of magnetism pointing east (red) or west (blue). Images b and c provide a better look at the magnetic wreaths. (Images by Kyle Augustson, NCAR. This image is freely available for media & nonprofit use.)Cheyenne Quick FactsKey features of the new Cheyenne supercomputer system:5.34-petaflop SGI ICE XA Cluster with Intel “Broadwell” processorsMore than 4K compute nodes20% of the compute nodes have 128GB memory and the remaining ~80% have 64GB memory313 terabytes  (TB) of total memoryMellanox EDR InfiniBand high-speed interconnectPartial 9D Enhanced Hypercube interconnect topologySUSE Linux Enterprise Server operating systemAltair PBS Professional Workload ManagerIntel Parallel Studio XE compiler suiteSGI Management Center & SGI Development SuiteMellanox Unified Fabric ManagerThe new Cheyenne supercomputer and the existing file system are complemented by a new centralized parallel file system and data storage components.Key features of the new data storage system:Four DDN SFA14KX systems20 petabytes of usable file system space (can be expanded to 40 petabytes by adding drives)200 GB per second aggregate I/O bandwidth3,360 × 8-TB NL SAS drives48 × 800-GB mixed-use SSD drives for metadata24 × NSD (Network Shared Disk) serversRed Hat Enterprise Linux operating systemIBM GPFS (General Parallel File System)Update • January 16, 2017  | The amount of computing power available in the new Cheyenne system has been updated to more than three times the Yellowstone system, an increase from 2.5, the estimate prior to installation and testing.

Storm-proven forecasting gets yearlong trial

June 3, 2015 | Storm-studying scientists have made their next-generation forecasting system available online so the wider weather community can put it to the test. After using the real-time system during short-lived field research campaigns, developers at the National Center for Atmospheric Research (NCAR) are now ready to see how it performs year-round, and they're eager for user feedback. In April, NCAR scientists began running daily forecasts using the sophisticated system, which has proven its mettle by skillfully predicting the path of early summer storms as they roll across the country's midsection. The new project, which is funded to run through at least mid-June 2016, will allow scientists to see if the forecasts are as adept at predicting weather phenomena that more frequently occur at other times of the year.  "This type of system has never been run year round," said NCAR scientist Craig Schwartz, who co-leads the project. "We want to examine a wide range of weather phenomena, like winter storms, that are not typically studied with high-resolution models and see how the system performs." NCAR's high-resolution ensemble forecasts show 10 possible scnearios for how much precipitation was expected to fall acrosss the central Great Plains over a two-day period in early May. Check out current forecasts at (©UCAR. This animation is freely available for media & nonprofit use.) The high-resolution forecasting system was first developed to aid field scientists trying to get up-close views of severe weather events. Detailed information about where an individual storm is likely headed helps ensure that ground- and air-based observing equipment gets deployed to the best locations.  "We found that the forecasts did a really good job of showing us where the greatest hazards were going to be and allowed us to put field crews in the right places," said NCAR scientist Glen Romine, who has been involved in running the forecasts for past field campaigns. Now Romine and Schwartz, along with NCAR's Kate Fossell and Ryan Sobash, are extending the forecasts beyond the field, across the country and onto the Internet. By making their forecasts for the continental United States easily available at, they hope to get feedback from the meteorological community and hear from scientists who may want to use the data in their own research. The NCAR team is also supplying the forecasts to the National Severe Storms Laboratory and the Storm Prediction Center, both part of the National Oceanic and Atmospheric Administration. Scientists there are interested in seeing if the data can help them better track individual storms and issue more precise severe weather forecasts. To add probability, move beyond a single forecast To predict the path of an individual storm—instead of the general area where conditions are ripe for storm formation—scientists need a weather model that can run at a higher resolution than is commonly used. Because these detail-oriented models burn through so much computing power, they're typically used to create just a single, deterministic forecast. Deterministic forecasts describe just one possible future weather scenario—a single path that a storm might follow, for example. Because of this, the forecast is either right or wrong; there's no gray area in between.  Several years ago, NCAR scientists and their colleagues, working on storm-focused field campaigns, began using a different technique, known as ensemble forecasting, that allowed them to move away from black-or-white deterministic forecasts and instead create forecasts that incorporated the probability that a certain weather scenario would actually come to pass. Armed with the vast power of the new Yellowstone supercomputer, they were able to start producing high-resolution ensemble forecasts by running the same model—the advanced research version of NCAR's Weather Research and Forecasting Model (WRF-ARW)—multiple times for the same forecast period using different initial estimates of atmospheric conditions to kick off each run. "When most groups run a high-resolution forecast, they grab a single estimate of the state of the atmosphere, plop it down on their high-resolution forecast grid, and just run it," said Romine, who is co-leading the project with Schwartz. "When you initialize an ensemble forecast, you want a range of estimates of the state of the atmosphere that are all equally likely. What you end up with is a variety of different forecast solutions, despite the fact that the forecasts were started with just small initial differences." Using an array of initial estimates is important because scientists can't measure exactly what conditions exist in every part of the atmosphere at any given moment. In the expanses between weather stations and other observational equipment, scientists must use models to help them make best guesses of the true conditions. Small variations in those estimates can result in big differences in the forecast outcome. The multiple forecasts that are generated using a range of initial estimates, known as the ensemble members, show a breadth of possible outcomes. This range of results gives scientists and forecasters the ability to determine how probable any particular weather event occurring in any particular location might be. Where the member forecasts tend to agree, the probability is higher. Where they don't agree, the probability drops. The real-time daily ensemble forecasts now available at have 10 member forecasts, each of which is initialized using estimates of the atmosphere generated by NCAR's Data Assimilation Research Testbed (DART) toolkit. "Because these forecast methods are so new, a lot of research and testing is needed to understand how to put surface, radar and satellite data properly into the models to start them off correctly so we get better forecasts," said Louis Wicker, a research meteorologist at the National Severe Storms Laboratory, where they are testing a number of different ensemble systems. "NCAR’s ensemble forecast system is one of the best storm-scale ensemble systems currently being tested." Dive Deeper Learn more about NCAR's ensemble forecast details. Anyone interested in more information, providing feedback, or collaborating on the project can email Writer/contactLaura Snider Collaborating institutionsNational Center for Atmospheric ResearchNCAR Mesoscale & Microscale Meteorology LaboratoryNCAR Computational & Information Systems LaboratoryNOAA National Severe Storms LaboratoryNOAA/NWS Storm Prediction Center  FundersNational Science Foundation

NCAR enhances big data services for climate and weather researchers

BOULDER — The National Center for Atmospheric Research (NCAR) has recently implemented an enhanced data sharing service that allows scientists increased access to data as well as improved capabilities for collaborative research. In addition to data sharing, NCAR has significantly upgraded its centralized file service, known as the Globally Accessible Data Environment (GLADE). Managed by NCAR’s Computational and Information Systems Laboratory (CISL), both GLADE and the data sharing service are important upgrades for the high-performance computing (HPC) user community, allowing faster and better access to data and a more flexible virtual workspace. The Globally Accessible Data Environment (GLADE) is the centralized file service located at the NCAR-Wyoming Supercomputing Center in Cheyenne. (Photo courtesy David Read, NCAR.) The data sharing service leverages the capabilities of Globus Plus to increase customization options for storage as well as data sharing. Globus, a project of the Computation Institute (a partnership of The University of Chicago and Argonne National Laboratory), is a software service that has been described as a dropbox for big data. It is broadly used in the scientific community. “Plus” refers to a new feature that allows researchers to share data with colleagues outside of their home institutions, greatly improving ease of collaborative work. “Scientific collaborations are global endeavors, and researchers need to share data with colleagues around the world. As data sets have grown in size and number, the process of moving and managing access to them has become a significant challenge,” said Pam Gillman, manager of NCAR’s Data Analysis Services Group. “Globus Plus is a robust and user-friendly service that eases the workflow, and it allows users to be more productive by spending less time on the minutiae of data transfers.” NCAR users have been accessing the Globus transfer service for many years. In addition to making data available to external colleagues, the upgrade now allows users of CISL's HPC environment to control the users or groups of users to which the data are accessible. With the sharing service, outside users need only a free Globus account, not a UCAR username/token, to access shared data. The Globus Plus service has a 1.5-petabyte capacity, and most users can take advantage of the Globus web interface to transfer data. Advanced users or service developers can leverage the Globus Plus features via a command-line interface. CISL recently added 5 petabytes of high performance storage to the GLADE environment, bringing the total to 16.4 petabytes. GLADE is based on the GPFS file system and provides over 90 GB/s of sustained bandwidth across HPC, analysis, and visualization resources. GLADE file spaces are intended as work areas for day-to-day tasks and are well suited for managing software projects, scripts, code, and data sets. “We strive to meet the growing needs of our user community, which expand as the data sets grow and require greater and more efficient resources,” said Gillman. “These major upgrades are part of CISL’s ongoing commitment to giving users the tools and services they need to carry out cutting-edge computational research.”  

NCAR & CU Join Intel Parallel Computing Centers Program

BOULDER —The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help train the next generation of scientists and engineers who will apply these new technologies to challenges of societal importance. The NCAR/CU team will focus on weather and climate applications, including the NCAR-based Community Earth System Model (CESM), Weather Research and Forecasting model (WRF), and Model for Prediction Across Scales (MPAS),  three of the most widely used applications in the field. The Indian Institute of Science in Bangalore, India, will also collaborate with the NCAR/CU-Boulder team on the project. This international public-private partnership is one of several Intel Parallel Computing Centers being established with recognized high performance computing institutions and research groups. It is part of the research and academic work that Intel supports in the quest for increasing efficiency and optimization of parallel microprocessor computer architectures. NCAR's advanced research version of the Weather Research and Forecasting model (WRF) was used last year to simulate air flow in and around 2012’s Hurricane Sandy. In this 3-D simulation of potential temperature, relatively cool air wraps around Sandy's core near the surface (purple and blue colors), while air parcels gain heat from moisture condensing into clouds and precipitation as they ascend through the storm’s core. (©UCAR. Image courtesy Mel Shapiro, NCAR. This image is freely available for media & nonprofit use.). Video animations are available of this simulation and several others here. To solve today's most challenging scientific problems, computer scientists must coordinate many processing elements, assuring they work together efficiently on the same problem. Called parallel processing, this approach is essential to exploiting the power of the latest generation of Intel® processors. “This partnership will accelerate the important work we are doing to prepare both our applications and our workforce to tackle the next computational challenges in weather and climate simulation,” said Thomas Bogdan, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation. “It helps us conduct valuable, in-depth research on Intel architecture, and it underscores the value of working with industry in preparing for future technologies and training bright young minds in these fields.” The models that simulate climate and weather have an insatiable need for increased computing power, and as the physical models evolve, the computational algorithms must also keep pace in order to run efficiently at ever-increasing resolutions that provide enhanced detail. Importantly, preparing applications for the next generation of parallel processors and supercomputers will ensure a smooth transition when these systems become widely available. The collaboration with NCAR and Intel provides an exciting opportunity for CU-Boulder graduate students to participate in research that will prepare climate and weather code for the next generation of Intel processors. Students will have the opportunity to collaborate with scientists on large-scale computational problems that have a real impact. Thomas Hauser, CU-Boulder’s Director of Research Computing, said, “This project will give me the opportunity to have students in the High-Performance Scientific Computing class work on projects that are connected to this effort with access to Intel’s newest architectures.” Research Computing at CU-Boulder is in the process of setting up a small cluster containing the Intel® Xeon PhiTM coprocessor, provided to CU-Boulder by Intel, in order to give students and researchers at CU-Boulder access to these new Intel architectures for porting and tuning their scientific applications. Intel and Xeon Phi are registered trademarks of Intel Corporation in the United States and other countries.

NWSC named "Green" Data Center of the Year

BOULDER— The NCAR-Wyoming Supercomputing Center (NWSC) has been named the 2013 “‘Green’ Data Center of the Year” at the inaugural Datacenter Dynamics North American Awards. The Vistor Center at the NWSC. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) The NWSC received top honors in a category that recognizes the reality of designing and operating data centers in the context of environmental scrutiny. The award was bestowed last week in a ceremony in San Francisco. This award comes on the heels of the NWSC’s top honors for facility design from the Green Enterprise IT Awards from the Uptime Institute, showcasing cutting-edge data center projects that demonstrate energy and resource efficiency in a new, operational data center.  “We are gratified that our efforts to build and operate the most efficient and sustainable data center possible have been successful and that the NWSC is being recognized on its merits,” says Gary New, who manages the center for NCAR’s Computational and Information Systems Laboratory (CISL). “Nearly 10 years of planning and hard work went into making this success a reality.” Located in Cheyenne, Wyoming, the facility must adapt to outside weather conditions that can change rapidly in the West. “The award recognizes the state-of-the-art design features our team built into NWSC that enable it to respond quickly and efficiently to changing conditions both inside and outside the facility,” says Al Kellie, director of NCAR’s CISL. “In this way, we can minimize the environmental impact of our scientific computing operations." Construction of the NWSC was made possible by the sponsorship of the National Science Foundation (NSF) and a unique collaborative partnership between local, state, and federal government entities and private industry to provide project funding and governance. Partners include the State of Wyoming, the University of Wyoming, Cheyenne LEADS, the Wyoming Business Council, and Cheyenne Light, Fuel & Power. The NWSC is operated by the National Center for Atmospheric Research (NCAR) on behalf of NSF, which sponsors NCAR, and the University Corporation for Atmospheric Research (UCAR), which manages it. The NWSC, which is home to one of the most powerful supercomputers dedicated to Earth system science, also achieved a Leadership in Energy and Environmental Design Gold certification from the U.S. Green Building Council last year. LEED certification depends on a number of sustainable criteria, such as energy efficiency, water conservation, and the use of recycled or locally sourced construction materials. “The NWSC was designed and built to the highest standards in sustainability and efficiency,” says UCAR president Thomas Bogdan. “It is deeply gratifying that this approach is receiving national and international recognition.” NWSC design highlights Sustainable materials: During construction, more than 70 percent of construction waste was diverted from landfills and used for recycling. The building itself is made with more than 510 tons of recycled concrete, 60 tons of recycled wood, and 26 tons of recycled metal. Water: The ultra-efficient cooling tower configuration, as well as use of native species for landscaping, enables water savings of up to 6 million gallons per year. Infrastructure and Space Use: The center's efficient use of energy means that the mechanical and electrical systems, as well as its office space expend less than 10 percent of the total power consumption. Heating: Waste heat from the supercomputer is captured and reused to heat the building and melt snow and ice on exterior walkways and loading docks. Cooling: During design, project planners estimated that Wyoming’s cool, dry climate would allow natural cooling of the facility for 96 percent of the year. Early experience with the facility indicates that 98-99 percent is achievable. Power: Renewable wind energy provides direct power to the facility, starting at 10 percent of supply with the ability to raise that percentage as conditions permit. Flexibility and Longevity: The design of the NWSC includes “future proofing” to anticipate adaptation to evolving technologies and deployment of future supercomputing systems yet to be developed. The design is also highly modular, allowing critical power and cooling components to be provisioned only when needed. This enhanced flexibility helps minimize capital expenditures by providing only what is needed when it is needed. Combining all of these factors, the NWSC not only minimizes the environmental footprint but also directs operating funds toward productive scientific work while reducing overhead expenses. About the Datacenter Dynamics Awards The Datacenter Dynamics Awards are the leading awards for the data center industry, recognizing innovation, leadership, and “out of the box” thinking. With 15 established award categories and active award programs in Europe, China, Japan, Spanish-speaking Latin America, and Brazil, the awards celebrate successful data center projects of all sizes, across all sectors with many hundreds of worthy entries.


Subscribe to Supercomputers