Supercomputers

Raising the visibility of women in IT

October 17, 2016 | To provide a boost to women working in information technology, the University Corporation for Atmospheric Research (UCAR) is helping to bring together a team of women who will help build and operate a high-capacity network at a major supercomputing conference.The Women in IT Networking at SC program, or WINS, is a collaboration among UCAR, the U.S. Department of Energy’s Energy Sciences Network, and the Pennsylvania-based Keystone Initiative for Network Based Education and Research. Following a national competition, WINS selected seven women who work in IT departments at universities and national labs around the country to help build and operate SCinet, the very high capacity network at the SC16 international supercomputing conference in Salt Lake City next month.For the second year in a row, UCAR will help bring together a team of women to provide technical support at SC, a leading supercomputing conference. UCAR's Marla Meehl (left) and ESnet's Jason Zuraski (second from left) are pictured at last year's conference, meeting with WINS team members. (Photo by Marijke Unger, NCAR.)"This provides the women with great exposure to the latest in technology, working with some of the top engineers who are out there," said Marla Meehl, manager of the Network Engineering and Telecommunications Section for UCAR and NCAR, the National Center for Atmospheric Research. "It's an opportunity to learn and have exposure to things that they don't work with every day."Women are increasingly underrepresented in technological fields. A report last year by the American Association of University Women found that the number of U.S. women working in the computing and mathematical professions dropped from 35% in 1990 to just 26% in 2013.Meehl worked with several other IT experts to launch WINS last year and expand the number of women among the volunteers who design and deliver SCinet. Planning begins more than a year in advance and culminates in a high-intensity, around-the-clock installation in the days leading up to the conference."I’m grateful to be one of the WINS grant awardees and participate in SCinet," said Angie Asmus, IT security analyst at Colorado State University. "Because of WINS, I will be able to be mentored by and work with some of the brightest minds in IT. This is an amazing opportunity for me to gain hands-on experience and build important relationships that will be valuable to me as I progress in my career."Other participants are Denise Grayson, Sandia National Laboratories; Julie Locke, Los Alamos National Laboratory; Kali McLennan, University of Oklahoma: Amber Rasche, North Dakota State University; Jessica Shaffer, Georgia Institute of Technology; Julia Staats, CENIC; and, with separate funding, Indira Kassymkhanova of Lawrence Berkeley National Laboratory.The WINS participants were chosen from 28 eligible applicants—a big jump from the 19 applications received the previous year. The selection team weighed a variety of factors, looking for applicants who had experience in networking; whose skillset matched their area of interest; whose participation was supported by their institution; and who added to the group’s diversity, whether geographically, institutionally or otherwise.The WINS awardee selection team, led by Wendy Huntoon of the Keystone Initiative, included Susan Lucas from ESnet, Linda Winkler from Argonne National Labs, Dave Jent from Indiana University, and Florence Hudson from Internet2.Meehl was able to secure funding from the National Science Foundation for participants from research and education organizations. The Department of Energy is supporting the women from its national laboratories.“Although there are more jobs in IT, there’s a massive shortage of workers, especially in the number of women in the field,” Meehl said. “It was really fulfilling this year to see a huge jump in the number of really qualified applicants. It was very hard to choose.”Writer/editor:David Hosansky, Manager of Media Relations

Wrangling observations into models

April 4, 2016 | If scientists could directly measure the properties of all the water throughout the world’s oceans, they wouldn’t need help from NCAR scientist Alicia Karspeck. But since large expanses of the oceans are beyond the reach of observing instruments, Karspeck’s work is critical for those who want estimates of temperature, salinity, and other properties of water around the globe. Scientists need these estimates to better understand the world’s climate system and how it is changing. “It’s painstaking work, but my hope is it will lead to major advances in climate modeling and long-term prediction,” Karspeck said. She is one of a dozen or so researchers at NCAR who spend their days on data assimilation, a field that is becoming increasingly important for the geosciences and other areas of research. Broadly speaking, data assimilation is any method of enabling computer models to utilize relevant observations. Part science and part art, it involves figuring out how to get available measurements--which may be sparse, tightly clustered, or irregularly scattered--into models that tend to simplify the world by breaking it into gridded boxes. Commonly used in weather forecasting, the technique can improve simulations and help scientists predict future events with more confidence. It can also identify deficiencies in both models and observations. As models have become more powerful and observations more numerous, the technique has become so critical that NCAR last year launched a Data Assimilation Program to better leverage expertise across its seven labs. “Activities in data assimilation have grown well beyond traditional applications in numerical weather prediction for the atmosphere and now span across NCAR’s laboratories,” said NCAR Director Jim Hurrell. “The Data Assimilation program is designed to enhance data assimilation research at NCAR, while at the same time serving the broader U.S. research community.” Scientists are using data assimilation techniques to input a range of North American observations into experimental, high-resolution U.S. forecasts. These real-time ensemble forecasts are publicly available while they're being tested. (@UCAR. This image is freely available for media & nonprofit use.) Improving prediction Created by the NCAR Directorate, the Data Assimilation Program is designed to advance prediction of events ranging from severe weather and floods to air pollution outbreaks and peaks in the solar cycle. One of its goals is to encourage collaborations among data assimilation experts at NCAR and the larger research community. For example, scientists in several labs are joining forces to apply data assimilation methods to satellite measurements to create a database of global winds and other atmospheric properties. This database will then be used for a broad range of climate and weather studies. The program also provides funding to hire postdocs at NCAR to focus on data assimilation projects as well as for a software engineer to support such activities. NCAR Senior Scientist Chris Snyder coordinates the Data Assimilation Program. "By bringing money to the table, we’re building up data assimilation capability across NCAR,” said NCAR Senior Scientist Chris Snyder, who coordinates the Data Assimilation Program. “This is critical because data assimilation provides a framework to scientists throughout the atmospheric and related sciences who need to assess where the uncertainties are and how a given observation can help.” NCAR Senior Scientist Jeff Anderson, who oversees the Data Assimilation Research Testbed (DART), says that data assimilation has become central for the geosciences. DART is a software environment that helps researchers develop data assimilation methods and observations with various computer models. “I think the Data Assimilation Program is a huge win for NCAR and the entire atmospheric sciences community,” Anderson said. “The scientific method is about taking observations of the world and making sense of them, and data assimilation is fundamental for applying the scientific method to the geosciences as well as to other research areas.” From oceans to Sun Here are examples of how data assimilation is advancing our understanding of atmospheric and related processes from ocean depths to the Sun’s interior: Oceans. Karspeck is using data assimilation to estimate water properties and currents throughout the world's oceans. This is a computationally demanding task that requires feeding observations into the NCAR-based Community Earth System Model, simulating several days of ocean conditions on the Yellowstone supercomputer, and using those results to update the conditions in the model and run another simulation. The good news: the resulting simulations match well with historical records, indicating that the data assimilation approach is working. “My goal is to turn this into a viable system for researchers,” Karspeck said. Air quality. Atmospheric chemists at NCAR are using data assimilation of satellite observations to improve air quality models that currently draw on limited surface observations of pollutants. For example, assimilating satellite observations would show the effect of emissions from a wildfire in Montana on downwind air quality, such as in Chicago.  “We've done a lot of work to speed up the processing time and the results are promising," said NCAR scientist Helen Worden. “The model simulations after assimilating satellite carbon monoxide data are much closer to actual air quality conditions.” Weather forecasting. Data assimilation is helping scientists diagnose problems with weather models. For example, why do models consistently overpredict or underpredict temperatures near the surface? Using data assimilation, NCAR scientist Josh Hacker discovered that models incorrectly simulate the transfer of heat from the ground into the atmosphere. “With data assimilation, you’re repeatedly confronting the model with observations so you can very quickly see how things go wrong,” he said. Solar cycle. Scientists believe the 11-year solar cycle is driven by mysterious processes deep below the Sun’s surface, such as the movements of cells of plasma between the Sun’s lower latitudes and poles. To understand the causes of the cycle and ultimately predict it, they are turning to data assimilation to augment observations of magnetic fields and plasma flow at the Sun’s surface and feed the resulting information into a computer model of subsurface processes. “We are matching surface conditions to the model, such as the pattern and speed of the plasma flows and evolving magnetic fields,” said NCAR scientist Mausumi Dikpati. Capturing data. In addition to helping scientists improve models, the new Data Assimilation Program is also fostering discussions about observations. NCAR senior scientist Wen-Chau Lee and colleagues who are experts in gathering observations are conferring with computer modelers over how to process the data for the models to readily ingest. One challenge, for example, is that radars may take observations every 150 meters whereas the models often have a resolution of 1-3 kilometers. Inputting the radar observations into the models requires advanced quality control techniques, including coordinate transformation (modifying coordinates from observations to the models) and data thinning (reducing the density of observations while retaining the basic information). “We are modifying our quality control procedures to make sure that the flow of data is smooth.” Lee said. “With data assimilation, the first word is ‘data’,” he added. “Without data, without observations, there is no assimilation.” Writer/contactDavid Hosansky, Manager of Media Relations FundersNCAR Directorate National Science FoundationAdditional funding agencies for specific projects  

NCAR announces powerful new supercomputer for scientific discovery

BOULDER—The National Center for Atmospheric Research (NCAR) announced today that it has selected its next supercomputer for advancing atmospheric and Earth science, following a competitive open procurement process. The new machine will help scientists lay the groundwork for improved predictions of a range of phenomena, from hour-by-hour risks associated with thunderstorm outbreaks to the timing of the 11-year solar cycle and its potential impacts on GPS and other sensitive technologies.The new system, named Cheyenne, will be installed this year at the NCAR-Wyoming Supercomputing Center (NWSC) and become operational at the beginning of 2017.Cheyenne will be built by Silicon Graphics International Corp. (SGI) in conjunction with centralized file system and data storage components provided by DataDirect Networks (DDN). The SGI high-performance computer will be a 5.34-petaflop system, meaning it can carry out 5.34 quadrillion calculations per second. It will be capable of more than 2.5 times the amount of scientific computing performed by Yellowstone, the current NCAR supercomputer.Funded by the National Science Foundation and the state of Wyoming through an appropriation to the University of Wyoming, Cheyenne will be a critical tool for researchers across the country studying climate change, severe weather, geomagnetic storms, seismic activity, air quality, wildfires, and other important geoscience topics. Since the supercomputing facility in Wyoming opened its doors in 2012, more than 2,200 scientists from more than 300 universities and federal labs have used its resources.Six clips of scientific visualizations created with the help of the Yellowstone supercomputer. For more details on the individual clips, and to see the full-length visualizations, click here. “We’re excited to bring more supercomputing power to the scientific community,” said Anke Kamrath, director of operations and services at NCAR’s Computational and Information Systems Laboratory. “Whether it’s the threat of solar storms or a heightened risk in certain severe weather events, this new system will help lead to improved predictions and strengthen society’s resilience to potential disasters.”“Researchers at the University of Wyoming will make great use of the new system as they continue their work into better understanding such areas as the surface and subsurface flows of water and other liquids, cloud processes, and the design of wind energy plants,” said William Gern, vice president of research and economic development at the University of Wyoming. “UW’s relationship with NCAR through the NWSC has greatly strengthened our scientific computing and data-centric research. It’s helping us introduce the next generation of scientists and engineers to these endeavors.”The NWSC is located in Cheyenne, and the name of the new system was chosen to honor the support that it has received from the people of that city. It also commemorates the upcoming 150th anniversary of the city, which was founded in 1867 and named for the American Indian Cheyenne nation.Increased power, greater efficiencyThe new data storage system for Cheyenne will be integrated with NCAR’s existing GLADE file system. The DDN storage will provide an initial capacity of 20 petabytes, expandable to 40 petabytes with the addition of extra drives.  This, combined with the current 16 petabytes of GLADE, will total 36 petabytes of high-speed storage. The new DDN system also will transfer data at the rate of 200 gigabytes per second, which is more than twice as fast as the current file system’s rate of 90 gigabytes per second.The system will include powerful Intel Xeon processors, whose performance will be augmented through optimization work that has been done by NCAR and the University of Colorado Boulder. NCAR and the university performed this work through their participation in the Intel Parallel Computing Centers program.Even with its increased power, Cheyenne will be three times more energy efficient (in floating point operations per second, or flops, per watt) than Yellowstone, its predecessor, which is itself highly efficient.“The new system will have a peak computation rate of over 3 billion calculations per second for every watt of power consumed," said NCAR’s Irfan Elahi, project manager of Cheyenne and section manager for high-end supercomputing services.Scientists used the Yellowstone supercomputer to develop this 3-D rendering of a major thunderstorm in July 2011 that caused flooding in Fourmile Canyon west of Boulder. The colors show conditions in the clouds, including ice particles (light blue), graupel (orange), snow (pink), rain (blue), and water (grey). (Image by David Gochis, NCAR. This image is freely available for media & nonprofit use.)More detailed predictionsHigh-performance computers such as Cheyenne allow researchers to run increasingly detailed models that simulate complex processes and how they might unfold in the future. These predictions give resource managers and policy experts valuable information for planning ahead and mitigating risk.Some of the areas in which Cheyenne is expected to accelerate research include the following:Streamflow. Year-ahead predictions of streamflows and associated reservoir levels at a greater level of detail will provide water managers, farmers, and other decision makers with vital information about likely water availability and the potential for drought or flood impacts.Severe weather. By conducting multiple simultaneous runs (or ensembles) of high-resolution forecast models, scientists will lay the groundwork for more specific predictions of severe weather events, such as the probability that a cluster of intense thunderstorms with the risk of hail or flooding will strike a county at a particular hour.Solar energy. Specialized models of solar irradiance and cloud cover will be run more frequently and at higher resolution, producing research that will help utilities predict how much energy will be generated by major solar arrays hours to days in advance.Regional climate change. Scientists will conduct multiple simulations with detailed climate models, predicting how particular regions around the world will experience changing patterns of precipitation and temperature, along with potential impacts from sea level rise, streamflow, and runoff.Decadal prediction. Ensembles of detailed climate models will also help scientists predict the likelihood of certain climate patterns over a 10-year period, such as the risk of drought for a certain region or changes in Arctic sea ice extent.Air quality. Scientists will be able to simulate the movement and evolution of air pollutants in far more detail, thereby better understanding the potential health effects of particular types of emissions and working toward improved forecasts of air quality.Subsurface flows. More accurate and detailed models will enable researchers to better simulate the subsurface flows of water, oil, and gas, leading to a greater understanding of these resources.Solar storms. Innovative, three-dimensional models of the Sun will lay the groundwork for predictions of the timing and strength of the Sun’s 11-year cycle as well as for days-ahead forecasts of solar disturbances that can generate geomagnetic storms in Earth’s upper atmosphere."Supercomputing is vital to NCAR’s scientific research and applications, giving us a virtual laboratory in which we run experiments that would otherwise be impractical or impossible to do,” said NCAR Director James Hurrell. “Cheyenne will be a key component of the research infrastructure of the United States through its provision of supercomputing specifically tailored for the atmospheric, geospace, and related sciences. The capabilities of this new system will be central to the continued improvement of our ability to understand and predict changes in weather, climate, air quality, and space weather, as well as their impacts on people, ecosystems, and society.”This series of images, based on a research project run on the Yellowstone supercomputer, shows order and chaos in the Sun's interior dynamo. Turbulent plasma motions (image a) generate a tangled web of magnetic field lines, with opposing "wreaths" of magnetism pointing east (red) or west (blue). Images b and c provide a better look at the magnetic wreaths. (Images by Kyle Augustson, NCAR. This image is freely available for media & nonprofit use.)Cheyenne Quick FactsKey features of the new Cheyenne supercomputer system:5.34-petaflop SGI ICE XA Cluster with Intel “Broadwell” processorsMore than 4K compute nodes20% of the compute nodes have 128GB memory and the remaining ~80% have 64GB memory313 terabytes  (TB) of total memoryMellanox EDR InfiniBand high-speed interconnectPartial 9D Enhanced Hypercube interconnect topologySUSE Linux Enterprise Server operating systemAltair PBS Professional Workload ManagerIntel Parallel Studio XE compiler suiteSGI Management Center & SGI Development SuiteMellanox Unified Fabric ManagerThe new Cheyenne supercomputer and the existing file system are complemented by a new centralized parallel file system and data storage components.Key features of the new data storage system:Four DDN SFA14KX systems20 petabytes of usable file system space (can be expanded to 40 petabytes by adding drives)200 GB per second aggregate I/O bandwidth3,360 × 8-TB NL SAS drives48 × 800-GB mixed-use SSD drives for metadata24 × NSD (Network Shared Disk) serversRed Hat Enterprise Linux operating systemIBM GPFS (General Parallel File System)

Storm-proven forecasting gets yearlong trial

June 3, 2015 | Storm-studying scientists have made their next-generation forecasting system available online so the wider weather community can put it to the test. After using the real-time system during short-lived field research campaigns, developers at the National Center for Atmospheric Research (NCAR) are now ready to see how it performs year-round, and they're eager for user feedback. In April, NCAR scientists began running daily forecasts using the sophisticated system, which has proven its mettle by skillfully predicting the path of early summer storms as they roll across the country's midsection. The new project, which is funded to run through at least mid-June 2016, will allow scientists to see if the forecasts are as adept at predicting weather phenomena that more frequently occur at other times of the year.  "This type of system has never been run year round," said NCAR scientist Craig Schwartz, who co-leads the project. "We want to examine a wide range of weather phenomena, like winter storms, that are not typically studied with high-resolution models and see how the system performs." NCAR's high-resolution ensemble forecasts show 10 possible scnearios for how much precipitation was expected to fall acrosss the central Great Plains over a two-day period in early May. Check out current forecasts at ensemble.ucar.edu. (©UCAR. This animation is freely available for media & nonprofit use.) The high-resolution forecasting system was first developed to aid field scientists trying to get up-close views of severe weather events. Detailed information about where an individual storm is likely headed helps ensure that ground- and air-based observing equipment gets deployed to the best locations.  "We found that the forecasts did a really good job of showing us where the greatest hazards were going to be and allowed us to put field crews in the right places," said NCAR scientist Glen Romine, who has been involved in running the forecasts for past field campaigns. Now Romine and Schwartz, along with NCAR's Kate Fossell and Ryan Sobash, are extending the forecasts beyond the field, across the country and onto the Internet. By making their forecasts for the continental United States easily available at ensemble.ucar.edu, they hope to get feedback from the meteorological community and hear from scientists who may want to use the data in their own research. The NCAR team is also supplying the forecasts to the National Severe Storms Laboratory and the Storm Prediction Center, both part of the National Oceanic and Atmospheric Administration. Scientists there are interested in seeing if the data can help them better track individual storms and issue more precise severe weather forecasts. To add probability, move beyond a single forecast To predict the path of an individual storm—instead of the general area where conditions are ripe for storm formation—scientists need a weather model that can run at a higher resolution than is commonly used. Because these detail-oriented models burn through so much computing power, they're typically used to create just a single, deterministic forecast. Deterministic forecasts describe just one possible future weather scenario—a single path that a storm might follow, for example. Because of this, the forecast is either right or wrong; there's no gray area in between.  Several years ago, NCAR scientists and their colleagues, working on storm-focused field campaigns, began using a different technique, known as ensemble forecasting, that allowed them to move away from black-or-white deterministic forecasts and instead create forecasts that incorporated the probability that a certain weather scenario would actually come to pass. Armed with the vast power of the new Yellowstone supercomputer, they were able to start producing high-resolution ensemble forecasts by running the same model—the advanced research version of NCAR's Weather Research and Forecasting Model (WRF-ARW)—multiple times for the same forecast period using different initial estimates of atmospheric conditions to kick off each run. "When most groups run a high-resolution forecast, they grab a single estimate of the state of the atmosphere, plop it down on their high-resolution forecast grid, and just run it," said Romine, who is co-leading the project with Schwartz. "When you initialize an ensemble forecast, you want a range of estimates of the state of the atmosphere that are all equally likely. What you end up with is a variety of different forecast solutions, despite the fact that the forecasts were started with just small initial differences." Using an array of initial estimates is important because scientists can't measure exactly what conditions exist in every part of the atmosphere at any given moment. In the expanses between weather stations and other observational equipment, scientists must use models to help them make best guesses of the true conditions. Small variations in those estimates can result in big differences in the forecast outcome. The multiple forecasts that are generated using a range of initial estimates, known as the ensemble members, show a breadth of possible outcomes. This range of results gives scientists and forecasters the ability to determine how probable any particular weather event occurring in any particular location might be. Where the member forecasts tend to agree, the probability is higher. Where they don't agree, the probability drops. The real-time daily ensemble forecasts now available at ensemble.ucar.edu have 10 member forecasts, each of which is initialized using estimates of the atmosphere generated by NCAR's Data Assimilation Research Testbed (DART) toolkit. "Because these forecast methods are so new, a lot of research and testing is needed to understand how to put surface, radar and satellite data properly into the models to start them off correctly so we get better forecasts," said Louis Wicker, a research meteorologist at the National Severe Storms Laboratory, where they are testing a number of different ensemble systems. "NCAR’s ensemble forecast system is one of the best storm-scale ensemble systems currently being tested." Dive Deeper Learn more about NCAR's ensemble forecast details. Anyone interested in more information, providing feedback, or collaborating on the project can email ensemble@ucar.edu. Writer/contactLaura Snider Collaborating institutionsNational Center for Atmospheric ResearchNCAR Mesoscale & Microscale Meteorology LaboratoryNCAR Computational & Information Systems LaboratoryNOAA National Severe Storms LaboratoryNOAA/NWS Storm Prediction Center  FundersNational Science Foundation

NCAR enhances big data services for climate and weather researchers

BOULDER — The National Center for Atmospheric Research (NCAR) has recently implemented an enhanced data sharing service that allows scientists increased access to data as well as improved capabilities for collaborative research. In addition to data sharing, NCAR has significantly upgraded its centralized file service, known as the Globally Accessible Data Environment (GLADE). Managed by NCAR’s Computational and Information Systems Laboratory (CISL), both GLADE and the data sharing service are important upgrades for the high-performance computing (HPC) user community, allowing faster and better access to data and a more flexible virtual workspace. The Globally Accessible Data Environment (GLADE) is the centralized file service located at the NCAR-Wyoming Supercomputing Center in Cheyenne. (Photo courtesy David Read, NCAR.) The data sharing service leverages the capabilities of Globus Plus to increase customization options for storage as well as data sharing. Globus, a project of the Computation Institute (a partnership of The University of Chicago and Argonne National Laboratory), is a software service that has been described as a dropbox for big data. It is broadly used in the scientific community. “Plus” refers to a new feature that allows researchers to share data with colleagues outside of their home institutions, greatly improving ease of collaborative work. “Scientific collaborations are global endeavors, and researchers need to share data with colleagues around the world. As data sets have grown in size and number, the process of moving and managing access to them has become a significant challenge,” said Pam Gillman, manager of NCAR’s Data Analysis Services Group. “Globus Plus is a robust and user-friendly service that eases the workflow, and it allows users to be more productive by spending less time on the minutiae of data transfers.” NCAR users have been accessing the Globus transfer service for many years. In addition to making data available to external colleagues, the upgrade now allows users of CISL's HPC environment to control the users or groups of users to which the data are accessible. With the sharing service, outside users need only a free Globus account, not a UCAR username/token, to access shared data. The Globus Plus service has a 1.5-petabyte capacity, and most users can take advantage of the Globus web interface to transfer data. Advanced users or service developers can leverage the Globus Plus features via a command-line interface. CISL recently added 5 petabytes of high performance storage to the GLADE environment, bringing the total to 16.4 petabytes. GLADE is based on the GPFS file system and provides over 90 GB/s of sustained bandwidth across HPC, analysis, and visualization resources. GLADE file spaces are intended as work areas for day-to-day tasks and are well suited for managing software projects, scripts, code, and data sets. “We strive to meet the growing needs of our user community, which expand as the data sets grow and require greater and more efficient resources,” said Gillman. “These major upgrades are part of CISL’s ongoing commitment to giving users the tools and services they need to carry out cutting-edge computational research.”  

NCAR & CU Join Intel Parallel Computing Centers Program

BOULDER —The National Center for Atmospheric Research (NCAR) and the University of Colorado Boulder (CU-Boulder) announced today that they will join the Intel Parallel Computing Centers program. Participants in the program will develop methods to increase the performance of applications that use advanced microprocessor technologies and will help train the next generation of scientists and engineers who will apply these new technologies to challenges of societal importance. The NCAR/CU team will focus on weather and climate applications, including the NCAR-based Community Earth System Model (CESM), Weather Research and Forecasting model (WRF), and Model for Prediction Across Scales (MPAS),  three of the most widely used applications in the field. The Indian Institute of Science in Bangalore, India, will also collaborate with the NCAR/CU-Boulder team on the project. This international public-private partnership is one of several Intel Parallel Computing Centers being established with recognized high performance computing institutions and research groups. It is part of the research and academic work that Intel supports in the quest for increasing efficiency and optimization of parallel microprocessor computer architectures. NCAR's advanced research version of the Weather Research and Forecasting model (WRF) was used last year to simulate air flow in and around 2012’s Hurricane Sandy. In this 3-D simulation of potential temperature, relatively cool air wraps around Sandy's core near the surface (purple and blue colors), while air parcels gain heat from moisture condensing into clouds and precipitation as they ascend through the storm’s core. (©UCAR. Image courtesy Mel Shapiro, NCAR. This image is freely available for media & nonprofit use.). Video animations are available of this simulation and several others here. To solve today's most challenging scientific problems, computer scientists must coordinate many processing elements, assuring they work together efficiently on the same problem. Called parallel processing, this approach is essential to exploiting the power of the latest generation of Intel® processors. “This partnership will accelerate the important work we are doing to prepare both our applications and our workforce to tackle the next computational challenges in weather and climate simulation,” said Thomas Bogdan, president of the University Corporation for Atmospheric Research, which manages NCAR on behalf of the National Science Foundation. “It helps us conduct valuable, in-depth research on Intel architecture, and it underscores the value of working with industry in preparing for future technologies and training bright young minds in these fields.” The models that simulate climate and weather have an insatiable need for increased computing power, and as the physical models evolve, the computational algorithms must also keep pace in order to run efficiently at ever-increasing resolutions that provide enhanced detail. Importantly, preparing applications for the next generation of parallel processors and supercomputers will ensure a smooth transition when these systems become widely available. The collaboration with NCAR and Intel provides an exciting opportunity for CU-Boulder graduate students to participate in research that will prepare climate and weather code for the next generation of Intel processors. Students will have the opportunity to collaborate with scientists on large-scale computational problems that have a real impact. Thomas Hauser, CU-Boulder’s Director of Research Computing, said, “This project will give me the opportunity to have students in the High-Performance Scientific Computing class work on projects that are connected to this effort with access to Intel’s newest architectures.” Research Computing at CU-Boulder is in the process of setting up a small cluster containing the Intel® Xeon PhiTM coprocessor, provided to CU-Boulder by Intel, in order to give students and researchers at CU-Boulder access to these new Intel architectures for porting and tuning their scientific applications. Intel and Xeon Phi are registered trademarks of Intel Corporation in the United States and other countries.

NWSC named "Green" Data Center of the Year

BOULDER— The NCAR-Wyoming Supercomputing Center (NWSC) has been named the 2013 “‘Green’ Data Center of the Year” at the inaugural Datacenter Dynamics North American Awards. The Vistor Center at the NWSC. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) The NWSC received top honors in a category that recognizes the reality of designing and operating data centers in the context of environmental scrutiny. The award was bestowed last week in a ceremony in San Francisco. This award comes on the heels of the NWSC’s top honors for facility design from the Green Enterprise IT Awards from the Uptime Institute, showcasing cutting-edge data center projects that demonstrate energy and resource efficiency in a new, operational data center.  “We are gratified that our efforts to build and operate the most efficient and sustainable data center possible have been successful and that the NWSC is being recognized on its merits,” says Gary New, who manages the center for NCAR’s Computational and Information Systems Laboratory (CISL). “Nearly 10 years of planning and hard work went into making this success a reality.” Located in Cheyenne, Wyoming, the facility must adapt to outside weather conditions that can change rapidly in the West. “The award recognizes the state-of-the-art design features our team built into NWSC that enable it to respond quickly and efficiently to changing conditions both inside and outside the facility,” says Al Kellie, director of NCAR’s CISL. “In this way, we can minimize the environmental impact of our scientific computing operations." Construction of the NWSC was made possible by the sponsorship of the National Science Foundation (NSF) and a unique collaborative partnership between local, state, and federal government entities and private industry to provide project funding and governance. Partners include the State of Wyoming, the University of Wyoming, Cheyenne LEADS, the Wyoming Business Council, and Cheyenne Light, Fuel & Power. The NWSC is operated by the National Center for Atmospheric Research (NCAR) on behalf of NSF, which sponsors NCAR, and the University Corporation for Atmospheric Research (UCAR), which manages it. The NWSC, which is home to one of the most powerful supercomputers dedicated to Earth system science, also achieved a Leadership in Energy and Environmental Design Gold certification from the U.S. Green Building Council last year. LEED certification depends on a number of sustainable criteria, such as energy efficiency, water conservation, and the use of recycled or locally sourced construction materials. “The NWSC was designed and built to the highest standards in sustainability and efficiency,” says UCAR president Thomas Bogdan. “It is deeply gratifying that this approach is receiving national and international recognition.” NWSC design highlights Sustainable materials: During construction, more than 70 percent of construction waste was diverted from landfills and used for recycling. The building itself is made with more than 510 tons of recycled concrete, 60 tons of recycled wood, and 26 tons of recycled metal. Water: The ultra-efficient cooling tower configuration, as well as use of native species for landscaping, enables water savings of up to 6 million gallons per year. Infrastructure and Space Use: The center's efficient use of energy means that the mechanical and electrical systems, as well as its office space expend less than 10 percent of the total power consumption. Heating: Waste heat from the supercomputer is captured and reused to heat the building and melt snow and ice on exterior walkways and loading docks. Cooling: During design, project planners estimated that Wyoming’s cool, dry climate would allow natural cooling of the facility for 96 percent of the year. Early experience with the facility indicates that 98-99 percent is achievable. Power: Renewable wind energy provides direct power to the facility, starting at 10 percent of supply with the ability to raise that percentage as conditions permit. Flexibility and Longevity: The design of the NWSC includes “future proofing” to anticipate adaptation to evolving technologies and deployment of future supercomputing systems yet to be developed. The design is also highly modular, allowing critical power and cooling components to be provisioned only when needed. This enhanced flexibility helps minimize capital expenditures by providing only what is needed when it is needed. Combining all of these factors, the NWSC not only minimizes the environmental footprint but also directs operating funds toward productive scientific work while reducing overhead expenses. About the Datacenter Dynamics Awards The Datacenter Dynamics Awards are the leading awards for the data center industry, recognizing innovation, leadership, and “out of the box” thinking. With 15 established award categories and active award programs in Europe, China, Japan, Spanish-speaking Latin America, and Brazil, the awards celebrate successful data center projects of all sizes, across all sectors with many hundreds of worthy entries.

First Place: NCAR-Wyoming Supercomputing Center Recognized for Outstanding Design Implementation

News Release Multimedia Gallery Fact Sheet More NWSC News   BOULDER—The National Center for Atmospheric Research (NCAR) has taken top honors in the prestigious 2013 Green Enterprise IT (GEIT) Awards. The center’s NCAR-Wyoming Supercomputing Center won first place in the “Facility Design Implementation” category for its sustainable approach in designing and building the new NCAR-Wyoming Supercomputing Center (NWSC). The NCAR-Wyoming Supercomputing Center in Cheyenne, Wyoming, which opened last year, has received international recognition for its sustainable approach. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) The closely watched GEIT Awards, bestowed by the Uptime Institute, showcase organizations for pioneering projects and innovations that significantly improve energy productivity and resource use in information technology. The Facility Design Implementation Award recognizes cutting-edge data center projects that demonstrate energy and resource efficiency in a new, operational data center. “We are honored and pleased to receive this recognition for the NWSC,” says Aaron Andersen, Deputy Director of Operations & Services at NCAR’s Computational and Information Systems Laboratory. “Nearly 10 years of planning and hard work went into designing this facility to be as sustainable as possible, and it is gratifying to have the facility in production use and be able to share what we’ve done. We hope this facility advances the entire industry.” “Our goal is to meet the highest standards possible for sustainability in supercomputing while advancing scientific knowledge,” says Thomas Bogdan, president of the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation (NSF). “The GEIT Award suggests we’re on the right track, and we are deeply gratified by this international recognition.” H+L Architecture and the engineering firm RMH Group shared the award with NCAR for their role in the design of the facility and its systems. Construction of the NWSC was made possible by the sponsorship of NSF and a unique collaborative partnership between local, state, and federal government entities and private industry to provide project funding and governance. The NWSC is operated by NCAR on behalf of NSF and UCAR. “The State of Wyoming is proud to be a partner in this supercomputing facility,” says Wyoming Gov. Matt Mead. “The designers did an excellent job putting to work Wyoming’s natural advantages for data centers, and I join in congratulating them on this award.” The NWSC's complex mechanical systems were designed with performance, flexibility, and energy efficiency in mind.  (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) “My congratulations to the NCAR-Wyoming Supercomputing Center for its international recognition as a world-class facility,” says Wyoming Congresswoman Cynthia M. Lummis, a long-time supporter of the center. “NWSC's design capitalizes on Wyoming's climate and availability of renewable resources while also incorporating other best practices to achieve an energy efficient and sustainable facility. Considering the leading computational research conducted at the NWSC, it seems only fitting that the facility housing this research be recognized for its key role in enabling advancements in the field of supercomputing.” The NWSC, which is home to one of the most powerful supercomputers dedicated to Earth system science, also achieved a Leadership in Energy and Environmental Design Gold certification from the US Green Building Council last year. LEED certification depends on a number of sustainable criteria, such as energy efficiency, water conservation, and the use of recycled or locally sourced construction materials. “Our organization set out to build a world-class data center for scientific computing that would raise the bar in sustainability, longevity, and manageability,” says NCAR interim director Maura Hagan. “Today we are excited to be advancing science through supercomputing from a facility that is living up to those goals.” The 2013 GEIT Awards are sponsored by Sabey Data Centers. Entries were thoroughly reviewed by an international committee of independent judges following a double-blind process. NCAR will receive the award at the eighth annual Uptime Institute Symposium in Santa Clara, California, in May. As part of the symposium’s agenda, NCAR and a portion of the design team will present a case study about the NWSC to the event audience. NWSC Design Highlights Sustainable materials: During construction, more than 70% of construction waste was diverted from landfills and used for recycling. The building itself is made with over 510 tons of recycled concrete, 60 tons of recycled wood, and 26 tons of recycled metal. Water: The ultra-efficient cooling tower configuration as well as use of native species for landscaping enables water savings of up to 6 million gallons per year. Infrastructure and Space Use: The center's super-efficient use of energy means that the mechanical systems, electrical systems, and office space expend less than 10% of the total power consumption. Heating: Waste heat from the supercomputer is captured and reused to heat the building and melt snow and ice on exterior walkways and loading docks. Cooling: During design, project planners estimated that Wyoming’s cool, dry climate would allow natural cooling of the facility for 96% of the year. Early experience with the facility indicates that 98% to 99% is achievable. Power: Renewable wind energy provides direct power to the facility, starting at 10% of supply with the ability to raise that percentage as conditions permit. Flexibility and Longevity: The design of the NWSC includes “future proofing” to anticipate adaptation to evolving technologies and deployment of future supercomputing systems yet to be developed. The design is also highly modular, allowing critical power and cooling components to be provisioned only when needed. This enhanced flexibility helps minimize capital expenditures by providing only what is needed when it is needed. Combining all of these factors, the NWSC not only minimizes the environmental footprint but also directs operating funds toward productive scientific work while reducing overhead expenses. About Uptime Institute Uptime Institute provides independent thought leadership, certification, education, and professional services for the global digital infrastructure industry. It serves all industry stakeholders, including enterprise and third-party data center owners and operators, manufacturers, service providers, and engineers. Through Uptime Institute Professional Services, Uptime Institute delivers due diligence assessments and certifications of site infrastructure and site management in accordance with the Tier and Operational Sustainability Standards. Uptime Institute is a division of The 451 Group.  Headquartered in New York, The 451 Group also owns 451 Research, a leading technology-industry syndicated research and data service focused on the business of enterprise IT innovation, and Yankee Group, the preeminent research and advisory firm equipping companies to profit in a mobile world.

New supercomputer in Wyoming aids Antarctic safety

December 5, 2012 | When the weather is good enough for U.S. Air Force pilots to land their C-17 cargo jets on one of the ice runways at Antarctica’s McMurdo Station, they know they’ll probably see volcanic Mt. Erebus rising 13,000 feet into crystalline blue sky. Part of the reason they’ll get a clear view, and in turn a safe landing for passengers and cargo, will soon have to do with another Erebus. That’s the name of the new NCAR supercomputer dedicated to the real-time Antarctic Mesoscale Prediction System (AMPS) weather forecasts. Based at the recently opened NCAR-Wyoming Supercomputing Center, Erebus is now in testing phase. When fully operational, the system is expected to provide about 15 times greater computing power for AMPS than the current system, a portion of NCAR’s soon-to-be-retired bluefire supercomputer. Erebus is funded by the National Science Foundation’s Office of Polar Programs. Since AMPS was created in 2000, its twice-daily numerical weather predictions have greatly increased the safety and efficiency of U.S. and international operations in Antarctica. AMPS forecasts have helped cut in half the number of flights aborted because of adverse weather conditions experienced at ground locations, a major time and cost savings. The U.S. Air Force and New York Air National Guard make about 100 flights every year between Christchurch, New Zealand, and McMurdo Station, the largest research base in Antarctica and the hub of the United States Antarctic Program. The pilots rely on Antarctic forecasts provided by meteorologists at the Space and Naval Warfare Systems Center, who in turn rely heavily on guidance from AMPS. Each AMPS forecast uses ground-based and satellite observations of such atmospheric conditions as wind speed, humidity, and temperature. The resulting model output produces five-day forecasts over the continent as a whole, in addition to 36-hour forecasts at higher resolution for certain key regions of the continent. When operational on Erebus as early as the end of this year, the forecasts will be run in less than 30 minutes, or about 10 times faster than the 5 to 5.5 hours on bluefire. In addition, the resolution of the forecasts will increase by one third, providing more detail. “We’re very excited about the expanded computing capacity with Erebus and generating more detailed forecasts in less time,” says NCAR scientist Jordan Powers, who helped launch AMPS in 2000. AMPS forecasts had previously ranged in resolution, or level of detail, from a 15-kilometer grid (9.3 miles) over all of Antarctica, to just under 2-km grids around McMurdo Station and other key areas. Erebus will provide the computing power to pinpoint local forecasts, such as critical ground wind conditions, down to 1.1 km, which is less than the length of a McMurdo runway. Erebus will also enable the AMPS team to better test and integrate new polar physics approaches for describing the Antarctic atmosphere, which will in turn increase the predictive accuracy of the models.

NCAR-Wyoming Supercomputing Center opens

News Release Multimedia Gallery Fact Sheet FAQ More NWSC News   CHEYENNE—The NCAR-Wyoming Supercomputing Center (NWSC), which houses one of the world’s most powerful supercomputers dedicated to the geosciences, officially opens today. Scientists at the National Center for Atmospheric Research (NCAR) and universities across the country are launching a series of initial scientific projects on the center’s flagship, a 1.5-petaflop IBM supercomputer known as Yellowstone. These first projects focus on a wide range of Earth science topics, from atmospheric disturbances to subterranean faults, that will eventually help to improve predictions of tornadoes, hurricanes, earthquakes, droughts, and other natural hazards. A handful of the Yellowstone supercomputer's 100 racks. An iconic scene from its namesake national park is featured mosaic-style on the ends of each rack. The image by Michael Medford, licensed to National Geographic, centers on Fountain Geyser. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery. “This center will help transform our understanding of the natural world in ways that offer enormous benefits to society,” says Thomas Bogdan, president of the University Corporation for Atmospheric Research (UCAR), which manages NCAR on behalf of the National Science Foundation (NSF). “Whether it’s better understanding tornadoes and hurricanes, or deciphering the forces that lead to geomagnetic storms, the Yellowstone supercomputer and the NWSC will lead to improved forecasts and better protection for the public and our economy.” Bogdan took part this morning in a formal opening ceremony with Wyoming Gov. Matt Mead, NSF director Subra Suresh, University of Wyoming (UW) vice president of research relations William Gern, NCAR director Roger Wakimoto, and other political and scientific leaders. “This is a great day for scientific research, for the University of Wyoming, and for Wyoming,” says Mead. “Wyoming is proud to be part of the collaboration that has brought one of the world’s fastest computers to the state. The center will have a positive impact on our future, through the research done here and by sending the message that Wyoming is honored and equipped to be the home of this amazing facility.” “The NCAR-Wyoming Supercomputing Center will offer researchers the opportunity to develop, access, and share complex models and data at incredibly powerful speeds,” says Suresh. “This is the latest example of NSF's unique ability to identify challenges early and make sure that the best tools are in place to support the science and engineering research communities.” Public-private partnership Located on the western fringe of Cheyenne, Wyoming, the NCAR-Wyoming Supercomputing Center officially opened on October 15, 2012. (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery. The NWSC is the result of a broad public-private partnership among NCAR, NSF, UW, the state of Wyoming, Cheyenne LEADS, the Wyoming Business Council, and Cheyenne Light, Fuel & Power. NCAR’s Computational and Information Systems Laboratory (CISL) will operate the NWSC on behalf of NSF and UCAR. “We are delighted that this successful public-private partnership has delivered a major supercomputing center on time and on budget,” says NCAR director Roger Wakimoto. Through the NWSC partnership, which will also seek to advance education and outreach, UW will have research use of 20 percent of NWSC’s main computing resource. In turn, UW will provide $1 million each year for 20 years in support of the program. The state of Wyoming also contributed $20 million toward the construction of the center. “Our access to Yellowstone will allow the university to reach new heights in our educational and research endeavors in engineering; atmospheric, hydrological, and computational sciences; Earth system sciences; and mathematics,” says UW President Tom Buchanan. “The supercomputer is a huge draw for students and faculty. It opens the door to scientific innovation and discovery that will benefit our state, the nation, and the world.” Located in Cheyenne’s North Range Business Park, near the intersection of I-80 and I-25, the 153,000-square-foot supercomputing center will provide advanced computing services to scientists across the United States. Most researchers will interact with the center remotely, via a laptop or desktop computer and the Internet. Relative to the most recent ranking of the world’s fastest supercomputers, issued in June, the 1.5 petaflops peak system ranks in the top 20. The rankings constantly change as new and increasingly powerful supercomputers come online. The main components consist of a massive central file and data storage system, a high performance computational cluster, and a system for visualizing the data. Scientists will use the new center’s advanced computing resources to understand complex processes in the atmosphere and throughout the Earth system, and to accelerate research into severe weather, geomagnetic storms, climate change, carbon sequestration, aviation safety, wildfires, and other critical geoscience topics. Future-proof design Some of the NWSC's complex mechanical systems, which were designed with performance, flexibility, and energy efficiency in mind.  (©UCAR. Photo by Carlye Calvin. This image is freely available for media & nonprofit use.) View more images in the Multimedia Gallery. CISL has operated supercomputers at NCAR’s Mesa Laboratory in Boulder since the 1960s, even though the building was not designed with supercomputing in mind. In recent years, new research questions have required more powerful computers to run increasingly complex computer simulations. The Mesa Lab has now reached the limits of its ability to provide the necessary energy and cooling capacity essential for the next generation of supercomputers. The NWSC is expected to advance scientific discovery for the next several decades. Its design and construction have been "future proofed" by providing the scope to expand as supercomputing technology that does not exist today becomes available in the future. Raised floors are key to the facility’s flexible design, allowing the computing systems, electrical supply, and cooling to be positioned and controlled for optimal energy use and ease of maintenance. The raised floor is also vented, so air can be circulated as needed to computing systems and servers. The NWSC was awarded LEED Gold certification for its sustainable design. The center takes full advantage of Cheyenne’s elevation and cool, dry climate by employing ambient air to cool the facility nearly year round. This will significantly reduce the facility's energy use. A minimum of 10 percent of the power provided to the facility will be wind energy from the nearby Happy Jack Wind Farm. NCAR and UCAR will continue to explore options to increase the percentage of renewable energy provided to the facility in future years.

Pages

Subscribe to Supercomputers