- UCAR Home
- About Us
- For Staff
Feburary 6, 2017 | Students of microbiology can grow bacteria in petri dishes to better understand their subject. Paleontology students have fossils, and chemistry students have beakers bubbling with reactions. But students of the atmospheric and related sciences are often left with something much less tangible: data, and lots of it.
Datasets in the atmospheric sciences cover everything from observations made by weather balloons to satellite measurements of cloud cover to output from climate model runs.
Now the National Center for Atmospheric Research (NCAR) is helping make those data less abstract and more concrete — a little closer to a rock sample and a little further from a computer file. The result is two apps: one using virtual-reality and one using augmented-reality techniques to create 3D visualizations of datasets on a globe that students can move around and view from different perspectives. Meteo VR (Virtual Reality) and Meteo AR (Augmented Reality) are available for use on iPhone, iPad, and Android devices. They were developed by NCAR's Computational and Information Systems Lab (CISL).
"The goal is to make our data more accessible to the public, especially to students," said Tim Scheitlin, a senior software engineer at CISL's Visualization Lab. "We think it's a fun way to start a dialogue about atmospheric science. If people can get excited about using the app, then maybe they'll start asking questions that will lead to a deeper understanding."
The Meteo AR app takes advantage of the camera on a personal device. When the camera's pointed at an image from a visualization — of sea surface temperature anomalies during an El Niño, or of the inner workings of a hurricane, for example — the visualization pops up onto a 3D globe that can be spun around with a finger.
The Meteo VR app requires a virtual reality headset, such as Google Cardboard, and allows the user to "fly around" the globe to look at the projected dataset from any angle.
Development of the two apps was led by Nihanth Cherukuru, a doctoral student at Arizona State University. He came to NCAR last summer as part of CISL's Summer Internships in Parallel Computational Science (SIParCS) program, which strives "to make a long-term, positive impact on the quality and diversity of the workforce needed to use and operate 21st century supercomputers."
Cherukuru said one of the challenges of the project was to wrestle the vast amounts of data into a format that wouldn’t crash a handheld device.
"Mobile phones are tiny devices and the atmospheric data can be really huge," Cherukuru said. "We needed to take that data and trim it down. We created a single image for each timestamp and then we made animations to reduce the computational burden on the phones."
While Cherukuru has returned to Arizona State after his SIParCS internship, he is still working with the Visualization Lab. The goal is to expand the apps' capabilities, perhaps, for example, by having users click on parts of the data to get more information.
"There's kind of a 'wow' factor you get when you first use the app," Scheitlin said. "Our goal is to get past that and make it as educational as we can."
Laura Snider, senior science writer
The University Corporation for Atmospheric Research manages the National Center for Atmospheric Research under sponsorship by the National Science Foundation. Any opinions, findings and conclusions, or recommendations expressed in this publication are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.