July 4 - evening closure - NCAR Mesa Lab and road more info>
- UCAR Home
- About Us
- For Staff
The Sun in a Sky full of Stars: Understanding Solar-Stellar Magnetism
The magnetic activity of stars influences their space environment and forces planetary atmospheres. Most of our understanding of this intimate relationship between parent stars and planets that they host is based on studies of our own solar system. In the first part of this lecture, I will highlight some of the important ways in which magnetically active stars modulate their environment and how one may gain an understanding of coupled star-planet evolution through studies of other Sun-like stars. In the second part of this lecture, I will focus on the theoretical underpinnings of the sunspot cycle and describe our quest to understand the physics of solar cycle predictability.
The `Kriging’ algorithm, central to spatial statistics, is O(n^3) in computation time for n observations. The biggest computational burden lies in the required Cholesky decomposition, which is also O(n^3), and is associated with the evaluation of the data likelihood function given the spatial model covariance parameters. In this talk, I introduce the `fields’ package, a freely available R package used for spatial statistics, and discuss the results of accelerating its Cholesky decomposition using the Matrix Algebra on GPU and Multicore Architectures (MAGMA) library. Timing results are given for both MAGMA’s single and double precision Cholesky decompositions using Caldera computational nodes on the Yellowstone supercomputing environment and a mid-2014 MacBook Pro laptop with stock GPU. In addition, I show that reducing the precision of the Cholesky decomposition does not, in general, significantly affect the likelihood maximization accuracy, yet can substantially improve performance on the mid-2014 MacBook Pro.
Thursday, July 9 12:00PM
ML Directors Conference Room
(Bring your lunch)
Hugh Morrison (1) with Jason Milbrandt (2)
1. National Center for Atmospheric Research, Boulder, Colorado
2. Environment Canada, Montreal, Canada
The representation of cloud microphysics continues to be a major source of uncertainty in atmospheric models. Traditionally, microphysics schemes partition ice-phase particles into pre-defined categories with prescribed bulk characteristics. This approach, which is used in nearly all existing schemes (bulk and bin), is intrinsically restrictive and imposes the need for conversion between categories, which are poorly constrained processes and often unphysical.
Over the past few years there has been a paradigm shift in the parameterization of ice microphysics towards emphasis on the prediction of bulk hydrometeor properties, rather than pre-defined categories. As part of this shift, a fundamentally new approach is proposed and a new microphysics scheme has been developed. In the new P3 scheme, ice particle properties are predicted and evolve by prognosing four independent mixing ratio quantities for each "free" ice-phase category. From these variables, important physical properties that describe the ice hydrometeors at a given point in time and space can be derived. This allows the full range of ice particle types to be represented in P3 even in the single-category configuration.
A detailed overview of the P3 scheme will be given and results from model simulations will be presented.
This seminar will be webcast live at:
Recorded seminar link can be viewed here:
Thursday, 9 July 2015, 3:30 PM
Refreshments 3:15 PM
3450 Mitchell Lane
Bldg 2 Main Auditorium, Room 1022
Koepping is the founder of the Arctic Arts Project, which is capturing the art of Arctic change including a range of parts of the Earth system including permafrost soils, rock formations, glaciers, sea ice, human cultures, and plants.
Challenges to water resource management in the western US include increasing and changing demands, over-allocated and depleted water supplies, natural climate variability, and the impacts of anthropogenic climate change. Tree-ring records of hydroclimatic variability over past centuries can be used to provide a baseline for the conditions that have occurred under natural climate variability. While the conditions of the past will not be an analogue for the future, these records do document the range of conditions (droughts in particular) that have occurred in the past and could occur in the future, under natural climate variability alone. Water managers have used tree-ring data in a variety of ways, from basic awareness-raising to input into water system models. I’ll talk about my work in the Colorado River basin, the Rio Grande, and in California, including the challenges of collaborating with water resource managers.
Multimillennial solar activity reconstructions: Knowns and unknowns
Solar activity is dominated by the 11-yr Schwabe cycle, but there is also essential centennial variability expressed as modulation of the 11-yr cycle. For many purposes it is important to know the long-term solar activity. The direct record of sunspot number exists for the last four centuries and depicts a great deal of variability from the Maunder minimum up to the Modern grand maximum of activity. In order to go beyond the direct record, on has to use indirect proxy, such as cosmogenic radionuclides in natural archives. In this talk, the method of the cosmogenic proxy is described, its advantages and weaknesses are discussed, recent reconstructions of the multi-millennial solar activity are shown. Special emphasis is given to the Grand minima and maxima of solar activity. Possible uncertainties are discussed.
You are invited to participate in an afternoon of statistics seminars at NCAR.
We have three speakers lined up for this year's edition. Each presentation will be followed by a short discussion and a coffee break.
Efficient Computation of Gaussian Likelihoods for Stationary Markov Random Field Models
Joe Guinness, Department of Statistics, North Carolina State University
We introduce new methods for efficiently computing the Gaussian likelihood for spatial models that consist of a Gaussian Markov random field (GMRF) with stationary covariances and an additive uncorrelated error term, when the data locations fall on a possibly incomplete regular grid. The calculations can be made exact up to machine precision and are efficient both in memory allocation and computation time and are particularly fast when the uncorrelated error term is not present. Our approach handles boundary effects and missing values in a natural fashion. Frequentist methods are highlighted, but the availability of the likelihood allows for Bayesian inference as well. We demonstrate our results in simulation and timing studies, as well as with an application to gridded satellite data, where we use the exact likelihood both for parameter estimation and model comparison.
Statistical Analysis of Remote-Sensing Datasets Using Basis-Function Representations
Matthias Katzfuss, Department of Statistics, Texas A&M University
The spatial statistical analysis of remote-sensing datasets poses several challenges. The datasets are large or even massive, which leads to computational infeasibility. Often, it is advantageous to combine ("fuse") measurements on the same or related spatial processes from several instruments, but these instruments typically exhibit different spatial footprints and measurement-error characteristics. In addition, complementary, massive datasets might be stored in different locations and are costly to move to one location, which means that the analysis must be moved to the data, instead of the other way around. I will discuss how all of these problems can be tackled using statistical models that can be written as linear combinations of spatial basis functions at multiple resolutions. These basis functions can represent arbitrary processes, allow change-of-support, and enable scalable, parallel, and distributed computations.
Analyzing Spatial Data Locally
Tailen Hsing, Department of Statistics, University of Michigan
Stationarity is a common assumption in spatial statistics. The justification is often that stationarity is a reasonable approximation if data are collected "locally." In this talk we first review various known approaches for modeling nonstationary spatial data. We then examine the notion of local stationarity in more detail. In particular, we will consider a nonstationary model whose covariance behaves like the Matern covariance locally and an inference approach for that model based on gridded data.
After the seminar sessions you are welcome to join the group for Happy Hour at Under the Sun at 5:00pm
We look forward to your participation!
Dorit Hammerling and Doug Nychka, NCAR organizers
Join us for a summer, all-staff party TODAY at 3:00 on the patio at Center Green 1! Appetizers and drinks will be served. Bring your families and take some time to relax with your colleagues!
Please plan to attend!
Let's call it a day at 3:00 pm at CG1!
The New Sunspot Series, Methods, Results, Implications, Opposition
We have reconstructed the sunspot group count, not by comparisons with other reconstructions and correcting those where they were deemed to be deficient, but by a re-assessment of original sources. The resulting series is a pure solar index and does not rely on input from other proxies, e.g. radionuclides, auroral sightings, or geomagnetic records. ‘Backboning’ the data sets, our chosen method, provides substance and rigidity by using long-time observers as a stiffness character. Solar activity, as defined by the Group Number, appears to reach and sustain for extended intervals of time the same level in each of the last three centuries since 1700 and the past several decades do not seem to have been exceptionally active, contrary to what is often claimed.
Solar Extreme Ultraviolet (EUV) radiation creates the conducting E–layer of the ionosphere, mainly by photo ionization of molecular Oxygen. Solar heating of the ionosphere creates thermal winds which by dynamo action induce an electric field driving an electric current having a magnetic effect observable on the ground, as was discovered by G. Graham in 1722. The current rises and sets with the Sun and thus causes a readily observable diurnal variation of the geomagnetic field, allowing us the deduce the conductivity and thus the EUV flux as far back as reliable magnetic data reach. High–quality data go back to the ‘Magnetic Crusade’ of the 1830s and less reliable, but still usable, data are available for portions of the hundred years before that. J.R. Wolf and, independently, J.–A. Gautier discovered the dependence of the diurnal variation on solar activity, and today we understand and can invert that relationship to construct a reliable record of the EUV flux from the geomagnetic record. We compare that to the F10.7 flux and the sunspot number, and find that the reconstructed EUV flux reproduces the F10.7 flux with great accuracy. On the other hand, it appears that the Relative Sunspot Number as currently defined is beginning to no longer be a faithful representation of solar magnetic activity, at least as measured by the EUV and related indices. The reconstruction suggests that the EUV flux reaches the same low (but non–zero) value at every sunspot minimum (possibly including Grand Minima), representing an invariant ‘solar magnetic ground state’.
And last, but not least: Update on progress since this was written [June 16, 2015].
Speaker: Michael Scheuerer
Date: July 22, 2015
Place: FL 2 – Rm 1001
Title: Statistical Post-Processing of GEFS Ensemble Forecasts
for Precipitation Accumulations
Authors: Michael Scheuerer and Thomas M. Hamill
We present and compare two different methods for statistical post-processing of ensemble precipitation forecasts which are developed and demonstrated with GEFS precipitation reforecasts over the conterminous United States and verified against an 1/8-degree climatology-calibrated precipitation analyses.
The first approach is non-parametric and forms, for each location, an new ensemble from the analyzed precipitation amounts by identifying dates in the past that had reforecasts similar to today’s forecast. A variant of this method is presented that augments the data at each location by finding supplemental locations with similar characteristics (climatology, terrain, etc.). The resulting increase of training data will be shown to be particularly beneficial for the probabilistic prediction of rare events.
As a second approach we consider a parametric method that generates full predictive probability distributions for precipitation accumulations by fitting shifted, left-censored gamma distributions to statistics of the raw ensemble forecasts. This distribution type is shown to be adequate for modeling the distribution of analyzed precipitation accumulations given the ensemble forecasts both in situations with good predictability (e.g. at short lead times) and decreased predictability (e.g. at longer lead times or during summer months).
Probabilistic forecasts by both methods will be verified using common metrics (skill, reliability, and so forth). We study how the different ensemble statistics and non-linear components in the parametric approach contribute to its performance, and we discuss, for both methods, the effect of training sample size on the reliability and resolution of the post-processed predictions.
Michael Scheuerer is a Research Scientist at the Cooperative Institute for Research in Environmental Sciences (CIRES) at CU Boulder and the Earth System Research Laboratory of the National Oceanic and Atmospheric Administration (NOAA). He received his graduate degree in mathematics from Bayreuth University, Germany, and his Ph.D. in mathematical statistics from Göttingen University, Germany. His research focuses on probabilistic weather forecasting with emphasis on statistical calibration of ensemble weather predictions. In addition, he is developing methods for the evaluation of forecast performance.
This seminar will be available to view via webcast;
UCAR Connect Link & Recorded
University Corporation for Atmospheric Research
Boulder, Colorado, USA
Accurate prediction of tropical cyclogenesis by global models is a significant forecast challenge, largely because of the lack of observations in the lower-troposphere over the tropical oceans and deficiencies in model physics. The atmospheric limb sounding technique, which makes use of radio signals transmitted by global navigation satellite systems, has evolved as a robust global observing system, providing very accurate measurements at a high vertical resolution. The water vapor information contained in such measurements within the tropical lower troposphere can be very valuable for the analysis and prediction of tropical cyclones.
In this talk we present a detailed analysis of the impact of GPS radio occultation (RO) data on the prediction of the genesis of Typhoon Nuri (2008), which was observed during the field phase of T-PARC (THORPEX Pacific Asia Regional Campaign) over the western North Pacific. Typhoon Nuri was declared a tropical storm at 1800 UTC 16 August 2008, and its incipient disturbance can be tracked more than ten days prior to tropical storm formation. Nuri was a challenging case for numerical model predictions. The NCEP GFS model failed to predict the tropical storm in operation. The WRF-ARW model, initialized with either the NCEP or ECMWF global analysis, also failed to predict Nuri’s genesis. However, with the assimilation of GPS RO soundings obtained from COSMIC and other missions, using a two-dimensional observation operator, the moisture analysis was substantially improved. The improved moisture analysis contributed to a more accurate prediction of the convection associated with the pre-Nuri disturbance, which developed into a robust mid-level vortex as it interacted with an upper-level potential vorticity streamer. This mid-level vortex was responsible for the subsequent formation of Nuri through its interaction with convective and boundary layer processes. Without the assimilation of GPS RO data, the convection was suppressed, the mid-level vortex did not develop, and the model failed to predict the genesis of Nuri. Further experiments on nine additional typhoons over the western North Pacific between 2008-2010 showed that the assimilation of GPS RO data substantially improved the model’s ability to forecast tropical cyclogenesis.
The joint Taiwan-U.S. FORMOSAT-7/COSMIC-2 mission will be launched in September 2016, and is expected to provide ~5000 RO soundings per day over the tropics, an order of magnitude more soundings than COSMIC. With an improved design in the RO receiver and antenna, the RO data from COSMIC-2 will also be of much higher quality than COSMIC, particularly in the tropical lower troposphere. The results from this study suggest that COSMIC-2 will have a significant impact on the prediction of tropical cyclonegenesis.
This seminar will be webcast live at:
Recorded seminar link can be viewed here:
Thursday, 23 July 2015, 3:30 PM
Refreshments 3:15 PM
3450 Mitchell Lane
Bldg 2 Main Auditorium, Room 1022
The Weather Research and Forecasting (WRF) model Summer Tutorial will be offered during a two week period from 27 July - 7 August 2015 at the NCAR Foothills Laboratory, Boulder, Colorado.
The tutorial will be divided into 3 sessions. Participants can attend any combination of these sessions:
1. Basic WRF Tutorial (27-31 July 2015) The tutorial will consist of lectures on various components of the WRF modeling system along with hands-on practice sessions. The topics include:
a. WRF Pre-processing System
b. WRF Dynamics and Numerics
c. WRF Physics
d. Software Framework
e. Post-processing and Graphical Tools
2. WRF Chem Tutorial (3-4 August 2015) A WRF-ARW-only chemistry tutorial will be offered for 2 days during the second week of the tutorial. Topics include:a. General overview of WRF Chemistry b. Chemistry options c. Using chemical sources in simulation
Users may attend this tutorial without attending the basic WRF tutorial; however, a basic understanding of the WRF model is required in order to make full use of the WRF-Chem tutorial.
3. WRFDA Tutorial (5-7 August 2015) A WRF-ARW-only data assimilation tutorial (including hands-on practice) will be offered for 2 and a half days during the second week of the tutorial. The topics will include:
a. Compiling WRFDA for 3DVAR/4DVAR
b. Converting and assimilating conventional data using OBSPROC
c. Assimilating radiance data using RTM
d. Background error estimation using GEN_BE
e.Hybrid variational/ensemble assimilation
f. Analysis/forecast verification
Users may attend this tutorial without attending the basic WRF tutorial; however, a basic understanding of the WRF model is required to make full use of the WRFDA tutorial.
The Graduate School of Mathematics at Nagoya University Japan hosted a meeting on turbulence in 2014 entitled, "Fundamental Aspects of Geophysical Turbulence." The workshop touched on turbulence in the atmosphere, the ocean, and the sun and covered a broad range of scales, topics, and approaches. See the following link for more information: http://www.math.nagoya-u.ac.jp/en/research/conference/2013/geophys-turbu.html
The meeting environment was casual, interactive, and stimulating - attendees and organizers completely enjoyed the event. Thus, we wish to continue the dialogue at the National Center for Atmospheric Research (NCAR), in Boulder, CO in the summer of 2015.
August 11 - 14, 2015
3450 Mitchell Lane, Boulder, Colorado
GSI & EnKF - Starting Tuesday Morning on August 11 - 14, 2015
EnKF Only - Starting Thursday Afternoon on August 13 - 14, 2015
GSI knowledge is required for the EnKF Only Session
The combined Gridpoint Statistical Interpolation and Ensemble Kalman Filter (GSI/EnKF) Community Data Assimilation System Tutorial will be offered at the NCAR Foothills Laboratory, in Boulder, Colorado on August 11-14, 2015. This will be the sixth Community GSI tutorial, but the first time EnKF will be included.
GSI is the operational data assimilation (DA) system being used by various national operational and research centers, including NOAA and the National Aeronautics and Space Administration (NASA). It is traditionally a three-dimensional variational DA system and has been extended to run with advanced features, including the hybrid ensemble-variational data assimilation technique and the four dimensional EnVAR technique.
The EnKF system is a Monte-Carlo algorithm for data assimilation that uses an ensemble of short-term forecasts to estimate the background-error covariance in the Kalman Filter. The EnKF uses the observation operators in the GSI system to transform model variables to observed variables in observation space. Therefore, the types of observations available for use in the EnKF match those for the GSI. Currently this EnKF is running operationally as part of the GSI based hybrid data assimilation system for the National Centers for Environmental Prediction (NCEP) global applications.
The combined GSI/EnKF Community Tutorial will be held over the four days of August 11-14. The GSI/EnKF tutorial will consist of both lectures and hands-on practical exercises. The lecturers are invited from various GSI and EnKF development/support teams including NCEP/EMC, NASA/GMAO, NOAA/GSD, NCAR/MMM and DTC. The practical sessions will provide the necessary skills to run both the GSI and EnKF systems for both basic and advanced implementations. The tutorial will be tailored to the upcoming release code (GSI and EnKF) scheduled for June, 2015
The tutorial consists of a combination of classroom lectures and a hands on practical session. We offer a choice of two registration options:
Full GSI/EnKF Tutorial (4 days): $400 (includes lunch and refreshments)
EnKF Only Tutorial (1.5 days): $130 (includes lunch and refreshments)
There will be NO refunds for cancellations made on, or after 3 PM MT on Friday July 31th, 2015. Prior to the July 31th deadline, we will refund the registration fee, less $25.00 to cover administrative costs.
To register use the link to reg-on-line (https://www.regonline.com/2015gsicommunitytutorialcopycopy) - More information can be found at the tutorial webpage (http://www.dtcenter.org/com-GSI/users/tutorials/2015.php)
If you have any questions or are unable to register please e-mail: MaryBeth Zarlingo (firstname.lastname@example.org)
- Due to seating limitations, the registration is limited to 40 participants.
- 31 July, 2015: Last day to register if you need a temporary account on NCAR's Yellowstone computer for the hands on practical exercises.
- Before 3 PM MT on Friday 31 July 2015, we will refund your registration fee less $25.00 to cover administrative fees.
- After 31 July 2015, there will be no refunds.
We are looking forward to your attendance!
The Engineering for Climate Extremes Partnership (ECEP) is a collaboration between Industry, Commerce, Society, Academia and Government facilitated by NCAR with the goal of developing robust, well-communicated predictions and advice on the impacts of weather and climate extremes to support robust/resilient decision-making. (www.ecep.ucar.edu)
Please join us at a Workshop for the Engineering for Climate Extremes Partnership, to be held at NCAR, Boulder, Colorado on August 19-21 2015
ECEP arose out of a series of cross-disciplinary meetings and was formally launched in late 2014. The aim of this year’s meeting is to review the first year of the Partnership, and to identify future strategic directions.
The workshop will provide the opportunity for community members to share best practice and to feedback their priorities for research and information needs to facilitate decisions and to enhance community resilience to weather and climate extremes. Posters are invited, and several will be selected for oral presentation.
Participants include representatives from Indigenous Americans, Re/insurance, Local Government, Industry and Business, and National and International Universities.