Seminars

Seminars / Informal seminars / Lectures by ECMWF Staff and Invited Lecturers

Seminars contribute to our ongoing educational programme and are tailored to the interests of the ECMWF scientific community.

Informal seminars are held throughout the year on a range of topics. Seminars vary in their duration, depending on the area covered, and are given by subject specialists. As with the annual seminar, this may be an ECMWF staff member or an invited lecturer.

The following is a listing of seminars/lectures that have been given this year on topics of interest to the ECMWF scientific community.  See also our past informal seminars

2019

28 May
at 10:30

Room: LT

Improving atmospheric reanalyses for historical extreme events by rescuing lost weather observations

Speaker: Ed Hawkins (University of Reading)

Abstract

Our understanding of past changes in weather and climate rely on the availability of observations made over many decades. However, billions of historical weather observations are effectively lost to science as they are still only available in their original paper form in various archives around the world. The large-scale digitisation of these observations would substantially improve atmospheric reanalyses back to the 1850s. Recently, volunteer citizen scientists have been assisting with the rescue of millions of these lost observations taken across western Europe over a hundred years ago. The value of these data for understanding many notable and extreme weather events will be demonstrated.

21 March
at 10:30

Room: LT

Constraining Stochastic Parametrization Schemes using High-Resolution Model Simulations

Speaker: Hannah Christensen (Oxford University)

Abstract

Stochastic parametrizations are used in weather and climate models to represent model error. Designing new stochastic schemes has been the target of much innovative research over the last decade, with a focus on developing physically motivated approaches. We present a technique for systematically deriving new stochastic parametrizations or for constraining existing stochastic parametrizations. We take a high-resolution model simulation and coarse-grain it to the desired forecast model resolution. This provides the initial conditions and forcing data needed to drive a Single Column Model (SCM). By comparing the SCM parametrized tendencies with the evolution of the high-resolution model, we can measure the ‘error’ in the SCM tendencies. As a case study, we use this approach to assess the physical basis of the widely used ‘Stochastically Perturbed Parametrization Tendencies’ (SPPT) scheme using the IFS SCM. We provide justification for the multiplicative nature of SPPT, and for the large temporal and spatial scales used in the stochastic perturbations. However, we also identify issues with the SPPT scheme and motivate improvements. In particular, our results indicate that independently perturbing the tendencies associated with different parametrization schemes is justifiable, but that an alternative approach is needed to represent uncertainty in the convection scheme. It is hoped this new coarse-graining technique will improve both holistic and process-based approaches to stochastic parametrization.

20 March
at 10:30

Room: LT

About novel time integration methods for weather and climate simulations

Speaker: Martin Schreiber (Tech University of Munich)

Abstract

Weather and climate simulations face new challenges due to changes in computer architectures caused by physical limitations. From a pure computing perspective, algorithms are required to cope with stagnating or even decreasing per-core speed and increasing on-chip parallelism. Although this leads to an increase in the overall on-chip compute performance, data movement is increasingly becoming the most critical limiting factor. All in all, these trends will continue and already led to research on partly disruptive mathematical and algorithmic reformulations of dynamic cores, e.g. using (additional) parallelism in the time dimension.

This presentation provides an overview and introduction to the variety of newly developed and evaluated time integration methods for dynamical cores, all aimed at improving the ratio of wall clock time to error:

First, I will begin with rational approximations of exponential integrator methods in their various forms: Terry Haut's rational approach of exponential integrators (T-REXI), Cauchy contour integral methods (CI-REXI) on the complex plane and their relationship to Laplace transformations, and diagonalized Butcher's Tableau (B-REXI).

Second, Semi-Lagrangian (SL) methods are often used to overcome limitations on stable time step sizes induced by nonlinear advection. These methods show superior properties in terms of dispersion accuracy, and we have used this property with the Parareal parallel-in-time algorithm. In addition, a combination of SL with REXI is discussed, including the challenges of such a formulation due to Lagrangian formulation.

Third, the multi-level time integration of spectral deferred correction (ML-SDC) will be discussed, focusing on the multi-level induced truncation of nonlinear interactions and the importance of viscosity in this context. Based on this, the "Parallel Full Approximation Scheme in Space and Time" (PFASST) adds a time parallelism that allows even higher accelerations on the time-to-solution compared to ML-SDC and traditional time integration methods.

All studies were mainly conducted based on the shallow water equations (SWE) on the f-plane and the rotating sphere to investigate horizontal aspects of dynamical cores for weather and climate simulation. Overall, our results motivate further investigation and combination of these methods for operational weather/climate systems.

(With contributions and more from Jed Brown, Francois Hamon, Richard Loft, Michael Minion, Matthew Normile, Nathanaël Schaeffer, Andreas Schmitt, Pedro S Peixoto).

12 March
at 11:15

Room: CC  

Running serverless HPC workloads on top of Kubernetes and Jupyter notebooks

Speaker: Christopher Woods (University of Bristol)

6 March
at 10:30

Room: LT

Trends in data technology: opportunities and challenges for Earth system simulation and analysis

Speaker: V Balaji (Princeton Uni and NOAA/GFDL)

Abstract

Earth system modeling, since its origin at the dawn of modern computing, has operated at the very limits of technological possibility. This has led to tremendous advances in weather forecasting, and the use of models to project climate change both for understanding the Earth system, and in service of downstream science and policy. In this talk, we examine changes in underlying technology, including the physical limits of miniaturization, the emergence of a deep memory-strategy hierarchy, which make "business as usual" approaches to simulation and analysis appear somewhat risky. We look simultaneously at trends in Earth system modeling, in terms of the evolution of globally coordinated climate science experiments (CMIP-IPCC) and the emergence of "seamless prediction", blurring the boundaries between weather and climate. Together, these point to new directions of research and development in data software and data science. Innovative and nimble approaches to analysis will be needed. Yesterday's talk examines this in the context of computational science and software, but it seems apparent that computation and data are inseparable problems, and a unified approach is indicated.

6 March
at 14:00

Room: LT

Statistics for Natural science in the age of Supercomputers

Speaker: Dutta Ritabrata (Warwick University)

Abstract:

To explain the fascinating phenomenons of nature, natural scientists develop complex models which can simulate these phenomenons almost close to reality. But the hard question is how to calibrate these models given the real world observations. Traditional statistical methods are handicapped in this setup as we can not evaluate the likelihood functions of parameters of this models. In last decades or so, that statisticians answer to these questions has been approximate Bayesian computation (ABC), where the parameters are calibrated by comparing simulated and observed data set in a rigorous manner. Though it only became possible to apply ABC for realistic and hence complex models when it was efficiently combined with High Performance Computing (HPC). In this work, we will focus on this aspect of ABC, by showing how it was able to calibrate expensive models of epidemics on networks, of molecular dynamics, of platelets deposition in blood-vessels, of passenger queue in airports or volcanic eruption. This was achieved using standard MPI parallelization, nested MPI parallelization or nested CUDA parallelization inside MPI. Finally, we want to raise and discuss the open-questions regarding how to evolve and strengthen this inferential methods when each model simulation takes a full day or a resource equivalent to the best supercomputers of today.

5 March
at 10:30

Room: LT

Machine learning and the post-Dennard era of climate simulation

Speaker: V Balaji (Princeton Uni and NOAA/GFDL

Abstract

In this talk, we examine approaches to Earth system modeling in the post-Dennard era, inspired by the industry trend toward machine learning (ML). ML presents a number of promising pathways, but there remain challenges specific to introducing ML into multi-phase multi-physics modeling. A particular aspect of such 'multi-scale multi-physics' models that is under-appreciated is that they are built using a combination of local process-level and global system-level observational constraints, for which the calibration process itself remains a substantial computational challenge. These include, among others: the non-stationary and chaotic nature of climate time series; the presence of climate subsystems where the underlying physical laws are not completely known; and the imperfect calibration process alluded to above. The talk will present ideas and challenges and the future of Earth system models as we prepare for a post-Dennard future, where learning methods are poised to play an increasingly important role.

21 January
at 11:00

Room: LT

ESSPE: Ensemble-based Simultaneous State and Parameter Estimation for Earth System Data-Model Integration and Uncertainty Quantification

Speaker: Fuqing Zhang (Pennsylvania State University)

Abstract

Building on advanced data assimilation techniques, we advocate to develop and apply a generalized data assimilation software framework on Ensemble-based Simultaneous State and Parameter Estimation (ESSPE) that will facilitate data-model integration and uncertainty quantification for the broad earth and environmental science communities. This include, but not limited to, atmospheric composition and chemistry, land surface, hydrology, and  biogeochemistry, for which many of the physical and chemical processes in their respective dynamic system models rely heavily on parametrizations. Through augmenting uncertain model parameters as part of the state vector, the ESSPE framework will allow for simultaneous state and parameter estimation through assimilating in-situ measurements such as those from the critical-zone ground-based observational  networks and/or remotely sensed observations such as those from radars and satellites. Beyond data model integration and uncertainty quantification, through systematically designed ensemble sensitivity analysis, examples will be given to the application of the ESSPE framework to: (1) identify key physical processes and their significance/impacts and to better represent and parameterize these processes in dynamical models of various earth systems; (2) design better observation strategies in locating the optimum sensitive regions, periods and variables to be measured, and the minimum accuracies and frequencies of these measurements that are required to quantify the physical processes of interest; explore the impacts of heterogeneity and equifinality; (3) understand predictability and nonlinearity of these processes, and parameter identifiability; and (4) facilitate upscale cascading of knowledge from smaller-scale process understanding to larger-scale simplified representation and parametrization. I will end the presentation with an introduction on the preliminary findings from our ongoing collaborations with ECMWF on using the data assimilation methodology to identify potential deficiencies in the convective gravity drag parametrization that led to  stratospheric temperature biases in the operational model, and the potential pathways for using SSPE to improve model physics in the future.

25 January
at 10:30

Room: LT

Windstorm and Cyclone Events: Atmospheric Drivers, Long-term Variability and Skill of current Seasonal Forecasts 

Speaker: Daniel J Befort (University of Oxford)

Abstract

In this study, observed long-term variability of wind storm events is analysed using two state-of-the-art 20th century reanalyses called ERA-20C and NOAA-20CR. Long-term trends partly differ drastically between both datasets. These differences are largest for the early 20th century, with a higher agreement for the past 30 years. However, short-term variability on sub-decadal time-scales is in much better agreement especially over parts of the northern hemisphere. This suggests that these datasets are useful to analyse drivers of interannual variability of windstorm events  as these simulations allow to extend the time-period covered by common reanalyses as e.g. ERA-Interim. 

ERA-20C is used to analyse the relationship between atmospheric and oceanic conditions on windstorm frequency over the European continent. It is found that large parts of their interannual variability can be explained by few atmospheric patterns, including the North Atlantic Oscillation (NAO) and the Scandinavian pattern. This suggests that it is crucial to capture these atmospheric modes of variability e.g. in seasonal forecast systems to reasonably represent windstorm variability over Europe. 

The skill in windstorms and cyclones is analysed for three modern seasonal forecast systems: ECMWF-S3, ECMWF-S4 and GloSea5. Whereas skill for cyclones is generally small, significant positive skill of ECMWF-S4 and GloSea5 is found for windstorms over the eastern North Atlantic/western Europe. Further to analysing skill in windstorms using a dedicated tracking algorithm, it is also tested in how far the NAO can be used as a predictor for their variability. Results suggest that using the NAO adds some skill over northern Europe, however, using the whole model information by tracking windstorm events is superior over large parts over the eastern Atlantic and western Europe. 

7 January
at 10:30

Room: LT

When fossil fuel emissions are no longer perfect in atmospheric inversion systems

Speaker: Thomas Lauvaux (LSCE, Saclay, France)

Abstract

The biogenic component of greenhouse gas fluxes remains the primary source of uncertainties in global and regional inversion systems. But recent results suggest that anthropogenic greenhouse gas emissions from fossil fuel use, so far assumed perfect at all scales, represent a larger fraction of the uncertainties in these systems, and can no longer be ignored. Inversion systems capable of reducing fossil fuel uncertainties are discussed in parallel with planned observing systems deployed across the world and in space. The remaining challenges and recent advances are presented to not only infer fossil fuel emissions but to provide support to climate policy makers at national and local scales.

 

Uncertainty quantification of pollutant source retrieval: comparison of Bayesian methods with application to the Chernobyl and Fukushima Daiichi accidental releases of radionuclides

Speaker: M Bocquet (CEREA, France)

Abstract

Inverse modeling of the emissions of atmospheric species and pollutants has significantly progressed over the past fifteen years.  However, in spite of seemingly reliable estimates, the retrievals are rarely accompanied with an objective estimate of their uncertainty, except when Gaussian statistics are assumed for the errors which is often unrealistic.  I will describe rigorous techniques meant to compute this uncertainty in the context of the inverse modeling of the time emission rates -- the so-called source term -- of a point-wise atmospheric tracer.  Lognormal statistics are used for the positive source term prior and possibly the observation errors, which precludes simple Gaussian statistics-based solutions.

Firstly, through the so-called empirical Bayesian approach, parameters of the error statistics -- the hyperparameters -- are estimated by maximizing the observation likelihood via an expectation-maximization algorithm. This enables the estimation of an objective source term.  Then, the uncertainties attached to the total mass estimate and the source rates are estimated using four Monte Carlo techniques: (i) an importance sampling based on a Laplace proposal, (ii) a naive randomize-then-optimize (RTO) sampling approach, (iii) an unbiased RTO sampling approach, (iv) a basic Markov chain Monte Carlo (MCMC) simulation. Secondly, these methods are compared to a full Bayesian hierarchical approach, using an MCMC based on a transdimensional representation of the source term to reduce the computational cost.

I will apply those methods, and improvements thereof, to the estimation of the atmospheric cesium-137 source terms from the Chernobyl nuclear power plant accident in April/May 1986 and Fukushima Daiichi nuclear power plant accident in March 2011.

LT = Lecture Theatre, LCR = Large Committee Room, MZR = Mezzanine Committee Room,
CC = Council Chamber