Seminars

 

 

Seminars / Informal seminars / Lectures by ECMWF Staff and Invited Lecturers

Seminars contribute to our ongoing educational programme and are tailored to the interests of the ECMWF scientific community.

Informal seminars are held throughout the year on a range of topics. Seminars vary in their duration, depending on the area covered, and are given by subject specialists. As with the annual seminar, this may be an ECMWF staff member or an invited lecturer.

The following is a listing of seminars/lectures that have been given this year on topics of interest to the ECMWF scientific community.  See also our past informal seminars

2019

4 July
at 10:30

Room: LT

Forecast Evaluation of Set-Valued Properties

Speaker: Tobias Fissler (Imperial College, London)

Abstract

In forecast evaluation, one distinguishes two major tasks: forecast validation (or verification) and forecast comparison. While the former one is commonly performed with identification functions (for point forecasts) or other diagnostic tools of calibrations such as the Probability Integral Transform (for probabilistic forecasts), the latter task utilises scoring functions in the case of point forecasts and scoring rules for probabilistic forecasts. It is a widely accepted paradigm that these scoring functions (rules) should be consistent (proper) in that they honour correctly specified forecasts, thus incentivising truthful reporting under risk-neutrality. 

The statistical, climatological and financial literature has seen remarkable contributions to the evaluation of real- or vector-valued properties as well as predictive distributions for real- and vector-valued quantities. On the other hand, the case of set-valued properties has received considerably less attention.

Acknowledging the spatial structure of many climatological and meteorological phenomena of interest, we introduce a theoretical framework for sound forecast evaluation of set-valued quantities such as the expected area of precipitation or confidence regions for floods. To this end, we suggest the formal distinction between a selective notion, where forecasters are content with specifying a single point in the set of interest, and an exhaustive notion, where forecasters are far more ambitious and aim at correctly specifying the entire set of interest. We unveil a stark dichotomy between these two notions: A functional can be either selectively or exhaustively elicitable. 

We discuss implications of this mutual exclusivity for best practice in forecast evaluation of set-valued quantities, putting an emphasis on applications in meteorology and climatology.

 

This talk is based on joint work with Jana Hlavinová and Birgit Rudloff.

28 June
at 11:00

Room: LT

Towards an unbiased stratopsheric analysis

Speaker: Patrick Laloyaux (ECMWF)

Abstract

A set of newly developed diagnostics based on GPS-RO observations has revealed the presence of systematic large-scale errors in the stratospheric temperature of the IFS model. Interestingly, the amplitude of this bias has increased over the past few years in conjunction with the last horizontal resolution upgrade and the revision of the radiative scheme.

To take into account model biases, a weak-constraint 4D-Var formulation has been developed where a model-error forcing term is explicitly estimated inside the 4D-Var minimisation. This approach is able to reduce by up to 50% the bias in the analysis departures of all observations sensitive to stratospheric temperature. The importance of anchoring data (accurate observations that do not require bias correction) such as GPS-RO is apparent to ensure the good performance of the method. Weak-constraint 4D-Var also allows to have a more consistent way to treat the source of the different biases, ensuring that the Variational Bias Correction (VarBC) corrects only systematic errors from data and observation operators. In this talk, we will also start exploring the potential of the new weak-constraint 4D-Var to correct for model biases in medium-range weather forecasting and climate reanalyses.

19 June
at 15:30

Room: LT

Introduction to the New York State Mesonet

Speaker: Chris Thorncroft (University at Albany)

Abstract 

The New York State Mesonet (NYSM) is a comprehensive network of 126 environmental monitoring stations deployed statewide with an average spacing of 27 km.  The primary goal of the NYSM is to provide high quality weather data at high spatial and temporal scales to improve atmospheric  monitoring and prediction, especially for extreme weather events.  Completed in spring 2018, each station is equipped with  a standard suite of atmospheric and soil sensors.  Collectively, the network is comprised of 1,825 sensors with approximately 907,200 observations collected per day.  Unique aspects of the NYSM include its measurement of snow depth,  soil moisture and temperature, and its collection of camera images at  every site.  The NYSM also pioneered the building of three additional  sub-networks to collect vertical profile, surface energy budget, and  snow water equivalent measurements at a select number of sites across  the state.  The location of each station was carefully selected based upon WMO siting criteria and local requirements.  Extensive metadata are made available online.  All data are collected, quality-controlled, archived, and disseminated every 5 minutes.  Real-time data are displayed on the web for public use, and archived data are available for download.  Data are now utilized by a variety of sectors including emergency  management, transportation, utilities, agriculture and education.  Recent examples of the utility of the data will be shared.

 18 June
at 10:30
Room: LT

NOAA-CIRES-DOE 20th Century reanalysis version “3” (1836-2015) and Prospects for 200 years of reanalysis

Speaker: Gil Compo

Abstract

The new historical reanalysis dataset generated by the Physical Sciences Division of NOAA’s Earth System Research Laboratory and the University of Colorado CIRES, the Twentieth Century Reanalysis version 3 (20CRv3), is a comprehensive global atmospheric circulation dataset spanning 1836 to present, assimilating only surface pressure and using monthly Hadley Centre sea ice distributions (HadISST2.3) and an ensemble of daily Simple Ocean Data Assimilation with Sparse Input (SODAsi.3) sea surface temperatures.  SODAsi.3 was forced with a previous version of 20CR that itself was forced with a previous SODAsi, allowing these “iteratively-coupled” boundary conditions to be more consistent with the atmospheric reanalysis.  20CRv3 has been made possible with supercomputing resources of the U.S. Department of Energy and a collaboration with GCOS, WCRP, and the ACRE initiative. It is chiefly motivated by a need to provide an observational validation dataset, with quantified uncertainties, for assessments of climate model simulations of the 19th to 21st centuries, with emphasis on the statistics of daily weather. It uses, together with the NCEP global forecast system (GFS) numerical weather prediction (NWP) land/atmosphere model to provide background "first guess" fields, an Ensemble Kalman Filter (EnKF) data assimilation method. This yields a global analysis every 3 hours as the most likely state of the atmosphere, and also yields the uncertainty of that analysis.

20CRv3 has several improvements compared to the previous version 2c. The analysis and the 80 member ensemble are generated with the NCEP GFS at T254 resolution (about 0.75 degrees latitude by longitude) with 64 levels in the vertical, compared to T62 (about 2 degrees latitude by longitude) and 28 vertical levels in the 20CRv2c 56 member ensemble. This gives an improved representation of extreme events, such as hurricanes. Implementation of a “relaxation to prior” covariance inflation algorithm, combined with stochastic parameterizations in the GFS, provides quantitatively better uncertainty estimates than the previous additive inflation of 20CRv2c. An adaptive localization helps to keep the analysis from over-fitting the observations. A variational quality control system retains more observations. An incremental analysis update procedure produces a temporally smoother analysis without spurious spin-up trends seen in 20CRv2c. Millions of additional pressure observations contained in the new International Surface Pressure Databank version 4.7, such as from the citizen science Oldweather.org project, also improve the analyses. These improvements result in 20CR version “3” having comparable or better analyses to version 2c, as suggested by improved 6 hour forecast skill, more realistic uncertainty in near-surface air temperature, and a reduction in spurious centennial trends in the tropical and polar regions.  Possibilities for 200 years of reanalysis are also discussed in light of results of test reanalyses of the 1816 “Year without a Summer”.

Gilbert P. Compo1,2, Laura Slivinski1,2, Jeffrey S. Whitaker2, Prashant D. Sardeshmukh1,2, Benjamin S. Giese3, Philip Brohan4, Rob Allan4

1CIRES, University of Colorado, USA, compo@colorado.edu,  2Physical Sciences Division, Earth System Research Laboratory, NOAA, USA., 3Department of Oceanography, Texas A&M University , USA, 4Met Office Hadley Centre, Exeter, UK

14 June
at 10:30

Room: LT

Ocean Waves as a Missing Link Between Atmosphere and Ocean

Speaker: Alex Babanin (University of Melbourne, Australia)

 Abstract

Role of the waves as a link between the ocean and atmosphere will be discussed. It is rapidly becoming clear that many large-scale geophysical processes are essentially coupled with the surface waves, and those include weather, tropical cyclones, ice cover in both Hemispheres, climate and other phenomena in the atmosphere, at air/sea, sea/ice and sea/land interface, and many issues of the upper-ocean mixing below the surface. Besides, the wind-wave climate itself experiences large-scale trends and fluctuations, and can serve as an indicator for changes in the weather climate.  In the presentation, we will discuss wave influences at scales from turbulence to climate, on the atmospheric and oceanic sides.

At the atmospheric side of the interface, the air-sea coupling is usually described by means of the drag coefficient Cd, which represents the momentum flux in terms of the wind speed, but the scatter of experimental data with respect to such dependences is very significant and has not improved noticeably over some 40 years. It is argued that the scatter is due to multiple mechanisms which contribute into the sea drag, many of them are due to surface waves and cannot be accounted for unless the waves are explicitly known. We also argue that separation of the momentum flux for the components which go to the waves and to the current, is not trivial and depends on a numbers of factors such as sea state, but also on the measurement setup. In this presentation, field data, both at moderate winds and in Tropical Cyclones, and a WBL model are used to investigate such biases. It is shown that near the surface the turbulent fluxes are less than those obtained by extrapolation using the logarithmic-layer assumption, and the mean wind speeds very near the surface are larger.

Among wave-induced influences at the ocean side, the ocean mixing is most important. Until recently, turbulence produced by the orbital motion of surface waves was not accounted for, and this fact limits performance of the models for the upper-ocean circulation and ultimately large-scale air-sea interactions. Theory and practical applications for the wave-induced turbulence will be reviewed in the presentation. These include viscous and instability theories of wave turbulence, direct numerical simulations and laboratory experiments, field and remote sensing observations and validations, and finally implementations in ocean, Tropical Cyclone, ocean and ice models.

14 June  
at 13:00

Room: LT

The Bureau of Meteorology Research Program

Speaker: Peter May (BOM, Australia)

Abstract

An overview of the Australian Bureau of Meteorology's research program will be presented.  This will include our plans focusing on high resolution numerical prediction, multi-week and seasonal modelling, advanced post processing, climate science and our work to address some fundamental science and societal challenges.  For example our work on fire weather ranging from the changing fire risk in Australia, our current and future guidance and work on firestorms including fully coupled fire -high resolution simulations.   Finally I will discuss our future plans in the context of a fundamental transformation of the Bureau and the way we will be providing weather and climate services. 

 30 May
at 10:30

Room: LT

The atmospheric response to increased ocean model resultion in the ECMWF Integrated Forecasting System: a seamless approach

Speaker: Chris Roberts

Abstract

This study uses initialized forecasts and climate integrations to evaluate the wintertime North Atlantic response to an increase of ocean model resolution from ∼100 km (LRO) to ∼25 km (HRO) in the European Centre for Medium-Range Weather Forecasts Integrated Forecasting System (ECMWF-IFS). The presented analysis considers the atmospheric response at lead times of weeks to decades and assesses mean biases, variability, and subseasonal predictability. Importantly, the simulated impacts are highly dependent on lead time such that impacts seen at climate timescales cannot be generalized to initialized forecasts. At subseasonal lead times (weeks 1-4), sea surface temperature (SST) biases in LRO and HRO configurations are similar and partly inherited from ocean initial conditions. At multidecadal timescales, mean differences are dominated by biases in the LRO configuration, which include a North Atlantic cold bias and a breakdown in the 

 29 May
at 10:00

Room: LT

The axes of the Météo-France scientific strategy

Speaker: Marc Pontaud (Meteo-France)

  Abstract

Make progress in the knowledge and anticipation of extreme phenomena and their impacts, in a context of climate change

Anticipating extreme or high-stakes phenomena and their impacts, in metropolitan France and overseas, is a strong societal expectation addressed to Météo-France. Progress in this area is largely based on improvements made to our numerical weather prediction systems, particularly regional ones, which remain a priority research objective at Météo-France. Progress will come from several areas.

 A first source of progress will be the assimilation of an increasing number of observations from new meteorological satellites launched during the period, but also through access to new data sources or through better use of existing sources such as meteorological radars. It will also be a question of optimizing the information provided by these data. This will require the development of a new generation of data assimilation algorithms. The realization of forecasts in the form of sets will be generalized in order to improve by a few hours the anticipation of the occurrence of a meteorological hazard, to better predict its intensity and consequences and to be able to offer new services based on a greater capacity to adapt the forecasts to the expectations of users taking into account their challenges.

Improving the prediction of extreme or high-stakes weather phenomena and their impacts also requires progress in understanding the processes at work and in their modelling at different scales. Research investigation resources (measurement campaigns, instrumented sites, meter-scale modelling, etc.) will be oriented and developed as a priority to meet these objectives. It will also evaluate the contribution of very high resolution modelling, i.e. a few hundred metres grid, to the prediction of meteorological hazards on sites or during events with stakes.

For the very short time frames (0-3h), the contribution of artificial intelligence (AI) and other signal processing techniques to the extrapolation of observations and their fusion with numerical forecasts will be explored.

Moreover, knowledge at the regional level of the evolution of the frequency and intensity of extreme events with climate change is essential for adapting to climate change and strengthening territorial resilience measures. Météo-France will contribute to the scientific response to these issues, identified in the French Adpatation Program, by interpreting recent climate trends through the use of observations and climate model results. The aim will be to carry out simulations of past and future climate, in particular with the fine-scale forecasting model, capable of representing the evolution of Mediterranean episodes, tropical cyclones and the urban heat island during heat waves.

 

Continue the transition to integrated environmental modelling systems shared between forecasting and climate

Operational weather prediction, seasonal climate prediction and climate study require modelling not only the behaviour of the atmosphere but also that of other interacting components of the Earth system (continental surfaces, ocean, waves, cryosphere, chemical composition of the atmosphere) and anthropogenic factors (urbanization, irrigation, dams, anthropogenic emissions, etc.). This evolution towards environmental modelling will result in the construction and experimentation of a regional "Earth system" composite model with kilometric resolution, with the assistance of partners mastering certain components, such as the ocean or sea ice. This regional modelling system will be modular to allow different configurations and coupling levels depending on the forecast or application objectives.

This work will continue to be part of a single modelling system logic, from the global to the local scale, with tools shared between weather forecasting and climate modelling activities.

For numerical weather prediction, including the chemical composition of the atmosphere, a new research axis will be opened, that of data assimilation coupled between the different components (atmosphere, continental surfaces, aerosols and atmospheric chemistry, ocean, sea states), with the prospect of taking better advantage of certain observations at the interfaces and the exploitation of a greater number of data.

 

Adapt modeling tools to operational requirements on tomorrow's computing architectures

The operational use of the tools developed by the research is at the heart of the institution's strategy. On the one hand, it allows a rapid transfer of innovations from research to all the institution's activities and, on the other hand, a daily comparison of scientific work with reality, thus allowing researchers to benefit from regular feedback on the quality of their models. This community of tools between scientific and operational activities also imposes constraints that must be integrated by research teams (speed of code execution, compatibility of algorithms with computing infrastructures, etc.). In close collaboration with the meteorological services community, the establishment will carry out the scientific work necessary to prepare for future technological developments in intensive computing, including the development of graphics processors or other new or emerging architectures that will have a profound impact on the structure of digital codes.


Enhance weather and climate forecasts in response to the expectations of internal and external beneficiaries

Météo-France's research must also contribute to the promotion of its innovations, both to internal users (particularly forecasters) and to external users.

In particular, the enhancement of modeling data, which is increasingly accurate and informative but also more numerous, rich and complex, will be a major focus of research. This includes supporting the use of ensemble forecasts by defining methods for statistical extraction of relevant signals and post-processing adapted to this new type of data. To cover the needs in this area, scientific expertise in the field of AI will be strengthened. In a logic of support for end-users, the emphasis will be placed on innovation, the transfer of research results to the operational level and the orientation of scientific activities towards the needs of Météo-France's operational departments.

 

Strengthen the dynamics of national and international cooperation

The orientation of Météo-France's research activities towards environmental modelling on a regional scale will be accompanied by a strengthening and broadening of cooperation with the French academic and scientific community, such as the CNRS, the academic world and major national research organisations, and with international actors engaged in this same path.

The evolution of numerical weather prediction systems towards coupled systems and the consideration of future supercomputer architectures lead to increased cooperation with the meteorological services community with which the modelling tools are co-developed (European consortia Aladin and Hirlam, ECMWF). To be fully effective, this collaboration between meteorological services will have to be based on better shared tools and software convergence with the ECMWF is a reaffirmed priority for the Establishment.

In the field of space observation of the Earth, Météo-France will consolidate its status as a privileged interlocutor with space agencies for meteorological and climate applications, and more broadly environmental applications. This approach will be based on the close relations established with CNES in France, Eumetsat and ESA in Europe as well as with certain international space agencies.

28 May
at 10:30

Room: LT

Improving atmospheric reanalyses for historical extreme events by rescuing lost weather observations

Speaker: Ed Hawkins (University of Reading)

Abstract

Our understanding of past changes in weather and climate rely on the availability of observations made over many decades. However, billions of historical weather observations are effectively lost to science as they are still only available in their original paper form in various archives around the world. The large-scale digitisation of these observations would substantially improve atmospheric reanalyses back to the 1850s. Recently, volunteer citizen scientists have been assisting with the rescue of millions of these lost observations taken across western Europe over a hundred years ago. The value of these data for understanding many notable and extreme weather events will be demonstrated.

16 May
at 11:15

Room: Council

Are seasonal forecasts useful to improve operational decisions for water supply in the UK?

Speakers: Francesca Pianosi and Andres Peñuela (Bristol University)

Abstract

Improved skill of seasonal predictions for the North Atlantic circulation and Northern Europe are motivating an increasing effort towards developing seasonal hydrological forecasting systems, such as the Copernicus Climate Change Service (C3S). Among other purposes, such forecasting systems are expected to deliver better-informed water management decisions. Using a pumped-storage reservoir system in the UK as a pilot application, we investigate the potential for using seasonal weather forecasts to simultaneously increase supply reliability and reduce pumping costs. To this end, we develop a Real-Time Optimisation System (RTOS) that uses C3S seasonal weather forecasts to generate hydrological forecasts, and combine them with a reservoir simulation model and an evolutionary optimisation algorithm, to generate release and pumping decisions.

We evaluate the performances of the RTOS over historical periods and compare it to several benchmarks, including a simplified operation scheme that mimic the current operational procedures, and a RTOS that uses Ensemble Streamflow Predictions (ESP) in place of C3S seasonal forecasts. We also attempt at linking the improvement of system performances to the characteristic of hydrological conditions and forecasts properties. Ultimately, we aim at addressing key questions such as ‘To what extent improving forecast skill translates into an increase of the forecast value for water supply decisions?’ and ‘Does accounting for forecast uncertainty in optimization improve decisions?’.

16 May
at 10:15

Room: LT

Understanding the intraseasonal variability over Indian region and development of an operational extended range prediction system

Speaker: Dr Sahai (ITM, India)

 Abstract

Extended range forecast of sub seasonal variability beyond weather scale is a critical component in climate forecast applications over Indian region. The sub-seasonal to seasonal (s2s) project, undertaken by WCRP, started in 2013 to improve the forecast beyond weather scale which is a challenging gap area in research and operational forecast domain. The primary objective of this s2s project is to provide the sub-seasonal to seasonal forecast in various lead times.

The prediction of weather and climate in the extended range (i.e. 2-3 weeks in advance) is much in demand in the sectors depending on water resources, city planning, dam management, health management (e.g. protection against heat death) etc. This demand has grown up manifold in the last five years with the experimental implementation of dynamical extended range prediction system (ERPS) by Indian Institute of Tropical Meteorology (IITM), Pune. At the heart of ERPS is a forecast system based on the NCEP-CFSv2 Ocean-Atmosphere coupled dynamical model (hereafter CFS), which is configured to run in two resolutions (T382 and T126) and an atmospheric only version (hereafter GFS) configured to run with CFS sea surface temperature (SST) boundary condition that is bias corrected with observations. The initial conditions to run the model are generated through an in-house developed perturbation technique using NCMRWF(atmospheric) and INCOIS(ocean) data assimilation system. From every initial condition the model is run for the next 28 days and a multi ensemble forecast run is created. Forecast product variables are then separated for sector specific application with suitable post processing and downscaling based on advanced statistical techniques. The application of this forecast can be made in several allied fields like agro-meteorology, hydrometeorology, health sector etc. My talk will provide a brief overview of ERPS keeping the focus on few sector specific  applications.

15 May
at 10:30

Room: LT

Parallel in Time Integration Using PFASST

Speaker: Michael Minion (Lawrence Berkeley National Laboratory)

Abstract

The Parallel Full Approximation Scheme in Space and Time (PFASST) is an iterative approach to parallelization for the integration of ODEs and PDEs in the time direction.  I will give an overview of the PFASST algorithm, discuss the advantages and disadvantages of PFASST compared to other popular parallel in time (PinT) approaches, and show some examples of PFASST in applications.  I will also explain the construction of a new class of PinT integrators that combine properties of exponential integrators and PFASST, including some preliminary results on the accuracy and parallel performance of the
algorithm.

13 May
at 11:00

Room: LT

Flood Forecasting and Inundation Mapping using the US National Water Model

Speaker: David R Maidment (University of Texas at Austin)

Abstract

The US National Water Model forecasts flows on 5.2 million km of streams and rivers in the continental United States, divided into 2.7 million forecast reaches.  A Medium Range Forecast from this model for Hurricane Harvey prepared 3 days before the hurricane made landfall successfully predicted the spatial pattern of inland flooding in Texas.  A continental scale inundation map has been developed using the Height Above Nearest Drainage (HAND) method, and an associated cell phone app called Pin2Flood built that enables flood emergency responders to create their own flood inundation maps using the same HAND map base, thus connecting predictive and observational flood inundation mapping.

About the Presenter: David R Maidment is the Hussein M Alharthy Centennial Chair in Civil Engineering at the University of Texas at Austin, where he has served on the faculty since 1981.  He received his BE degree from the University of Canterbury in Christchurch, New Zealand, and his MS and PhD degrees from the University of Illinois.  In 2016, he was elected to the US National Academy of Engineering for application of geographic information systems to hydrologic processes.

21 March
at 10:30

Room: LT

Constraining Stochastic Parametrization Schemes using High-Resolution Model Simulations

Speaker: Hannah Christensen (Oxford University)

Abstract

Stochastic parametrizations are used in weather and climate models to represent model error. Designing new stochastic schemes has been the target of much innovative research over the last decade, with a focus on developing physically motivated approaches. We present a technique for systematically deriving new stochastic parametrizations or for constraining existing stochastic parametrizations. We take a high-resolution model simulation and coarse-grain it to the desired forecast model resolution. This provides the initial conditions and forcing data needed to drive a Single Column Model (SCM). By comparing the SCM parametrized tendencies with the evolution of the high-resolution model, we can measure the ‘error’ in the SCM tendencies. As a case study, we use this approach to assess the physical basis of the widely used ‘Stochastically Perturbed Parametrization Tendencies’ (SPPT) scheme using the IFS SCM. We provide justification for the multiplicative nature of SPPT, and for the large temporal and spatial scales used in the stochastic perturbations. However, we also identify issues with the SPPT scheme and motivate improvements. In particular, our results indicate that independently perturbing the tendencies associated with different parametrization schemes is justifiable, but that an alternative approach is needed to represent uncertainty in the convection scheme. It is hoped this new coarse-graining technique will improve both holistic and process-based approaches to stochastic parametrization.

20 March
at 10:30

Room: LT

About novel time integration methods for weather and climate simulations

Speaker: Martin Schreiber (Tech University of Munich)

Abstract

Weather and climate simulations face new challenges due to changes in computer architectures caused by physical limitations. From a pure computing perspective, algorithms are required to cope with stagnating or even decreasing per-core speed and increasing on-chip parallelism. Although this leads to an increase in the overall on-chip compute performance, data movement is increasingly becoming the most critical limiting factor. All in all, these trends will continue and already led to research on partly disruptive mathematical and algorithmic reformulations of dynamic cores, e.g. using (additional) parallelism in the time dimension.

This presentation provides an overview and introduction to the variety of newly developed and evaluated time integration methods for dynamical cores, all aimed at improving the ratio of wall clock time to error:

First, I will begin with rational approximations of exponential integrator methods in their various forms: Terry Haut's rational approach of exponential integrators (T-REXI), Cauchy contour integral methods (CI-REXI) on the complex plane and their relationship to Laplace transformations, and diagonalized Butcher's Tableau (B-REXI).

Second, Semi-Lagrangian (SL) methods are often used to overcome limitations on stable time step sizes induced by nonlinear advection. These methods show superior properties in terms of dispersion accuracy, and we have used this property with the Parareal parallel-in-time algorithm. In addition, a combination of SL with REXI is discussed, including the challenges of such a formulation due to Lagrangian formulation.

Third, the multi-level time integration of spectral deferred correction (ML-SDC) will be discussed, focusing on the multi-level induced truncation of nonlinear interactions and the importance of viscosity in this context. Based on this, the "Parallel Full Approximation Scheme in Space and Time" (PFASST) adds a time parallelism that allows even higher accelerations on the time-to-solution compared to ML-SDC and traditional time integration methods.

All studies were mainly conducted based on the shallow water equations (SWE) on the f-plane and the rotating sphere to investigate horizontal aspects of dynamical cores for weather and climate simulation. Overall, our results motivate further investigation and combination of these methods for operational weather/climate systems.

(With contributions and more from Jed Brown, Francois Hamon, Richard Loft, Michael Minion, Matthew Normile, Nathanaël Schaeffer, Andreas Schmitt, Pedro S Peixoto).

12 March
at 11:15

Room: CC  

Running serverless HPC workloads on top of Kubernetes and Jupyter notebooks

Speaker: Christopher Woods (University of Bristol)

6 March
at 10:30

Room: LT

Trends in data technology: opportunities and challenges for Earth system simulation and analysis

Speaker: V Balaji (Princeton Uni and NOAA/GFDL)

Abstract

Earth system modeling, since its origin at the dawn of modern computing, has operated at the very limits of technological possibility. This has led to tremendous advances in weather forecasting, and the use of models to project climate change both for understanding the Earth system, and in service of downstream science and policy. In this talk, we examine changes in underlying technology, including the physical limits of miniaturization, the emergence of a deep memory-strategy hierarchy, which make "business as usual" approaches to simulation and analysis appear somewhat risky. We look simultaneously at trends in Earth system modeling, in terms of the evolution of globally coordinated climate science experiments (CMIP-IPCC) and the emergence of "seamless prediction", blurring the boundaries between weather and climate. Together, these point to new directions of research and development in data software and data science. Innovative and nimble approaches to analysis will be needed. Yesterday's talk examines this in the context of computational science and software, but it seems apparent that computation and data are inseparable problems, and a unified approach is indicated.

6 March
at 14:00

Room: LT

Statistics for Natural science in the age of Supercomputers

Speaker: Dutta Ritabrata (Warwick University)

Abstract:

To explain the fascinating phenomenons of nature, natural scientists develop complex models which can simulate these phenomenons almost close to reality. But the hard question is how to calibrate these models given the real world observations. Traditional statistical methods are handicapped in this setup as we can not evaluate the likelihood functions of parameters of this models. In last decades or so, that statisticians answer to these questions has been approximate Bayesian computation (ABC), where the parameters are calibrated by comparing simulated and observed data set in a rigorous manner. Though it only became possible to apply ABC for realistic and hence complex models when it was efficiently combined with High Performance Computing (HPC). In this work, we will focus on this aspect of ABC, by showing how it was able to calibrate expensive models of epidemics on networks, of molecular dynamics, of platelets deposition in blood-vessels, of passenger queue in airports or volcanic eruption. This was achieved using standard MPI parallelization, nested MPI parallelization or nested CUDA parallelization inside MPI. Finally, we want to raise and discuss the open-questions regarding how to evolve and strengthen this inferential methods when each model simulation takes a full day or a resource equivalent to the best supercomputers of today.

5 March
at 10:30

Room: LT

Machine learning and the post-Dennard era of climate simulation

Speaker: V Balaji (Princeton Uni and NOAA/GFDL

Abstract

In this talk, we examine approaches to Earth system modeling in the post-Dennard era, inspired by the industry trend toward machine learning (ML). ML presents a number of promising pathways, but there remain challenges specific to introducing ML into multi-phase multi-physics modeling. A particular aspect of such 'multi-scale multi-physics' models that is under-appreciated is that they are built using a combination of local process-level and global system-level observational constraints, for which the calibration process itself remains a substantial computational challenge. These include, among others: the non-stationary and chaotic nature of climate time series; the presence of climate subsystems where the underlying physical laws are not completely known; and the imperfect calibration process alluded to above. The talk will present ideas and challenges and the future of Earth system models as we prepare for a post-Dennard future, where learning methods are poised to play an increasingly important role.

21 January
at 11:00

Room: LT

ESSPE: Ensemble-based Simultaneous State and Parameter Estimation for Earth System Data-Model Integration and Uncertainty Quantification

Speaker: Fuqing Zhang (Pennsylvania State University)

Abstract

Building on advanced data assimilation techniques, we advocate to develop and apply a generalized data assimilation software framework on Ensemble-based Simultaneous State and Parameter Estimation (ESSPE) that will facilitate data-model integration and uncertainty quantification for the broad earth and environmental science communities. This include, but not limited to, atmospheric composition and chemistry, land surface, hydrology, and  biogeochemistry, for which many of the physical and chemical processes in their respective dynamic system models rely heavily on parametrizations. Through augmenting uncertain model parameters as part of the state vector, the ESSPE framework will allow for simultaneous state and parameter estimation through assimilating in-situ measurements such as those from the critical-zone ground-based observational  networks and/or remotely sensed observations such as those from radars and satellites. Beyond data model integration and uncertainty quantification, through systematically designed ensemble sensitivity analysis, examples will be given to the application of the ESSPE framework to: (1) identify key physical processes and their significance/impacts and to better represent and parameterize these processes in dynamical models of various earth systems; (2) design better observation strategies in locating the optimum sensitive regions, periods and variables to be measured, and the minimum accuracies and frequencies of these measurements that are required to quantify the physical processes of interest; explore the impacts of heterogeneity and equifinality; (3) understand predictability and nonlinearity of these processes, and parameter identifiability; and (4) facilitate upscale cascading of knowledge from smaller-scale process understanding to larger-scale simplified representation and parametrization. I will end the presentation with an introduction on the preliminary findings from our ongoing collaborations with ECMWF on using the data assimilation methodology to identify potential deficiencies in the convective gravity drag parametrization that led to  stratospheric temperature biases in the operational model, and the potential pathways for using SSPE to improve model physics in the future.

25 January
at 10:30

Room: LT

Windstorm and Cyclone Events: Atmospheric Drivers, Long-term Variability and Skill of current Seasonal Forecasts 

Speaker: Daniel J Befort (University of Oxford)

Abstract

In this study, observed long-term variability of wind storm events is analysed using two state-of-the-art 20th century reanalyses called ERA-20C and NOAA-20CR. Long-term trends partly differ drastically between both datasets. These differences are largest for the early 20th century, with a higher agreement for the past 30 years. However, short-term variability on sub-decadal time-scales is in much better agreement especially over parts of the northern hemisphere. This suggests that these datasets are useful to analyse drivers of interannual variability of windstorm events  as these simulations allow to extend the time-period covered by common reanalyses as e.g. ERA-Interim. 

ERA-20C is used to analyse the relationship between atmospheric and oceanic conditions on windstorm frequency over the European continent. It is found that large parts of their interannual variability can be explained by few atmospheric patterns, including the North Atlantic Oscillation (NAO) and the Scandinavian pattern. This suggests that it is crucial to capture these atmospheric modes of variability e.g. in seasonal forecast systems to reasonably represent windstorm variability over Europe. 

The skill in windstorms and cyclones is analysed for three modern seasonal forecast systems: ECMWF-S3, ECMWF-S4 and GloSea5. Whereas skill for cyclones is generally small, significant positive skill of ECMWF-S4 and GloSea5 is found for windstorms over the eastern North Atlantic/western Europe. Further to analysing skill in windstorms using a dedicated tracking algorithm, it is also tested in how far the NAO can be used as a predictor for their variability. Results suggest that using the NAO adds some skill over northern Europe, however, using the whole model information by tracking windstorm events is superior over large parts over the eastern Atlantic and western Europe. 

7 January
at 10:30

Room: LT

When fossil fuel emissions are no longer perfect in atmospheric inversion systems

Speaker: Thomas Lauvaux (LSCE, Saclay, France)

Abstract

The biogenic component of greenhouse gas fluxes remains the primary source of uncertainties in global and regional inversion systems. But recent results suggest that anthropogenic greenhouse gas emissions from fossil fuel use, so far assumed perfect at all scales, represent a larger fraction of the uncertainties in these systems, and can no longer be ignored. Inversion systems capable of reducing fossil fuel uncertainties are discussed in parallel with planned observing systems deployed across the world and in space. The remaining challenges and recent advances are presented to not only infer fossil fuel emissions but to provide support to climate policy makers at national and local scales.

 

Uncertainty quantification of pollutant source retrieval: comparison of Bayesian methods with application to the Chernobyl and Fukushima Daiichi accidental releases of radionuclides

Speaker: M Bocquet (CEREA, France)

Abstract

Inverse modeling of the emissions of atmospheric species and pollutants has significantly progressed over the past fifteen years.  However, in spite of seemingly reliable estimates, the retrievals are rarely accompanied with an objective estimate of their uncertainty, except when Gaussian statistics are assumed for the errors which is often unrealistic.  I will describe rigorous techniques meant to compute this uncertainty in the context of the inverse modeling of the time emission rates -- the so-called source term -- of a point-wise atmospheric tracer.  Lognormal statistics are used for the positive source term prior and possibly the observation errors, which precludes simple Gaussian statistics-based solutions.

Firstly, through the so-called empirical Bayesian approach, parameters of the error statistics -- the hyperparameters -- are estimated by maximizing the observation likelihood via an expectation-maximization algorithm. This enables the estimation of an objective source term.  Then, the uncertainties attached to the total mass estimate and the source rates are estimated using four Monte Carlo techniques: (i) an importance sampling based on a Laplace proposal, (ii) a naive randomize-then-optimize (RTO) sampling approach, (iii) an unbiased RTO sampling approach, (iv) a basic Markov chain Monte Carlo (MCMC) simulation. Secondly, these methods are compared to a full Bayesian hierarchical approach, using an MCMC based on a transdimensional representation of the source term to reduce the computational cost.

I will apply those methods, and improvements thereof, to the estimation of the atmospheric cesium-137 source terms from the Chernobyl nuclear power plant accident in April/May 1986 and Fukushima Daiichi nuclear power plant accident in March 2011.

LT = Lecture Theatre, LCR = Large Committee Room, MZR = Mezzanine Committee Room,
CC = Council Chamber