Seminars

Seminars / Informal seminars / Lectures by ECMWF Staff and Invited Lecturers

Seminars contribute to our ongoing educational programme and are tailored to the interests of the ECMWF scientific community.

Informal seminars are held throughout the year on a range of topics. Seminars vary in their duration, depending on the area covered, and are given by subject specialists. As with the annual seminar, this may be an ECMWF staff member or an invited lecturer.

The following is a listing of seminars/lectures that have been given this year on topics of interest to the ECMWF scientific community.  See also our past informal seminars

2019

11 December at 10:30

Room: LT

Decadal variations in seasonal teleconnection patterns

Speaker: Tim Palmer (University of Oxford)

Abstract

The seminal Horel and Wallace (1981) paper set out the way in which El Nino variability is teleconnected to midlatitudes. Now we have considerably more observational data than Horel and Wallace had, we can ask whether these teleconnections exhibit significant decadal variations. They do, and this variability can help us understand a range of issues in seasonal prediction, from the role of the stratosphere to the signal-to-noise paradox.

29 November
at 10:30

Data assimilation and post-processing for large-scale Nordic hydrology

Speaker: Marie-Amelie Boucher (Sherbrooke University)

Abstract

Hydrological forecasting is now increasingly conceived as a global issue that transcends catchment boundaries and should be studied from a larger perspective. In addition, while the number of gauging stations is decreasing, potential new sources of meteorological and hydrological data are appearing, including the contribution of citizen scientists. It is likely that in the coming years, many other regions of the world will face a situation of co-existing local and global hydrological forecasting systems. How could we make the most out of both the local and global information? One idea would be to use the global ensemble forecasts as an initial field, and then to use the local forecast as additional information to refine the forecasts locally. Another element of interest in the context of global hydrology is the increasing availability of data from citizen science programs. This is a potentially rich source of information that could complement more « traditional » data. Can this data be assimilated in hydrological models? Is there any gain to assimilate data from citizen scientists in addition to traditional data? I would like to discuss these questions and many others, especially in the context of using machine learning tools to assimilate observations into hydrological models. Hopefully, this seminar will lead to fruitful discussions in preparation for my sabbatical (Sept 2020-August 2021), which I intend to spend mostly at the ECMWF.

12 November
at 10:30  

Lecture Theatre

The Bureau of Meteorology's use of NWP for the prediction of lightning and severe convective hazards

Speaker: Harald Richter (BOM)

Abstract

Over the last three years the Bureau's operational modelling suites have begun to incorporate a global design ensemble and convection-allowing models (CAMs). These systems, largely based on the UKMO's Unified Model, are now capable of meaningful predictions that relate to a range of convective hazards such large hail, damaging winds and lightning itself. 

I will first introduce the current and planned modelling systems at the Bureau of Meteorology, before moving into aspects of the post-processing approaches taken and planned. While the UKMO's IMPROVER framework is intended to provide the bulk of the post-processing capability, the prediction of convective hazards is achieved through a range of different individual algorithms such as Calibrated Thunder for the prediction of cloud-to-ground lightning or storm attributes such as updraft helicity for the prediction of severe convective storms. The focus of my presentation will be on these convectively focused post-processing approaches.

Harald's scientific interests
------------------------------------

Harald is a senior scientist in the Sciene to Services program of the Australian Bureau of Meteorology. His main focus is the diagnosis and prediction of deep extratropical convection and the associated hazards such as large hail, damaging winds and tornadoes. Over the past seven years his particular interest has been the use of large-scale and convection allowing models to predict thunderstorms and their hazards. 

In a previous life, Harald was responsible for training the Bureau's forecasters in the diagnosis, nowcasting and forecasting of severe thunderstorms based on observational systems and NWP which has set up a strongly operational perspective for his research to follow.

Harald also leads a cross-agency project on physical impact prediction which quantitatively combines exposure and vulnerability information with NWP-based wind predictions. Occasionally he has been called upon to act in more managerial roles, which allowed him to adopt a slightly wider view of the Bureau's research and development activities.

 

6 November
at 10:30

Clouds, Weather and Climate: From the Southern Ocean to Climate Strikes

Speaker: Dr Andrew Gettelman ((NCAR and visiting scientist at the University of Oxford and ECMWF)

Abstract

Clouds are critical for forecasting weather and climate. This presentation will provide an overview of how critical cloud processes affect weather and climate prediction, and how we simulate them in models. Critical cloud processes often not considered include supercooled liquid clouds, shallow clouds over the ocean, and cloud interactions with aerosols in the atmosphere. Interestingly, supercooled liquid clouds are not only important for extreme weather, they are also important for climate. Recent observations and modeling work, taking us literally to the ends of the Earth, is leading to a better understanding of clouds, aerosols and climate, with possible improvements in weather prediction and some surprising and not fully understood implications for future climate change.

30 October
at 14:00-15:00

Room: LCR 

Connecting the dots in hydrological ensemble prediction

Speaker: Maria-Helena Ramos (Irstea, France)

Abstract

Connecting people, disciplines and efficient techniques is crucial to develop hydrological ensemble prediction systems and foster partnerships. Probabilistic and scenario-based forecasts are recognized as essential to quantify uncertainties and support decision making in flood forecasting, drought management and water resources planning. Many successful cases have already been implemented, but still more must be done to bridge knowledge gaps, facilitate operational implementations and raise awareness among decision makers of the value of reliable predictions. This presentation will show some examples of work carried out at Irstea in France to connect different aspects of the hydrometeorological forecasting chain.

29 October
at 15:30

Room: LT

Impact of land temperature analysis in NWP and other recent developments in LSDA at the Met Office

Speaker: Breogan Gomez (UKMO)

Abstract

The Met Office Global Land Surface Data Assimilation (LSDA) system consists of a suite of schemes that give the initialisation fields for various variables. Over the last year, the Met Office has developed a system that creates soil moisture and land temperature analyses using the Simplified Extended Kalman Filter (SEKF) with the aim of having consistent analysis increments across all land variables. Our tests at N320 showed a positive impact in all regions and at all lead times when verifying against screen temperature and relative humidity. Applying the same methodology to the regional NWP model, UKV, yields some positive results. In recent years, a substantial effort has been made to improve the land initialisation in the UK regional NWP system. We have developed a soil moisture analysis using the same methodology as the global model with SEKF and a snow amount analysis using an optimal interpolation algorithm. The former has shown a neutral impact in the screen verification, but it has significantly improved the river flow predictions. The latter provides a much better snow cover estimation when compared to independent satellite products.

17 October
at 10:30

Room: Council

Post-processing at the Australian Bureau of Meteorology: precipitation results and IMPROVER collaboration

Speaker: Tom Gale (BMRC)

Abstract

Post-processing of rainfall is a priority for the Australian Bureau of Meteorology, due to the Australian climate and impacts on a range of industries, particularly agriculture. Significant advances have been made in the last few years, such as 2-3 lead days skill improvement since 2016 for 1mm/day rain amount. The result is better forecasts for the general public and specialised users, and reduced need for meteorologist intervention in forecast production. This talk will cover the changes made to rainfall post-processing over the last few years, and verification results from our 3 hourly rainfall post-processing which became operational in August 2019.

The Australian Bureau of Meteorology is collaborating with the UK Met Office on development of the new IMPROVER post-processing system. IMPROVER focuses on using the outputs from ensemble and convection permitting NWP models, is fully probabilistic, allows for verification of each processing step and is seamless from 15 minutes to 15 days. This talk will cover the differences between expected IMPROVER configuration in the UK and Australia, and work to provide compatible software environments at the Met Office and BoM.

30 September  at 10:30

Room: LT

Historical images of Earth from space: IFS versus observations (an informal talk)

Speaker: Philippe Lopez

Abstract 

As a follow-up to my improvised (virtual) trip to the Mo on back in July, I have extended my comparison of our model with various historical views of the Earth from space and I thought it would be worth giving an informal presentation about its outcome. Do not expect any slides full of groundbreaking equations. On the other hand, if you just wish to escape from our crazy world for a moment by looking at some nice images of our planet from far (far) away in space and time, please feel free to come.

13 September
at 11:30

Room: LT

Exascale resilience strategies for time-dependent solvers

Speaker: Chris Cantwell (Imperial College)

Abstract

Time-dependent partial differential equations (PDEs) arise in a wide range of application areas, for example in fluid dynamics. The high-fidelity resolution of complex flows often requires large-scale computational resources and is one of the drivers towards exascale computing. Energy usage is a major concern for these systems. The power overhead necessary to implement traditional hardware resilience to combat failures at exascale is likely to be substantial. The usefulness and energy efficiency of our computational tools might therefore be more effectively maintained by making the software more tolerant of the frequent hardware failures anticipated to occur on future large-scale systems.
In this talk I will highlight the case for resilience in software and present our latest efforts to address this challenge in the context of time-dependent PDE solvers. We combine user-level failure mitigation (ULFM), a proposed extension to the MPI 4.0 standard, with remote in-memory check-pointing in a minimally intrusive way, in order to augment existing software tools with scalable fault tolerance capabilities. Our resilience approach improves forward-path performance over conventional techniques by avoiding the parallel file system completely, and allows one or more concurrently failed ranks to be rebuilt with spare ranks on-the-fly and independently of other non-failed processes. I will describe the algorithms, implementation and their performance characteristics, and illustrate their application through examples using the Nektar++ spectral/hp element framework.

20 August
at 10:30

Room: LT

Forecast Evaluation of Set-Valued Properties

Speaker: Tobias Fissler (Imperial College, London)

Abstract

In forecast evaluation, one distinguishes two major tasks: forecast validation (or verification) and forecast comparison. While the former one is commonly performed with identification functions (for point forecasts) or other diagnostic tools of calibrations such as the Probability Integral Transform (for probabilistic forecasts), the latter task utilises scoring functions in the case of point forecasts and scoring rules for probabilistic forecasts. It is a widely accepted paradigm that these scoring functions (rules) should be consistent (proper) in that they honour correctly specified forecasts, thus incentivising truthful reporting under risk-neutrality.

The statistical, climatological and financial literature has seen remarkable contributions to the evaluation of real- or vector-valued properties as well as predictive distributions for real- and vector-valued quantities. On the other hand, the case of set-valued properties has received considerably less attention.

Acknowledging the spatial structure of many climatological and meteorological phenomena of interest, we introduce a theoretical framework for sound forecast evaluation of set-valued quantities such as the expected area of precipitation or confidence regions for floods. To this end, we suggest the formal distinction between a selective notion, where forecasters are content with specifying a single point in the set of interest, and an exhaustive notion, where forecasters are far more ambitious and aim at correctly specifying the entire set of interest. We unveil a stark dichotomy between these two notions: A functional can be either selectively or exhaustively elicitable.

We discuss implications of this mutual exclusivity for best practice in forecast evaluation of set-valued quantities, putting an emphasis on applications in meteorology and climatology.

This talk is based on joint work with Jana Hlavinová and Birgit Rudloff.

13 August
at 10:30

Room: LCR

Trends and Challenges in TC NWP for the 2020s

Speaker: M Fiorino (University of Colorado (USA)

Abstract

Perhaps the greatest success story in the history of Numerical Weather Prediction (NWP) is the 90% improvement1 in tropical cyclone (TC) track forecasts from the 1980s to 2018. This dramatic improvement was made possible by the high-resolution, high-quality global modeling of the leading operational centers ECMWF and the UKMO.
The talk firsts reviews the history of TC NWP – from the early days at Penn State where we made the first TC NWP forecast with operational analyses in a ‘full physics’ model (MM0 – the predecessor to WRF) in the late 1970s – to the 2000s, when global models became the clear ‘go-to’ aid for human forecasters. During the 2000s I was the NWP officer at both the Joint Typhoon Warning Center (JTWC) and the National Hurricane Center (NHC) where my main task was to bring good modeling into the human forecast process. It was during my time at NHC that I demonstrated how the deterministic ECMWF HRES forecasts were often superior to the best forecast aid (consensus) and how this significant advance was not because of a horizontal resolution increase but because of (subtle) changes in the physics (see Fiorino 2009: https://www.ecmwf.int/en/elibrary/17493-record-setting-performance-ecmwf...).
More recently, the limited-area model HWRF, during its 12 years in NCEP operations (2007-2018), has failed to add value to the much lower resolution GFS host global model at the medium range (72 h mean position error). Further, the inability of HWRF to make superior short-range track forecast (12- and 24-h) vis-à-vis ECMWF, despite near ‘perfect’ HWRF initial position error, challenges conventional thinking on the way forward in TC NWP.
This review of TC NWP and the current performance of HWRF/GFS suggest some next steps/challenges for the 2020s: 1) vortex analysis – can model forecasts be improved by a more accurate analysis of the TC vortex? e.g., by assimilation of the truly unique and special observation of the TC forecast centers – ‘TC vitals’; and 2) “completing the forecast” by predicting both TC genesis and dissipation. Metrics is common problem for both vortex initialization and genesis. The talk will conclude with a TC ‘forecast error’ metric proposal, that is more consistent with warnings and TC impacts, and some recent work on deterministic TC genesis verification.

31 July at 11:00

Room: CC

Recent trends and future projections of Meteorological droughts

Speaker: Sergio M Vicente-Serrano (Spanish National Research Council)

Abstract

Drought is one of the most difficult hydroclimatic hazards to quantify and monitor, which has lead to strong scientific debate on the temporal behaviour of drought trends over the last decades and possible future projections. Here the different perspectives considered to identify recent drought trends are shown and discussed, focusing on the different role of both precipitation and atmospheric evaporative demand. The role of the statistical properties of drought (mostly their autoregressive character) and land/atmosphere feedbacks related to plant physiology and CO2 fertilization are also a determining factor to understand droughts under future climate scenarios.

31 July
at 10:30

Room: LT

Efforts on Scaling and Optimizing Climate and Weather Forecasting Programs on Sunway TaihuLight

Speakers: Haohuan Fu (Tsinghua University, China) and Wei Xue (High Performance Computing Institute, Tsinghua, China)

Abstract

The Sunway TaihuLight supercomputer is the world's first system with a peak performance greater than 100 PFlops, and a parallel scale of over 10 million cores. In contrast with other existing heterogeneous supercomputers, which include both CPU processors and PCIe-connected many-core accelerators (NVIDIA GPU or Intel MIC), the computing power of TaihuLight is provided by a homegrown many-core SW26010 CPU that includes both the management processing elements (MPEs) and computing processing elements (CPEs) in one chip. This talk reports our efforts on refactoring and optimizing the climate and weather forecasting programs on Sunway TaihuLight. To map the large code base of CAM and WRF to the millions of cores on the Sunway system, we take OpenACC-based refactoring as the major approach, and apply source-to-source translator tools to exploit the most suitable parallelism for the CPE cluster, and to fit the intermediate variable into the limited on-chip fast buffer. For individual kernels, when comparing the original ported version using only MPEs and the refactored version using both the MPE and CPE clusters, we achieve up to 22x speedup for the compute-intensive kernels. For the 25km resolution CAM global model, we manage to scale to 24,000 MPEs, and 1,536,000 CPEs, and achieve a simulation speed of 2.81 model years per day.

Haohuan Fu is a professor in the Ministry of Education Key Laboratory for Earth System Modeling, and Department of Earth System Science in Tsinghua University, where he leads the research group of High Performance Geo-Computing (HPGC). He is also the deputy director of the National Supercomputing Center in Wuxi, leading the research and development division. Fu has a PhD in computing from Imperial College London. His research work focuses on providing both the most efficient simulation platforms and the most intelligent data management and analysis platforms for geoscience applications, leading to two consecutive winning of the ACM Gordon Bell Prizes (nonhydrostatic atmospheric dynamic solver in 2016, and nonlinear earthquake simulation in 2017).

10 July

at 11.00

Room: CC

London as a laboratory to explore the health effects of air pollution

Speaker: Ian Mudway (Kings College, London)

Biography

Dr Ian Mudway is a  senior lecture at the School of Population Health and Environmental Sciences at King's College London and a member of the MRC-PHE Centre for Environment and Health; MRC & Asthma UK Centre in Allergic Mechanisms of Asthma and NIHR-PHE Health Protection Research Unit in Health Impact of Environmental Hazards. He has over 20 years of experience researching the impacts of air pollution on human health and in the development of assays to quantify the toxicity of the chemical cocktails that pollute the air we breathe. Over this period Dr Mudway has published over 100 research papers, reports and book chapters on these topics, as well as providing advice the local, national and international governments and NGOs. Dr Mudway is passionate about the communication of science to lay audiences and has worked extensively with artists and educationalist to promote the public understanding of the risks associated with environmental pollutants. Currently his work is focused on understanding early life impacts of pollutants on the development of the lung and cognitive function in children living within urban populations.

28 June
at 11:00

Room: LT

Towards an unbiased stratopsheric analysis

Speaker: Patrick Laloyaux (ECMWF)

Abstract

A set of newly developed diagnostics based on GPS-RO observations has revealed the presence of systematic large-scale errors in the stratospheric temperature of the IFS model. Interestingly, the amplitude of this bias has increased over the past few years in conjunction with the last horizontal resolution upgrade and the revision of the radiative scheme.

To take into account model biases, a weak-constraint 4D-Var formulation has been developed where a model-error forcing term is explicitly estimated inside the 4D-Var minimisation. This approach is able to reduce by up to 50% the bias in the analysis departures of all observations sensitive to stratospheric temperature. The importance of anchoring data (accurate observations that do not require bias correction) such as GPS-RO is apparent to ensure the good performance of the method. Weak-constraint 4D-Var also allows to have a more consistent way to treat the source of the different biases, ensuring that the Variational Bias Correction (VarBC) corrects only systematic errors from data and observation operators. In this talk, we will also start exploring the potential of the new weak-constraint 4D-Var to correct for model biases in medium-range weather forecasting and climate reanalyses.

19 June
at 15:30

Room: LT

Introduction to the New York State Mesonet

Speaker: Chris Thorncroft (University at Albany)

Abstract 

The New York State Mesonet (NYSM) is a comprehensive network of 126 environmental monitoring stations deployed statewide with an average spacing of 27 km.  The primary goal of the NYSM is to provide high quality weather data at high spatial and temporal scales to improve atmospheric  monitoring and prediction, especially for extreme weather events.  Completed in spring 2018, each station is equipped with  a standard suite of atmospheric and soil sensors.  Collectively, the network is comprised of 1,825 sensors with approximately 907,200 observations collected per day.  Unique aspects of the NYSM include its measurement of snow depth,  soil moisture and temperature, and its collection of camera images at  every site.  The NYSM also pioneered the building of three additional  sub-networks to collect vertical profile, surface energy budget, and  snow water equivalent measurements at a select number of sites across  the state.  The location of each station was carefully selected based upon WMO siting criteria and local requirements.  Extensive metadata are made available online.  All data are collected, quality-controlled, archived, and disseminated every 5 minutes.  Real-time data are displayed on the web for public use, and archived data are available for download.  Data are now utilized by a variety of sectors including emergency  management, transportation, utilities, agriculture and education.  Recent examples of the utility of the data will be shared.

 18 June
at 10:30
Room: LT

NOAA-CIRES-DOE 20th Century reanalysis version “3” (1836-2015) and Prospects for 200 years of reanalysis

Speaker: Gil Compo

Abstract

The new historical reanalysis dataset generated by the Physical Sciences Division of NOAA’s Earth System Research Laboratory and the University of Colorado CIRES, the Twentieth Century Reanalysis version 3 (20CRv3), is a comprehensive global atmospheric circulation dataset spanning 1836 to present, assimilating only surface pressure and using monthly Hadley Centre sea ice distributions (HadISST2.3) and an ensemble of daily Simple Ocean Data Assimilation with Sparse Input (SODAsi.3) sea surface temperatures.  SODAsi.3 was forced with a previous version of 20CR that itself was forced with a previous SODAsi, allowing these “iteratively-coupled” boundary conditions to be more consistent with the atmospheric reanalysis.  20CRv3 has been made possible with supercomputing resources of the U.S. Department of Energy and a collaboration with GCOS, WCRP, and the ACRE initiative. It is chiefly motivated by a need to provide an observational validation dataset, with quantified uncertainties, for assessments of climate model simulations of the 19th to 21st centuries, with emphasis on the statistics of daily weather. It uses, together with the NCEP global forecast system (GFS) numerical weather prediction (NWP) land/atmosphere model to provide background "first guess" fields, an Ensemble Kalman Filter (EnKF) data assimilation method. This yields a global analysis every 3 hours as the most likely state of the atmosphere, and also yields the uncertainty of that analysis.

20CRv3 has several improvements compared to the previous version 2c. The analysis and the 80 member ensemble are generated with the NCEP GFS at T254 resolution (about 0.75 degrees latitude by longitude) with 64 levels in the vertical, compared to T62 (about 2 degrees latitude by longitude) and 28 vertical levels in the 20CRv2c 56 member ensemble. This gives an improved representation of extreme events, such as hurricanes. Implementation of a “relaxation to prior” covariance inflation algorithm, combined with stochastic parameterizations in the GFS, provides quantitatively better uncertainty estimates than the previous additive inflation of 20CRv2c. An adaptive localization helps to keep the analysis from over-fitting the observations. A variational quality control system retains more observations. An incremental analysis update procedure produces a temporally smoother analysis without spurious spin-up trends seen in 20CRv2c. Millions of additional pressure observations contained in the new International Surface Pressure Databank version 4.7, such as from the citizen science Oldweather.org project, also improve the analyses. These improvements result in 20CR version “3” having comparable or better analyses to version 2c, as suggested by improved 6 hour forecast skill, more realistic uncertainty in near-surface air temperature, and a reduction in spurious centennial trends in the tropical and polar regions.  Possibilities for 200 years of reanalysis are also discussed in light of results of test reanalyses of the 1816 “Year without a Summer”.

Gilbert P. Compo1,2, Laura Slivinski1,2, Jeffrey S. Whitaker2, Prashant D. Sardeshmukh1,2, Benjamin S. Giese3, Philip Brohan4, Rob Allan4

1CIRES, University of Colorado, USA, compo@colorado.edu,  2Physical Sciences Division, Earth System Research Laboratory, NOAA, USA., 3Department of Oceanography, Texas A&M University , USA, 4Met Office Hadley Centre, Exeter, UK

14 June
at 10:30

Room: LT

Ocean Waves as a Missing Link Between Atmosphere and Ocean

Speaker: Alex Babanin (University of Melbourne, Australia)

 Abstract

Role of the waves as a link between the ocean and atmosphere will be discussed. It is rapidly becoming clear that many large-scale geophysical processes are essentially coupled with the surface waves, and those include weather, tropical cyclones, ice cover in both Hemispheres, climate and other phenomena in the atmosphere, at air/sea, sea/ice and sea/land interface, and many issues of the upper-ocean mixing below the surface. Besides, the wind-wave climate itself experiences large-scale trends and fluctuations, and can serve as an indicator for changes in the weather climate.  In the presentation, we will discuss wave influences at scales from turbulence to climate, on the atmospheric and oceanic sides.

At the atmospheric side of the interface, the air-sea coupling is usually described by means of the drag coefficient Cd, which represents the momentum flux in terms of the wind speed, but the scatter of experimental data with respect to such dependences is very significant and has not improved noticeably over some 40 years. It is argued that the scatter is due to multiple mechanisms which contribute into the sea drag, many of them are due to surface waves and cannot be accounted for unless the waves are explicitly known. We also argue that separation of the momentum flux for the components which go to the waves and to the current, is not trivial and depends on a numbers of factors such as sea state, but also on the measurement setup. In this presentation, field data, both at moderate winds and in Tropical Cyclones, and a WBL model are used to investigate such biases. It is shown that near the surface the turbulent fluxes are less than those obtained by extrapolation using the logarithmic-layer assumption, and the mean wind speeds very near the surface are larger.

Among wave-induced influences at the ocean side, the ocean mixing is most important. Until recently, turbulence produced by the orbital motion of surface waves was not accounted for, and this fact limits performance of the models for the upper-ocean circulation and ultimately large-scale air-sea interactions. Theory and practical applications for the wave-induced turbulence will be reviewed in the presentation. These include viscous and instability theories of wave turbulence, direct numerical simulations and laboratory experiments, field and remote sensing observations and validations, and finally implementations in ocean, Tropical Cyclone, ocean and ice models.

14 June  
at 13:00

Room: LT

The Bureau of Meteorology Research Program

Speaker: Peter May (BOM, Australia)

Abstract

An overview of the Australian Bureau of Meteorology's research program will be presented.  This will include our plans focusing on high resolution numerical prediction, multi-week and seasonal modelling, advanced post processing, climate science and our work to address some fundamental science and societal challenges.  For example our work on fire weather ranging from the changing fire risk in Australia, our current and future guidance and work on firestorms including fully coupled fire -high resolution simulations.   Finally I will discuss our future plans in the context of a fundamental transformation of the Bureau and the way we will be providing weather and climate services. 

 30 May
at 10:30

Room: LT

The atmospheric response to increased ocean model resultion in the ECMWF Integrated Forecasting System: a seamless approach

Speaker: Chris Roberts

Abstract

This study uses initialized forecasts and climate integrations to evaluate the wintertime North Atlantic response to an increase of ocean model resolution from ∼100 km (LRO) to ∼25 km (HRO) in the European Centre for Medium-Range Weather Forecasts Integrated Forecasting System (ECMWF-IFS). The presented analysis considers the atmospheric response at lead times of weeks to decades and assesses mean biases, variability, and subseasonal predictability. Importantly, the simulated impacts are highly dependent on lead time such that impacts seen at climate timescales cannot be generalized to initialized forecasts. At subseasonal lead times (weeks 1-4), sea surface temperature (SST) biases in LRO and HRO configurations are similar and partly inherited from ocean initial conditions. At multidecadal timescales, mean differences are dominated by biases in the LRO configuration, which include a North Atlantic cold bias and a breakdown in the 

 29 May
at 10:00

Room: LT

The axes of the Météo-France scientific strategy

Speaker: Marc Pontaud (Meteo-France)

  Abstract

Make progress in the knowledge and anticipation of extreme phenomena and their impacts, in a context of climate change

Anticipating extreme or high-stakes phenomena and their impacts, in metropolitan France and overseas, is a strong societal expectation addressed to Météo-France. Progress in this area is largely based on improvements made to our numerical weather prediction systems, particularly regional ones, which remain a priority research objective at Météo-France. Progress will come from several areas.

 A first source of progress will be the assimilation of an increasing number of observations from new meteorological satellites launched during the period, but also through access to new data sources or through better use of existing sources such as meteorological radars. It will also be a question of optimizing the information provided by these data. This will require the development of a new generation of data assimilation algorithms. The realization of forecasts in the form of sets will be generalized in order to improve by a few hours the anticipation of the occurrence of a meteorological hazard, to better predict its intensity and consequences and to be able to offer new services based on a greater capacity to adapt the forecasts to the expectations of users taking into account their challenges.

Improving the prediction of extreme or high-stakes weather phenomena and their impacts also requires progress in understanding the processes at work and in their modelling at different scales. Research investigation resources (measurement campaigns, instrumented sites, meter-scale modelling, etc.) will be oriented and developed as a priority to meet these objectives. It will also evaluate the contribution of very high resolution modelling, i.e. a few hundred metres grid, to the prediction of meteorological hazards on sites or during events with stakes.

For the very short time frames (0-3h), the contribution of artificial intelligence (AI) and other signal processing techniques to the extrapolation of observations and their fusion with numerical forecasts will be explored.

Moreover, knowledge at the regional level of the evolution of the frequency and intensity of extreme events with climate change is essential for adapting to climate change and strengthening territorial resilience measures. Météo-France will contribute to the scientific response to these issues, identified in the French Adpatation Program, by interpreting recent climate trends through the use of observations and climate model results. The aim will be to carry out simulations of past and future climate, in particular with the fine-scale forecasting model, capable of representing the evolution of Mediterranean episodes, tropical cyclones and the urban heat island during heat waves.

 

Continue the transition to integrated environmental modelling systems shared between forecasting and climate

Operational weather prediction, seasonal climate prediction and climate study require modelling not only the behaviour of the atmosphere but also that of other interacting components of the Earth system (continental surfaces, ocean, waves, cryosphere, chemical composition of the atmosphere) and anthropogenic factors (urbanization, irrigation, dams, anthropogenic emissions, etc.). This evolution towards environmental modelling will result in the construction and experimentation of a regional "Earth system" composite model with kilometric resolution, with the assistance of partners mastering certain components, such as the ocean or sea ice. This regional modelling system will be modular to allow different configurations and coupling levels depending on the forecast or application objectives.

This work will continue to be part of a single modelling system logic, from the global to the local scale, with tools shared between weather forecasting and climate modelling activities.

For numerical weather prediction, including the chemical composition of the atmosphere, a new research axis will be opened, that of data assimilation coupled between the different components (atmosphere, continental surfaces, aerosols and atmospheric chemistry, ocean, sea states), with the prospect of taking better advantage of certain observations at the interfaces and the exploitation of a greater number of data.

 

Adapt modeling tools to operational requirements on tomorrow's computing architectures

The operational use of the tools developed by the research is at the heart of the institution's strategy. On the one hand, it allows a rapid transfer of innovations from research to all the institution's activities and, on the other hand, a daily comparison of scientific work with reality, thus allowing researchers to benefit from regular feedback on the quality of their models. This community of tools between scientific and operational activities also imposes constraints that must be integrated by research teams (speed of code execution, compatibility of algorithms with computing infrastructures, etc.). In close collaboration with the meteorological services community, the establishment will carry out the scientific work necessary to prepare for future technological developments in intensive computing, including the development of graphics processors or other new or emerging architectures that will have a profound impact on the structure of digital codes.


Enhance weather and climate forecasts in response to the expectations of internal and external beneficiaries

Météo-France's research must also contribute to the promotion of its innovations, both to internal users (particularly forecasters) and to external users.

In particular, the enhancement of modeling data, which is increasingly accurate and informative but also more numerous, rich and complex, will be a major focus of research. This includes supporting the use of ensemble forecasts by defining methods for statistical extraction of relevant signals and post-processing adapted to this new type of data. To cover the needs in this area, scientific expertise in the field of AI will be strengthened. In a logic of support for end-users, the emphasis will be placed on innovation, the transfer of research results to the operational level and the orientation of scientific activities towards the needs of Météo-France's operational departments.

 

Strengthen the dynamics of national and international cooperation

The orientation of Météo-France's research activities towards environmental modelling on a regional scale will be accompanied by a strengthening and broadening of cooperation with the French academic and scientific community, such as the CNRS, the academic world and major national research organisations, and with international actors engaged in this same path.

The evolution of numerical weather prediction systems towards coupled systems and the consideration of future supercomputer architectures lead to increased cooperation with the meteorological services community with which the modelling tools are co-developed (European consortia Aladin and Hirlam, ECMWF). To be fully effective, this collaboration between meteorological services will have to be based on better shared tools and software convergence with the ECMWF is a reaffirmed priority for the Establishment.

In the field of space observation of the Earth, Météo-France will consolidate its status as a privileged interlocutor with space agencies for meteorological and climate applications, and more broadly environmental applications. This approach will be based on the close relations established with CNES in France, Eumetsat and ESA in Europe as well as with certain international space agencies.

28 May
at 10:30

Room: LT

Improving atmospheric reanalyses for historical extreme events by rescuing lost weather observations

Speaker: Ed Hawkins (University of Reading)

Abstract

Our understanding of past changes in weather and climate rely on the availability of observations made over many decades. However, billions of historical weather observations are effectively lost to science as they are still only available in their original paper form in various archives around the world. The large-scale digitisation of these observations would substantially improve atmospheric reanalyses back to the 1850s. Recently, volunteer citizen scientists have been assisting with the rescue of millions of these lost observations taken across western Europe over a hundred years ago. The value of these data for understanding many notable and extreme weather events will be demonstrated.

16 May
at 11:15

Room: Council

Are seasonal forecasts useful to improve operational decisions for water supply in the UK?

Speakers: Francesca Pianosi and Andres Peñuela (Bristol University)

Abstract

Improved skill of seasonal predictions for the North Atlantic circulation and Northern Europe are motivating an increasing effort towards developing seasonal hydrological forecasting systems, such as the Copernicus Climate Change Service (C3S). Among other purposes, such forecasting systems are expected to deliver better-informed water management decisions. Using a pumped-storage reservoir system in the UK as a pilot application, we investigate the potential for using seasonal weather forecasts to simultaneously increase supply reliability and reduce pumping costs. To this end, we develop a Real-Time Optimisation System (RTOS) that uses C3S seasonal weather forecasts to generate hydrological forecasts, and combine them with a reservoir simulation model and an evolutionary optimisation algorithm, to generate release and pumping decisions.

We evaluate the performances of the RTOS over historical periods and compare it to several benchmarks, including a simplified operation scheme that mimic the current operational procedures, and a RTOS that uses Ensemble Streamflow Predictions (ESP) in place of C3S seasonal forecasts. We also attempt at linking the improvement of system performances to the characteristic of hydrological conditions and forecasts properties. Ultimately, we aim at addressing key questions such as ‘To what extent improving forecast skill translates into an increase of the forecast value for water supply decisions?’ and ‘Does accounting for forecast uncertainty in optimization improve decisions?’.

16 May
at 10:15

Room: LT

Understanding the intraseasonal variability over Indian region and development of an operational extended range prediction system

Speaker: Dr Sahai (ITM, India)

 Abstract

Extended range forecast of sub seasonal variability beyond weather scale is a critical component in climate forecast applications over Indian region. The sub-seasonal to seasonal (s2s) project, undertaken by WCRP, started in 2013 to improve the forecast beyond weather scale which is a challenging gap area in research and operational forecast domain. The primary objective of this s2s project is to provide the sub-seasonal to seasonal forecast in various lead times.

The prediction of weather and climate in the extended range (i.e. 2-3 weeks in advance) is much in demand in the sectors depending on water resources, city planning, dam management, health management (e.g. protection against heat death) etc. This demand has grown up manifold in the last five years with the experimental implementation of dynamical extended range prediction system (ERPS) by Indian Institute of Tropical Meteorology (IITM), Pune. At the heart of ERPS is a forecast system based on the NCEP-CFSv2 Ocean-Atmosphere coupled dynamical model (hereafter CFS), which is configured to run in two resolutions (T382 and T126) and an atmospheric only version (hereafter GFS) configured to run with CFS sea surface temperature (SST) boundary condition that is bias corrected with observations. The initial conditions to run the model are generated through an in-house developed perturbation technique using NCMRWF(atmospheric) and INCOIS(ocean) data assimilation system. From every initial condition the model is run for the next 28 days and a multi ensemble forecast run is created. Forecast product variables are then separated for sector specific application with suitable post processing and downscaling based on advanced statistical techniques. The application of this forecast can be made in several allied fields like agro-meteorology, hydrometeorology, health sector etc. My talk will provide a brief overview of ERPS keeping the focus on few sector specific  applications.

15 May
at 10:30

Room: LT

Parallel in Time Integration Using PFASST

Speaker: Michael Minion (Lawrence Berkeley National Laboratory)

Abstract

The Parallel Full Approximation Scheme in Space and Time (PFASST) is an iterative approach to parallelization for the integration of ODEs and PDEs in the time direction.  I will give an overview of the PFASST algorithm, discuss the advantages and disadvantages of PFASST compared to other popular parallel in time (PinT) approaches, and show some examples of PFASST in applications.  I will also explain the construction of a new class of PinT integrators that combine properties of exponential integrators and PFASST, including some preliminary results on the accuracy and parallel performance of the
algorithm.

13 May
at 11:00

Room: LT

Flood Forecasting and Inundation Mapping using the US National Water Model

Speaker: David R Maidment (University of Texas at Austin)

Abstract

The US National Water Model forecasts flows on 5.2 million km of streams and rivers in the continental United States, divided into 2.7 million forecast reaches.  A Medium Range Forecast from this model for Hurricane Harvey prepared 3 days before the hurricane made landfall successfully predicted the spatial pattern of inland flooding in Texas.  A continental scale inundation map has been developed using the Height Above Nearest Drainage (HAND) method, and an associated cell phone app called Pin2Flood built that enables flood emergency responders to create their own flood inundation maps using the same HAND map base, thus connecting predictive and observational flood inundation mapping.

About the Presenter: David R Maidment is the Hussein M Alharthy Centennial Chair in Civil Engineering at the University of Texas at Austin, where he has served on the faculty since 1981.  He received his BE degree from the University of Canterbury in Christchurch, New Zealand, and his MS and PhD degrees from the University of Illinois.  In 2016, he was elected to the US National Academy of Engineering for application of geographic information systems to hydrologic processes.

21 March
at 10:30

Room: LT

Constraining Stochastic Parametrization Schemes using High-Resolution Model Simulations

Speaker: Hannah Christensen (Oxford University)

Abstract

Stochastic parametrizations are used in weather and climate models to represent model error. Designing new stochastic schemes has been the target of much innovative research over the last decade, with a focus on developing physically motivated approaches. We present a technique for systematically deriving new stochastic parametrizations or for constraining existing stochastic parametrizations. We take a high-resolution model simulation and coarse-grain it to the desired forecast model resolution. This provides the initial conditions and forcing data needed to drive a Single Column Model (SCM). By comparing the SCM parametrized tendencies with the evolution of the high-resolution model, we can measure the ‘error’ in the SCM tendencies. As a case study, we use this approach to assess the physical basis of the widely used ‘Stochastically Perturbed Parametrization Tendencies’ (SPPT) scheme using the IFS SCM. We provide justification for the multiplicative nature of SPPT, and for the large temporal and spatial scales used in the stochastic perturbations. However, we also identify issues with the SPPT scheme and motivate improvements. In particular, our results indicate that independently perturbing the tendencies associated with different parametrization schemes is justifiable, but that an alternative approach is needed to represent uncertainty in the convection scheme. It is hoped this new coarse-graining technique will improve both holistic and process-based approaches to stochastic parametrization.

20 March
at 10:30

Room: LT

About novel time integration methods for weather and climate simulations

Speaker: Martin Schreiber (Tech University of Munich)

Abstract

Weather and climate simulations face new challenges due to changes in computer architectures caused by physical limitations. From a pure computing perspective, algorithms are required to cope with stagnating or even decreasing per-core speed and increasing on-chip parallelism. Although this leads to an increase in the overall on-chip compute performance, data movement is increasingly becoming the most critical limiting factor. All in all, these trends will continue and already led to research on partly disruptive mathematical and algorithmic reformulations of dynamic cores, e.g. using (additional) parallelism in the time dimension.

This presentation provides an overview and introduction to the variety of newly developed and evaluated time integration methods for dynamical cores, all aimed at improving the ratio of wall clock time to error:

First, I will begin with rational approximations of exponential integrator methods in their various forms: Terry Haut's rational approach of exponential integrators (T-REXI), Cauchy contour integral methods (CI-REXI) on the complex plane and their relationship to Laplace transformations, and diagonalized Butcher's Tableau (B-REXI).

Second, Semi-Lagrangian (SL) methods are often used to overcome limitations on stable time step sizes induced by nonlinear advection. These methods show superior properties in terms of dispersion accuracy, and we have used this property with the Parareal parallel-in-time algorithm. In addition, a combination of SL with REXI is discussed, including the challenges of such a formulation due to Lagrangian formulation.

Third, the multi-level time integration of spectral deferred correction (ML-SDC) will be discussed, focusing on the multi-level induced truncation of nonlinear interactions and the importance of viscosity in this context. Based on this, the "Parallel Full Approximation Scheme in Space and Time" (PFASST) adds a time parallelism that allows even higher accelerations on the time-to-solution compared to ML-SDC and traditional time integration methods.

All studies were mainly conducted based on the shallow water equations (SWE) on the f-plane and the rotating sphere to investigate horizontal aspects of dynamical cores for weather and climate simulation. Overall, our results motivate further investigation and combination of these methods for operational weather/climate systems.

(With contributions and more from Jed Brown, Francois Hamon, Richard Loft, Michael Minion, Matthew Normile, Nathanaël Schaeffer, Andreas Schmitt, Pedro S Peixoto).

12 March
at 11:15

Room: CC  

Running serverless HPC workloads on top of Kubernetes and Jupyter notebooks

Speaker: Christopher Woods (University of Bristol)

6 March
at 10:30

Room: LT

Trends in data technology: opportunities and challenges for Earth system simulation and analysis

Speaker: V Balaji (Princeton Uni and NOAA/GFDL)

Abstract

Earth system modeling, since its origin at the dawn of modern computing, has operated at the very limits of technological possibility. This has led to tremendous advances in weather forecasting, and the use of models to project climate change both for understanding the Earth system, and in service of downstream science and policy. In this talk, we examine changes in underlying technology, including the physical limits of miniaturization, the emergence of a deep memory-strategy hierarchy, which make "business as usual" approaches to simulation and analysis appear somewhat risky. We look simultaneously at trends in Earth system modeling, in terms of the evolution of globally coordinated climate science experiments (CMIP-IPCC) and the emergence of "seamless prediction", blurring the boundaries between weather and climate. Together, these point to new directions of research and development in data software and data science. Innovative and nimble approaches to analysis will be needed. Yesterday's talk examines this in the context of computational science and software, but it seems apparent that computation and data are inseparable problems, and a unified approach is indicated.

6 March
at 14:00

Room: LT

Statistics for Natural science in the age of Supercomputers

Speaker: Dutta Ritabrata (Warwick University)

Abstract:

To explain the fascinating phenomenons of nature, natural scientists develop complex models which can simulate these phenomenons almost close to reality. But the hard question is how to calibrate these models given the real world observations. Traditional statistical methods are handicapped in this setup as we can not evaluate the likelihood functions of parameters of this models. In last decades or so, that statisticians answer to these questions has been approximate Bayesian computation (ABC), where the parameters are calibrated by comparing simulated and observed data set in a rigorous manner. Though it only became possible to apply ABC for realistic and hence complex models when it was efficiently combined with High Performance Computing (HPC). In this work, we will focus on this aspect of ABC, by showing how it was able to calibrate expensive models of epidemics on networks, of molecular dynamics, of platelets deposition in blood-vessels, of passenger queue in airports or volcanic eruption. This was achieved using standard MPI parallelization, nested MPI parallelization or nested CUDA parallelization inside MPI. Finally, we want to raise and discuss the open-questions regarding how to evolve and strengthen this inferential methods when each model simulation takes a full day or a resource equivalent to the best supercomputers of today.

5 March
at 10:30

Room: LT

Machine learning and the post-Dennard era of climate simulation

Speaker: V Balaji (Princeton Uni and NOAA/GFDL

Abstract

In this talk, we examine approaches to Earth system modeling in the post-Dennard era, inspired by the industry trend toward machine learning (ML). ML presents a number of promising pathways, but there remain challenges specific to introducing ML into multi-phase multi-physics modeling. A particular aspect of such 'multi-scale multi-physics' models that is under-appreciated is that they are built using a combination of local process-level and global system-level observational constraints, for which the calibration process itself remains a substantial computational challenge. These include, among others: the non-stationary and chaotic nature of climate time series; the presence of climate subsystems where the underlying physical laws are not completely known; and the imperfect calibration process alluded to above. The talk will present ideas and challenges and the future of Earth system models as we prepare for a post-Dennard future, where learning methods are poised to play an increasingly important role.

21 January
at 11:00

Room: LT

ESSPE: Ensemble-based Simultaneous State and Parameter Estimation for Earth System Data-Model Integration and Uncertainty Quantification

Speaker: Fuqing Zhang (Pennsylvania State University)

Abstract

Building on advanced data assimilation techniques, we advocate to develop and apply a generalized data assimilation software framework on Ensemble-based Simultaneous State and Parameter Estimation (ESSPE) that will facilitate data-model integration and uncertainty quantification for the broad earth and environmental science communities. This include, but not limited to, atmospheric composition and chemistry, land surface, hydrology, and  biogeochemistry, for which many of the physical and chemical processes in their respective dynamic system models rely heavily on parametrizations. Through augmenting uncertain model parameters as part of the state vector, the ESSPE framework will allow for simultaneous state and parameter estimation through assimilating in-situ measurements such as those from the critical-zone ground-based observational  networks and/or remotely sensed observations such as those from radars and satellites. Beyond data model integration and uncertainty quantification, through systematically designed ensemble sensitivity analysis, examples will be given to the application of the ESSPE framework to: (1) identify key physical processes and their significance/impacts and to better represent and parameterize these processes in dynamical models of various earth systems; (2) design better observation strategies in locating the optimum sensitive regions, periods and variables to be measured, and the minimum accuracies and frequencies of these measurements that are required to quantify the physical processes of interest; explore the impacts of heterogeneity and equifinality; (3) understand predictability and nonlinearity of these processes, and parameter identifiability; and (4) facilitate upscale cascading of knowledge from smaller-scale process understanding to larger-scale simplified representation and parametrization. I will end the presentation with an introduction on the preliminary findings from our ongoing collaborations with ECMWF on using the data assimilation methodology to identify potential deficiencies in the convective gravity drag parametrization that led to  stratospheric temperature biases in the operational model, and the potential pathways for using SSPE to improve model physics in the future.

25 January
at 10:30

Room: LT

Windstorm and Cyclone Events: Atmospheric Drivers, Long-term Variability and Skill of current Seasonal Forecasts 

Speaker: Daniel J Befort (University of Oxford)

Abstract

In this study, observed long-term variability of wind storm events is analysed using two state-of-the-art 20th century reanalyses called ERA-20C and NOAA-20CR. Long-term trends partly differ drastically between both datasets. These differences are largest for the early 20th century, with a higher agreement for the past 30 years. However, short-term variability on sub-decadal time-scales is in much better agreement especially over parts of the northern hemisphere. This suggests that these datasets are useful to analyse drivers of interannual variability of windstorm events  as these simulations allow to extend the time-period covered by common reanalyses as e.g. ERA-Interim. 

ERA-20C is used to analyse the relationship between atmospheric and oceanic conditions on windstorm frequency over the European continent. It is found that large parts of their interannual variability can be explained by few atmospheric patterns, including the North Atlantic Oscillation (NAO) and the Scandinavian pattern. This suggests that it is crucial to capture these atmospheric modes of variability e.g. in seasonal forecast systems to reasonably represent windstorm variability over Europe. 

The skill in windstorms and cyclones is analysed for three modern seasonal forecast systems: ECMWF-S3, ECMWF-S4 and GloSea5. Whereas skill for cyclones is generally small, significant positive skill of ECMWF-S4 and GloSea5 is found for windstorms over the eastern North Atlantic/western Europe. Further to analysing skill in windstorms using a dedicated tracking algorithm, it is also tested in how far the NAO can be used as a predictor for their variability. Results suggest that using the NAO adds some skill over northern Europe, however, using the whole model information by tracking windstorm events is superior over large parts over the eastern Atlantic and western Europe. 

7 January
at 10:30

Room: LT

When fossil fuel emissions are no longer perfect in atmospheric inversion systems

Speaker: Thomas Lauvaux (LSCE, Saclay, France)

Abstract

The biogenic component of greenhouse gas fluxes remains the primary source of uncertainties in global and regional inversion systems. But recent results suggest that anthropogenic greenhouse gas emissions from fossil fuel use, so far assumed perfect at all scales, represent a larger fraction of the uncertainties in these systems, and can no longer be ignored. Inversion systems capable of reducing fossil fuel uncertainties are discussed in parallel with planned observing systems deployed across the world and in space. The remaining challenges and recent advances are presented to not only infer fossil fuel emissions but to provide support to climate policy makers at national and local scales.

 

Uncertainty quantification of pollutant source retrieval: comparison of Bayesian methods with application to the Chernobyl and Fukushima Daiichi accidental releases of radionuclides

Speaker: M Bocquet (CEREA, France)

Abstract

Inverse modeling of the emissions of atmospheric species and pollutants has significantly progressed over the past fifteen years.  However, in spite of seemingly reliable estimates, the retrievals are rarely accompanied with an objective estimate of their uncertainty, except when Gaussian statistics are assumed for the errors which is often unrealistic.  I will describe rigorous techniques meant to compute this uncertainty in the context of the inverse modeling of the time emission rates -- the so-called source term -- of a point-wise atmospheric tracer.  Lognormal statistics are used for the positive source term prior and possibly the observation errors, which precludes simple Gaussian statistics-based solutions.

Firstly, through the so-called empirical Bayesian approach, parameters of the error statistics -- the hyperparameters -- are estimated by maximizing the observation likelihood via an expectation-maximization algorithm. This enables the estimation of an objective source term.  Then, the uncertainties attached to the total mass estimate and the source rates are estimated using four Monte Carlo techniques: (i) an importance sampling based on a Laplace proposal, (ii) a naive randomize-then-optimize (RTO) sampling approach, (iii) an unbiased RTO sampling approach, (iv) a basic Markov chain Monte Carlo (MCMC) simulation. Secondly, these methods are compared to a full Bayesian hierarchical approach, using an MCMC based on a transdimensional representation of the source term to reduce the computational cost.

I will apply those methods, and improvements thereof, to the estimation of the atmospheric cesium-137 source terms from the Chernobyl nuclear power plant accident in April/May 1986 and Fukushima Daiichi nuclear power plant accident in March 2011.

LT = Lecture Theatre, LCR = Large Committee Room, MZR = Mezzanine Committee Room,
CC = Council Chamber