The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum.

Slides:



Advertisements
Similar presentations
2 WP9/JRA3: ESM Evaluation: Developing an Infrastructure Participants: WP Leader: George Tselioudis, Academy of Athens Co-leader: Michael Schulz, IPSL.
Advertisements

CMIP6 and how observations can inform evaluation
Earth System Curator Spanning the Gap Between Models and Datasets.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech.
ECMWF long range forecast systems
A Web-based Community Approach to Model Evaluation using AMET Saravanan Arunachalam 1, Nathan Rice 2 and Pradeepa Vennam 1 1 Institute for the Environment.
Climate Analytics on Global Data Archives Aparna Radhakrishnan 1, Venkatramani Balaji 2 1 DRC/NOAA-GFDL, 2 Princeton University/NOAA-GFDL 2. Use-case 3.
CAVIAR – Continuum Absorption by Visible and Infrared Radiation and its Atmospheric Relevance How on schedule are we? Keith Shine Department of Meteorology,
CLIMATE SCIENTISTS’ BIG CHALLENGE: REPRODUCIBILITY USING BIG DATA Kyo Lee, Chris Mattmann, and RCMES team Jet Propulsion Laboratory (JPL), Caltech.
M. Stockhause et al. Martina Stockhause, Michael Lautenschlager, Frank Toussaint Deutsches Klimarechenzentrum (DKRZ) World Data Centre for Climate (WDCC)
TPAC Digital Library Talk Overview Presenter:Glenn Hyland Tasmanian Partnership for Advanced Computing & Australian Antarctic Division Outline: TPAC Overview.
Earth System Model Evaluation Tool (ESMValTool)
QCDgrid Technology James Perry, George Beckett, Lorna Smith EPCC, The University Of Edinburgh.
Components of the climate system, interactions, and changes (Source: IPCC AR4 WG1 Ch.1, FAQ 1.2, Figure 1)
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space The Capabilities of the GridSpace2 Experiment.
IS-ENES [ees-enes] InfraStructure for the European Network for Earth System Modelling IS-ENES will develop a virtual Earth System Modelling Resource Centre.
13 ° COSMO General Meeting Rome VERSUS2 Priority Project Report and Plan Adriano Raspanti.
Metadata Creation with the Earth System Modeling Framework Ryan O’Kuinghttons – NESII/CIRES/NOAA Kathy Saint – NESII/CSG July 22, 2014.
RUP Implementation and Testing
CIM – The Common Information Model in Climate Research
NE II NOAA Environmental Software Infrastructure and Interoperability Program Cecelia DeLuca Sylvia Murphy V. Balaji GO-ESSP August 13, 2009 Germany NE.
QCDGrid Progress James Perry, Andrew Jackson, Stephen Booth, Lorna Smith EPCC, The University Of Edinburgh.
R.Sutton RT4 coordinated experiments Rowan Sutton Centre for Global Atmospheric Modelling Department of Meteorology University of Reading.
ARGONNE NATIONAL LABORATORY Climate Modeling on the Jazz Linux Cluster at ANL John Taylor Mathematics and Computer Science & Environmental Research Divisions.
_______________________________________________________________CMAQ Libraries and Utilities ___________________________________________________Community.
Leveraging Globus Services to Support Climate Model Data Access Through the Earth System Grid Federation (ESGF) Brian Knosp 1, Luca Cinquini 1, Lukasz.
Climate Change Working Group (CCWG) July, 2004 Co-chairs: Gerald A. Meehl, Ben Santer, and Warren Washington.
Ligia Bernardet, S. Bao, C. Harrop, D. Stark, T. Brown, and L. Carson Technology Transfer in Tropical Cyclone Numerical Modeling – The Role of the DTC.
- Vendredi 27 mars PRODIGUER un nœud de distribution des données CMIP5 GIEC/IPCC Sébastien Denvil Pôle de Modélisation, IPSL.
Data for Model Evaluations Karl E. Taylor Program for Climate Model Diagnosis and Intercomparison (PCMDI) Presented to the Fourth WCRP Observation and.
ESG Observational Data Integration Presented by Feiyi Wang Technology Integration Group National Center of Computational Sciences.
Portable Infrastructure for the Metafor Metadata System Charlotte Pascoe 1, Gerry Devine 2 1 NCAS-BADC, 2 NCAS-CMS University of Reading PIMMS provides.
SL BoG: Main topics 1.Summary of CCI Papers 2.Open Issues from previous meeting 3.Plans of CCI teams for the coming year 4.Obs4MIPs 5.CORE-CLIMAX 6.SAF-CCI.
Regional Climate Model Evaluation System based on satellite and other observations for application to CMIP/AR downscaling Peter Lean 1, Jinwon Kim 1,3,
The evolution of climate modeling Kevin Hennessy on behalf of CSIRO & the Bureau of Meteorology Tuesday 30 th September 2003 Canberra Short course & Climate.
Data formats and requirements in CMIP6: the climate-prediction case Pierre-Antoine Bretonnière EC-Earth meeting, Reading, May 2015.
Lautenschlager + Thiemann (M&D/MPI-M) / / 1 Introduction Course 2006 Services and Facilities of DKRZ and M&D Integrating Model and Data Infrastructure.
1 Adventures in Web Services for Large Geophysical Datasets Joe Sirott PMEL/NOAA.
1 Accomplishments. 2 Overview of Accomplishments  Sustaining the Production Earth System Grid Serving the current needs of the climate modeling community.
WP9/JRA3: ESM Evaluation: developing an Infrastructure Participants: AA, CNRS-IPSL, DLR, FMI, SMHI, MPG OBJECTIVES Create an interdisciplinary infrastructure.
G.Govi CERN/IT-DB 1 September 26, 2003 POOL Integration, Testing and Release Procedure Integration  Packages structure  External dependencies  Configuration.
Earth System Curator and Model Metadata Discovery and Display for CMIP5 Sylvia Murphy and Cecelia Deluca (NOAA/CIRES) Hannah Wilcox (NCAR/CISL) Metafor.
Running CESM An overview
LLNL-PRES-XXXXXX This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under contract DE-AC52-07NA27344.
The Modeling Circle Courtesy M. Lautenschlager, DKRZ.
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
EUROPEAN UNION Polish Infrastructure for Supporting Computational Science in the European Research Space The Capabilities of the GridSpace2 Experiment.
Data Discovery and Access to The International Surface Pressure Databank (ISPD) 1 Thomas Cram Gilbert P. Compo* Doug Schuster Chesley McColl* Steven Worley.
T EST T OOLS U NIT VI This unit contains the overview of the test tools. Also prerequisites for applying these tools, tools selection and implementation.
1 Alison Pamment, 2 Calum Byrom, 1 Bryan Lawrence, 3 Roy Lowry 1 NCAS/BADC,Science and Technology Facilities Council, 2 Tessella plc, 3 British Oceanogrphic.
Metadata Development in the Earth System Curator Spanning the Gap Between Models and Datasets Rocky Dunlap, Georgia Tech 5 th GO-ESSP Community Meeting.
ESMValTool for Benchmarking Models with ESA CCI Data Mattia Righi 1, Veronika Eyring 1, Axel Lauer 1, Alexander Löw 2, Benjamin Müller 2, Daniel Senftleben.
Veronika Eyring (DLR, Germany) With input from Peter Cox (University of Exeter, USA) Peter Gleckler (PCMDI, USA) Duane Waliser (JPL, USA) Sabrina Wenzel.
Review for Eclipse Release Review | © 2012 by Review for Eclipse Committers, made available under the EPL v1.0 1 Review for Eclipse (R4E) 0.11 Release.
A Quick Tour of the NOAA Environmental Software Infrastructure and Interoperability Group Cecelia DeLuca Dr. Robert Detrick visit March 28, 2012
RAL, 2012, May 11 Research behaviour Martin Juckes, 11 May, 2012.
CMIP6-DICAD AP 6.1: Installation und Betrieb des ESMValTools in der ESGF DKRZ Infrastruktur Stephan Kindermann1, Björn Brötz2, Veronika Eyring2, and.
Chemistry-Climate Model Validation Activity for SPARC (CCMVal)
AP7/AP8: Long-Term Archival of CMIP6 Data
World Conference on Climate Change October 24-26, 2016 Valencia, Spain
Climate variability and change
Overview Coupled Model Intercomparison Project Phase 6 (CMIP6)
Intro to CMIP, the WHOI CMIP5 community server, and planning for CMIP6
CMIP6-DICAD – FU Berlin - Germany
GFDL Climate Model Status and Plans for Product Generation
Climate Data Analytics in a Big Data world
Helping a friend out Guidelines for better software
Task 5 : Supporting CCI Contributions to Obs4MIPs
Metadata Development in the Earth System Curator
Presentation transcript:

The Earth System Model Evaluation Tool (ESMValTool) Axel Lauer 1, Veronika Eyring 1, Mattia Righi 1, Alexander Löw 2, Martin Evaldsson 3 1 Deutsches Zentrum für Luft- und Raumfahrt e.V. (DLR), Oberpfaffenhofen, Germany 2 Department of Geography, University of Munich (LMU), Germany 3 Swedish Meteorological and Hydrological Institute (SMHI), Norrköping, Sweden 25 November 2015

Motivation for the development of the Earth System Model Evaluation Tool (ESMValTool) Facilitate the evaluation of complex Earth System Models, e.g., quick look at standard diagnostic plots & output diagnostic variables, easy comparison of new simulations (e.g. sensitivity runs or runs with new model versions) with existing runs and with observations Raise the standard for model evaluation include additional diagnostics of ongoing evaluation activities so that we do not have to start from scratch each time implement more observations, account for observational uncertainties Quick assessment of where we stand with a new set of model simulations via standard namelists that reproduce specific papers, reports, etc. Traceability and reproducibility Facilitates participation in and analysis of Model Intercomparison Projects easily comparison of models participating in CMIP and CMIP6-Endorsed MIPs Easy expandability synergies with ongoing projects to expand the tool (e.g. NCAR CVDP) useful for model groups & those analyzing models useful for model development

CRESCENDO Overarching goal Improve the process realism and future climate projection reliability of (European) climate models. Objective Develop and apply an evaluation tool for routine model benchmarking and more advanced analysis of feedbacks and future projections. Strategy Develop a community benchmarking system (ESMValTool) Comparing with observations and earlier model versions Benchmarking key aspects of simulated variability and trends Including additional biogeochemical and aerosol/trace gas metrics Adding process-level diagnostics Implement emergent contraints developed in CRESCENDO into ESMValTool Make ESMValTool available to the wider research community

Overview ESMValTool Routine benchmarking and evaluation of single or multiple ESMs, either against predecessor versions, a wider set of climate models, or observations Current implementations include sea ice assessments and other Essential Climate Variables (ECVs), tropical variability, atmospheric CO 2 and NO 2 budget, …  can easily be extended with additional analysis Community development under a subversion controlled repository  allows for multiple developers from different institutions to contribute and join Goals:  Enhance and improve routine benchmarking and evaluation of ESMs  Routinely run the tool on model output of CMIP6 alongside the Earth System Grid Federation (ESGF)  Support and facilitate the analysis of the ongoing CMIP Development, Evaluation, Characterization of Klima (DECK) and CMIP6 simulations with a fully internationally developed evaluation tool from multiple institutions Climate community is encouraged to contribute to this effort over the coming years.

Current Status: Contributing Institutions (currently ~60 developers from 22 institutions) 1.Deutsches Zentrum für Luft- und Raumfahrt (DLR), Institut für Physik der Atmosphäre, Germany 2.Swedish Meteorological and Hydrological Institute (SMHI), Norrköping, Sweden 3.Agenzia nazionale per le nuove tecnologie, l’energia e lo sviluppo economico sostenibile (ENEA), Italy 4.British Atmospheric Data Centre (BADC), UK 5.Centre for Australian Weather and Climate Research (CAWCR), Bureau of Meteorology, Australia 6.Deutsches Klimarechenzentrum (DKRZ), Germany 7.ETH Zurich, Switzerland 8.Finnish Meteorological Institute, Finland 9.Geophysical Fluid Dynamics Laboratory (GFDL) NOAA, USA 10.Institut Pierre Simon Laplace, France 11.Ludwig Maximilian University of Munich, Germany 12.Max-Planck-Institute for Meteorology, Germany 13.Met Office Hadley Centre, UK 14.Météo France, Toulouse, France 15.Nansen Environmental and Remote Sensing Center, Norway 16.National Center for Atmospheric Research (NCAR), USA 17.New Mexico Tech, USA 18.Royal Netherlands Meteorological Institute (KNMI), The Netherlands 19.University of East Anglia (UEA), UK 20.University of Exeter, Exeter, UK 21.University of Reading, UK 22.University of Wagingen, The Netherlands

Development of an Earth System Model Evaluation Tool Within EMBRACE: DLR, SMHI & EMBRACE partners in collaboration with NCAR, PCMDI, GFDL Open Source: Python script that calls NCL (NCAR Command Language) and other languages (R, Python) Input: CF compliant netCDF model output (CMIP standards) Observations: can be easily added Extensible: easy to (a) read models (b) process output [diagnostic] with observations and (c) use a standard plot type (e.g. lat-lon map) Current developments include Essential Climate Variables, e.g.,  Sea-Ice  Temperatures & Water Vapor  Radiation  CO 2  Ozone Tropical variability (incl. Monsoon, ENSO, MJO) Southern Ocean Continental dry biases and soil-hydrology-climate interactions (e.g., Standardized Precipitation Index) Atmospheric CO 2 and NO 2 budget More Observations (e.g., obs4MIPs, ESA CCI) Statistical measures of agreement Goal: Standard namelists to reproduce certain reports or papers (e.g., IPCC AR5 Chapter 9, Massonnet et al., 2012; Anav et al., 2012; Cox et al., 2013; Eyring et al., 2013)

WORKFLOW MANAGER namelist_XYZ.xml Model data Observations Plots (ps, eps, png, pdf) Output (NetCDF) Log file (references) diag_scripts/*.typ plot_scripts/typ/* derive_var.ncl cfg_XYZ/ launchers.py Call diagnostic scripts Different languages (typ) supported: NCL, python, R main.py ESMValTool main driver reformat.py Check/reformat the input according to CF/CMOR Processed data Calculate derived variable variable_defs/ reformat_default reformat_EMAC reformat_obs diag_scripts/lib Common libraries Reformat routines Schematic Overview of the ESMValTool Structure

Software requirements Python 2.* NCL 6.2 or higher CMIP5 style datasets ESMValTool (not yet officially released  contact PIs): tarball or from svn repository Installation e.g.: esgf-data.dkrz.de/esgf-web-fe

Examples of Diagnostics Implemented Similar to Figure 9.24 of AR5 Similar to Figure 9.7 of AR5 Similar to Figure 9.5 of AR5Similar to Figure 9.45 of AR5 CMIP5 MMM CMIP5 MMM - OBS Monsoon Precipitation Intensity and Domain Similar to Figure 9.32 of AR5

Examples of Diagnostics Implemented Cloud Regime Error Metric (CREM, Williams and Webb, 2009) Tropospheric ozone (Righi et al., 2015) Interannual variability in surface pCO 2 (Rodenbeck et al., 2014) Atlantic Meridional Overturning Streamfunction (AMOC)

Eyring et al., Geosci. Model Dev. Discuss, 8, , 2015

Summary  The ESM Evaluation Tool will facilitate the complex evaluation of ESMs and their simulations (e.g., submitted to international Model Intercomparison Projects such as CMIP, C4MIP, CCMI)  The tool is developed under a subversion controlled repository ‒ Allows multiple developers from different institutions to join the development team ‒ Broad community-wide and international participation in the development is envisaged ‒ Collaboration with the WGNE/WGCM metrics panel  Current extensions ‒ Atmospheric dynamics, biogeochemical processes, cryosphere and ocean ‒ Need for a wide range of observational data to be used with the tool ‒ Observations should ideally be provided with uncertainty and a technical documentation (e.g. similar to those from obs4MIPs) and in CF compliant netCDF ‒ Improving statistical comparison and visualization  Regular releases to the wider international community ‒ Further develop and share the diagnostic tool and routinely run it on CMIP DECK output and according observations (obs4MIPs) alongside the Earth System Grid Federation (ESGF) ‒ Release of ESMVal tool to the public → will contribute to metrics panel code repository ‒ Work towards a full community-tool developed by multiple institutions / countries

CRESCENDO WP3.1 – Objectives Further develop an ESM benchmarking and evaluation tool (ESMValTool): new Essential Climate Variables (ECVs) and standard diagnostics Make use of observations from projects such as ESA-CCI and obs4MIPs: new climate diagnostics, e.g., surface radiation, turbulent energy, water fluxes, precipitation variability Include new key diagnostics: biogeochemical and aerosol processes Extend ESMValTool with new diagnostics for operational analysis of multi ‐ model ESM projections: e.g. IPCC AR5 Ch. 9 diagnostics Provide user guidelines for other partners to include new diagnostics and performance metrics for process ‐ based model evaluation (RT2) and emergent constraints (WP3.2) into the ESMValTool Develop interfaces to existing diagnostic packages such as the UK Met Office Auto ‐ Assess package

Thank you!

Considering Observational Uncertainty in the ESMValTool  Observational uncertainty needs to be considered in the assessment of model performance and should be an integral part of the evaluation tools  The ESMValTool is quite flexible and can consider observational uncertainty in many ways, e.g.:  using more than one observational dataset to evaluate the models  showing the differences between the observational datasets in addition to the difference between the model and the reference dataset  easily usage of an observed uncertainty ensemble spanning the observed uncertainty space, consistent with statistics Example: HadCRUT4 surface temperatures, 5 (out of 100) realizations

Join the core development team with full access to: Subversion repositoryMantis bug tracker Teamsite & Wiki International cooperation for community development Options to contribute

Automated testing Any changes to a programming code have the risk of introducing unwanted side effects on some other parts of a code and of introducing bugs. Routine and automated testing is therefore essential to maximize the code quality and ensure integrity of all diagnostics implemented within ESMValTool. Automated testing within the ESMValTool is implemented on two complementary levels unittests are used to verify that small code units (e.g. functions/subroutines) provide the expected results integration testing is used to verify that a diagnostic integrates well into the ESMValTool framework and that a diagnostic provides expected results. This is verified by comparison of the results against a set of reference data generated during the implementation of the diagnostic.

Automated testing Nosetests (Python) While testing results of a diagnostic, a special namelist file is executed by ESMValTool which runs a diagnostic on a limited set of test data. A small test data set is chosen to minimize executing time for testing while ensuring on the other hand that the diagnostic produces the correct results. The following general tests are implemented at the moment: Check for file availability: a check is performed that all required output data have been successfully generated by the diagnostic. File checksum: currently the MD5 checksum is used to verify that contents of two files are identical. Extension graphics check: for graphic files an additional test will be implemented which verifies that two graphical outputs are identical. This is in particular useful to verify that outputs of a diagnostic remain the same after code changes.

Log-files Aim: transparency and reproducibility Log-files contain: Creation date Host and user Version number of the ESMValTool List of namelists / diagnostics run Variables and models processed List of all model files that have been used including full pathnames + Tracking ID (read from metadata if available) Patches applied to model data (if any) List of all observations used including references Contributing authors and acknowledgement of projects

Earth System Grid Federation (ESGF) Coupling Overview Aim: integration of the ESMValTool with the ESGF In practice this means ESMValTool is:  aware of ESGF directory structure  able to use “Solr” index search to locate remote datasets  helping user/administrator download remote datasets

ESGF Coupling What has been achieved whilst ESGF offline ESMValTool can read files from ESGF node cache  New ESGF enabled project classes locate datasets in node cache, no need to specify data directory  Path structure of each ESGF node cache is described in new ESGF config file, based on XML  Structure of node cache differs slightly between ESGF nodes, ESMValTool takes account of this so namelists are easily ported between ESGF sites  New ESGF project classes and config file fully documented

ESGF Coupling What is envisaged once ESGF back online ESMValTool will find remote files using “Solr” index search  If dataset not in ESGF node cache, or no access to ESGF node cache, ESMValTool can search custom local cache  If dataset not available locally, ESMValTool will perform Solr index search on ESGF using info provided in model line ESMValTool will assist with download of remote files  If remote dataset found, ESMValTool will provide URLs of required files, so user/administrator can download  May be possible to provide user with a text file containing config information for Synda, making downloading files even easier

Eyring et al., Geosci. Model Dev. Discuss, 8, , 2015 Coming soon!