WGClimate The Joint CEOS/CGMS Working Group on Climate Perspective for Cycle#3 Jörg Schulz WGClimate The Joint CEOS/CGMS Working Group on Climate 6th Meeting.

Slides:



Advertisements
Similar presentations
Trend of international discussions on the UNFCCC
Advertisements

Configuration Management
Product Quality and Documentation – Recent Developments H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC
Ninth Lecture Hour 8:30 – 9:20 pm, Thursday, September 13
CEOS-CGMS Working Group Climate, Darmstadt, Germany, 5-7 March 2014 Coordination Group for Meteorological Satellites - CGMS COORDINATION GROUP FOR METEOROLOGICAL.
1 ECV Inventory – Overview and Background  Around 2009 “a certain gentlemen” challenged CEOS to describe how CEOS was actually contributing to the generation.
CEOS-CGMS Working Group on Climate Essential Climate Variable (ECV) Inventory John Bates – Chair CEOS-CGMS Working Group on Climate WGISS-37.
Creating Architectural Descriptions. Outline Standardizing architectural descriptions: The IEEE has published, “Recommended Practice for Architectural.
Recent developments in the UNFCCC process in relation to global observations 4 th GTOS Steering Committee Paris, 1-2 December 2009 Rocio Lichte Programme.
WGClimate Work Plan for John Bates, Chair WGClimate 4th Working Group on Climate Meeting.
PISA FOR DEVELOPMENT Technical Workshops Contextual questionnaires 10 th April 2014 OECD Secretariat 1.
CEOS-CGMS Working Group on Climate John Bates, NOAA SIT-30 Agenda Item #11 Climate Monitoring, Research, and Services 30 th CEOS SIT Meeting CNES Headquarters,
Introduction to ISO New and modified requirements.
Committee on Earth Observation Satellites John Bates Chair, Joint CEOS-CGMS Working Group on Climate 3 rd WCRP Data Advisory Committee.
CEOS SST-VC- Status and Issues Craig Donlon and Kenneth S. Casey on behalf of the SST-VC members ESA and NOAA CEOS SIT-29 Meeting CNES, Toulouse, France.
© GEO Secretariat GEO WP & 2013 GEO priorities for CEOS Support GEO Secretariat CEOS SIT-28 NASA - Langley March 2013.
Performance Measurement and Analysis for Health Organizations
Software System Engineering: A tutorial
28 th CEOS Plenary Session John Bates NOAA CEOS Plenary, Agenda Item 13 Tromsø, Norway October 2014.
1 Report of the GCOS Space Rapporteur to AOPC-20.
Coordination Group for Meteorological Satellites - CGMS Climate Monitoring Architecture: Status and way forward Presented to CGMS-41 plenary session.
Slide: 1 27 th CEOS Plenary |Montréal | November 2013 Agenda item: 29 Adrian Simmons, Chair of the GCOS Steering Committee Update from the Global.
Usability Issues Documentation J. Apostolakis for Geant4 16 January 2009.
WGClimate John Bates NOAA SIT Workshop Agenda Item #8 WGClimate Work Plan progress & Issues CEOS SIT Technical Workshop CNES, Montpellier, France 17 th.
Assessing the Maturity of Climate Data Records
Environmental Management System Definitions
EUM/SIR/VWG/11/012 WP 2000: Climate SBA 9 October 2011 WP 2000: Climate Societal Benefit Area Robert Husband (EUMETSAT)
WMO WIGOS-ICG 18/03/20131 Architecture for Climate Monitoring from Space in the context of WIGOS and GFCS Jérôme Lafeuille & Stephan Bojinski WMO Space.
CGMS lessons learned towards vibrant weather and climate monitoring Dr Tillmann Mohr – Former Director General EUMETSAT Dr Don Hinsman – Former Director.
CEOS-CGMS Working Group on Climate John Bates NOAA/NESDIS/NCEI SIT Workshop Agenda Item 7 CEOS Action / Work Plan Reference CEOS SIT Technical Workshop.
WGISS and GEO Activities Kathy Fontaine NASA March 13, 2007 eGY Boulder, CO.
Improving Information Quality for Earth Science Data and Products – An Overview H. K. (Rama) Ramapriyan Science Systems and Applications, Inc. & NASA Goddard.
The Data Sharing Working Group 24 th meeting of the GEO Executive Committee Geneva, Switzerland March 2012 Report of the Data Sharing Working Group.
CryoNet Design Principles draft 3. background From the Abridged Final Report of the 17 th World Meteorological Congress mandates (§ 8.8): “Global Cryosphere.
WMO Architecture for Climate Monitoring from Space Update for ICG WIGOS Jérôme Lafeuille (OBS/SAT/C/SBOS) Stephan Bojinski (OBS/SAT/SO/SUP) WMO; OBS/SAT.
CGMS-42 CEOS-CGMS WP-01 V1, 11 May 2014 Coordination Group for Meteorological Satellites - CGMS Status report by the CEOS-CGMS Joint Working Group on Climate.
EUMETSAT, version 1, Date 06/06/2016 Coordination Group for Meteorological Satellites - CGMS Status report and recommendations from SCOPE-CM Chair: Ed.
CEOS Working Group on Information System and Services (WGISS) Data Access Infrastructure and Interoperability Standards Andrew Mitchell - NASA Goddard.
QA4EO Update on the Quality Assurance Framework For Earth Observation Joint GSICS GDWG-GRWG meeting.
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO A WMO perspective on GSICS Joint Meeting of the.
Recommendations to GSICS from international working groups GSICS meeting, March 2015, New Delhi, India Rob Roebeling.
Committee on Earth Observation Satellites John Bates, NOAA Plenary Agenda Item 8 29 th CEOS Plenary Kyoto International Conference Center Kyoto, Japan.
Sustained Coordinated Processing of Environmental Satellite Data for Climate Monitoring SCOPE-CM Sustained, Co-Ordinated Processing of Environmental Satellite.
Report on SBSTA-43 John Bates NOAA, National Centers for Environmental Information (NCEI) NOAA/NESDIS 6th Working Group on Climate Meeting.
SCOPE-CM: Inter-calibration Projects Rob Roebeling, Marie Bouchez Doutriaux, Alessio Lattanzio, Viju John, Sebastien Wagner, and Tim Hewison EUMETSAT.
Agency ESA, version 1, Date 6 June 2016 Coordination Group for Meteorological Satellites - CGMS Joint CEOS/CGMS Working Group on Climate 4-year Work Plan.
Eléments de contexte autour des FCDR, TCDR, ECV, … Questions autour des données de Réanalyses Sophie Cloché, IPSL Groupe tendances et variabliltés, pôle.
Status Report – CEOS-CGMS Working Group on Climate Presented to CGMS-43 Plenary session, agenda item G.1.
Configuration Management
Status of the ECV Inventory Cycle #2 Gap Analysis & Action Plan
CEOS Carbon Strategy – WGClimate Actions
Jörg Schulz, Alexandra Nunes
Essential Climate Variables (ECVs) and the ECV Inventory
Configuration Management
Joint Working Group Climate Report
Programme Board 6th Meeting May 2017 Craig Larlee
Capacity Building Enhance the coordination of efforts to strengthen individual, institutional and infrastructure capacities, particularly in developing.
CORE CLIMAX Results Overview
Proposal for an Obs4ECV Dataset and OpenSearch Capability for ESG
The Joint CEOS/CGMS Working Group on Climate
Scientific and Process oriented Evaluation of Climate Data
Data Management: Documentation & Metadata
CEOS/CGMS JWGC and status of actions to the IP
GSICS GRWG & GDWG, March 2016, Tsukuba, Japan
Site classifications, definitions, and updates to Landnet
Leigh Grundhoefer Indiana University
CEOS/CGMS Working Group Climate
WMO Space Programme Office Geneva, Switzerland, 29 March 2018
Draft Methodology for impact analysis of ESS.VIP Projects
CEOS-CGMS WGClimate Jörg Schulz, EUMETSAT, WGClimate Chair
Presentation transcript:

WGClimate The Joint CEOS/CGMS Working Group on Climate Perspective for Cycle#3 Jörg Schulz WGClimate The Joint CEOS/CGMS Working Group on Climate 6th Meeting of the Joint CEOS/CGMS Working Group on Climate, March, Paris (France)

Content 6th Meeting of the Joint CEOS/CGMS Working Group on Climate, March, Paris (France)  Enabling the ECV Inventory to answer more questions compared to current gap analysis;  Potential further development of the ECV Inventory  Usage of System Maturity Matrix and Application Performance Metric  Adding an FCDR Inventory?  Better coverage of ICDR?  Adding additional requirements to those from GCOS, e.g., from research programmes if readily available?  How do we proceed?

Questions not addressed by Cycle#2 gap analysis 6th Meeting of the Joint CEOS/CGMS Working Group on Climate, March, Paris (France)  What is the status of uncertainty characterisation of the records?  What are the arrangements for the interoperability of archives? [could address a gap in coordination of archives including following internationally agreed meta data standards];  What applications are generally covered by current ECVs and which known needed applications are not? ...  These are just exemplary questions, I guess there are more that could be asked

Core-Climax: System Maturity Matrix Maturity SOFTWARE READINESS METADATAUSER DOCUMENTATION UNCERTAINTY CHARACTERISATION PUBLIC ACCESS, FEEDBACK, UPDATE USAGE 1 Conceptual developmentNone Limited scientific description of the methodology available from PI NoneRestricted availability from PINone 2 Research grade codeResearch grade Comprehensive scientific description of the methodology, report on limited validation, and limited product user guide available from PI; paper on methodology is sumitted for peer-review Standard uncertainty nomenclature is idenitified or defined; limited validation done; limited information on uncertainty available Data avaliable from PI, feedback through scientific exchange, irregular updates by PI Research: Benefits for applications identified DSS: Potential benefits identified 3 Research code with partially applied standards; code contains header and comments, and a README file; PI affirms portability, numerical reproducibility and no security problems Standards defined or identified; sufficient to use and understand the data and extract discovery metadata Score 2 + paper on methodology published; comprehensive validation report available from PI and a paper on validation is submitted; comprehensive user guide is available from PI; Limited description of operations concept available from PI Score 2 + standard nomenclature applied; validation extended to full product data coverage, comprehensive information on uncertainty available; methods for automated monitoring defined Data and documentation publically available from PI, feedback through scientifc exchange, irregular updates by PI Research: Benefits for applications demonstrated. DSS: Use occuring and benefits emerging 4 Score 3 + draft software installation/user manual available; 3rd party affirms portability and numerical reproducibility; passes data providers security review Score 3 + standards systematically applied; meets international standards for the data set; enhanced discovery metadata; limited location level metadata Score 3 + comprehensive scientific description available from data provider; report on inter comparison available from PI; paper on validation published; user guide available from data provider; comprehensive description of operations concept available from PI Score 3 + procedures to establish SI traceability are defined; (inter)comparison against corresponding CDRs (other methods, models, etc); quantitative estimates of uncertainty provided within the product characterising more or less uncertain data points; automated monitoring partially implemented Data record and documentation available from data provider and under data provider's version control; Data provider establishes feedback mechanism; regular updates by PI Score 3 + Research: Citations on product usage in occurring DSS: societal and economical benefits discussed 5 Score 4 + operational code following standards, actions to achieve full compliance are defined; software installation/user manual complete; 3rd party installs the code operationally Score 4+ fully compliant with standards; complete discovery metadata; complete location level metadata Score 4 + comprehensive scientific description maintained by data provider; report on data assessment results exists; user guide is regularly updated with updates on product and validation; description on practical implementation is available from data provider Score 4 + SI traceability partly established; data provider participated in one inter-national data assessment; comprehensive validation of the quantitative uncertainty estimates; automated quality monitoring fully implemented (all production levels) Score 4 + source code archived by Data Provider; feedback mechanism and international data quality assessment are considered in periodic data record updates by Data Provider Score 4+ Research: product becomes reference for certain applications DSS: Societal and economic benefits are demonstrated 6 Score 5 + fully compliant with standards; Turnkey System Score 5 + regularly updated Score 5 + journal papers on product updates are and more comprehensive validation and validation of quantitative uncertainty estimates are published; operations concept regularly updated Score 5 + SI traceability established; data provider participated in multiple inter-national data assessment and incorporating feedbacks into the product development cycle; temporal and spatial error covariance quantified; Automated monitoring in place with results fed back to other accessible information, e.g. meta data or documentation Score 5 + source code available to the public and capability for continuous data provisions established (ICDR) Score 5 + Research: Product and its applications becomes references in multiple research field DSS: Influence on decision and policy making demonstrated

StandardsValidationUncertainty quantification Automated Quality Monitoring None Standard uncertainty nomenclature is identified or defined Validation using external reference data done for limited locations and times Limited information on uncertainty arising from systematic and random effects in the measurement None Score 2 + Standard uncertainty nomenclature is applied Validation using external reference data done for global and temporal representative locations and times Comprehensive information on uncertainty arising from systematic and random effects in the measurement Methods for automated quality monitoring defined Score 3 + Procedures to establish SI traceability are defined Score 3 + (Inter)comparison against corresponding CDRs (other methods, models, etc) Score 3 + quantitative estimates of uncertainty provided within the product characterising more or less uncertain data points Score 3 + automated monitoring partially implemented Score 4 + SI traceability partly established Score 4 + data provider participated in one inter-national data assessment Score 4 + temporal and spatial error covariance quantified Score 3 + monitoring fully implemented (all production levels) Score 5 + SI traceability established Score 4 + data provider participated in multiple inter- national data assessment and incorporating feedbacks into the product development cycle Score 5 + comprehensive validation of the quantitative uncertainty estimates and error covariance Score 5 + automated monitoring in place with results fed back to other accessible information, e.g. meta data or documentation       Sub-Matrix - Uncertainty

Fitness for Purpose? Motivation for Application Performance Metric (APM)  SMM provides assessment of whether the data set can be sustainable in terms of engineering, science, archive, and usage aspects;  There is no guarantee that a data set with high System Maturity is suitable for all applications!  For example, data set X with over all System Maturity FIVE/SIX that provides DAILY mean humidity values is NOT suitable for assessing DIURNAL cycle of humidity in climate models;  How do we assess the performance of a data set for a particular application?

Support User’s to Select Data  User requirements collection exercises show a large variability in the stated requirements of users with nominally similar applications;  But a core set of typical questions may always be isolated: Does the coverage of the record suffice ? What original observations were used in the product? What methods were used to create the product? How does the quality vary in time ? Is there sufficient level of detail ? Are the observations of adequate quality ? CoverageSamplingUncertaintyStability Are the record length and spatial coverage meeting the application’s requirements? Do the spatial and temporal sampling meet the applications requirements? Do the random and systematic uncertainties meet the requirements? Do the temporal and spatial stability meet the requirements?

General Concept of APM Database of CDR Product Specification Tables CDRs matching the URs best Application Specific User Requirements ECV Inventory

Maturity Matrix and APM 6th Meeting of the Joint CEOS/CGMS Working Group on Climate, March, Paris (France)  Use would allow for more targeted approach where data providers stand with respect to the process generating ECV CDRs;  For instance it includes also more explicit the validation process which currently is only addressed in the documentation category;  Would need to assess how the questionnaire can be combined with the maturity matrix – many questions are very similar;  APM would allow to answer more specific questions in the gap analysis in downstream direction of the architecture, e.g., case studies indicate specific data usages and needs which could form requirements used in an APM approach;

The Architecture for Climate Monitoring from Space Implementation coordinated by Joint CEOS – CGMS Working Group on Climate (JWG Climate) ICD R FCDR

How do we proceed? 6th Meeting of the Joint CEOS/CGMS Working Group on Climate, March, Paris (France) 6 th WGClimate 7 th WGClimate 8 th WGClimate  I am happy to hear your opinions/ideas on this;  Could start draft an ECV Inventory change document in Q4/2016 taking into account first lessons learnt from Cycle#2;  Need support from WGClimate members to complete/consolidate;  7 th WG Climate – Discuss and decide on developments for implementation on ECV Inventory for Cycle#3;  8 th WG Climate - Endorse developments and updated baseline of ECV Inventory for Cycle#3.