Larry Flynn and Manik Bali 2016 GSICS Executive Panel Meeting

Slides:



Advertisements
Similar presentations
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO Draft considerations on calibration and the.
Advertisements

Documenting GSICS as an element of the WMO Integrated Global Observing System 16th GSICS Executive Panel, Boulder, May 2015 Item 6 Jérôme Lafeuille.
Copyright 2010, The World Bank Group. All Rights Reserved. Recommended Tabulations and Dissemination Section B.
Report from JMA 17th GSICS Executive Panel, Biot, 2-3 June 2016 Masaya Takahashi on behalf of Kazumi Kamide Meteorological Satellite Center, Japan Meteorological.
GRWG Agenda Item Towards Operational GSICS Corrections for Meteosat/SEVIRI IR Channels Tim Hewison EUMETSAT 1.
Agency xxx, version xx, Date xx 2016 [update in the slide master] Coordination Group for Meteorological Satellites - CGMS Introduction to GSICS Presented.
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO A WMO perspective on GSICS Joint Meeting of the.
GSICS GDWG Report, Toulouse, 9 February GSICS Data Working Group Status Report V. Gaertner CMA, EUMETSAT, JMA, KMA, NOAA/NESDIS, WMO.
GSICS web meeting on Planning for Annual GSICS WG meeting Agenda.
June, GSICS Coordination Centre (GCC) Annual Report to the Executive Panel Document #3 Larry Flynn and Manik Bali 2016 GSICS Executive.
June, GSICS USER GUIDE Using GSICS Products and Services Manik Bali, Tim Hewison, Larry Flynn and Jerome Lafeuille 2016 GSICS Executive Panel Meeting.
Report from USGS 17th GSICS Executive Panel, Biot, 2-3 June 2016 Thomas Stone U.S. Geological Survey, Flagstaff, Arizona USA GSICS-EP-17, Biot, 2-3 June.
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO Documenting GSICS as an element of the WMO Integrated.
GSICS Procedure for Product Acceptance (GPPA) WorkFlow GCC Presented by Fangfang Yu 12/03/
GSICS Coordination Center Report Fangfang Yu, Fuzhong Weng, Fred Wu and George Ohring NOAA/Center for Satellite Applications and Research 8 April, 2013.
1.Bob Iacovazzi (NOAA) - Road to Pre-operational Phase Presentation 2.All – Discussion & Feedback 3.Tim Hewison (EUMETSAT) – Reminders: Korea, Conference,
Instrument Landing Pages Challenges and Proposals.
Marianne König, Tim Hewison, Peter Miu
GSICS Procedure for Product Acceptance Update
Comments on Template & Document Management System for GSICS
GSICS MW products and a path forward.?
GSICS TC Calibration Event Logging December 2015 Darmstadt, Germany
GDWG Agenda Item: Tools for GRWG activities
GSICS Collaboration Servers a Vehicle for International Collaboration Status 2011 Peter Miu EUMETSAT.
GSICS Instrument Event Logs: Rational and Draft Template
Report from KMA 17th GSICS Executive Panel, Biot, 2-3 June 2016
Planned Activities of GSICS Microwave sub-group
In-orbit Microwave Reference Records
GDWG Agenda Item Existing netCDF Format Updates
GSICS Review Summary Manik Bali.
GSICS Quarterly Special Issues and Editors
KMA GDWG Activity Progress Report
GDWG Agenda Item Repository for GSICS Work - Clarification on Archiving Masaya Takahashi, JMA CMA, CNES, EUMETSAT, ISRO, IMD, JAXA, JMA, KMA, NASA, NIST,
Doelling, Wagner 2015 GSICS annual meeting, New Delhi March 20, 2015
Rob Roebeling and Peter Miu
Larry Flynn and Manik Bali 2016 GSICS Executive Panel Meeting
GIRO usage and GSICS Lunar Observation Dataset Policy S. Wagner
GSICS Data Management and Availability to Users
Actions and Recommendations of the Joint Session
Review of MSU/AMSU Re-Calibration Product
Larry Flynn and Manik Bali
Manik Bali, Larry Flynn NOAA/STAR
The GSICS Procedure for Product Acceptance ( GPPA )- New Challenges
Fiduceo update for GSICS GRWG meeting 2018
Rob Roebeling, Viju John, Frank Ruethrich, Tim Hewison EUMETSAT
JMA Agency Report 2017 Arata Okuyama, Masaya Takahashi and Hidehiko Murata Meteorological Satellite Center, Japan Meteorological Agency.
Site classifications, definitions, and updates to Landnet
GSICS Product Submission & Promotion Documents 8.1 and 8.2
Update on the GSICS Procedure for Product Acceptance (GPPA)
2b. GSICS Coordination Centre (GCC)
GSICS MW products and a path forward.?
Contribution to Agenda Item 8
GSICS Coordination Center GSICS EP- Debrief
MW Products and Deliverables
Technical aspects of the GIRO work: inputs from GRWG
Status of the EUMETSAT GSICS DCC product
Introduction to Mini Conference & GSICS
Rob Roebeling and Peter Miu
Where do we stand on GSICS MW products?
Preparatory Web meeting for 2018 Annual Meeting in Shanghai, China
GSICS Products and Deliverables
Fiduceo update for GSICS GRWG meeting 2018
Progress toward DCC Demo product
Hot summary of the Fourth GSICS Users Workshop
GSICS Server Upgrade - Requirements for New Deliverables
GSICS Data Management Working Group Summary and Action Items
Deliverables and their acceptance criterion Document
2s. Defining GSICS Products and Deliverables
GSICS Documentation Discussion - classification and storage
GSICS Newsletter Microwave mini issue
Presentation transcript:

Larry Flynn and Manik Bali 2016 GSICS Executive Panel Meeting 20 November 2018 Defining GSICS Products and Deliverables and their acceptance criterion Document #10 Larry Flynn and Manik Bali 2016 GSICS Executive Panel Meeting Biot, France June 3, 2016

Disclaimer “The scientific results and conclusions, as well as any views or opinions expressed herein, are those of the author(s) and do not necessarily reflect the views of NOAA or the Department of Commerce."

GPPA Introduction Actions taken to meet challenges 20 November 2018 GPPA Introduction Actions taken to meet challenges Past Action: GCC to take a lead in discussing the GPPA with GSICS members to reduce the amount of time needed to move through the phases. Actions from EP-16 Action: a) GCC to finalize the high-level categorization of GSICS holdings and deliverables, ensuring that actual or planned products can be mapped to these categories, indicating a target audience, delivery or access mode;  Action: b) GCC to propose appropriate review/acceptance mechanisms for each category of GSICS products (with a distinction between acceptance of the algorithms, and acceptance of the production of calibration coefficients or calibrated radiances).

Contents Proposed Categorization Core Products Model and Data Sets Concerns and discussion Examples Model and Data Sets Concern and discussion Tools GSICS Documents GPRC and OSCAR Resources Proposed Acceptance and Maturity for categorization.

Email from Jerome Thinking further, I believe that the following are important criteria to categorize GSICS holdings and deliverables: Is it specific to a monitored instrument, or is it applicable to a range of GSICS things, or is it applicable further to GSICS-like things, or even broader to science in general ? Is it the final deliverable, or only a by-product ? Is it meant to be produced in real-time (and then the routine delivery is critical) or is it for old records (then routine delivery makes no sense and should not be part of any acceptance procedure) With these questions in mind I produced the slide in attachment  where we have four broad categories: Core products (related to one monitored instrument, current or archived) Shared  "Science resources"  (models, datasets) which are generally collaborative contributions to/from the broader science community Standards promoted by GSICS for calibration-related activities Tools used by GSICS developers (which can incidentally be shared beyond the developers if it helps, or even be developed for users like the messaging /registration tool)

Near Real Time corrections Time–centered corrections Schematic representation of GSICS holdings and deliverables (from Jérôme) 20 November 2018 Delivered services Instrument behaviour assessment (monitoring results, statistics, records of anomalies) Instrument correction algorithm (bias, gain, non-linearity, λ shift, new SRF) Near Real Time corrections Time–centered corrections Bias adjust. sequences Core GSICS products related to a monitored instrument Intermediate products Main certified products Provided by Jérôme Lafeuille.

Schematic representation of GSICS holdings and deliverables (from Jerome) Solar spectra Lunar spectra Target reflectivities Calibration datasets Spectroscopy database RTM Resources shared with the science community Delivered services Instrument behaviour assessment (monitoring results, statistics, records of anomalies) Instrument correction algorithm (bias, gain, non-linearity, λ shift, new SRF) Near Real Time corrections Time–centered corrections Bias adjust. sequences Core GSICS products related to a monitored instrument Intermediate products Main certified products

Schematic representation of GSICS holdings and deliverables (from Jerome) Solar spectra Lunar spectra Target reflectivities Calibration datasets Spectroscopy database RTM Resources shared with the science community Delivered services Instrument behaviour assessment (monitoring results, statistics, records of anomalies) Instrument correction algorithm (bias, gain, non-linearity, λ shift, new SRF) Near Real Time corrections Time–centered corrections Bias adjust. sequences Core GSICS products related to a monitored instrument Intermediate products Main certified products GSICS standards Data Formats Metadata Formats Thesaurus Procedures SNO matchup Ray matching Bias monitoring Plotting GSICS tools

Categorization of existing and potential GSICS Holdings / Commodities Action: a) Categorization of existing and potential GSICS Holdings / Commodities Core products Subject to GPPA (Stamp of Approval) Models and data sets These should be reviewed by GSICS prior to recommendation for use. Tools These are made available because GSICS members find them useful. GSICS Documents These are related to products, protocol and procedures. GPRC and OSCAR Documents and Resources GSICS Disclaimer for external website content.

Core Products – Subject to GPPA 20 November 2018 Products currently in the Catalog Estimates of Level 1 changes other than simple biases Biases with diurnal, seasonal or other variations Wavelength scale drift estimates SRF estimates Estimates from vicarious calibration, especially for stability and transfer DCC statistics, target reflectivities Reference instrument records over some period of time A stand-alone ECVR – E.g., this record is found to have xx accuracy and yy precision and zz stability over 20aa to 20bb for nighttime ocean scenes Products connected with ECVRs or FCDRs a sequence of bias adjustments a record segment that is adjusted from beginning to end and has good error estimates included. LF: Reference instruments: Agencies will compare to what they know. Independent records are valuable. Every user will have their own scorecard for what is “best”. Stability must be continually monitored. Three reference instruments are good for this. IR comparisons to Sonde collection, GSPRO and AVIRIS. SBUV/2 CDR adjustment to NOAA-16 SBUV/2? to SAGE or MLS? BI: Records from GEO instruments are not usually reprocessed. LF: Do vicarious calibration methods require RTMs?

Concerns and Discussion 20 November 2018 The GPPA needs some modifications for some these. Traceability is a continuum – Which products are SI traceable and which are not? Who are the users of each product and what are their requirements? The GPPA demands input for the reviews from users – how do products with the developer as primary user move forward? How do products that have served their purpose (e.g., were used to make an ECV) identify users? Do the adjusted ECV record holdings fall under a climate data program’s domain more naturally? What is required to extend product maturity to comparisons among in-family sensors? Reference and Target may be very different cases. (Existing action.) BI: While SI is preferred, the reality is that most uncertainties are just relative to non-SI references. BI: For Essential Climate Variables, GSICS can link to lists of holdings not recreate them.

Examples for Discussion Spectral Response Function (SRF) shift estimates Sequence of bias adjustments used to merge time series components for ECVs Reference Instruments (Anchors, Benchmarks, Absolute) DCC-based Bias Adjustments

Models and Data Sets – Community Standards? 20 November 2018 Lunar Spectra / GIRO + GLOD Reference Solar Spectra Or do we leave this up to CEOS IVOS or someone else? Key calibration data sets Or should these live at GPRCs or OSCAR with links at GSICS? Target Reflectivity/Emissivity Or are these possible new products? Emission Lines and Absorption Constants (e.g., NIST) How do we agree on the best sources? Radiative Transfer Models (e.g., JCSDA CRTM) Or should these be in the Tools Category? Are these needed for vicarious calibration? LF: Some of these are related to vicarious calibration products. LF: Survey the GRWG subgroups as to what they already use. LF: Does GPSRO provide traceable calibration of IR sensors? Does it need an RTM? LF: What about GRUAN and other ground-based assets LF: IGACO has a site on Ozone Absorption

Concerns and Discussion 20 November 2018 These should be reviewed/scrutinized by GSICS RWGs and recommended for use. We should require some amount of documentation on the theory and use. Each one should be considered on a case by case basis as to where they live, who maintains them, what the GSICS associations are, and who will manage licensing and any restrictions on model and data use. Many key calibration data sets are just accepted as the agency best effort. They may also be discussed for consideration as products. That is, go through elements of the GPPA and other GSICS reviews to establish their uncertainties and error bounds. Are the primary users for these resources the GRWG members? Is external versus internal users a helpful classification criterion for what should become GSICS holdings? LF: GSICS has a related action to identify key calibration parameters and characterization. LF: GDWG has a related action to develop an approach for restricted access. BI: GPPA for these could just say that they received scrutiny of the community and have documents and other materials appropriate for their use.

Tools – GSICS Recommended? 20 November 2018 SNO Matchup software FOV alignment and Ray matching tools Bias Monitoring tool Readers for Level 1 data sets Or should these live at GPRCs? SBAF Plotting, Plotting tools, THREDDS servers, product generation environment, wiki, Data processing tools developed within GSICS (e.g., NetCDF, HDF and other kinds of database manipulation) Observed versus Forecast Models

Concerns and Discussion 20 November 2018 Some of these may or may not be available directly from GSICS web sites. If the developers of tools want them to be recommended / approved for use by GSICS, will we need to review them and may require users’ guides, ATDBs etc. One could argue for moving the SNO from this tools category to the previous one for models. Can we prescribe a minimum set of tools and documents that should be available for any instrument with measurements used in GSICS? If so, then we could provide a standard set of links to these resources as provide by each agency. Is this a new category tracking instrument maturity? LF: There could be pages for each instrument giving an overview of the maturity of its record. This would evaluate the level of documentation, characterization, verification and validation. There could be different expectations for reference/anchor sensors versus target sensors. LF: GSICS has a related action to develop instrument landing pages. LF: The concept of reference instruments (Tim) and instrument/record maturity (Mitch) intersect with the instrument resources. IB: There could be a review page for each tool for feedback from users.

GSICS Documents Examples of these are in the current GSICS holdings. Documents associated with products (e.g., ATBDs, Reviews, Users’ Guides, metadata rules) Documents associated with GSICS mission, implementation, processes and procedures Spreadsheet of Action Items and their Outcomes Quarterly newsletters Documents associated with meetings (Presentations, Minutes) GSICS Technical Notes – Approval Process? (e.g., Best Practices) Websites and WIKI Some of these documents have procedures for reviews, while others will need approval from the EP as they enter the system or their contents are updated.

Examples of GPRC and OSCAR Documents and Resources 20 November 2018 Instrument Landing Pages Instrument validation and laboratory testing reports Instrument and platform events and anomaly/status timeline compilation Instrument specifications and performance (Or are these calibration data sets?) Product monitoring by instrument operating agency. Agency comparison and monitoring results not submitted as products These may live at the GPRCs or other locations, e.g., OSCAR or NOAA ICVS. They could be listed (with links) as resources at the GCC. We should interact with the instrument operator to determine what links should be provided for each instrument at GSICS and where the content should live. We should interact with the agencies and the WMO if we think there is a need for additional content. LF: GSCIS has an action on providing event timelines. We should establish a minimum set of information for each instrument. There are actions looking at prelaunch data. The requirements may be difficult given sensor development schedules and vendor proprietary concerns. LF: Many of these would become part of the Instrument Maturity evaluation materials.

Overarching Concerns What should each agency commit to provide along with an instrument’s measurements? (E.g., SRFs, readers, FOVs, Pointing, two-line elements – NORAD, …) How do users and researchers find what they want or even know it exists? Where should different resources live and who maintains them? What happens to products after an instrument’s end-of-life?

The GSICS Procedure for Product Acceptance (GPPA) 20 November 2018 The GSICS Procedure for Product Acceptance (GPPA) QA4EO - Guidelines “ The Quality Assurance Framework for Earth Observation consist of ten distinct guidelines linked in the Guidelines Framework Data Producer (GPAF) Data Reviewer (GPERF) Data User ( GPERF) GSICS Executive Panel Acceptance is through meeting GSICS Goals. Maturity is Acceptance+ satisfying user Guidelines address Quality Indicators Traceability Reference (measurement)Standard Uncertainty Ref: QA4EO The Guide

Action: b) Acceptance and maturity for Proposed Classification of Entities(Talk 2.s) 20 November 2018 Core Products: Bias products, new additions, products connected with ECV/FCDR directly follow the GISCS Conventions. GSICS Resources and Documents: GPRCs, ICVS, Corrections not following GSICS convections but meeting GSICS goals. Models and Data Sets: GIRO, SBAF, Solar, Lunar, references, Intermediate Datasets used to create core products, Pre-Launch key data sets. Tools: SNO Matchup, Bias Monitoring, Display Tools, Readers, SBAF Acceptance and Maturity of GPPA Can be applied Acceptance and Maturity of the GPPA Can be applied Partly Accepted in the subgroup. One or all conditions to be fulfilled Models/Data sets published and peer reviewed and internationally accepted. Have users within the group. LF: We will have to make it clear what products have passed which levels of scrutiny. BI: Need to make clear to users and submitters which acceptance and maturity requirements apply to each submission. LF: May be a table with one dimension components required for acceptance and the other dimension product classes and A or NA entries. Accepted in the subgroup. One or all conditions to be fulfilled Models/Data sets published and peer reviewed and internationally accepted. Have users within the group. 11/20/2018

Background slides THANK YOU

Content for Instrument Landing Pages Instrument Specification (documents on instruments baseline physical basis of measurements and calibration) Instrument Events (databases or documents with calibration related events) Instrument Monitoring (databases or documents following the state of the instrument and any degradation of changes) Data Outages (databases or documents with instrument outage periods)

Jerome’s Matrix to map GCC presentation to FOR document 20 November 2018 GCC presentation GSICS-FOR Possible revision Various GSICS Products But only some of them are subject to a product acceptance procedure Various GSICS Deliverables But only some of them are called “Products”. All Products are subject to Product acceptance procedure Various GSICS Deliverables. Including “GSICS Products” which are routine corrections generated in accordance with GSICS-certified methods. Corrections and well determined methods - The corrections (e.g. regression coefficients) delivered to satellite data users are “Products”. either NRT or wrongly named “reanalysis” Products are routine corrections applicable to calibration coefficients (regression coefficients and ancillary information TBD) either in NRT, or delayed (centered time-averaging window) using certified algorithms. Products could also include corrected calibration coefficients (correction already applied) - The methods to generate these NRT corrections (ATBDs) are mainly internal deliverable. Certified algorithms applicable: FCDRs, ECV Algorithms to generate FCDRs (EP-15 clarified that FCDR are not GSICS products. Generating a FCDR from historical data requires another level of effort – close to archeology !) - by climate users for archive reprocessing to generate FCDRs - by sat operators to generate the GSICS NRT or delayed products Databases For the purpose of high-level communication I suggest we merge these 3 deliverables in a broad category Databases, References References, Tools and tools Documentation RTM RTMs are beyond the role of GSICS. The agreed Vision recommended partnership with the RTM community. Does GPSRO provide traceable calibration of IR sensors? Does it need an RTM? Does the MW community use spectral characterization in the CRTM?