Product Quality and Documentation – Recent Developments H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC

Slides:



Advertisements
Similar presentations
Simon Pinnock ESA Technical Officer
Advertisements

Report on the activities of the Digital Soil Mapping Working Group Endre Dobos.
ESRC UK Longitudinal Studies Centre A Framework for Quality Profiles Nick Buck and Peter Lynn Institute for Social and Economic Research University of.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Providing access to your data: Determining your audience Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
NASA Earth Science Data Preservation Content Specification H. K. (Rama) Ramapriyan John Moses 10 th ESDSWG Meeting – November 2, 2011 Newport News, VA.
Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
CEOS-CGMS Working Group on Climate Essential Climate Variable (ECV) Inventory John Bates – Chair CEOS-CGMS Working Group on Climate WGISS-37.
MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems.
NOAA Metadata Update Ted Habermann. NOAA EDMC Documentation Directive This Procedural Directive establishes 1) a metadata content standard (International.
Improving cooperation and quality in deliveries from Primary statistics (PS) to National Accounts (NA) Roger Pettersson Statistics Sweden.
Meteorological Observatory Lindenberg – Richard Assmann Observatory The GCOS Reference Upper Air Network.
Committee on Earth Observation Satellites John Bates Chair, Joint CEOS-CGMS Working Group on Climate 3 rd WCRP Data Advisory Committee.
Coordination Group for Meteorological Satellites - CGMS Climate Monitoring Architecture: Status and way forward Presented to CGMS-41 plenary session.
Marina Signore Head of Service “Audit for Quality Istat Assessing Quality through Auditing and Self-Assessment Signore M., Carbini R., D’Orazio M., Brancato.
Creating Documentation and Metadata: Metadata for Discovery Lola Olsen 1, Tyler Stevens 2, 1 National Aeronautics and Space Administration (NASA) 2 Wyle.
Quality issues on the way from survey to administrative data: the case of SBS statistics of microenterprises in Slovakia Andrej Vallo, Andrea Bielakova.
WGClimate John Bates NOAA SIT Workshop Agenda Item #8 WGClimate Work Plan progress & Issues CEOS SIT Technical Workshop CNES, Montpellier, France 17 th.
Assessing the Maturity of Climate Data Records
Full and Open Access GEOSS: Are we there yet? Massimo Craglia, Elena Roglia, Alessandro Sorichetta European Commission Joint Research Centre
Preservation Strategies: Emerging standards for preservation Ronald Weaver National Snow and Ice Data Center Version 1.0 Review Date.
GEO Work Plan Symposium 2012 ID-03: Science and Technology in GEOSS ID-03-C1: Engaging the Science and Technology (S&T) Community in GEOSS Implementation.
Using the Global Change Master Directory (GCMD) to Promote and Discover ESIP Data, Services, and Climate Visualizations Presented by GCMD Staff January.
Using Portals and Registries: Publishing Metadata to GCMD Lola Olsen 1, Tyler Stevens 2, 1 National Aeronautics and Space Administration (NASA) 2 Wyle.
NASA Perspectives on Data Quality July Overall Goal To answer the common user question, “Which product is better for me?”
CEOS-CGMS Working Group on Climate John Bates NOAA/NESDIS/NCEI SIT Workshop Agenda Item 7 CEOS Action / Work Plan Reference CEOS SIT Technical Workshop.
ATT Contribution to GEO Archive Task Team WGISS – 22 Sep 11 – 15, 2006 Annapolis, USA.
Meteorological Observatory Lindenberg – Richard Assmann Observatory (2010) GCOS Reference Upper Air Network Holger Vömel Meteorological Observatory Lindenberg.
Improving Information Quality for Earth Science Data and Products – An Overview H. K. (Rama) Ramapriyan Science Systems and Applications, Inc. & NASA Goddard.
Task XX-0X Task ID-01 GEO Work Plan Symposium April 2014 Task ID-01 “ Advancing GEOSS Data Sharing Principles” Experiences related to data sharing.
Data Systems Integration Committee of the Earth Science Data System Working Group (ESDSWG) on Data Quality Robert R. Downs 1 Yaxing Wei 2, and David F.
Information Quality Cluster - Introduction H. K. (Rama) Ramapriyan Science Systems and Applications, Inc. & NASA Goddard Space Flight Center David Moroni.
Global Change Master Directory (GCMD) Mission “To assist the scientific community in the discovery of Earth science data, related services, and ancillary.
1 U.S. Department of the Interior U.S. Geological Survey LP DAAC Stacie Doman Bennett, LP DAAC Scientist.
Providing access to your data: Determining your audience Robert R. Downs, PhD NASA Socioeconomic Data and Applications Center (SEDAC) Center for International.
4 th WCRP Observations and Assimilation Panel Meeting Hamburg, Germany, March 29-31, Workshop on Ensuring Access and Trustworthiness of Climate.
Advertising your data Alecia Aleman 1, Ruth Duerr 2 1 National Aeronautics and Space Administration (NASA) 2 National Snow and Ice Data Center, University.
Page 1 CSISS Center for Spatial Information Science and Systems 09/12/2006 Center for Spatial Information Science and Systems (CSISS) George Mason University.
ESA UNCLASSIFIED – For Official Use Data Stewardship Interest Group ESA – EO Data Stewardship Maturity Matrix WGISS#41 Meeting, Canberra, (AUS) 14–18 March,
QA4EO in 10 Minutes! A presentation to the 10 th GHRSST Science Team Meeting.
WGClimate The Joint CEOS/CGMS Working Group on Climate Perspective for Cycle#3 Jörg Schulz WGClimate The Joint CEOS/CGMS Working Group on Climate 6th Meeting.
CEOS Working Group on Information System and Services (WGISS) Data Access Infrastructure and Interoperability Standards Andrew Mitchell - NASA Goddard.
QA4EO Update on the Quality Assurance Framework For Earth Observation Joint GSICS GDWG-GRWG meeting.
The International Ocean Colour Coordinating Group International Network for Sensor Inter- comparison and Uncertainty assessment for Ocean Color Radiometry.
Committee on Earth Observation Satellites John Bates, NOAA Plenary Agenda Item 8 29 th CEOS Plenary Kyoto International Conference Center Kyoto, Japan.
Atmosphere Clouds Aerosols Ozone GHG Issues and remarks (synthesis after presentations and discussion)
Sustained Coordinated Processing of Environmental Satellite Data for Climate Monitoring SCOPE-CM Sustained, Co-Ordinated Processing of Environmental Satellite.
CrossCutting topic: Data Quality and European Network of EO Networks
NASA Earth Science Data Stewardship
CEOS Carbon Strategy – WGClimate Actions
Information Quality Cluster - Fostering Collaborations
Ensuring and Improving Information Quality for Earth Science Data and Products – Role of the ESIP Information Quality Cluster H. K. (Rama) Ramapriyan,
Persistent Identifiers Implementation in EOSDIS
CCl Expert Team on Education and Training
Copyright 2012 Lola Olsen & Tyler Stevens.
Capacity Building Enhance the coordination of efforts to strengthen individual, institutional and infrastructure capacities, particularly in developing.
Agency Requirements: NOAA Administrative Order Management of environmental and geospatial data and information This training module is part of.
WGISS-WGCV Joint Session
Essential Climate Variable (ECV) Inventory
Carbon Actions for WGCV
EV Workshop, June 11-12, 2015, Bari, Italy
Data Quality Session Gilberto A. Vicente
WGISS Working Group for Information Systems & Services
Committee on Earth Observation Satellites
Recent activities of OCR-VC
WGISS Connected Data Assets Oct 24, 2018 Yonsook Enloe
CEOS Working Group on Climate (WGClimate)
The role of metadata in census data dissemination
Presented to the CEOS WGISS October 10, 2019
Presentation transcript:

Product Quality and Documentation – Recent Developments H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC 1 Sumer ESIP Meeting July 8, 2014

Motivation Scientists (Providers/Dataset Producers) are motivated to provide high quality products and have a stake in ensuring that their data are not misused Users need to know quality of data they use Many ways to express quality Makes it difficult for both providers and users Data Centers are intermediaries Need to simplify providers’ job of supplying information Express information conveniently for users to access and understand 2

Background QA4EO Guidelines (2010) NASA “Making Earth System Data Records (ESDRs) for User in Research Environments (MEaSUREs)” Product Quality Checklist (2012) NOAA Climate Data Records (CDR) Maturity Matrix (Bates and Privette, 2012) Improving Data Quality Information for NASA Earth Observation Data (Lynnes et al, 2012) Obs4MIPS – Climate Model Intercomparison Project (CMIP5) (2012) Committee on Earth Observing Satellites (CEOS) Essential Climate Variables (ECV) inventory questions (2012) National Center for Atmospheric Research (NCAR) Community Contributions Pages (2013) CEOS Working Group on Information Systems and Services (WGISS) Metadata Quality Exploration Questionnaire (2013) Global Earth Observation System of Systems (GEOSS) Data Quality Guidelines (2013) EU FP7 project CORE-CLIMAX assessment of European capacity to produce ECV climate data records from satellite, in situ and reanalysis data – NOAA maturity matrix extended/revised (2013) ISO – Metadata Standard for Geographic Information Data Quality (2013) 3

Product Quality Checklist Result of about 2 years’ discussions in Metrics Planning and Reporting Working Group – MEaSUREs PI’s and DAACs represented Distinction between “Scientific Data Quality” and “Product Quality” Two separate checklists created – one for PI’s and another for DAACs to fill out Recommendation made to HQ and approved Adopted and used for MEaSUREs 2006 projects Included in Cooperative Agreements for MEaSUREs 2012 projects 4

Product Quality Checklist – PI’s 5 Project ESDR / EDSR Group Checklist Science Quality Level 1. Have the data been evaluated by external users? (Summarize results) 2. Is the data set complete as proposed? (Explain 'partial'). 3. Is the data set consistently processed as proposed? (Explain 'partial'). 4. Are uncertainties estimated and documented, including in their spatial or temporal dimension? 5. Have the data been validated, i.e. ‘assessed for uncertainties, to the extent possible by comparison with alternative measurements’? 6. Have differences between new products and any comparable existing products been documented? (Explain how, in what ways) 7. Have promised improvements in the new data compared to existing products been achieved? 8. Have the ESDR’s algorithm or analysis method, product description and product evaluation results been published in peer-reviewed literature? Documentation Quality Level 1. Is the data format well and completely described and/or is a commonly accepted appropriate standard format used? 2. Has data format description been provided to the DAAC? 3. Are the algorithm and processing steps described? 4. Have algorithm and processing steps description been provided to DAAC? 5. Is the metadata complete? 6. Is the documentation of the metadata complete? 7. Has documentation of the metadata been provided to the DAAC? Usage and Satisfaction 1. If project is distributing products, is the targeted community using the data? (Indicate trend) 2. If project is distributing products, is the broader community using the data? (Indicate trend) 3. If project is distributing products, are users satisfied with the data product? (Indicate trend)

Product Quality Checklist – DAACs 6 DAAC ESDR / ESDR Group Checklist Science Quality Level 1. Have differences between new products and any comparable existing products been documented? (Summarize results) Documentation Quality Level 1. Is the data format well and completely described and/or is a commonly accepted appropriate standard format used? 2. Are the algorithm and processing steps described? 3. Is the metadata complete? 4. Is the documentation of the metadata complete? Accessibility / Support Services Quality 1. Is it easy for users to discover the data? 2. Is it easy for users to access the data? 3. Are tools and services that enable reading and use of the data readily available? 4. Are there existing tools for analysis of this data set? 5. Can the users get help with discovery, access and use of the data? Usage and Satisfaction 1. For products distributed by DAAC, is the targeted community using the data? 2. For products distributed by DAAC, is the broader community using the data? (Indicate Trend) 3. For products distributed by DAAC, are users satisfied with the data product? (Indicate Trend)

NCAR Climate Data Guide - Community Contributions Pages What are the key strengths of this data set? What are the key limitations of this data set? What are the typical research applications of these data? What are examples from your work? What are some common mistakes that users encounter when processing or interpreting these data? What are the likely spurious (non-climatic) features, if any, of time series derived from these data? What corrections were applied to account for changes in observing systems, sampling methods or density, and satellite drift or degradation? Describe any conversion steps that are necessary or general strategies to compare these data with model output. What are some comparable data sets, if any? Why use this data set instead of another? How is uncertainty characterized in these data? Provide a summary statement about these data and their utility for climate research and model evaluation. 7

CEOS WGISS Metadata Quality Exploration Questionnaire Why did you choose this dataset for the survey? How does your organization define “fitness for purpose” for this dataset? What quality measures do you use to assess scientific quality? How are quality measures created initially? How do you store quality measures in your metadata? Are uncertainties estimated and documented, including in their spatial or temporal dimension? Have the data been validated, i.e. ‘assessed for uncertainties’, to the extent possible by comparison with alternative measurements’? Have the algorithm or analysis method, product description and product evaluation results been published in peer- reviewed literature? Is the data evaluated by external users? If so, how are the comments from external users captured? Any other relevant comments regarding Quality metadata. 8