MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems.

Slides:



Advertisements
Similar presentations
Product Quality and Documentation – Recent Developments H. K. Ramapriyan Assistant Project Manager ESDIS Project, Code 423, NASA GFSC
Advertisements

1 H2 Cost Driver Map and Analysi s Table of Contents Cost Driver Map and Analysis 1. Context 2. Cost Driver Map 3. Cost Driver Analysis Appendix A - Replica.
Metrics Planning Group (MPG) Report to Plenary Clyde Brown ESDSWG Nov 3, 2011.
SEM II : Marketing Research
Weekly Risk Report & Performance Metrics
The COUNTER Code of Practice for Books and Reference Works Peter Shepherd Project Director COUNTER UKSG E-Books Seminar, 9 November 2005.
Enhancing Data Quality of Distributive Trade Statistics Workshop for African countries on the Implementation of International Recommendations for Distributive.
ICASAS305A Provide Advice to Clients
Introduction and Election of Co-Chair H. K. (Rama) Ramapriyan NASA/GSFC Metrics Planning and Reporting (MPAR) WG 8 th Earth Science Data Systems Working.
Software Quality Metrics
Process, Communication, and Certification Padma Venkata
OHT 4.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 Software Quality assurance (SQA) SWE 333 Dr Khalid Alnafjan
Section 2: Science as a Process
Data Analysis in the Water Industry: A Good-Practice Guide with application to SW Deborah Gee, Efthalia Anagnostou Water Statistics User Group - Scottish.
1 DEVELOPING ASSESSMENT TOOLS FOR ESL Liz Davidson & Nadia Casarotto CMM General Studies and Further Education.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
Literature Review and Parts of Proposal
CHE 165 Plant Design Introduction. Greensheet Review  Target is Study –Reported Verbally and in Text –Consists of Various Components  Most are submitted.
NASA Earth Observing System Data and Information Systems
Earth Observing System Data and Information System (EOSDIS) provides access to more than 3,000 types of Earth science data products and specialized services.
المحاضرة الثالثة. Software Requirements Topics covered Functional and non-functional requirements User requirements System requirements Interface specification.
Chapter 6 : Software Metrics
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
Software Systems for Survey and Census Yudi Agusta Statistics Indonesia (Chief of IT Division Regional Statistics Office of Bali Province) Joint Meeting.
FAMILY AND CHILDREN’S TRUST FUND (FACT) RESEARCH AND DATA MATERIALS.
1 Science as a Process Chapter 1 Section 2. 2 Objectives  Explain how science is different from other forms of human endeavor.  Identify the steps that.
Preliminary Evaluation Results: A Discussion with the Advisory Committee on Citizen-Friendly Reporting August 30, 2005 Human Services Research Institute.
CHAPTER 1 LESSON 3 & 4 MATH IN SCIENCE + GRAPHS. WHAT ARE SOME MATH SKILLS USED IN SCIENCE? SOME MATH SKILLS USED IN SCIENCE WHEN WORKING WITH DATA INCLUDE.
FY2010 Metrics Reporting Review Greg Hunolt, SGT ES-DSWG / MPARWG October, 2010.
Tools of Environmental ScienceSection 1 Section 1: Scientific Methods Preview Objectives The Experimental Method Observing Hypothesizing and Predicting.
Planetary Science Archive PSA User Group Meeting #1 PSA UG #1  July 2 - 3, 2013  ESAC PSA Archiving Standards.
Implementation of Citation Count Metrics H. K. (Rama) Ramapriyan NASA/GSFC Metrics Planning and Reporting (MPAR) WG 9 th Earth Science Data Systems Working.
Client/User Analysis Website Design. 2 Questions to be answered: What is the purpose of the site? What is the purpose of the site? Who is the site for?
Developing Survey Handbooks as Educational Tools for Data Users Presented at the European Conference on Quality in Official Statistics May 2010 Deborah.
Dr Jamal Roudaki Faculty of Commerce Lincoln University New Zealand.
D1.HRD.CL9.06 D1.HHR.CL8.07 D2.TRD.CL8.09 Slide 1.
Evaluation of information. Introduction It is common for people to challenge things they learn It is known that not every information is true Medical.
MPARWG Business & Disposition of Action Items from MPARWG October 2009 H. K. (Rama) Ramapriyan NASA/GSFC Metrics Planning and Reporting (MPAR) WG 9 th.
Label Design Tool Management Council F2F Washington, D.C. November 29-30, 2006
ESDSWG meeting – 10/21-23/2008 Metrics Planning and Reporting (MPAR) WG Breakout Summary H. K. (Rama) Ramapriyan NASA GSFC Clyde Brown NASA LaRC Co-Chairs,
Introduction to Earth Science Section 2 Section 2: Science as a Process Preview Key Ideas Behavior of Natural Systems Scientific Methods Scientific Measurements.
August 2005 TMCOps TMC Operator Requirements and Position Descriptions Phase 2 Interactive Tool Project Presentation.
Chapter 8 Evaluating Search Engine. Evaluation n Evaluation is key to building effective and efficient search engines  Measurement usually carried out.
Jianchun Qin, Liguang Wu, Michael Theobald, A. K. Sharma, George Serafino, Sunmi Cho, Carrie Phelps NASA Goddard Space Flight Center, Code 902 Greenbelt,
1 U.S. Department of the Interior U.S. Geological Survey LP DAAC Stacie Doman Bennett, LP DAAC Scientist Dave Meyer, LP DAAC Project Scientist.
Analyzing Systems Using Data Dictionaries Systems Analysis and Design, 8e Kendall & Kendall 8.
Copyright © Allyn & Bacon 2008 Intelligent Consumer Chapter 14 This multimedia product and its contents are protected under copyright law. The following.
Report to Plenary H. K. (Rama) Ramapriyan NASA/GSFC Clyde Brown SSAI - NASA/LaRC Metrics Planning and Reporting (MPAR) WG 9 th Earth Science Data Systems.
AF4 – Using investigative approaches 1. AF4 – Using investigative approaches LEVEL 3 2.
Team working in distributed environments M253 Communicating, Cooperating & Collaborating on Line Faculty of Computer Studies Arab Open University Kuwait.
Science Science is  The process of trying to understand the world  A way of knowing, thinking and learning  Based on observation and experimentation.
QILT Reporting Approach and 2015 SES progress Quality Indicators for Learning and Teaching.
PDS Geosciences Node Page 1 Archiving LCROSS Ground Observation Data in the Planetary Data System Edward Guinness and Susan Slavney PDS Geosciences Node.
Clerical & Support Staff Introduction to Moving ON Audits & Clinical Indicators.
5 February 2016M253 Team working in distributed environments 1.
Science Review Panel Meeting Biosphere 2, Tucson, AZ - January 4-5, 2011 Vegetation Phenology and Vegetation Index Products from Multiple Long Term Satellite.
ESO and the CMR Life Cycle Process Winter ESIP, Jan 2015 ESDIS Standards Office (ESO) Yonsook Enloe Allan Doyle Helen Conover.
Public Libraries Survey Data File Overview. What We’ll Talk About PLS: Public Libraries Survey State level data Public library data (Administrative Entities)
Public Libraries Survey Data File Overview. 2 What We’ll Talk About PLS: Public Library Survey State level data Public library data (Administrative Entities)
AMPS : International Support for Antarctic Science and Activities Kevin W. Manning Jordan G. Powers National Center for Atmospheric Research.
Introduction to Earth Science Section 1 SECTION 1: WHAT IS EARTH SCIENCE? Preview  Key Ideas Key Ideas  The Scientific Study of Earth The Scientific.
ICAD3218A Create User Documentation.  Before starting to create any user documentation ask ‘What is the documentation going to be used for?’.  When.
SOFTWARE TESTING AND QUALITY ASSURANCE. Software Testing.
Data Management: Data Analysis Types of Data Analysis at USGS There are several ways to classify Data Analysis activities at USGS, and here are some of.
Advanced Higher Computing Science
GCE Software Systems Development
Section 2: Science as a Process
Measuring Data Quality
AICT5 – eProject Project Planning for ICT
Presentation transcript:

MPARWG Deborah K Smith DISCOVER MEaSUREs Project Remote Sensing Systems

 Dr Frouin’s Criteria Table  Strawman tables in Greg Hunolt’s background notes  Telecon in August with suggestion of answering a series of questions as a means to determine quality. Do the answers to questions lead to a metric of overall quality?  Peter’s presentation this morning on quality feedback from survey and his opinions. Oct ESDSWG Mtg, New Orleans

 A "straw-man proposal", a brainstormed simple proposal intended to generate discussion of its disadvantages and to provoke the generation of new and better proposals. Often, a straw man document will be prepared by one or two people prior to kicking off a larger project. In this way, the team can jump start their discussions with a document that is likely to contain many, but not all the key aspects to be discussed. -Wikipediastraw-manbrainstormed Oct ESDSWG Mtg, New Orleans

 A project with heritage. We’ve been producing microwave ocean data products and making them available since 1996 under NASA Pathfinder funding.  We have produced, and continue to produce and distribute many ocean products. Each product is assessed for quality before release to the public.  We are currently releasing F16 and F17 SSMIS ocean products. I will use these as an example in this talk. Oct ESDSWG Mtg, New Orleans

1. Is the data set complete? 2. Are any gaps confirmed and documented? 3. Are the data acceptably intercalibrated to previous data? 4. Do the data products look as expected? (has a human eye checked the data set?) 5. Are overall statistics within expected range? 6. Are statistics for sub-regions or sub-time frames consistent with expectations and previous data?

7. Are comparison statistics with “truth” or other data (such as buoy, ship or model winds) within expected range? 8. Is the data format consistent with previous data and what users expect? 9. Are files read correctly by read routines, and if not, have changes been made? 10. Have we completed or updated product documentation?

11. Have we informed the users of file format, processing steps, algorithm changes/specifics? 12. Is a data validation file produced? 13. Has all web and ftp text been updated? 14. Have images been made and do the web tools to display them work correctly? 15. Have we described to users the differences to expect? 16. Is the data product like any other available, and if so, how does it compare?

17. Who have we created this data set for and will it meet those user needs? 18. Do our tools work on the new data products? 19. Are any new tools needed? 20. What advances have occurred since we last asked these questions and should we change?

 Have we checked the data within the extended time series?  Are there any spurious trends in the data?

 But what do the answers to these questions mean?  What does the program want to know?  Is an external body needed to determine the quality? If so, who? (program? DAAC? Other scientists? General public?) Oct ESDSWG Mtg, New Orleans

Oct ESDSWG Mtg, New Orleans Science Quality Level To what degree have the data been validated? To what degree do the data fit within already existing products? To what degree is the data set complete and consistently processed? To what degree is the data set used in the community? To what degree are the data accurate and precise? 4Comparison by both project and other scientists find similar results Time series analysis demonstrate the quality of intercalibration Data set complete, with documentation of known gaps, and consistently processed, extends previous values Some are redistributing the data / Power users Measurements are both accurate and precise and derived values show expected and accepted ranges 3Comparison s to known references Extensive intercalibration performed Data set complete and consistently processed Many routine users Accuracy and precision have been demonstrated by many 2Comparison s to other data products Preliminary intercalibration performed Data set consistently processed but has gaps Many usersPreliminary point comparisons show data are accurate and precise 1No comparisons made by the project No intercalibration, but biases are known Data set has gaps and is not consistently processed Few usersNo assessment has been made

Oct ESDSWG Mtg, New Orleans Document Quality Level To what degree is the data format described and usable? To what degree are the algorithm and processing steps described? To what degree can a new user begin using the data? 4Data format is a commonly used standard accessible by 3 rd party software Peer-reviewed publication describes algorithm and data production Can be using the data within 15 minutes and can confirm the data are accurate 3Data format is described and read routines provided to users ATBD provided by data producer Can use the data accurately within a reasonable amount of time 2Data format is described Informal description provided with data Time required to understand how to use the data 1No data format information is provided No description provided Many hours and support s needed to make sense of the data

Oct ESDSWG Mtg, New Orleans Accessibility/ Support Services Quality How easy is it for users to read the data ? How easy is it for users to obtain the data? To what degree can the user get help? 4Widely used standard file format User can access data with clients such as OpenDAP Full support services available 3Limited use file format, some common tools can be used User can view and obtain the data using common tools User can the data producer 2Simple binary format with read routines User can view the data on the web Online FAQ 1Format described but no read routines available User must download and write own program to access data No help

Oct ESDSWG Mtg, New Orleans Overall Quality Level To what degree are users satisfied with data product? What is score from sub- tables? To what degree is the targeted community using the data** To what degree is the broader community using the data? 4Few questions to support services 10-12Significant useRoutinely used by journalists, students, applications 3Some questions about data 7-10Routine useUsed by journalists 2Questions about data and format 4-7Some useSome use by universities and researchers etc 1Many questions0-3No useNo use is evident ** this can be obtained from user metrics already being collected and from citation metrics

 Agreed-upon definitions of all terms  Understanding of how questions carry meaning across projects (if any)  An understanding of what Martha wants  An understanding of the value of this information and how to communicate it  Agreement on what the questions will be Oct ESDSWG Mtg, New Orleans