Download presentation
Presentation is loading. Please wait.
Published byStephany Webster Modified over 9 years ago
1
Report to Plenary H. K. (Rama) Ramapriyan NASA/GSFC Clyde Brown SSAI - NASA/LaRC Metrics Planning and Reporting (MPAR) WG 9 th Earth Science Data Systems Working Group New Orleans, LA October, 2010
2
MPARWG Highlights Martha Maiden, NASA HQ, approved MPARWG recommendations for Citations Metrics and Quad-Chart format for Impact Metrics. Two major topics were discussed: MEaSUREs– DAACs Best Practices Measuring Product Quality MPARWG Regular Business: Disposition of 2009 Action Items Summary of FY2010 Metrics Reporting – by Projects and to NASA HQ ACCESS Project Metrics (Steve Berrick, NASA HQ) Service Metrics Case EMS Update (Kevin Murphy) Metrics example from a MEaSUREs Project (Indrani Kommareddy) Implementation of Citation Metrics Action Items from 2010 Meeting
3
MEaSUREs – DAACs Best Practices Eight DAACs reported on their progress working with MEaSUREs projects from which they will receive products for archive and distribution. All reported good progress in coordinating on transition of products to DAACs. The earlier the project and DAAC began coordinating, the better the process worked, e.g. with regard to product formats and metadata standards. Some difficulties were reported stemming from late designation of project-DAAC pairing. Three subgroups discussed recommendations for ways to improve transition process – now and for future programs. A key recommendation was to establish the project – DAAC partnership as early as possible In some cases this could be with DAACs having Co-I’s or collaborators Information about DAACs should be included in calls for proposals Responsibilities of project and DAAC should be made clear from the outset and appropriate resources provided. Adoption of widely used standard formats for products and standards for metadata will promote both interoperability and broader use of the MEaSUREs products. Next Steps: Information presented by the DAACs, report-outs from the subgroups, notes from the discussions will be incorporated into a draft white paper that will conclude with draft recommendations for NASA HQ. The draft will be circulated to projects and DAACs for review and comment, and will be finalized and sent to NASA HQ.
4
Measuring Product Quality MEaSUREs projects’ goal is production of high quality Earth System Data Records (ESDRs). “Quality” includes usability as well as literal science quality; usability includes documentation, formatting, support, etc. The goal of program level product quality metrics is to measure, for NASA HQ, a project’s progress towards meeting its goals, and the overall progress of the MEaSUREs program. This meeting followed a telecon held on August 17, 2010 that concluded by outlining an approach to developing product quality metrics. Decide on criteria for assessing quality (starting with list suggested by Robert Frouin). Develop questions whose answers would indicate progress on each criteria. Develop high level metrics based on responses to those questions. Kamel Didan and Deborah Smith presented: A theoretical elaboration of the approach (Kamel) and a strawman example based on an actual project’s experience (Deborah). The example posed questions and a range of possible answers for three categories of quality: intrinsic science quality, documentation quality, and quality of accessibility and supporting services.
5
Measuring Product Quality, Continued A wide-ranging discussion ensued, raising many questions and concerns: Whether product quality metrics would measure status of project’s work rather than product quality per se (actually both). Some strawman metrics seemed to be measures of DAACs (accessibility and support) rather than projects (actually both, since projects have to deliver products, documentation that the DAACs can make readily accessible and support effectively). How product quality metrics would be used by NASA HQ (not to compare projects, but to measure progress toward goals for projects and program). The group decided to review the questions in Deborah Smith’s example, modifying some, deleting some as inappropriate. The group will consider converting the questions to a yes/no checklist format, with the progression of yes’s over time measuring progress. Next Steps: A summary of the discussion will be prepared and circulated to projects and DAACs, and a follow-up telecon will be scheduled to continue work. The final goal remains development of program level product quality metrics.
6
Action 2010-1: Greg Hunolt to draft white paper on MEaSUREs – DAACs Best Practices, provide draft to MPARWG co-chairs. Action 2010-2: Co-Chairs to refine draft, circulate to projects and DAACs for review and comment. Action 2010-3: Rama to send final recommendations to NASA HQ Action 2010-4: Greg Hunolt to prepare a summary of the Product Quality Metrics discussion provide draft to MPARWG co-chairs. Action 2010-5: Co-chairs to refine and circulate to projects and DAACs and scheduled a telecon to continue the work. Action 2010–6: Conduct telecon with ACCESS projects to define suitable metrics Action 2010-7: Workout Quad chart format for ACCESS projects Action 2010-8: All –– provide citation metrics by April 2011 Action 2010-9: MCT/E-Books team – implement method to accept citation metrics Action Items from 2010 MPARWG
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.