Presentation is loading. Please wait.

Presentation is loading. Please wait.

And 170+ member organizationsESIP is supported by | #ESIPfed INDEPENDENT TECHNOLOGY ASSESSMENT WITHIN THE ESIP TESTBED Annie Burgess.

Similar presentations


Presentation on theme: "And 170+ member organizationsESIP is supported by | #ESIPfed INDEPENDENT TECHNOLOGY ASSESSMENT WITHIN THE ESIP TESTBED Annie Burgess."— Presentation transcript:

1 and 170+ member organizationsESIP is supported by www.esipfed.org @ESIPfed | #ESIPfed INDEPENDENT TECHNOLOGY ASSESSMENT WITHIN THE ESIP TESTBED Annie Burgess 1, Erin Robinson 1 and John Graybeal 2 1 ESIP, 2 Stanford University

2 EVALUATION NEED Earth sciences technologies benefits from diverse perspectives and end-user input.

3 BACKGROUND NASA’s Advanced Information Systems Technology (AIST) Program supports information technology development. A Technology Readiness Level (TRL) is used to assess project maturity. TRL assessment is internal only. Research to operations transitions have suffered from surprises in technology adoption.

4 BACKGROUND NASA’s Advanced Information Systems Technology (AIST) Program supports information technology development. A Technology Readiness Level (TRL) is used to assess project maturity. TRL assessment is internal only. Research to operations transitions have suffered from surprises in technology adoption.

5 ESIP/AIST COLLABORATION ESIP was tasked with: Providing an independent assessment of AIST project TRL. Identifying opportunities/roadblocks for project infusion within the broader Earth science community.

6 EVALUATION GOALS Achieve consistency, traceability and defensibility of evaluation results. Be recognized as comprehensive and fair. Provide a valuable experience for PIs and project evaluators.

7 EVALUATION COMPONENTS

8

9 TIMELINE Lead TimeAug 1Aug 15Sep 1Sep 15Oct 1Oct 15Nov 1Nov 15Dec 1 Project Selection PI Communications Evaluator Selection Intro. Telcons + Planning Testing Period Evaluators Final Report Prep. ESIP Final Report Prep. Reports Submitted to AIST AIST ESIP Evaluators

10 WORKFLOW ESIP Edits and Submits Reports to AIST Evaluators Submit Final Report Content Evaluators Fill Out Evaluation Structure Evaluators Carry Out Testing Evaluators/AIST PIs Create Test Plan ESIP Selects Evaluators AIST Selects Projects 8 Evaluators: - 1 Private Sector - 1 Non-Profit - 2 Federal Government - 4 Academic 3 Projects - ESIP solicits suggestions from PI. - ESIP reaches out to community.

11 WORKFLOW ESIP Edits and Submits Reports to AIST Evaluators Submit Final Report Content Evaluators Fill Out Evaluation Structure Evaluators Carry Out Testing Evaluators/AIST PIs Create Test Plan ESIP Selects Evaluators AIST Selects Projects Access Restrictions Software Readiness 1-2 Telecons

12 WORKFLOW ESIP Edits and Submits Reports to AIST Evaluators Submit Final Report Content Evaluators Fill Out Evaluation Structure Evaluators Carry Out Testing Evaluators/AIST PIs Create Test Plan ESIP Selects Evaluators AIST Selects Projects Checklist for software development best practices

13 CHECKLIST FOR SOFTWARE DEVELOPMENT BEST PRACTICES Supportability Portability Testability Accessibility Community Governance Licensing Copyright Installability Buildability Learnability Documentation Understandability Friendliness 200+ questions about software development

14 Supportability Portability Testability Accessibility Community Governance Licensing Copyright Installability Buildability Learnability Documentation Understandability Friendliness 200+ questions about software development CHECKLIST FOR SOFTWARE DEVELOPMENT BEST PRACTICES

15 WORKFLOW ESIP Edits and Submits Reports to AIST Evaluators Submit Final Report Content Evaluators Fill Out Evaluation Structure Evaluators Carry Out Testing Evaluators/AIST PIs Create Test Plan ESIP Selects Evaluators AIST Selects Projects Content was KEY! - AIST Reviews Reports - PIs receive Final Reports

16 EXAMPLE FINAL REPORT

17 FEEDBACK FROM EVALUATORS + Multiple communication choices. + Compensation. + Selecting evaluators from potential infusion sites. -Short evaluation period lead time. -Software best-practices checklist overly complex.

18 LESSONS LEARNED Start evaluation with a technical exchange meeting. – Motivation – Challenges (Technology access) – Infusion Potential Diversify evaluation teams with domain and technical expertise. Give PI option to review report BEFORE going to AIST. Manage privacy on PI/Evaluator communications. Each evaluation will be different, coordination and communication are KEY!

19 OUTLOOK Provide the Earth sciences community with a novel, needed evaluation framework to improve technology development and infusion potential.

20 PARTICIPANTS Evaluators PIs

21 COMMUNITY FEEDBACK

22 THANK YOU! Contact: annieburgess@esipfed.org and 170+ member organizationsESIP is supported by www.esipfed.org @ESIPfed | #ESIPfed


Download ppt "And 170+ member organizationsESIP is supported by | #ESIPfed INDEPENDENT TECHNOLOGY ASSESSMENT WITHIN THE ESIP TESTBED Annie Burgess."

Similar presentations


Ads by Google