Download presentation
Presentation is loading. Please wait.
Published byGilbert Mitchell Modified over 8 years ago
1
9/20/2016 1 Annual Inventory: Began in FY09 Basic program data across portfolio (number of students, teachers, volunteers) Analysis done in spring of each year (FY11 underway) Four-Part Evaluation Strategy: Baseline Metrics / Output Measures: All programs currently collect a set of baseline "measures of performance" or "output measures;” Metrics Tools Participant Data Pre and Post Surveys Retention Rates Longitudinal Studies Collaboration Rates Hiring Surveys Metrics and Evaluation 1 Impact Metrics / Measures of Effectiveness: Beginning in FY12 all programs in which the Navy invests more than $200,000 annually will collect a set of Impact Measures; Return on Investment: In FY12, return on Investment calculations will be developed for programs with the greatest strategic value and highest investment levels; and Formal Program Evaluation: Beginning in FY13, one or two programs will be selected for formal evaluation annually, according to generally accepted evaluation methods.
2
9/20/2016 2 Element 1. Baseline Metrics: Measures of [Program] Performance/Inputs and Outputs All programs within the Navy's STEM portfolio collect a baseline set of "measures of performance“ or "output" measures. These metrics do not measure student/participant "change" (learning, awareness, academic focus/graduation, Navy interest, etc.) Inputs Program Purpose and Goals Target Audiences Annual Budget Participant Numbers and demographics Average contact hours per participant [Volunteer Participation] [Period of Performance] [Program Duration] [Topic Area] [Desired Program Outcomes] [Mode of Participation (face to face or online)] Outputs Program Purpose and Goals Curriculum/Content Resources # of updates and/or # of new curriculum resources developed Annual Budget Program self-sustaining from outside sponsors? Program cost per student /Decrease in cost Ratio of admin cost to total cost of program Participant Numbers & Demographic Information: Increase/Decrease (n/%) Repeating Students Repeating Students in Next Program Level Volunteer Numbers: Increase/Decrease (n/%) –Naval –Non-naval
3
9/20/2016 3 Element 2: Impact Metrics: Measures of Effectiveness for Participants All Naval programs, which are supported by at least $200,000 of direct Naval investment, will be required (if feasible) to collect Impact Measures, or Measures of Effectiveness (MOEs). These measures help to determine whether or not the program is having an impact in terms of achieving its stated goal.. Outputs: Increase/Decrease Program Purpose and Goals Curriculum/Content Resources # of updates and/or # of new curriculum resources developed Annual Budget Program self-sustaining from outside sponsors? Program cost per student /Decrease in cost Ratio of admin cost to total cost of program Participant Numbers & Demographic Information: Increase/Decrease (n/%) Repeating Students Repeating Students in Next Program Level Volunteer Numbers: Increase/Decrease (n/%) –Naval –Non-naval Impacts: Change Measurement / evaluation standards employed Tracking data evaluated and stored in database # new approaches / interventions developed found to be effective and adopted at scale Survey results used to improve learning Quantitative evidence of achieving stated goals Participating Students Pre-Survey –# students w/ knowledge of S/E –# students w/ awareness of what S&Es do –# students w/ aspiration to become S or E –# students w/ aspiration to work at Navy Lab Post-Survey –% increase in knowledge of S/E –% increase in awareness of what S&Es do –% students who express interest in STEM –# students who can perform task(s) taught –% increase in students aspiring to become S/E –% increase in students aspiring to work at Navy Lab –[# students w/ improved test scores]
4
9/20/2016 4 Element 3: Return on Investment ROI basic calulation: o Earnings (output), divided by investment (input) o Rarely sufficient to determine "value" o Depends on many variables, including program goals, what constitutes success, timing, comparative value of alternative investment. ROI for Naval Education and Outreach programs One output measure is # participants completing program divided by program investment (direct/indirect costs) –Many Navy STEM programs have data for this simple ROI calculation. Many programs have varying goals and/or definitions of success, so other output measures (tied to program goals) will need to be identified A formula for determining the Navy's Return on Investment will be developed for a subset of the Navy's STEM portfolio. –Only a handful of existing programs currently collect data sufficient to support a simple ROI calculation based on Output Measures. –These programs will serve as test cases for developing effective ROI formulas for selected K- 12 programs and for Higher Education programs.
5
9/20/2016 5 Element 4: Formal Program Evaluation Detailed evaluation of program relative to its own metrics and to "benchmark" metrics of other, similar programs These tend to be expensive and time consuming Select 1-2 major programs annually Model is National Research Council evaluation NASA's Precollege Education Program in 2008: Evaluation of NASA’s K-12 education program and its related projects is challenging and requires significant resources and expertise in evaluation. The program goals are broad, and the projects are diverse in their scope and design. The goal of engaging students in STEM activities is particularly challenging for evaluation because “engagement” is difficult to measure, and it requires tracking over time. In addition, NASA’s K-12 education projects, in an attempt to address local or regional issues, often vary from location to location, and evaluation design must take that variation into account. [p. 90] 5
6
9/20/2016 6 2012 Timeline to Guide Evaluation Activities Jan/Feb: Revamp on S2S Database – and FY 11 Datacall Round 1 testing Impact Measure Survey tools Mar/Apr: Initial report FY11 Output Measures Round 2 testing of Impact Measure survey tools May/June: Application of Impact Measure survey tools of initial 29 projects July/Aug: Initial report of Impact Measures for FY 11 Sept/Oct: Application Impact Measure survey tools on remaining portfolio of programs >$200,000 Nov/Dec: Full report of Impact Measures Jan/Feb: Revamp on S2S Database – and FY 11 Datacall Round 1 testing Impact Measure Survey tools Mar/Apr: Initial report FY11 Output Measures Round 2 testing of Impact Measure survey tools May/June: Application of Impact Measure survey tools of initial 29 projects July/Aug: Initial report of Impact Measures for FY 11 Sept/Oct: Application Impact Measure survey tools on remaining portfolio of programs >$200,000 Nov/Dec: Full report of Impact Measures Five-year outlook: Sufficient longitudinal data to: –Evaluate fully: Faculty programs, Higher Education programs, High School internship and work study programs (SEAP, SCEP, STEP) – Evaluate partially: Programs such as Youth Exploring Science (YES). Ten-year outlook: Sufficient longitudinal data to: –Fully evaluate the Navy's signature K- 12 programs, such as Sea Perch, Iridescent and YES.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.