Presentation is loading. Please wait.

Presentation is loading. Please wait.

CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.

Similar presentations


Presentation on theme: "CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National."— Presentation transcript:

1 CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National

2 Federal Policy Context Presidential AdministrationsFederal Guidance President Clinton (1993 – 2001) Government Performance and Results Act of 1993 (GPRA) President Bush (2001 – 2009) Program Assessment Rating Tool President Obama (2009 – 2017) GPRA Modernization Act of 2010 Office of Management and Budget Memoranda M-10 -01 Increased Emphasis on Program Evaluation M-12-14 Use of Evidence and Evaluation in the 2014 Budget M-13-17 Next Steps in the Evidence and Innovation Agenda M-14-07 Fiscal Year 2016 Budget Guidance, Evidence and Evaluation

3 Federal Evidence Initiatives Tiered Evidence Initiatives –Direct more resources to initiatives with strong evidence –Study and scale the most promising program models –CNCS Social Innovation Fund, Department of Education Investing in Innovation Fund (i3) Pay for Success –Federal funds invested only after programs demonstrate results Evidence Clearinghouses –Repositories of evidence on existing program models –CNCS Evidence Exchange, Department of Education What Works Clearinghouse

4 CNCS Evaluation Requirements Code of Federal Regulations (45 C.F.R. §§2522.500-.540 and.700-740) Finalized July 8, 2005

5 CNCS Long-Term Evaluation Agenda Implement research and evaluation to: Advance the agency’s mission Accommodate agency-wide evidence priorities Illuminate the agency’s most effective policies, programs and practices

6 -Gather evidence supporting the intervention- Design/Adopt a strong program -Develop a Logic Model -Create Implementation Materials -Pilot implementation -Document program process(es) -Ensure fidelity in implementation -Evaluate program’s quality and efficiency -Establish continuous process improvement protocols [Performance Measures - Outputs] -Develop indicators for measuring outcomes -Conduct pre-/post- intervention evaluation to measure outcomes -Conduct process evaluation [Performance Measures - Outcomes] -Examine linkage between program activities and outcomes -Perform multiple pre- and post-evaluations (time series design) -Conduct independent (unbiased) outcome evaluation(s) -Conduct meta-analysis of various studies -Establish causal linkage between program activities and intended outcomes/impact (e.g. Conduct quasi- experimental evaluation using a comparison group, evaluation with random assignment (RCT), regression analysis, or other appropriate study design) -Conduct Multiple independent evaluations using strong study designs -Measure cost effectiveness compared to other interventions addressing same need Identify a strong program design Attain strong evidence of positive program outcomes Assess program’s outcomes Ensure effective implementation Obtain evidence of positive program outcomes Building Evidence of Effectiveness Evidence Informed Evidence Based

7 AmeriCorps Grant Making NOFO –Levels of Evidence– More points for stronger evidence –Theory of Change – Effectiveness of intervention Evaluation requirements and plans are not scored CNCS provides feedback on evaluation reports and evaluation plans after funding decisions have been made

8 Common Challenges Staff Capacity –Understanding evaluation requirements and how to meet them –Hiring/communicating with evaluators –Fear Cost –Average amount budgeted was $10,000 –Median amount budgeted was $3,000 –Realistic evaluation costs are 15-25% of total program budget Timing –Year 1: Planning; Engage external evaluator if applicable –Year 2: Conduct Evaluation –Year 3: Analyze and Complete Report; Submit to CNCS

9 Challenges, continued Data collection systems are not adequate –If a program is struggling to collect performance measurement data, it will struggle with evaluation –Focus on setting up solid data collection systems in first three years of grant Program model is not clearly defined –Multiple, loosely defined interventions –Lack of standardization in program delivery across sites –Multi-site, multi-focus intermediaries

10 Successful Evaluation Approaches Focus the evaluation on one (or a few) key research questions Budget adequately Hire an external evaluator you are comfortable with and stay engaged in the process Plan your evaluation with the goal of using results to improve your program

11 Successful Approaches, continued Use existing data collection instruments, processes or administrative data to lower data collection costs Build a long-term research agenda designed to increase evidence base over time

12 CNCS Evaluation Resources Evaluation Core Curriculum –Ongoing webinar series –Courses and other resources online: http://www.nationalservice.gov/resources/evaluation http://www.nationalservice.gov/resources/evaluation Technical Assistance

13 Other Resources Results for America: http://results4america.org/http://results4america.org/ The Arnold Foundation: http://www.arnoldfoundation.org/initiatives/ http://www.arnoldfoundation.org/initiatives/ Grantmakers for Effective Organizations: http://www.geofunders.org/smarter- grantmaking/learn-for-improvement http://www.geofunders.org/smarter- grantmaking/learn-for-improvement


Download ppt "CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National."

Similar presentations


Ads by Google