CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.

Slides:



Advertisements
Similar presentations
Performance Measurement: Not just for breakfast anymore Performance Measurement and the Strategic Plan Senior Corps Virtual Conference August 2012.
Advertisements

AmeriCorps State and National Evaluation Requirements August 16,
IMPLEMENTING EABS MODERNIZATION Patrick J. Sweeney School Administration Consultant Educational Approval Board November 15, 2007.
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
1 Department of State Program Evaluation Policy Overview Spring 2013.
Institutional Effectiveness (ie) and Assessment
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Decision Making Tools for Strategic Planning 2014 Nonprofit Capacity Conference Margo Bailey, PhD April 21, 2014 Clarify your strategic plan hierarchy.
Ray C. Rist The World Bank Washington, D.C.
P e r f o r m a n c e Measuring Results of Organizational Performance Lesson 1 Strategic Planning/ Performance Management Abstract.
Project HEART Transition Monitoring Challenges and Successes of Monitoring Health System Capacity August 12, 2010 Rozalin Wise.
Braiding Initiatives Steve Goodman, Michigan’s Integrated Behavior and Learning Initiative (MiBLSi) April 16, :00PM – 3:30PM
United States Department of Labor Employment & Training Administration EVALUATING TAACCCT Kristen Milstead Region 2 TAACCCT Roundtable July 29-31, 2014.
Introduction & Background Laurene Christensen National Center on Educational Outcomes National Center on Educational Outcomes (NCEO)
1 Program Performance and Evaluation: Policymaker Expectations 2009 International Education Programs Service Technical Assistance Workshop Eleanor Briscoe.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Theory of Change and Evidence
A SOUND INVESTMENT IN SUCCESSFUL VR OUTCOMES FINANCIAL MANAGEMENT FINANCIAL MANAGEMENT.
System of Environmental-Economic Accounting SEEA Implementation Guide and Diagnostic Tool and Suggested Structure for Assessment United Nations Statistics.
TIMELESS LEARNING POLICY & PRACTICE. JD HOYE President National Academy Foundation.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
Dr. G. Johnson, Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson.
Oregon’s Initiatives & Grants  Roadmap  Action  Responsibility Nancy Latini, Ph.D. Oregon Department of Education 2008.
Assessment & Evaluation Committee A New Road Ahead Presentation Dr. Keith M. McCoy, Vice President Professor Jennifer Jakob, English Associate Director.
1 Building District Capacity through State Monitoring of SIG The Massachusetts Model March 22, 2011 Presented by Karla Brooks Baehr, Deputy Commissioner.
Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
FHWA Reorganization Update Program Performance Management Standing Committee on Performance Management Meeting Detroit, MI October 14, 2011 Peter Stephanos.
Human resources reform: a people strategy for IFAD Liz Davis Director, Human Resources Division 8-9 July th Replenishment.
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
ADD Perspectives on Accountability Where are We Now and What does the Future Hold? Jennifer G. Johnson, Ed.D.
Using Evidence to Guide Social Policy and Spending Ron Haskins March 1, 2012.
Lower Mississippi Valley Joint Venture Management Board Meeting the Expectations and Challenges of Joint Venture Implementation Buras, Louisiana June.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Developing a Long-Term Research Agenda. Learning objectives By the end of this presentation, you will be able to: Recognize the importance of building.
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Program Evaluation for Nonprofit Professionals Unit 2: Creating an Evaluation Plan.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar April 25, 2012 Note: These slides are intended as.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
AmeriCorps Grantee Training Evaluation and Research September 11, 2014.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Week 12: Performance Management and Performance Budgeting Discuss Eureka exercise Review mid-term Conceptual Origins of Performance Management Government.
Best Practices in Demonstrating Evidence Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Carla Ganiel, AmeriCorps State and National.
ESEA Consolidated Monitoring Office of Federal Programs December 10, 2013.
Alexandra B. McGoldrick Director, Central Grants Office City of Bridgeport Bill Finch Mayor.
Using Evaluation Results and Building a Long-Term Research Agenda
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Assessment Design and its relationship to NARS and ILOs Arthur Brown Advisor to the Quality Assurance and Accreditation Project Republic of Egypt.
1 PERFORMANCE MEASURES: So WHY should I care? Remarks by Sherry Sterling Senior Advisor, OPP PREP Course 13 June 2005.
Quality Evaluations in Education Interventions 1 March 2016 Dr Fatima Adam Zenex Foundation.
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
How do you know your product “works”? And what does it mean for a product to “work”?
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
Office of School Turnaround Center for Accountability and Improvement, Ohio Department of Education 25 South Front Street, Columbus, Ohio
[Presentation location] [Presentation date] (Confirm ABT logo) Building Bridges and Bonds (B3): An introduction.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
1 Effectively Addressing Administrative Challenges of Implementing Title I, Part D Katie Deal, Rob Mayo, Liann Seiter, and Jake Sokolsky.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
How to Assess the Effectiveness of Your Language Access Program
Introduction to Program Evaluation
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
February 21-22, 2018.
Presentation transcript:

CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National

Federal Policy Context Presidential AdministrationsFederal Guidance President Clinton (1993 – 2001) Government Performance and Results Act of 1993 (GPRA) President Bush (2001 – 2009) Program Assessment Rating Tool President Obama (2009 – 2017) GPRA Modernization Act of 2010 Office of Management and Budget Memoranda M Increased Emphasis on Program Evaluation M Use of Evidence and Evaluation in the 2014 Budget M Next Steps in the Evidence and Innovation Agenda M Fiscal Year 2016 Budget Guidance, Evidence and Evaluation

Federal Evidence Initiatives Tiered Evidence Initiatives –Direct more resources to initiatives with strong evidence –Study and scale the most promising program models –CNCS Social Innovation Fund, Department of Education Investing in Innovation Fund (i3) Pay for Success –Federal funds invested only after programs demonstrate results Evidence Clearinghouses –Repositories of evidence on existing program models –CNCS Evidence Exchange, Department of Education What Works Clearinghouse

CNCS Evaluation Requirements Code of Federal Regulations (45 C.F.R. §§ and ) Finalized July 8, 2005

CNCS Long-Term Evaluation Agenda Implement research and evaluation to: Advance the agency’s mission Accommodate agency-wide evidence priorities Illuminate the agency’s most effective policies, programs and practices

-Gather evidence supporting the intervention- Design/Adopt a strong program -Develop a Logic Model -Create Implementation Materials -Pilot implementation -Document program process(es) -Ensure fidelity in implementation -Evaluate program’s quality and efficiency -Establish continuous process improvement protocols [Performance Measures - Outputs] -Develop indicators for measuring outcomes -Conduct pre-/post- intervention evaluation to measure outcomes -Conduct process evaluation [Performance Measures - Outcomes] -Examine linkage between program activities and outcomes -Perform multiple pre- and post-evaluations (time series design) -Conduct independent (unbiased) outcome evaluation(s) -Conduct meta-analysis of various studies -Establish causal linkage between program activities and intended outcomes/impact (e.g. Conduct quasi- experimental evaluation using a comparison group, evaluation with random assignment (RCT), regression analysis, or other appropriate study design) -Conduct Multiple independent evaluations using strong study designs -Measure cost effectiveness compared to other interventions addressing same need Identify a strong program design Attain strong evidence of positive program outcomes Assess program’s outcomes Ensure effective implementation Obtain evidence of positive program outcomes Building Evidence of Effectiveness Evidence Informed Evidence Based

AmeriCorps Grant Making NOFO –Levels of Evidence– More points for stronger evidence –Theory of Change – Effectiveness of intervention Evaluation requirements and plans are not scored CNCS provides feedback on evaluation reports and evaluation plans after funding decisions have been made

Common Challenges Staff Capacity –Understanding evaluation requirements and how to meet them –Hiring/communicating with evaluators –Fear Cost –Average amount budgeted was $10,000 –Median amount budgeted was $3,000 –Realistic evaluation costs are 15-25% of total program budget Timing –Year 1: Planning; Engage external evaluator if applicable –Year 2: Conduct Evaluation –Year 3: Analyze and Complete Report; Submit to CNCS

Challenges, continued Data collection systems are not adequate –If a program is struggling to collect performance measurement data, it will struggle with evaluation –Focus on setting up solid data collection systems in first three years of grant Program model is not clearly defined –Multiple, loosely defined interventions –Lack of standardization in program delivery across sites –Multi-site, multi-focus intermediaries

Successful Evaluation Approaches Focus the evaluation on one (or a few) key research questions Budget adequately Hire an external evaluator you are comfortable with and stay engaged in the process Plan your evaluation with the goal of using results to improve your program

Successful Approaches, continued Use existing data collection instruments, processes or administrative data to lower data collection costs Build a long-term research agenda designed to increase evidence base over time

CNCS Evaluation Resources Evaluation Core Curriculum –Ongoing webinar series –Courses and other resources online: Technical Assistance

Other Resources Results for America: The Arnold Foundation: Grantmakers for Effective Organizations: grantmaking/learn-for-improvement grantmaking/learn-for-improvement