AmeriCorps Grantee Training Evaluation and Research September 11, 2014.

Slides:



Advertisements
Similar presentations
Performance Measurement and Evaluation 2/8/2014 Performance Measurement and Evaluation 1 Performance Measurement and Evaluation What are they? How are.
Advertisements

AmeriCorps State and National Evaluation Requirements August 16,
Performance Measurement: Defining Results Tutorial #1 for AmeriCorps
Options appraisal, the business case & procurement
Evaluation Capacity Building Identifying and Addressing the Fields Needs.
Explanation of slide: Logos, to show while the audience arrive.
“Scientifically Based Evaluation Methods” Presented by Paula J. Martin COE Conference, September 13, 2004.
2014 AmeriCorps State and National Symposium Omni Circular The CNCS Basics.
The New Rules Forum National Conference 2005 – Washington, DC.
A Community Idea For A Better Future: The Pulaski County Commission on Children and Families John Bumgarner Project Associate, Institute for Policy Outreach.
Alaska Native Education Program (ANEP) Technical Assistance Meeting September 2014 Sylvia E. Lyles Valerie Randall Almita Reed.
ECVET WORKSHOP 2 22/23/24 November The European Quality Assurance Reference Framework.
Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training.
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
Purpose of the Standards
Standards and Guidelines for Quality Assurance in the European
Developing Earmark Grant Performance Measures: Grant Proposal Section 3 Deanna Khemani.
Theory of Change and Evidence
Evaluation of Math-Science Partnership Projects (or how to find out if you’re really getting your money’s worth)
How to Develop a Project Evaluation Plan Pat Gonzalez Office of Special Education Programs
How to Develop the Right Research Questions for Program Evaluation
2014 AmeriCorps External Reviewer Training
Quality assurance in IVET in Romania Lucian Voinea Mihai Iacob Otilia Apostu 4 th Project Meeting Prague, 21 st -22 nd October 2010.
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
ISTEP: Technology Field Research in Developing Communities Instructor: M. Bernardine Dias CAs: Sarah Belousov and Ermine Teves Spring 2009.
New Grantee/Staff Institute: Lessons Learned Peer Panel WELCOME! California AmeriCorps Training Crown Plaza, LAX– July 22-23, 2015.
Program Collaboration and Service Integration: An NCHHSTP Green paper Kevin Fenton, M.D., Ph.D., F.F.P.H. Director National Center for HIV/AIDS, Viral.
May 20, Purpose of the Self- Assessment Required by the Head Start Performance Standards (i)(1) Head Start Ac 2007 Head Start Act Section.
CNCS Evaluation Highlights Carla Ganiel, Senior Program and Project Specialist AmeriCorps State and National.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 10, 2011.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
Carrie E. Markovitz, PhD Program Evaluation: Challenges and Recommendations July 23, 2015.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Rwanda MCC Threshold Program CIVIL SOCIETY STRENGTHENING PROJECT Cross-Cutting Advocacy Issues Data Collection Monitoring and Evaluation.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
HECSE Quality Indicators for Leadership Preparation.
High Quality Performance Measures AmeriCorps State and National Sustainability Planning.
Preliminary Results – Not for Citation Strengthening Institutions Program Webinar on Competitive Priority on Evidence April 11, 2012 Note: These slides.
Math and Science Partnership Program Approaches to State Longitudinal Evaluation March 21, 2011 San Francisco MSP Regional Meeting Patty O’Driscoll Public.
Evaluating Impacts of MSP Grants Hilary Rhodes, PhD Ellen Bobronnikov February 22, 2010 Common Issues and Recommendations.
Using Individual Project and Program Evaluations to Improve the Part D Programs Dr. Herbert M. Baum.
Evaluating Ongoing Programs: A Chronological Perspective to Include Performance Measurement Summarized from Berk & Rossi’s Thinking About Program Evaluation,
Begin at the Beginning introduction to evaluation Begin at the Beginning introduction to evaluation.
Evaluating Impacts of MSP Grants Ellen Bobronnikov Hilary Rhodes January 11, 2010 Common Issues and Recommendations.
2016 AmeriCorps State: Grant Development October 15-30, 2015 Marisa Petreccia, Kate Pisano & Nancy Stetter.
1 Module 3 Designs. 2 Family Health Project: Exercise Review Discuss the Family Health Case and these questions. Consider how gender issues influence.
Evaluation Designs Adrienne DiTommaso, MPA, CNCS Office of Research and Evaluation.
Program Assessment. Before you get started Need to determine if the program can be evaluated or should be evaluated. Is the answer already available?
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
1 The project is financed from the European Union funds within the framework of Erasmus+, Key Action 2: Cooperation for innovation and the exchange of.
The Kirkpatrick Model organizational change Richard Lambert, Ph.D.
1 General Elements in Evaluation Research. 2 Types of Evaluations.
Monitoring and Evaluation in MCH Programs and Projects MCH in Developing Countries Feb 9, 2012.
SRC Participation in Policy Development 1 NATIONAL STATE REHABILITATION COUNCIL FORUM JUNE 25, 2013.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
SOLGM Wanaka Retreat Health and Safety at Work Act 2015 Ready? 4 February 2016 Samantha Turner Partner DDI: Mob:
1 DEMONSTRATION PROJECTS TO ENSURE STUDENTS WITH DISABILITIES RECEIVE A QUALITY HIGHER EDUCATION PROGRAM Performance Measurement, Program and Project Evaluation.
Session 7: Planning for Evaluation. Session Overview Key definitions:  monitoring  evaluation Process monitoring and process evaluation Outcome monitoring.
Selection Criteria and Invitational Priorities School Leadership Program U.S. Department of Education 2005.
Project: EaP countries cooperation for promoting quality assurance in higher education Maria Stratan European Institute for Political Studies of Moldova.
Strategic planning A Tool to Promote Organizational Effectiveness
Stages of Research and Development
Logic Models How to Integrate Data Collection into your Everyday Work.
Designing Effective Evaluation Strategies for Outreach Programs
Monitoring and Evaluation
Presentation transcript:

AmeriCorps Grantee Training Evaluation and Research September 11, 2014

Session Objectives  Review policy context for evaluation  Review evaluation requirements  Share examples of grantee evaluation practices  Share resources

Federal/State Policy Context  Federal Evaluating Programs Requirements and Procedures (CFR 45 §§ ) CNCS finalized these regulations July 8, 2005  State Program Design Policy E.3—Evaluation: Competitive programs need to follow evaluation requirements as described in the AmeriCorps Regulations…Formula programs are not required to conduct an evaluation. Policy adopted 2009

Definition of Performance Measurement Performance measurement is the process of systematically and regularly collecting and monitoring data related to the direction of observed changes in communities, participants (members), or end beneficiaries receiving your program’s services. It is intended to provide an indication of your program’s operations and performance. §

Definition of Evaluation …evaluation uses scientifically-based research methods to assess the effectiveness of programs by comparing the observed program outcomes with what would have happened in the absence of the program. §

Comparing Performance Measurement and Evaluation Performance MeasurementEvaluation What is?A system of tracking progress in accomplishing specific pre-set targets (activities, outputs, and/or outcomes) A formal scientific process for collecting, analyzing, and interpreting data about how well a program was implemented (process evaluation) or how effectively the program accomplished desired outcomes/impacts (outcome/impact evaluation) Why is it typically used?To gauge program delivery, quality, participant satisfaction and engagement; to improve products, services, and efficiency; to inform/enhance decision making, and support planning and program development To assess program effectiveness and determine whether the program is responsible for changes found How does it work?Monitors a few vital signs related to program objectives, outputs, and/or outcomes Comprehensively examines programs using systematic, objective, and unbiased procedures in accordance with social science research methods and research designs Who typically does it?Program staffAn experienced researcher (often external to the program) When is it done?Ongoing basisPeriodically

Building Evidence of Effectiveness

Evaluation Study Designs and Causal Impact Evaluation Study DesignsComparisonAbility to make statements about causal attribution Experimental DesignRandomly Assigned Groups Quasi-Experimental Design Studies Statistically Matched Groups Non-Experimental Design Studies Not Statistically Matched Groups or Group

Evaluation Study Designs & Causal Impact Experimental Design Studies Quasi-Experimental Design Studies Non-Experimental Design Studies Random assignment to treatment and control groups Controls for differences b/w the two groups so differences in outcomes can be attributed to whether or not individuals participated in the program Uses two groups, but no random assignment, often due to practical considerations Carefully match the two groups at beginning of evaluation to be confident they are basically the same Subsequent observed differences b/w groups will be due to whether or not individuals participated in program services Do not meet the requirements for experimental or quasi- experimental designs Can also include process and implementation evaluations that make sure plans are followed

Evaluation Study Designs and CNCS Requirements * Fulfills CNCS evaluation requirement for large grantees if a reasonable comparison group is identified and appropriate matching/propensity scoring is used in the analysis.

Evaluation Requirements  Competitive Programs Over $500,000 Under $500,000  Formula Programs

Evaluation Requirements Competitive ($500K +)  Independent evaluation  Quasi- or experimental design study  Cover at least one year of operation  Submit evaluation with any application for competitive funds §

Evaluation Requirements Competitive (less than $500K)  Internal evaluation  Quasi- or experimental design (optional)  Cover at least one year of operation  Submit evaluation with any application for competitive funds §

Competitive Evaluation Requirements What is Due When?

Evaluation Requirements Formula Programs  State commissions establish evaluation requirements for formula grantees  CV does not require formula grantees to conduct evaluations (per 2014 Program Design Policies)

Strengthening the Evidence Base Formula and Competitive (less than $500K)  Select a study design most appropriate for developmental phase of AmeriCorps program (process, implementation, outcome or impact)  Encourage use of experimental or quasi-experimental designs but not required  Assure program design is based on or adapted from a similar program that has evidence from an evaluation

Evaluation & Grant Review  2014 applications scored and placed into one of four tiered evidence levels. Applications with a stronger evidence base received more points.  Evaluation reports submitted are assessed in terms of the quality of the evaluation designs and the studies’ findings. These assessments may be used to inform CVs/CNCS’s consideration of the selection criteria and for the purpose of clarifying or verifying information in the proposals.  Applicants that failed to submit evaluation reports as required had points removed.

Research and Evaluation Grantee Panel Stephanie Biegler Chief Program Officer Birth & Beyond/Child Abuse Prevention Council Atalaya Sergi Deputy Director Jumpstart CA/Jumpstart Julie McClure and Sara Sitch Director, Assistant Director CaSERVES Volunteer Infrastructure Program/ Napa County Office of Education Matt Aguiar Chief of Staff Reading Partners CA/Reading Partners

Questions?

Resources Available  Evaluation FAQs:  Electronic CFR: Select Agency List from left navigation bar Scroll down to CNCS, select 45 CFR Chapters XII, XXV Scroll down to XXV, select Select 2522 Select Subpart E-Evaluation Requirements  National Knowledge Network: americorps#.VAowPbdOXcshttps:// americorps#.VAowPbdOXcs  CV Program Officer

THANK YOU!