AmeriCorps Grantee Training Evaluation and Research September 11, 2014
Session Objectives Review policy context for evaluation Review evaluation requirements Share examples of grantee evaluation practices Share resources
Federal/State Policy Context Federal Evaluating Programs Requirements and Procedures (CFR 45 §§ ) CNCS finalized these regulations July 8, 2005 State Program Design Policy E.3—Evaluation: Competitive programs need to follow evaluation requirements as described in the AmeriCorps Regulations…Formula programs are not required to conduct an evaluation. Policy adopted 2009
Definition of Performance Measurement Performance measurement is the process of systematically and regularly collecting and monitoring data related to the direction of observed changes in communities, participants (members), or end beneficiaries receiving your program’s services. It is intended to provide an indication of your program’s operations and performance. §
Definition of Evaluation …evaluation uses scientifically-based research methods to assess the effectiveness of programs by comparing the observed program outcomes with what would have happened in the absence of the program. §
Comparing Performance Measurement and Evaluation Performance MeasurementEvaluation What is?A system of tracking progress in accomplishing specific pre-set targets (activities, outputs, and/or outcomes) A formal scientific process for collecting, analyzing, and interpreting data about how well a program was implemented (process evaluation) or how effectively the program accomplished desired outcomes/impacts (outcome/impact evaluation) Why is it typically used?To gauge program delivery, quality, participant satisfaction and engagement; to improve products, services, and efficiency; to inform/enhance decision making, and support planning and program development To assess program effectiveness and determine whether the program is responsible for changes found How does it work?Monitors a few vital signs related to program objectives, outputs, and/or outcomes Comprehensively examines programs using systematic, objective, and unbiased procedures in accordance with social science research methods and research designs Who typically does it?Program staffAn experienced researcher (often external to the program) When is it done?Ongoing basisPeriodically
Building Evidence of Effectiveness
Evaluation Study Designs and Causal Impact Evaluation Study DesignsComparisonAbility to make statements about causal attribution Experimental DesignRandomly Assigned Groups Quasi-Experimental Design Studies Statistically Matched Groups Non-Experimental Design Studies Not Statistically Matched Groups or Group
Evaluation Study Designs & Causal Impact Experimental Design Studies Quasi-Experimental Design Studies Non-Experimental Design Studies Random assignment to treatment and control groups Controls for differences b/w the two groups so differences in outcomes can be attributed to whether or not individuals participated in the program Uses two groups, but no random assignment, often due to practical considerations Carefully match the two groups at beginning of evaluation to be confident they are basically the same Subsequent observed differences b/w groups will be due to whether or not individuals participated in program services Do not meet the requirements for experimental or quasi- experimental designs Can also include process and implementation evaluations that make sure plans are followed
Evaluation Study Designs and CNCS Requirements * Fulfills CNCS evaluation requirement for large grantees if a reasonable comparison group is identified and appropriate matching/propensity scoring is used in the analysis.
Evaluation Requirements Competitive Programs Over $500,000 Under $500,000 Formula Programs
Evaluation Requirements Competitive ($500K +) Independent evaluation Quasi- or experimental design study Cover at least one year of operation Submit evaluation with any application for competitive funds §
Evaluation Requirements Competitive (less than $500K) Internal evaluation Quasi- or experimental design (optional) Cover at least one year of operation Submit evaluation with any application for competitive funds §
Competitive Evaluation Requirements What is Due When?
Evaluation Requirements Formula Programs State commissions establish evaluation requirements for formula grantees CV does not require formula grantees to conduct evaluations (per 2014 Program Design Policies)
Strengthening the Evidence Base Formula and Competitive (less than $500K) Select a study design most appropriate for developmental phase of AmeriCorps program (process, implementation, outcome or impact) Encourage use of experimental or quasi-experimental designs but not required Assure program design is based on or adapted from a similar program that has evidence from an evaluation
Evaluation & Grant Review 2014 applications scored and placed into one of four tiered evidence levels. Applications with a stronger evidence base received more points. Evaluation reports submitted are assessed in terms of the quality of the evaluation designs and the studies’ findings. These assessments may be used to inform CVs/CNCS’s consideration of the selection criteria and for the purpose of clarifying or verifying information in the proposals. Applicants that failed to submit evaluation reports as required had points removed.
Research and Evaluation Grantee Panel Stephanie Biegler Chief Program Officer Birth & Beyond/Child Abuse Prevention Council Atalaya Sergi Deputy Director Jumpstart CA/Jumpstart Julie McClure and Sara Sitch Director, Assistant Director CaSERVES Volunteer Infrastructure Program/ Napa County Office of Education Matt Aguiar Chief of Staff Reading Partners CA/Reading Partners
Questions?
Resources Available Evaluation FAQs: Electronic CFR: Select Agency List from left navigation bar Scroll down to CNCS, select 45 CFR Chapters XII, XXV Scroll down to XXV, select Select 2522 Select Subpart E-Evaluation Requirements National Knowledge Network: americorps#.VAowPbdOXcshttps:// americorps#.VAowPbdOXcs CV Program Officer
THANK YOU!