Serve DC Training 9/28/11. Theories of Change and Logic Models Evidence Performance Measurement 101 Reviewing Performance Measures eGrants Tips Serve.

Slides:



Advertisements
Similar presentations
Designing Effective Action for Change
Advertisements

Evidence: What It Is And Where To Find It Evidence: What It Is and Where to Find It How evidence helps you select an effective intervention.
Evidence 2/8/2014 Evidence 1 Evidence What is it? Where to find it?
Beginning with the End in Mind (aka the latest and greatest on AmeriCorps performance measurement) 2011 NY Project Director Training – New Rochelle October.
Performance Measurement and Evaluation 2/8/2014 Performance Measurement and Evaluation 1 Performance Measurement and Evaluation What are they? How are.
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION FOR NATIONAL & COMMUNITY SERVICE.
An overview of the National Performance Measures 118 NOVEMBER 2011 THE CORPORATION OF NATIONAL & COMMUNITY SERVICE.
Theory of Change 2/8/2014 Theory of Change 1 Theory of Change What is it? How to use it?
Performance Measurement Overview 2/8/2014 Performance Measurement Overview 1 What is it? Why do it?
Alignment and Quality 2/8/2014 Alignment and Quality 1 What is it? What do I need to look for?
1 Creating Strong Reports New Mexico AmeriCorps April 20, 2006 Sue Hyatt, Project STAR Coach.
Introduction to Monitoring and Evaluation
Evidence: What It Is And Where To Find It Evidence: What It Is and Where to Find It How evidence helps you select an effective intervention Copyright ©
Using Data to Measure and Report Program Impact Anne Basham, MFA MEMconsultants.
Founded in 1995 by the National Association of Community Health Centers, Community HealthCorps is the largest health-focused, national AmeriCorps program.
Grantee Program Plan. Components of the Program Plan Goals  Objectives  Activities  Techniques & Tools  Outcomes/Change.
2014 AmeriCorps External Reviewer Training Assessing Need, Theory of Change, and Logic Model.
5 Keys to a Superior PSGC proposal: 1. Create a well written, compelling research backed narrative that describes the problem & serves as justification.
Healthy Child Development Suggestions for Submitting a Strong Proposal.
Assessing Evidence and Past Performance 2014 AmeriCorps External Reviewer Training.
The Lumina Center Grantseeking Workshop Series Presents Outcomes & Evaluations April 20, 2006.
Logic Models. A logic model is your program ROAD MAP. Where are you trying to go? How are you trying to get there? What will tell you that you’ve arrived?
CA Performance Measurement Worksheet (PMW)
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
How to Write Goals and Objectives
High Quality Performance Measures What is Performance Measurement? What makes a high quality performance measure? Copyright © 2012 by JBS International,
Theory of Change and Evidence
Daniel Barutta and Sarah Yue, Program Officers
How to Develop the Right Research Questions for Program Evaluation
2014 AmeriCorps External Reviewer Training
Performance Measurement and Evaluation Basics 2014 AmeriCorps External Reviewer Training.
May 12 th Monitoring and Project Control. Objectives Anticipated Outcomes Express why Monitoring and Controlling are Important. Differentiate between.
Evaluating REACH Funded Projects Using the REACH Theory of Change.
Performance Measures AmeriCorps Project Director Training Saratoga, NY October 7 th – 9 th, 2013.
Theory of Change Designing Effective Action for Change How a Theory of Change helps you clarify the cause-and-effect relationship at the heart of your.
2015 AmeriCorps Logic Model A presentation for AmeriCorps grant applicants.
Theory of Change and Evidence
The Evaluation Plan.
Performance Measurement Overview 9/18/2015 Performance Measurement Overview 1 What is it? Why do it?
Data Quality Review: Best Practices Sarah Yue, Program Officer Jim Stone, Senior Program and Project Specialist.
Fundamentals of Evaluation for Public Health Programs ROBERT FOLEY, M.ED. NIHB TRIBAL PUBLIC HEALTH SUMMIT MARCH 31,
Outcome Based Evaluation for Digital Library Projects and Services
EVALUATION RESEARCH. Evaluation Research  How do we begin?  What are the different types of evaluation research?  How do these different types fit.
Cathy Burack and Alan Melchior The Center for Youth and Communities The Heller School for Social Policy and Management, Brandeis University Your Program.
Performance Measurement 201: Best practices in performance measure design & implementation Ia Moua, Deputy Director, Grants & Program Development Patrick.
Overview of Evaluation Designs. Learning objectives By the end of this presentation, you will be able to: Explain evaluation design Describe the differences.
Logic Models and Theory of Change Models: Defining and Telling Apart
1 PROJECT EVALUATION IT’S ALL ABOUT STUDENTS. 2 In partnership, we help America’s students stay in school and graduate by: Reducing gaps in college access.
Tell the Story: Why Performance Measures Matter Cat Keen, M.S.W., M.B.A National Service Programs Director.
Foster Grandparent Program Performance Measures. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program.
Mapping the logic behind your programming Primary Prevention Institute
(c) 2007 McGraw-Hill Higher Education. All rights reserved. Accountability and Teacher Evaluation Chapter 14.
2016 AmeriCorps State: Grant Development October 15-30, 2015 Marisa Petreccia, Kate Pisano & Nancy Stetter.
Education Performance Measures Session. Session Overview Combination of presentation and interactive components Time at the end of the session for Q&A.
Laying the Groundwork Before Your First Evaluation Diana Epstein, Ph.D, CNCS Office of Research and Evaluation Adrienne DiTommaso, MPA, CNCS Office of.
Performance Measurement 101. Performance Measurement Performance Measurement is: –The ongoing monitoring and reporting of program accomplishments and.
How to Write a Project Proposal Specialization Introductory Module Thursday, May 9, 2013 Barbados.
So You Think You’ve Made a Change? Developing Indicators and Selecting Measurement Tools Chad Higgins, Ph.D. Allison Nichols, Ed.D.
Assessment/Evaluation Make evaluation a central part of planning – not an afterthought 1) Determine Needs 2) Determine Desired Outcomes 3) Determine Activities.
Logic Models How to Integrate Data Collection into your Everyday Work.
The Continuum of Interventions in a 3 Tier Model
2016 AmeriCorps Texas All-Grantee Meeting February 25-26, 2016
An Introduction to Evaluating Federal Title Funding
BOOTCAMP SOCIAL INNOVATION ACCELERATOR TO CREATE LASTING CHANGE
Presentation transcript:

Serve DC Training 9/28/11

Theories of Change and Logic Models Evidence Performance Measurement 101 Reviewing Performance Measures eGrants Tips Serve DC Training 9/28/11

Learning Objectives: Know the definition of theory of change Understand relationship between theory of change and program design Understand how logic models articulate a theory of change Serve DC Training 9/28/11

A theory of change: Looks at cause and effect relationships Identifies specific interventions to achieve the desired result Uses evidence to articulate assumptions Serve DC Training 9/28/11

PROBLEM: The identified community need INTERVENTION: The activities of members and community volunteers supported by AmeriCorps members OUTCOME: The change that occurs because of the intervention EVIDENCE: Why you believe a certain set of actions (the intervention) will lead to the intended outcome Serve DC Training 9/28/11

If the intervention (X) is delivered, at a certain dosage, then the expected outcome (Y) will happen. X Y Serve DC Training 9/28/11

I have strep throat (PROBLEM). If I take antibiotics (INTERVENTION), then I will get better (OUTCOME). Antibiotics I get better. X Y Serve DC Training 9/28/11

If I take penicillin, I will get better. If I take a different antibiotic, will I get better? Some interventions (antibiotics) work better than others. Some dont work at all. Serve DC Training 9/28/11

How do I know which antibiotic is best? I look at the evidence. There is research that shows which antibiotic is likely to get the best result. I consider constraints that may preclude the ideal intervention. (Penicillin may be too expensive.) If I cant have the most promising intervention, I need to understand the tradeoffs. Serve DC Training 9/28/11

Two types of evidence are required: Data that documents the community need Data that documents why you think your intervention (using AmeriCorps members and community volunteers) will achieve the intended outcome Serve DC Training 9/28/11

Data that demonstrates that the proposed intervention is likely to solve the identified problem For example: Evidence says that x hours of tutoring leads to academic outcomes…so the intervention features X hours of AmeriCorps members tutoring Serve DC Training 9/28/11

The evidence basis for an intervention may include: Past performance measurement data Results from a program evaluation Research studies that document the outcomes of similar programs Evaluations that document outcomes of similar programs Serve DC Training 9/28/11

Preliminary Moderate Strong Serve DC Training 9/28/11

Variance in executing the ideal program intervention Little evidence to support your intervention Serve DC Training 9/28/11

PROBLEM: Children at risk of failing third grade reading exam INTERVENTION: Individualized tutoring on five building block literacy skills OUTCOMES: Students master skills, pass state reading exam Serve DC Training 9/28/11

What is your theory of change? Serve DC Training 9/28/11

Logic models are a visual way of expressing the cause and effect reasoning behind a theory of change. They move from left to right in a linear fashion. X Y Serve DC Training 9/28/11

ProblemInterventionOutcomes I have strep throat. I take penicillin. I get better. Serve DC Training 9/28/11

PROBLEMINTERVENTIONOUTCOME Children at risk of failing third grade reading exam Evidence: Statistics on the number of students at risk of failing in programs service area; Research on why reading proficiency by 3 rd grade is important. Individualized tutoring on five building block literacy skills Evidence: Research on how children learn to read supporting theory that mastering building block skills will lead to proficiency. Research on design, frequency, duration of tutoring sessions. Constraints? Students master five building block skills. Students pass state reading exam. Serve DC Training 9/28/11

Logic Model Practice Serve DC Training 9/28/11

Learning Objectives: Review the two types of evidence that support a theory of change Practice evaluating evidence Serve DC Training 9/28/11

Data that documents the community need Data that documents why the intervention is likely to lead to the outcome(s) Serve DC Training 9/28/11

Relevant Compelling Up-to-date Reliable Source Evidence Continuum (Preliminary, Moderate, Strong) Serve DC Training 9/28/11

Where do we find evidence to document the community need? Serve DC Training 9/28/11

Past performance measurement data Results from impact evaluation of your program Research studies that document the outcomes of similar programs Evaluations that document outcomes of similar programs Serve DC Training 9/28/11

University or research organizations Names of known professionals/thought leaders Similar sounding programs/descriptions Meta-articles that review multiple studies Serve DC Training 9/28/11

Recommended design and dosage (frequency, intensity, duration) Do we need to alter the design of our intervention? Do we need to choose a new intervention? Serve DC Training 9/28/11

Learning Objectives: Know the definition of performance measurement Understand differences between performance measurement and evaluation Know performance measurement requirements for AmeriCorps grants Serve DC Training 9/28/11

Performance measurement is the process of regularly measuring the amount of work done by your program and the outcomes of this work on your program beneficiaries. Serve DC Training 9/28/11

Performance Measurement Captures near term changes Evaluation Captures lasting changes Attempts to demonstrate cause and effect between intervention and outcome Serve DC Training 9/28/11

Performance Measurement Evaluation Systematic data collection and information about: What took place What outputs were generated What near term outcomes were generated Serve DC Training 9/28/11

Performance Measurement Evaluation Tracks outputs and outcomes on a regular, ongoing basis Does not show causality Seeks to show causality Longer term focus Uses the most rigorous methodology that is right for the program (often quasi- experimental design) Serve DC Training 9/28/11

The most important difference: Evaluation seeks to prove the theory of change (XY). Performance measurement does not. Serve DC Training 9/28/11

Performance measurement can show the outcome (change ) occurred but not causality (the change occurred because of the intervention) Performance measurement does not seek to prove a theory of change but can provide evidence that informs your theory Performance measurement data can inform evaluation efforts Serve DC Training 9/28/11

Performance Measurement Individual benchmark assessments on Dynamic Indicators of Basic Early Literacy Skills (DIBELS) three times/year State Reading Exam --Number of students who graduate from the Minnesota Reading Corps who pass state reading exam Serve DC Training 9/28/11

Evaluation Matched sample research project in Minneapolis School DistrictReading Corps pre-school participants scored significantly higher in phonemic awareness, alphabetic principle, and total literacy than children in matched comparison group entering kindergarten Serve DC Training 9/28/11

If performance measurement doesnt prove that my intervention worked, then why do it? Serve DC Training 9/28/11

If the evidence for an intervention is strong, PM helps show the program is on track If the evidence basis is weak or not well- defined, PM can provide evidence that a change occurred Serve DC Training 9/28/11

Improve performance Inform decision making Demonstrate accountability (internally and externally) Justify continued funding Enhance customer service Improve quality of services Set targets for future performance Serve DC Training 9/28/11

Measuring prevention or long-term outcomes Time Cost Establishing reasonable targets Brief service interventions Attributing impact to the intervention Serve DC Training 9/28/11

Counts of the amount of service that members or volunteers have completed. They do not provide information on benefits to or other changes in the lives of members and/or beneficiaries. Serve DC Training 9/28/11

Number of students who complete participation in an AmeriCorps education program Number of veterans engaged in service opportunities Number of individuals receiving support, services, education and/or referrals to alleviate long-term hunger Serve DC Training 9/28/11

Outcomes specify changes that have occurred in the lives of members and/or beneficiaries. They should be: Realistic Measurable during grant period Relevant to theory of change Serve DC Training 9/28/11

Outcomes measure changes in: Attitude Behavior Condition Most programs should aim to measure a quantifiable change in behavior or condition. Serve DC Training 9/28/11

Applicants are required to create at least one aligned performance measure to capture the output and outcome of their primary service activity. Note: Applicants may create additional performance measures provided that they capture significant program outcomes. Serve DC Training 9/28/11

An aligned performance measure has two components: Output Outcome Alignment refers to whether: The outcome is logical and reasonable given your intervention and output(s) The output and outcome measure the same beneficiary Serve DC Training 9/28/11

Learning Objectives: Learn how CNCS assesses performance measures Practice using assessment checklist Serve DC Training 9/28/11

Applicants must describe the following theory of change elements: The problem(s) identified (Need) The actions that will be carried out by AmeriCorps members and community volunteers (Evidence-Based Intervention) The ways in which AmeriCorps members are particularly well-suited to deliver the solution (Value Added by AmeriCorps) The anticipated outcomes (Outcomes) Serve DC Training 9/28/11

Measures align with the need, activities and outcomes (theory of change) described in the narrative Outputs and outcomes are correctly aligned Measures utilize rigorous methodologies to demonstrate significant outcomes Serve DC Training 9/28/11

Choose an intervention that will lead to the specific desired outcomes. Choose outcomes that can measure the intervention. Example: Improving academic performance Serve DC Training 9/28/11

Intervention: After school enrichment program Outcome: Improved academic performance in reading Serve DC Training 9/28/11

Intervention: Tutoring program focused on helping kindergarten students master the most critical emergent literacy skills Outcome: Improved academic performance in reading Serve DC Training 9/28/11

Intervention: Homework help program focusing on multiple subjects Outcome: Improved academic performance in reading Serve DC Training 9/28/11

Need a clear link between: The intervention (design, frequency, duration) The specific change (outcome) that is likely to occur as a result of the intervention Serve DC Training 9/28/11

Intervention: AmeriCorps members lead classes to educate smokers about the health risks associated with smoking. Outcomes: Individuals increase their knowledge of the health risks of smoking. Individuals stop smoking. Alignment Issue: Simply telling people that smoking is bad for them may not help them to quit. Serve DC Training 9/28/11

Intervention: Members provide financial literacy trainings to economically disadvantaged adults. Outcome: Economically disadvantaged adults will open savings accounts after receiving financial literacy training. Alignment Issue: If beneficiaries do not have enough money to meet their basic needs, a savings account may not be realistic. Serve DC Training 9/28/11

National Measures must be aligned as directed in CNCS guidance Aligned measure includes output and outcome for primary service activity Outcomes likely to result from outputs Outputs and outcomes measure the same population Serve DC Training 9/28/11

Output H4: Clients participating in health education programs Outcome: Community members will decrease costly emergency room visits Serve DC Training 9/28/11

Output H5: Youth engaged in activities to reduce childhood obesity Outcome: Children experiences at least an 8% increase in aerobic fitness Serve DC Training 9/28/11

Output EN4: Acres of parks improved Outcome: Acres of park certified as satisfactorily restored by land manager partners Serve DC Training 9/28/11

Output EN4: Acres of parks improved Outcome: Public parks will be cleaner as the result of removing 140,000 pounds of trash and debris Serve DC Training 9/28/11

Data collection methods are rigorous Outcomes capture a significant change. It is helpful to consider: So what? Is this change worth measuring? Serve DC Training 9/28/11

Do outcomes capture the change you want to accomplish? Will proposed methods/instruments capture these outcomes? Are methods rigorous but realistic? Is there a clear plan/timeline for developing instruments and collecting data? Serve DC Training 9/28/11

Proposed methods are not realistic because: Too ambitious Cant get data Unable to obtain a representative sample Serve DC Training 9/28/11

A grantee plans to use a standardized pre/post test but has difficulty administering the test and aggregating the data within the grant period. Would like to measure improvement in grades instead. A grantee is unable to create a sampling frame that defines the population from which they will sample. Serve DC Training 9/28/11

Objective vs. Subjective Not tested ahead of time Dont measure what they are supposed to measure (Validity) Biased Serve DC Training 9/28/11

What is a valid way to measure my driving proficiency? Survey that asks how I feel about driving? Survey that asks if I think Im a good driver? Written Test? Driving Test? Serve DC Training 9/28/11

A survey scale that only measures improvement A survey that is only returned by individuals who feel strongly Serve DC Training 9/28/11

The AmeriCorps State and National Performance Measurement Assessment Checklist: Alignment with theory of change Alignment of outputs and outcomes Quality (Rigorous, Worth Measuring) Serve DC Training 9/28/11

Practice using Performance Measurement Checklist Serve DC Training 9/28/11

Learning Objectives Learn tips for entering PMs in eGrants Understand how the eGrants language is sometimes different from other CNCS language for performance measures Serve DC Training 9/28/11

Strategy = Intervention Result = Output, Intermediate Outcome, or End Outcome Indicator = A description of the measurable change that will occur (Number of beneficiaries who...) Target Statement = The indicator plus the expected number (100 beneficiaries will…) Serve DC Training 9/28/11

Target – The number in the target statement (100) Instrument – The specific tool that will be used to collect data (AIMSweb Letter Sounds and Letter Names pre/post test) Data Collection Methodology – How data will be collected (survey, pre/post test, etc.) Serve DC Training 9/28/11

The strategy (intervention) will be the same for all components of the measure (output, intermediate outcome, end outcome) because all of these should result from the same intervention Serve DC Training 9/28/11

Within each output or outcome, the result statement, indicator, target statement and target number will seem repetitive: Result Statement: Students will demonstrate improved academic performance Indicator: Number of students with improved academic performance Target Statement: 100 students will demonstrate improved academic performance Target: 100 Serve DC Training 9/28/11

The data collection methodology is how you will collect the data. For example, administering a standardized test is a method of collecting data. The instrument is the actual tool that will be used. For example, AIMSweb Letter Sounds and Letter Names Pre/Post test is one standardized test that might be an acceptable instrument. Serve DC Training 9/28/11

Resource Center 2012 AmeriCorps NOFO and Performance Measures Instructions Performance Measurement by Harry Hatry Stanford Social Innovation Review Serve DC Training 9/28/11