Download presentation
Presentation is loading. Please wait.
Published byCharles Wheeler Modified over 10 years ago
1
Serve DC Training 9/28/11
2
Theories of Change and Logic Models Evidence Performance Measurement 101 Reviewing Performance Measures eGrants Tips Serve DC Training 9/28/11
3
Learning Objectives: Know the definition of theory of change Understand relationship between theory of change and program design Understand how logic models articulate a theory of change Serve DC Training 9/28/11
4
A theory of change: Looks at cause and effect relationships Identifies specific interventions to achieve the desired result Uses evidence to articulate assumptions Serve DC Training 9/28/11
5
PROBLEM: The identified community need INTERVENTION: The activities of members and community volunteers supported by AmeriCorps members OUTCOME: The change that occurs because of the intervention EVIDENCE: Why you believe a certain set of actions (the intervention) will lead to the intended outcome Serve DC Training 9/28/11
6
If the intervention (X) is delivered, at a certain dosage, then the expected outcome (Y) will happen. X Y Serve DC Training 9/28/11
7
I have strep throat (PROBLEM). If I take antibiotics (INTERVENTION), then I will get better (OUTCOME). Antibiotics I get better. X Y Serve DC Training 9/28/11
8
If I take penicillin, I will get better. If I take a different antibiotic, will I get better? Some interventions (antibiotics) work better than others. Some dont work at all. Serve DC Training 9/28/11
9
How do I know which antibiotic is best? I look at the evidence. There is research that shows which antibiotic is likely to get the best result. I consider constraints that may preclude the ideal intervention. (Penicillin may be too expensive.) If I cant have the most promising intervention, I need to understand the tradeoffs. Serve DC Training 9/28/11
10
Two types of evidence are required: Data that documents the community need Data that documents why you think your intervention (using AmeriCorps members and community volunteers) will achieve the intended outcome Serve DC Training 9/28/11
11
Data that demonstrates that the proposed intervention is likely to solve the identified problem For example: Evidence says that x hours of tutoring leads to academic outcomes…so the intervention features X hours of AmeriCorps members tutoring Serve DC Training 9/28/11
12
The evidence basis for an intervention may include: Past performance measurement data Results from a program evaluation Research studies that document the outcomes of similar programs Evaluations that document outcomes of similar programs Serve DC Training 9/28/11
13
Preliminary Moderate Strong Serve DC Training 9/28/11
14
Variance in executing the ideal program intervention Little evidence to support your intervention Serve DC Training 9/28/11
15
PROBLEM: Children at risk of failing third grade reading exam INTERVENTION: Individualized tutoring on five building block literacy skills OUTCOMES: Students master skills, pass state reading exam Serve DC Training 9/28/11
16
What is your theory of change? Serve DC Training 9/28/11
17
Logic models are a visual way of expressing the cause and effect reasoning behind a theory of change. They move from left to right in a linear fashion. X Y Serve DC Training 9/28/11
18
ProblemInterventionOutcomes I have strep throat. I take penicillin. I get better. Serve DC Training 9/28/11
19
PROBLEMINTERVENTIONOUTCOME Children at risk of failing third grade reading exam Evidence: Statistics on the number of students at risk of failing in programs service area; Research on why reading proficiency by 3 rd grade is important. Individualized tutoring on five building block literacy skills Evidence: Research on how children learn to read supporting theory that mastering building block skills will lead to proficiency. Research on design, frequency, duration of tutoring sessions. Constraints? Students master five building block skills. Students pass state reading exam. Serve DC Training 9/28/11
20
Logic Model Practice Serve DC Training 9/28/11
21
Learning Objectives: Review the two types of evidence that support a theory of change Practice evaluating evidence Serve DC Training 9/28/11
22
Data that documents the community need Data that documents why the intervention is likely to lead to the outcome(s) Serve DC Training 9/28/11
23
Relevant Compelling Up-to-date Reliable Source Evidence Continuum (Preliminary, Moderate, Strong) Serve DC Training 9/28/11
24
Where do we find evidence to document the community need? Serve DC Training 9/28/11
25
Past performance measurement data Results from impact evaluation of your program Research studies that document the outcomes of similar programs Evaluations that document outcomes of similar programs Serve DC Training 9/28/11
26
University or research organizations Names of known professionals/thought leaders Similar sounding programs/descriptions Meta-articles that review multiple studies Serve DC Training 9/28/11
27
Recommended design and dosage (frequency, intensity, duration) Do we need to alter the design of our intervention? Do we need to choose a new intervention? Serve DC Training 9/28/11
28
Learning Objectives: Know the definition of performance measurement Understand differences between performance measurement and evaluation Know performance measurement requirements for AmeriCorps grants Serve DC Training 9/28/11
29
Performance measurement is the process of regularly measuring the amount of work done by your program and the outcomes of this work on your program beneficiaries. Serve DC Training 9/28/11
30
Performance Measurement Captures near term changes Evaluation Captures lasting changes Attempts to demonstrate cause and effect between intervention and outcome Serve DC Training 9/28/11
31
Performance Measurement Evaluation Systematic data collection and information about: What took place What outputs were generated What near term outcomes were generated Serve DC Training 9/28/11
32
Performance Measurement Evaluation Tracks outputs and outcomes on a regular, ongoing basis Does not show causality Seeks to show causality Longer term focus Uses the most rigorous methodology that is right for the program (often quasi- experimental design) Serve DC Training 9/28/11
33
The most important difference: Evaluation seeks to prove the theory of change (XY). Performance measurement does not. Serve DC Training 9/28/11
34
Performance measurement can show the outcome (change ) occurred but not causality (the change occurred because of the intervention) Performance measurement does not seek to prove a theory of change but can provide evidence that informs your theory Performance measurement data can inform evaluation efforts Serve DC Training 9/28/11
35
Performance Measurement Individual benchmark assessments on Dynamic Indicators of Basic Early Literacy Skills (DIBELS) three times/year State Reading Exam --Number of students who graduate from the Minnesota Reading Corps who pass state reading exam Serve DC Training 9/28/11
36
Evaluation Matched sample research project in Minneapolis School DistrictReading Corps pre-school participants scored significantly higher in phonemic awareness, alphabetic principle, and total literacy than children in matched comparison group entering kindergarten Serve DC Training 9/28/11
37
If performance measurement doesnt prove that my intervention worked, then why do it? Serve DC Training 9/28/11
38
If the evidence for an intervention is strong, PM helps show the program is on track If the evidence basis is weak or not well- defined, PM can provide evidence that a change occurred Serve DC Training 9/28/11
39
Improve performance Inform decision making Demonstrate accountability (internally and externally) Justify continued funding Enhance customer service Improve quality of services Set targets for future performance Serve DC Training 9/28/11
40
Measuring prevention or long-term outcomes Time Cost Establishing reasonable targets Brief service interventions Attributing impact to the intervention Serve DC Training 9/28/11
41
Counts of the amount of service that members or volunteers have completed. They do not provide information on benefits to or other changes in the lives of members and/or beneficiaries. Serve DC Training 9/28/11
42
Number of students who complete participation in an AmeriCorps education program Number of veterans engaged in service opportunities Number of individuals receiving support, services, education and/or referrals to alleviate long-term hunger Serve DC Training 9/28/11
43
Outcomes specify changes that have occurred in the lives of members and/or beneficiaries. They should be: Realistic Measurable during grant period Relevant to theory of change Serve DC Training 9/28/11
44
Outcomes measure changes in: Attitude Behavior Condition Most programs should aim to measure a quantifiable change in behavior or condition. Serve DC Training 9/28/11
45
Applicants are required to create at least one aligned performance measure to capture the output and outcome of their primary service activity. Note: Applicants may create additional performance measures provided that they capture significant program outcomes. Serve DC Training 9/28/11
46
An aligned performance measure has two components: Output Outcome Alignment refers to whether: The outcome is logical and reasonable given your intervention and output(s) The output and outcome measure the same beneficiary Serve DC Training 9/28/11
47
Learning Objectives: Learn how CNCS assesses performance measures Practice using assessment checklist Serve DC Training 9/28/11
48
Applicants must describe the following theory of change elements: The problem(s) identified (Need) The actions that will be carried out by AmeriCorps members and community volunteers (Evidence-Based Intervention) The ways in which AmeriCorps members are particularly well-suited to deliver the solution (Value Added by AmeriCorps) The anticipated outcomes (Outcomes) Serve DC Training 9/28/11
49
Measures align with the need, activities and outcomes (theory of change) described in the narrative Outputs and outcomes are correctly aligned Measures utilize rigorous methodologies to demonstrate significant outcomes Serve DC Training 9/28/11
50
Choose an intervention that will lead to the specific desired outcomes. Choose outcomes that can measure the intervention. Example: Improving academic performance Serve DC Training 9/28/11
51
Intervention: After school enrichment program Outcome: Improved academic performance in reading Serve DC Training 9/28/11
52
Intervention: Tutoring program focused on helping kindergarten students master the most critical emergent literacy skills Outcome: Improved academic performance in reading Serve DC Training 9/28/11
53
Intervention: Homework help program focusing on multiple subjects Outcome: Improved academic performance in reading Serve DC Training 9/28/11
54
Need a clear link between: The intervention (design, frequency, duration) The specific change (outcome) that is likely to occur as a result of the intervention Serve DC Training 9/28/11
55
Intervention: AmeriCorps members lead classes to educate smokers about the health risks associated with smoking. Outcomes: Individuals increase their knowledge of the health risks of smoking. Individuals stop smoking. Alignment Issue: Simply telling people that smoking is bad for them may not help them to quit. Serve DC Training 9/28/11
56
Intervention: Members provide financial literacy trainings to economically disadvantaged adults. Outcome: Economically disadvantaged adults will open savings accounts after receiving financial literacy training. Alignment Issue: If beneficiaries do not have enough money to meet their basic needs, a savings account may not be realistic. Serve DC Training 9/28/11
57
National Measures must be aligned as directed in CNCS guidance Aligned measure includes output and outcome for primary service activity Outcomes likely to result from outputs Outputs and outcomes measure the same population Serve DC Training 9/28/11
58
Output H4: Clients participating in health education programs Outcome: Community members will decrease costly emergency room visits Serve DC Training 9/28/11
59
Output H5: Youth engaged in activities to reduce childhood obesity Outcome: Children experiences at least an 8% increase in aerobic fitness Serve DC Training 9/28/11
60
Output EN4: Acres of parks improved Outcome: Acres of park certified as satisfactorily restored by land manager partners Serve DC Training 9/28/11
61
Output EN4: Acres of parks improved Outcome: Public parks will be cleaner as the result of removing 140,000 pounds of trash and debris Serve DC Training 9/28/11
62
Data collection methods are rigorous Outcomes capture a significant change. It is helpful to consider: So what? Is this change worth measuring? Serve DC Training 9/28/11
63
Do outcomes capture the change you want to accomplish? Will proposed methods/instruments capture these outcomes? Are methods rigorous but realistic? Is there a clear plan/timeline for developing instruments and collecting data? Serve DC Training 9/28/11
64
Proposed methods are not realistic because: Too ambitious Cant get data Unable to obtain a representative sample Serve DC Training 9/28/11
65
A grantee plans to use a standardized pre/post test but has difficulty administering the test and aggregating the data within the grant period. Would like to measure improvement in grades instead. A grantee is unable to create a sampling frame that defines the population from which they will sample. Serve DC Training 9/28/11
66
Objective vs. Subjective Not tested ahead of time Dont measure what they are supposed to measure (Validity) Biased Serve DC Training 9/28/11
67
What is a valid way to measure my driving proficiency? Survey that asks how I feel about driving? Survey that asks if I think Im a good driver? Written Test? Driving Test? Serve DC Training 9/28/11
68
A survey scale that only measures improvement A survey that is only returned by individuals who feel strongly Serve DC Training 9/28/11
69
The AmeriCorps State and National Performance Measurement Assessment Checklist: Alignment with theory of change Alignment of outputs and outcomes Quality (Rigorous, Worth Measuring) Serve DC Training 9/28/11
70
Practice using Performance Measurement Checklist Serve DC Training 9/28/11
71
Learning Objectives Learn tips for entering PMs in eGrants Understand how the eGrants language is sometimes different from other CNCS language for performance measures Serve DC Training 9/28/11
72
Strategy = Intervention Result = Output, Intermediate Outcome, or End Outcome Indicator = A description of the measurable change that will occur (Number of beneficiaries who...) Target Statement = The indicator plus the expected number (100 beneficiaries will…) Serve DC Training 9/28/11
73
Target – The number in the target statement (100) Instrument – The specific tool that will be used to collect data (AIMSweb Letter Sounds and Letter Names pre/post test) Data Collection Methodology – How data will be collected (survey, pre/post test, etc.) Serve DC Training 9/28/11
74
The strategy (intervention) will be the same for all components of the measure (output, intermediate outcome, end outcome) because all of these should result from the same intervention Serve DC Training 9/28/11
75
Within each output or outcome, the result statement, indicator, target statement and target number will seem repetitive: Result Statement: Students will demonstrate improved academic performance Indicator: Number of students with improved academic performance Target Statement: 100 students will demonstrate improved academic performance Target: 100 Serve DC Training 9/28/11
76
The data collection methodology is how you will collect the data. For example, administering a standardized test is a method of collecting data. The instrument is the actual tool that will be used. For example, AIMSweb Letter Sounds and Letter Names Pre/Post test is one standardized test that might be an acceptable instrument. Serve DC Training 9/28/11
77
Resource Center 2012 AmeriCorps NOFO and Performance Measures Instructions Performance Measurement by Harry Hatry Stanford Social Innovation Review Serve DC Training 9/28/11
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.