Download presentation
Presentation is loading. Please wait.
1
Building a Strong Outcome Portfolio
Section 2: Evidence and Evidence-Based Jeffrey A. Butts, Ph.D. Research and Evaluation Center John Jay College of Criminal Justice City University of New York September 2018
2
What are Evidence-Based Programs?
X X Good Programs Programs That Always Work X Programs That Usually Work X Proven Programs X X Most Effective Programs Best Programs
3
What are Evidence-Based Programs?
Replicable interventions based on sound theory with reliable effects on relevant outcomes as demonstrated by multiple evaluations using credible research designs accounting for all reasonable threats to validity, both internal and external. Good Programs Programs That Work Programs That Always Work Proven Programs Most Effective Programs Best Programs
4
What are Evidence-Based Programs?
Before After 100% 0% 80% = 30% 50%
5
What are Evidence-Based Programs?
What if the difference was less than 30%? X 55% Before After 100% 0% What if the difference was 3% or 4%? X 45%
6
What are Evidence-Based Programs?
Even a difference of 3% or 4% could qualify a program as “evidence-based” IF it was a reliable difference detected by valid evaluation designs.
7
Two Types of Threats to Validity
External: Something about the way the study was conducted makes it inappropriate to generalize the findings beyond the particular study, sample, or population. Can the findings of effectiveness be transferred to other settings, other circumstances, and other populations? Internal: The study failed to establish credible evidence that the intervention (e.g., services, policy changes) affected the outcomes in a causal way and that the association was not likely due to other factors. Can we really say that A > caused > B?
8
Two other important concepts:
Interpreting Effects Two other important concepts: Statistical Significance — How confident can we be that differences in outcome are really there and not just due to dumb luck? Effect Size — How meaningful are the differences in outcome? Differences can be statistically significant, but trivial in terms of their application and benefit in the real world.
9
Statistical Confidence Comes From our Knowledge of Distributions
10
Percent Change in Recidivism
One study outcome or one person’s behavior - 20% -10% 0% 10% 20% Percent Change in Recidivism
11
Percent Change in Recidivism
- 20% -10% 0% 10% 20% Percent Change in Recidivism
12
Percent Change in Recidivism
- 20% -10% 0% 10% 20% Percent Change in Recidivism
13
Not “evidence-based” …
- 20% -10% 0% 10% 20% Percent Change in Recidivism
14
Maybe “evidence-based” …
- 20% -10% 0% 10% 20% Percent Change in Recidivism
15
Cause and Effect Evaluators assess not only outcomes, but whether changing outcomes are attributable to a program or policy: Outcome Level is the status of an outcome at some point in time (e.g., drug use among teenagers) Outcome Change is the difference between outcome levels at different points in time or between groups Program Effect is the portion of a change in outcome that can be attributed uniquely to a program as opposed to the influence of other factors Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p Sage Publications.
16
Model Development RANDOMIZED TRIALS EVALUATION FIDELITY ASSURANCE
REPLICATION INNOVATION THEORY
17
Many Types of Evidence Stage of Development Question to be Answered
Evaluation Function 1. Assessment of social problems and needs To what extent are community and standards met? Needs assessment; problem description 2. Determination of goals What must be done to meet those needs and standards? Needs assessment; service needs 3. Design of program alternatives What services could be used to produce the desired changes? Assessment of program logic or theory 4. Selection of alternative Which of the possible program approaches is best? Feasibility study; formative evaluation 5. Program implementation How should the program be put into operation? Implementation assessment 6. Program operation Is the program operating as planned? Process evaluation; program monitoring 7. Program outcomes Is the program having the desired effects? Outcome evaluation 8. Program efficiency Are program effects attained at a reasonable cost? Cost-benefit analysis; cost-effectiveness analysis Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p. 40. Sage Publications; adapted from Pancer & Westhues 1989).
18
Focus of Evidence-Based Practices and Policy
Many Types of Evidence Stage of Development Question to be Answered Evaluation Function 1. Assessment of social problems and needs To what extent are community and standards met? Needs assessment; problem description 2. Determination of goals What must be done to meet those needs and standards? Needs assessment; service needs 3. Design of program alternatives What services could be used to produce the desired changes? Assessment of program logic or theory 4. Selection of alternative Which of the possible program approaches is best? Feasibility study; formative evaluation 5. Program implementation How should the program be put into operation? Implementation assessment 6. Program operation Is the program operating as planned? Process evaluation; program monitoring 7. Program outcomes Is the program having the desired effects? Outcome evaluation 8. Program efficiency Are program effects attained at a reasonable cost? Cost-benefit analysis; cost-effectiveness analysis Focus of Evidence-Based Practices and Policy Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p. 40. Sage Publications; adapted from Pancer & Westhues 1989).
19
Section 3: Evidence Starts with Theory
Next Section 3: Evidence Starts with Theory
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.