Building a Strong Outcome Portfolio

Slides:



Advertisements
Similar presentations
When are Impact Evaluations (IE) Appropriate and Feasible? Michele Tarsilla, Ph.D. InterAction IE Workshop May 13, 2013.
Advertisements

Action Research Not traditional educational research often research tests theory not practical Teacher research in classrooms and/or schools/districts.
What You Will Learn From These Sessions
Chapter 1: An Overview of Program Evaluation Presenter: Stephanie Peterson.
Reading the Dental Literature
Measuring and Monitoring Program Outcomes
A brief overview What is program evaluation? How is an evaluation conducted? When should it be used? When can it be used? Used with Permission of: John.
Evaluation Research, aka Program Evaluation. Definitions Program Evaluation is not a “method” but an example of applied social research. From Rossi and.
Evidence: What It Is And Where To Find It Building Evidence of Effectiveness Copyright © 2014 by JBS International, Inc. Developed by JBS International.
WRITING the Research Problem.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Critical Appraisal of an Article by Dr. I. Selvaraj B. SC. ,M. B. B. S
Research and Evaluation Center Jeffrey A. Butts John Jay College of Criminal Justice City University of New York August 7, 2012 How Researchers Generate.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Program Evaluation. Program evaluation Methodological techniques of the social sciences social policy public welfare administration.
Criteria for Assessing The Feasibility of RCTs. RCTs in Social Science: York September 2006 Today’s Headlines: “Drugs education is not working” “ having.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Evaluation – Principles and methods Matt Barnard Head of Evaluation NSPCC.
CDIS 5400 Dr Brenda Louw 2010 Validity Issues in Research Design.
Single-Subject Experimental Research
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
Chapter Thirteen – Organizational Effectiveness.  Be able to define organizational effectiveness  Understand the issues underpinning measuring organizational.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Welcome to Program Evaluation Overview of Evaluation Concepts Copyright 2006, The Johns Hopkins University and Jane Bertrand. All rights reserved. This.
Evaluation Research Dr. Guerette. Introduction Evaluation Research – Evaluation Research – The purpose is to evaluate the impact of policies The purpose.
What Makes A Juvenile Justice Program Evidence Based? Clay Yeager Policy Director, Blueprints for Healthy Youth Development.
PRAGMATIC Study Designs: Elderly Cancer Trials
Copyright © Springer Publishing Company, LLC. All Rights Reserved. EVIDENCE-BASED TEACHING IN NURSING – Chapter 15 –
Overview of Intervention Mapping
CHOOSING A RESEARCH DESIGN
Incorporating Evaluation into a Clinical Project
Approaches to social research Lerum
Chapter 11: Quasi-Experimental and Single Case Experimental Designs
Program Evaluation ED 740 Study Team Project Program Evaluation
DATA COLLECTION METHODS IN NURSING RESEARCH
Designing Effective Evaluation Strategies for Outreach Programs
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Evaluation Plan Akm Alamgir, PhD June 30, 2017.
Research methodology.
Chapter 1 An Introduction to Assurance and Financial Statement Auditing.
Evidence Based Practice Process
Fundamentals of Monitoring and Evaluation
QIC-AG Logic Model Template
Clinical Studies Continuum
Introduction to Comprehensive Evaluation
Test Validity.
Research & Writing in CJ
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Conducting Efficacy Trials
DUET.
Georgia Compensatory Educational Leaders, Inc Conference
Critical Reading of Clinical Study Results
Rigor is disciplinary…
Chapter 1: Introduction to Scientific Thinking
Building a Strong Outcome Portfolio
Single-Case Designs.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
External Validity.
Presenter: Kate Bell, MA PIP Reviewer
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Methodological Evaluation of Experiments
OGB Partner Advocacy Workshop 18th & 19th March 2010
Sample Sizes for IE Power Calculations.
Chapter 4 Summary.
Monitoring and Evaluating FGM/C abandonment programs
Wednesday, September 6 Remember Dusty? How could we use correlation to learn more about the relationship with different variables with ADHD? What is.
What is a Logic Model? A depiction of a program showing what the program will do and what it is to accomplish. A series of “if-then” relationships that.
Data for PRS Monitoring: Institutional and Technical Challenges
PUBLIC POLICY, POWER AND DECISION
Presentation transcript:

Building a Strong Outcome Portfolio Section 2: Evidence and Evidence-Based Jeffrey A. Butts, Ph.D. Research and Evaluation Center John Jay College of Criminal Justice City University of New York September 2018

What are Evidence-Based Programs? X X Good Programs Programs That Always Work X Programs That Usually Work X Proven Programs X X Most Effective Programs Best Programs

What are Evidence-Based Programs? Replicable interventions based on sound theory with reliable effects on relevant outcomes as demonstrated by multiple evaluations using credible research designs accounting for all reasonable threats to validity, both internal and external. Good Programs Programs That Work Programs That Always Work Proven Programs Most Effective Programs Best Programs

What are Evidence-Based Programs? Before After 100% 0% 80% = 30% 50%

What are Evidence-Based Programs? What if the difference was less than 30%? X 55% Before After 100% 0% What if the difference was 3% or 4%? X 45%

What are Evidence-Based Programs? Even a difference of 3% or 4% could qualify a program as “evidence-based” IF it was a reliable difference detected by valid evaluation designs.

Two Types of Threats to Validity External: Something about the way the study was conducted makes it inappropriate to generalize the findings beyond the particular study, sample, or population. Can the findings of effectiveness be transferred to other settings, other circumstances, and other populations? Internal: The study failed to establish credible evidence that the intervention (e.g., services, policy changes) affected the outcomes in a causal way and that the association was not likely due to other factors. Can we really say that A > caused > B?

Two other important concepts: Interpreting Effects Two other important concepts: Statistical Significance — How confident can we be that differences in outcome are really there and not just due to dumb luck? Effect Size — How meaningful are the differences in outcome? Differences can be statistically significant, but trivial in terms of their application and benefit in the real world.

Statistical Confidence Comes From our Knowledge of Distributions

Percent Change in Recidivism One study outcome or one person’s behavior - 20% -10% 0% 10% 20% Percent Change in Recidivism

Percent Change in Recidivism - 20% -10% 0% 10% 20% Percent Change in Recidivism

Percent Change in Recidivism - 20% -10% 0% 10% 20% Percent Change in Recidivism

Not “evidence-based” … - 20% -10% 0% 10% 20% Percent Change in Recidivism

Maybe “evidence-based” … - 20% -10% 0% 10% 20% Percent Change in Recidivism

Cause and Effect Evaluators assess not only outcomes, but whether changing outcomes are attributable to a program or policy: Outcome Level is the status of an outcome at some point in time (e.g., drug use among teenagers) Outcome Change is the difference between outcome levels at different points in time or between groups Program Effect is the portion of a change in outcome that can be attributed uniquely to a program as opposed to the influence of other factors Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p. 208. Sage Publications.

Model Development RANDOMIZED TRIALS EVALUATION FIDELITY ASSURANCE REPLICATION INNOVATION THEORY

Many Types of Evidence Stage of Development Question to be Answered Evaluation Function 1. Assessment of social problems and needs To what extent are community and standards met? Needs assessment; problem description 2. Determination of goals What must be done to meet those needs and standards? Needs assessment; service needs 3. Design of program alternatives What services could be used to produce the desired changes? Assessment of program logic or theory 4. Selection of alternative Which of the possible program approaches is best? Feasibility study; formative evaluation 5. Program implementation How should the program be put into operation? Implementation assessment 6. Program operation Is the program operating as planned? Process evaluation; program monitoring 7. Program outcomes Is the program having the desired effects? Outcome evaluation 8. Program efficiency Are program effects attained at a reasonable cost? Cost-benefit analysis; cost-effectiveness analysis Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p. 40. Sage Publications; adapted from Pancer & Westhues 1989).

Focus of Evidence-Based Practices and Policy Many Types of Evidence Stage of Development Question to be Answered Evaluation Function 1. Assessment of social problems and needs To what extent are community and standards met? Needs assessment; problem description 2. Determination of goals What must be done to meet those needs and standards? Needs assessment; service needs 3. Design of program alternatives What services could be used to produce the desired changes? Assessment of program logic or theory 4. Selection of alternative Which of the possible program approaches is best? Feasibility study; formative evaluation 5. Program implementation How should the program be put into operation? Implementation assessment 6. Program operation Is the program operating as planned? Process evaluation; program monitoring 7. Program outcomes Is the program having the desired effects? Outcome evaluation 8. Program efficiency Are program effects attained at a reasonable cost? Cost-benefit analysis; cost-effectiveness analysis Focus of Evidence-Based Practices and Policy Rossi, P., M. Lipsey and H. Freeman (2004). Evaluation: A Systematic Approach (7th Edition), p. 40. Sage Publications; adapted from Pancer & Westhues 1989).

Section 3: Evidence Starts with Theory Next Section 3: Evidence Starts with Theory