The Cause…or the “What” of What Works? David S. Cordray Vanderbilt University IES Research Conference Washington, DC June 16, 2006.

Slides:



Advertisements
Similar presentations
Improving the Intelligence of Assessment Through Technology: An IES Perspective Martin E. Orland, Ph.D. Special Assistant Institute of Education Sciences.
Advertisements

Parent Connectors: An Evidence-based Peer-to-Peer Support Program Albert J. Duchnowski, Ph.D. Krista Kutash, Ph.D. University of South Florida Federation.
Supporting continuous improvement in the replication process Getting to Grips with Replication Seminar 3: Monitoring, evaluation & continuous improvement.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Developing Indicators to Assess Program Outcomes Kathryn E. H. Race Race & Associates, Ltd. Panel Presentation at American Evaluation Association Meeting.
Introduction Results and Conclusions Comparisons on the TITIS fidelity measure indicated a significant difference between the IT and AS models on the Staffing.
Planning an improved prevention response up to early childhood Ms. Giovanna Campello UNODC Prevention, Treatment and Rehabilitation Section.
Modeling “The Cause”: Assessing Implementation Fidelity and Achieved Relative Strength in RCTs David S. Cordray Vanderbilt University IES/NCER Summer Research.
Chapter 2 Flashcards.
Assessing Program Impact Chapter 8. Impact assessments answer… Does a program really work? Does a program produce desired effects over and above what.
+ Evidence Based Practice University of Utah Presented by Will Backner December 2009 Training School Psychologists to be Experts in Evidence Based Practices.
Response to Intervention
Eugene, OR Brown Bag Presentation: November 19, 2007
Evaluation Research, aka Program Evaluation. Definitions Program Evaluation is not a “method” but an example of applied social research. From Rossi and.
Analyzing Intervention Fidelity and Achieved Relative Strength David S. Cordray Vanderbilt University NCER/IES RCT Training Institute,2010.
Program Evaluation In A Nutshell 1 Jonathan Brown, M.A.
Studying treatment of suicidal ideation & attempts: Designs, Statistical Analysis, and Methodological Considerations Jill M. Harkavy-Friedman, Ph.D.
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar May 2014 Note: These slides are intended as guidance.
STUDY PLANNING & DESIGN TO ENHANCE TRANSLATION OF HEALTH BEHAVIOR RESEARCH Lisa Klesges, Russell Glasgow, Paul Estabrooks, David Dzewaltowski, Sheana Bull.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 NASC 2012 ANNUAL CONFERENCE AUGUST 6, 2012 Ray Wahl Deputy State Court Administrator.
Copyright © 2008 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 12 Undertaking Research for Specific Purposes.
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Achieved Relative Intervention Strength: Models and Methods Chris S. Hulleman David S. Cordray Presentation for the SREE Research Conference Washington,
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Discussant Comments Prepared by Carol.
Barbara Resnick, PhD, CRNP, FAAN, FAANP
Conceptualizing Intervention Fidelity: Implications for Measurement, Design, and Analysis Implementation Research Methods Meeting September 20-21, 2010.
KATEWINTEREVALUATION.com Education Research 101 A Beginner’s Guide for S STEM Principal Investigators.
Components of a national drug prevention system Ms. UNODC.
Conceptualizing Intervention Fidelity: Implications for Measurement, Design, and Analysis Implementation: What to Consider At Different Stages in the Research.
Assessing Intervention Fidelity in RCTs: Models, Methods and Modes of Analysis David S. Cordray & Chris Hulleman Vanderbilt University Presentation for.
Private involvement in education: Measuring Impacts Felipe Barrera-Osorio HDN Education Public-Private Partnerships in Education, Washington, DC, March.
JEFF ALEXANDER The University of Michigan The Challenge and Promise of Delivery System Research: A Meeting of AHRQ Grantees, Experts, and Stakeholders.
Laying the Foundation for Scaling Up During Development.
EDPQS in 10 minutes: Overview of European Drug Prevention Quality Standards (EDPQS) With financial support from the Drug Prevention and Information Programme.
Sharing Design Knowledge through the IMS Learning Design Specification Dawn Howard-Rose Kevin Harrigan David Bean University of Waterloo McGraw-Hill Ryerson.
June 22, 2011 CCSSO-NCSA Innovative Approaches to Statewide Writing Assessments 6/22/11CCSSO-NCSA.
 Welcome, introductions  Conceptualizing the evaluation problem  stakeholder interests  Divergent and convergent processes  Developing evaluation.
Suggested Components of a Schoolwide Reading Plan Part 1: Introduction Provides an overview of key components of reading plan. Part 2: Component details.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Paul O’Halloran Gaza, April The 10-ESC, were originally developed in the UK by the NIMHE, in consultation with service users and carers together.
Researchers Without Borders Webinar 4 A Framework and Suite of Instruments for Examining Fidelity of Implementation Jeanne Century Center for Elementary.
Measuring Fidelity in Early Childhood Scaling-Up Initiatives: A Framework and Examples Carl J. Dunst, Ph.D. Orelena Hawks Puckett Institute Asheville,
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
Chief Examiner’s Checklist IDEAS You need to find different ideas that suit your specification. Photos or sketches or a combination of both. Annotate.
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Symposium Chair: Chris S. Hulleman,
Open Forum: Scaling Up and Sustaining Interventions Moderator: Carol O'Donnell, NCER
Basic Concepts of Outcome-Informed Practice (OIP).
Preliminary Results – Not for Citation Investing in Innovation (i3) Fund Evidence & Evaluation Webinar 2015 Update Note: These slides are intended as guidance.
ASSESSMENT-BASED GRADING Assigning grades on the basis of classroom assessment: Chapter Sixteen.
ISSUES & CHALLENGES Adaptation, translation, and global application DiClemente, Crosby, & Kegler, 2009.
Program Planning for Evidence-based Health Programs.
Symposium CLIENT –PROVIDER RELATIONSHIP AS AN ACTIVE INGREDIENT IN DELIVERY OF SOCIAL SERVICES Organizer: Jeanne C. Marsh, PhD, MSW University of Chicago.
Stages of Research and Development
a New Focus for External Validity
David S. Cordray, PhD Vanderbilt University
Evaluation of Nutrition-Sensitive Programs*
Conducting Efficacy Trials
OSEP Project Directors Meeting
Measuring adherence to Rights based family planning (RBFP) principles at the service delivery level in Uganda Lynn Bakamjian IBP Session on “Implementing.
The Decision-Making Process of Evidence-Based Practice
Building a Strong Outcome Portfolio
Paul O’Halloran Gaza, April 2010
Paul O’Halloran Gaza, April 2010
Mixed Up Multiplication Challenge
MONITORING AND EVALUATION IN TB/HIV PROGRAMS
Analyzing Intervention Fidelity and Achieved Relative Strength
Presentation transcript:

The Cause…or the “What” of What Works? David S. Cordray Vanderbilt University IES Research Conference Washington, DC June 16, 2006

Achieved Relative Strength =.15 (85)-(70) = Outcome Infidelity “Infidelity” T C Treatment Strength C t Expected Relative Strength =.25

Intervention Fidelity Intervention, as conceptualized (T and C) Intervention, as realized or implemented (t and c) Does the intervention, as implemented, faithfully represents the intervention “theory” or model? Intervention Fidelity

Dimensions Intervention Fidelity Adherence to the intervention model: –Essential components (activities, processes) –Necessary, but not unique, activities, processes and structures (supporting the essential components of T) –Ordinary features of the setting (shared with C); Quality of delivery (adherence); and Receipt essential components.

Fidelity and IES Goals Goal 2 (Development) –Need explicit model of the final intervention model –Products: Evidence of potential effects, an intervention model, all relevant materials, and detailed indices for assessing fidelity Goals 3 (Efficacy) and 4 (Scale-up) –Need detailed indices of fidelity for: training and program research (e.g., effects of core components) and –Need efficient indices for scaling implementation in the unit of delivery –Need efficient indices of fidelity in all conditions (T and C) “Goal 6” – Managing the tension between fidelity and inevitable adaptation

Substantial Challenges Developing explicit enough program models so the fidelity can be meaningfully assessed; Measuring fidelity entails: –Multiple indicators –Mixed methods; Constructing indexes: –Differential weighting of components –Use of broader range of scale construction procedures (cumulative and rubric-based); Validation studies; Because fidelity plays many roles, we’ll need multiple versions of fidelity scales (e.g, “scale-down”)

The Cause…and “What Works?” In experimental research, the causal agent is a relative difference between ingredients in T and C conditions; The achieved relative strength of the contrast is affected by infidelity; and With adequate indices of fidelity, we can move to questions of what works, under what circumstances, and why.