Conceptualizing Intervention Fidelity: Implications for Measurement, Design, and Analysis Implementation: What to Consider At Different Stages in the Research.

Slides:



Advertisements
Similar presentations
Project VIABLE: Behavioral Specificity and Wording Impact on DBR Accuracy Teresa J. LeBel 1, Amy M. Briesch 1, Stephen P. Kilgus 1, T. Chris Riley-Tillman.
Advertisements

LEE JOSEPH CRONBACH  Lee Joseph Cronbach was an American educational psychologist who made significant contributions to psychological testing and measurement.
Roger D. Goddard, Ph.D. March 21, Purposes Overview of Major Research Grants Programs Administered by IES; Particular Focus on the Education Research.
Just Because They Say It’s ‘Scientifically- based’ Doesn’t Mean It Will Work!
Modeling “The Cause”: Assessing Implementation Fidelity and Achieved Relative Strength in RCTs David S. Cordray Vanderbilt University IES/NCER Summer Research.
The SWIFT Center SCHOOLWIDE INTEGRATED FRAMEWORK FOR TRANSFORMATION.
July 2007 IDEA Partnership 1 RTI Process What is it?
AGRISCIENCE CURRICULUM REVIEW Ginnie Bushong A ED 615 Investigations and Studies in Applied Research.
Funding Opportunities at the Institute of Education Sciences: Information for the Grants Administrator Elizabeth R. Albro, Ph.D. Acting Commissioner National.
Analyzing Intervention Fidelity and Achieved Relative Strength David S. Cordray Vanderbilt University NCER/IES RCT Training Institute,2010.
1 Reading First Internal Evaluation Leadership Tuesday 2/3/03 Scott K. Baker Barbara Gunn Pacific Institutes for Research University of Oregon Portland,
Scott Baker, Ph.D. Michael Rebar, Ph.D. Oregon Reading First Center Oregon Reading First Review of Supplemental and Intervention Programs: Summary by Essential.
1 National Reading First Impact Study: Critique in the Context of Oregon Reading First Oregon Reading First Center May 13, 2008 Scott K. Baker, Ph.D. Hank.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Aligning Course Competencies using Text Analytics
Tips on Preparing a Successful Educational Research Proposal Fiona Fui-Hoon Nah, professor, BIT Nancy J. Stone, professor and chair, Psychological Science.
Reinventing After-School: A Review of New Research on After-School Interventions ————————— September 17, 2008 National Press Club.
Reliability and factorial structure of a Portuguese version of the Children’s Hope Scale José Tomás da Silva Maria Paula Paixão Catarina Carvalho dos Santos.
Doctor of Education (EdD). Programme Objectives March EdD Program 1  The Doctor of Education (Ed.D) is designed to produce high quality academics.
Assessing Intervention Fidelity in RCTs: Concepts and Methods Panelists: David S. Cordray, PhD Chris Hulleman, PhD Joy Lesnick, PhD Vanderbilt University.
An Update on Florida’s Charter Schools Program Grant: CAPES External Evaluation 2014 Florida Charter Schools Conference: Sharing Responsibility November.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Dr. Bonnie J. Faddis & Dr. Margaret Beam RMC Research Fidelity of Implementation and Program Impact.
Title: A study… Name Department of Early Childhood Education, University of Taipei References Hoover-Dempsey, K. V., & Sandler, H. (1995). Parental involvement.
Moving from Development to Efficacy & Intervention Fidelity Topics National Center for Special Education Research Grantee Meeting: June 28, 2010.
Title: A study… Name Department of Early Childhood Education, University of Taipei ABSTRACT We discuss how a research-based model of the parental involvement.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Achieved Relative Intervention Strength: Models and Methods Chris S. Hulleman David S. Cordray Presentation for the SREE Research Conference Washington,
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Discussant Comments Prepared by Carol.
The Role of Information in Systems for Learning Paul Nichols Charles DePascale The Center for Assessment.
Research Policies and Mechanisms: Key Points from the National Mathematics Advisory Panel Joan Ferrini-Mundy Director, Division of Research on Learning.
Slide 1 Estimating Performance Below the National Level Applying Simulation Methods to TIMSS Fourth Annual IES Research Conference Dan Sherman, Ph.D. American.
Investigating K-12/University Partnerships: A Case Study Analysis Zulma Y. Méndez, Ph.D. Rodolfo Rincones, Ph.D. College of Education Department of Educational.
Crossing Methodological Borders to Develop and Implement an Approach for Determining the Value of Energy Efficiency R&D Programs Presented at the American.
Conceptualizing Intervention Fidelity: Implications for Measurement, Design, and Analysis Implementation Research Methods Meeting September 20-21, 2010.
LECTURE 2 EPSY 642 META ANALYSIS FALL CONCEPTS AND OPERATIONS CONCEPTUAL DEFINITIONS: HOW ARE VARIABLES DEFINED? Variables are operationally defined.
Critical Elements Effective teaching Alignment of curriculum and instruction to standards Instructional program coherence Fidelity of implementation Evaluation.
Assessing Intervention Fidelity in RCTs: Models, Methods and Modes of Analysis David S. Cordray & Chris Hulleman Vanderbilt University Presentation for.
Laying the Foundation for Scaling Up During Development.
Evidence Based Practices: What are they? Who is Defining Them? and How does it Relate to My Work? Lou Danielson Susan Sanchez, Brian Cobb, Kathleen Lane.
Session III College Readiness: Cognitive. Copyright © All rights reserved. College Readiness: Cognitive What we already know: – All students need.
Changing Teaching Behaviors: The Road to Student Achievement Powell et al: Technology as a potentially cost-effective alternative to on-site coaching Research.
Securing External Federal Funding Janice F. Almasi, Ph.D. Carol Lee Robertson Endowed Professor of Literacy University of Kentucky
OSEP Project Director’s Meeting: Establishing, Sustaining and Scaling Effective Practices Rob Horner University of Oregon OSEP TA Center on PBIS
Using State Tests to Measure Student Achievement in Large-Scale Randomized Experiments IES Research Conference June 28 th, 2010 Marie-Andrée Somers (Presenter)
RTI International is a trade name of Research Triangle Institute Nancy Berkman, PhDMeera Viswanathan, PhD
Developing an evaluation of professional development Webinar #2: Going deeper into planning the design 1.
Part 2: Assisting Students Struggling with Reading: Multi-Tier System of Supports H325A
Progressing Toward a Shared Set of Methods and Standards for Developing and Using Measures of Implementation Fidelity Symposium Chair: Chris S. Hulleman,
CENTER FOR PREVENTION AND EARLY INTERVENTION  A COLLABORATION BETWEEN THE JHU BLOOMBERG SCHOOL OF PUBLIC HEALTH, BLOOMBERG SCHOOL OF PUBLIC HEALTH, THE.
Effectiveness of Selected Supplemental Reading Comprehension Interventions: Impacts on a First Cohort of Fifth-Grade Students June 8, 2009 IES Annual Research.
National Center on Response to Intervention RTI Essential Component: Progress Monitoring National Center on Response to Intervention.
The Cause…or the “What” of What Works? David S. Cordray Vanderbilt University IES Research Conference Washington, DC June 16, 2006.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
1 Teaching Supplement.  What is Intersectionality?  Intersectionality and Components of the Research Process  Implications for Practice 2.
Building an Evidence-Based Nursing Practice
Are Evidence-Based Practice Websites Trustworthy and Accessible?
David S. Cordray, PhD Vanderbilt University
Title: A study… Name Abstract Intervantions Discussion Introduction
UNIT 3: COURSE DESIGN Unit Objectives: Students are able to:
Future Directions Conference September 3rd, 2010
Practice- How to Present the Evidence
Sabine Wollscheid, Senior Researcher, Dr. phil.
Domain-Specific Prior Knowledge and Learning: A Meta-Analysis
Considering Fidelity as an Element Within Scale Up Initiatives Validation of a Multi-Phase Scale Up Design for a Knowledge-Based Intervention in Science.
Analyzing Intervention Fidelity and Achieved Relative Strength
Some Further Considerations in Combining Single Case and Group Designs
Navigating Institutional Improvement and Accreditation
COURSE EVALUATION Spring 2019 Pilot August 27, 2019.
Presentation transcript:

Conceptualizing Intervention Fidelity: Implications for Measurement, Design, and Analysis Implementation: What to Consider At Different Stages in the Research Process Panel presentation for the Institute for Education Sciences Annual Grantee Meeting September 7, 2011 Chris S. Hulleman, Ph.D.

Implementation vs. Implementation Fidelity Descriptive What happened as the intervention was implemented? A priori model How much, and with what quality, were the core intervention components implemented? Implementation Assessment Continuum Fidelity: How faithful was the implemented intervention (t Tx ) to the intended intervention (T Tx )? Infidelity: T Tx – t Tx Most assessments include both

Linking Fidelity to Causal Models Rubin’s Causal Model: – True causal effect of X is (Y i Tx – Y i C ) – RCT is best approximation – Tx – C = average causal effect Fidelity Assessment – Examines the difference between implemented causal components in the Tx and C – This difference is the achieved relative strength (ARS) of the intervention – Theoretical relative strength = T Tx – T C – Achieved relative strength = t Tx – t C Index of fidelity

Implementation assessment typically captures… (1) Essential or core components (activities, processes, structures) (2) Necessary, but not unique, activities, processes and structures (supporting the essential components of Tx) (3) Best practices (4) Ordinary features of the setting (shared with the control group) Intervention Fidelity assessment

Why is this Important? Construct Validity – Which is the cause? (T Tx - T C ) or (t Tx – t C ) – Degradation due to poor implementation, contamination, or similarity between Tx and C External Validity – Generalization is about t Tx – t C – Implications for future specification of Tx – Program failure vs. Implementation failure Statistical Conclusion Validity – Variability in implementation increases error, and reduces effect size and power

Why is this important? Reading First implementation results ComponentsSub- components Performance LevelsARS RFNon-RF Reading Instruction Daily (min.) Daily in 5 components (min.) Daily with High Quality practice Overall Average0.35 Adapted from Gamse et al. (2008) and Moss et al. (2008) Effect Size Impact of Reading First on Reading Outcomes =.05

5-Step Process (Cordray, 2007) 1.Specify the intervention model 2.Develop fidelity indices 3.Determine reliability and validity 4.Combine indices 5.Link fidelity to outcomes Conceptual Measurement Analytical

Some Challenges Intervention Models Unclear interventions Scripted vs. Unscripted Intervention Components vs. Best Practices Measurement Novel constructs: Standardize methods and reporting (i.e., ARS) but not measures (Tx-specific) Measure in both Tx & C Aggregation (or not) within and across levels Analyses Weighting of components Psychometric properties? Functional form? Analytic frameworks Descriptive vs. Causal (e.g., ITT) vs. Explanatory (e.g., LATE) See Howard’s Talk Next! Future Implementation Zone of Tolerable Adaptation Systematically test impact of fidelity to core components Tx Strength (e.g., ARS): How big is big enough?

Treatment Strength (ARS): How Big is Big Enough? Effect Size StudyFidelity ARS Outcome Motivation – Lab Motivation – Field Reading First* *Averaged over 1 st, 2 nd, and 3 rd grades (Gamse et al., 2008).

Thank You! And Special Thanks to My Collaborators: Catherine Darrow, Ph. D. Amy Cassata-Widera, Ph.D. David S. Cordray Michael Nelson Evan Sommer Anne Garrison Charles Munter

Chris Hulleman is an assistant professor at James Madison University with joint appointments in Graduate Psychology and the Center for Assessment and Research Studies. Chris also co-directs the Motivation Research Institute at James Madison. He received his PhD in social/personality psychology from the University of Wisconsin-Madison in 2007, and then spent two years as an Institute for Education Sciences Research Fellow in Vanderbilt University’s Peabody College of Education. In 2009, he won the Pintrich Outstanding Dissertation Award from Division 15 (Educational Psychology) of the American Psychological Association. He teaches courses in graduate statistics and research methods, and serves as the assessment liaison for the Division of Student Affairs. His motivation research focuses on motivation in academic, sport, work, and family settings. His methodological interests include developing guidelines for translating laboratory research into the field, and developing indices of intervention fidelity. As a Research Affiliate for the National Center on Performance Incentives, Chris is involved in several randomized field experiments of teacher pay-for-performance programs in K-12 settings. His scholarship has been published in journals such as Science, Psychological Bulletin, Journal of Research on Educational Effectiveness, Journal of Educational Psychology, and Phi Delta Kappan. Department of Graduate Psychology James Madison University

Achieved Relative Strength (t tx ) = 0.15 Infidelity “Infidelity” (85)-(70) = 15 t C t tx T Tx TCTC Treatment Strength Expected Relative Strength = T Tx - T C = ( ) = Outcome