Evidence-based Education: Can We Get There From Here?

Slides:



Advertisements
Similar presentations
Integrating the NASP Practice Model Into Presentations: Resource Slides Referencing the NASP Practice Model in professional development presentations helps.
Advertisements

Dan Ebbert Paul Cicciarelli
RTI: Questions and Answers June, Response to Intervention (RTI) What is it? a problem-solving systema problem-solving system a way to monitor progressa.
Instructional Decision Making
Definitions of EBP Popular in SW
Current Issues in School-Based Speech-Language Pathology
+ Evidence Based Practice University of Utah Training School Psychologists to be Experts in Evidence Based Practices for Tertiary Students with Serious.
Improving the Intelligence of Assessment Through Technology: An IES Perspective Martin E. Orland, Ph.D. Special Assistant Institute of Education Sciences.
School-wide Positive Behavior Support Rob Horner and George Sugai University of Oregon and University of Connecticut OSEP TA Center on Positive Behavior.
PBS Overview Goal for Today To introduce you to key principles and basic concepts for a continuum of support for students known as Positive Behavior.
US Office of Education K
Teacher Preparation and Education Reform: A Behavioral Systems Perspective Çhair: Ronnie Detrich, Wing Institute Discussant: Chuck Salzberg, Utah State.
Evidence-Based Education (EBE) Grover J. (Russ) Whitehurst Assistant Secretary Educational Research and Improvement United States Department of Education.
Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education Ronnie Detrich Wing Institute.
School-wide Positive Behavioral Interventions and Supports & Students with Autism Jointly developed by the above organizations with funding from the U.S.
Including Parents in Evidence-based Education Ronnie Detrich Wing Institute.
Community-Based Participatory Research
Ethical Considerations in Behavior Analysis and Autism Intervention Christine M. Holland, MS BCBA
Assessment of Special Education Students
Diane Paul, PhD, CCC-SLP Director, Clinical Issues In Speech-Language Pathology American Speech-Language-Hearing Association
Some Emerging Characteristics of Sustainable Practices Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Trina D. Spencer ABA 2009 Research Based Principles What Practice Can’t Do Without.
From the Boutique to the Mainstream: The Role of Behavior Analysis in Education Reform Ronnie Detrich Wing Institute MABA 2010, Lake Geneva, Wisconsin.
Clinical Social Work Research Patience Matute-Ewelisane Eugene Shabash Jayne Griffin.
Sustainability Through the Looking Glass: Shifting Contingencies Across Levels of a System Jack States Randy Keyworth Ronnie Detrich 34th Annual Convention.
Evidence Based Practice
Association for Behavior Analysis Conference Sustainable Programs: In Search of the Elusive Randy Keyworth Ronnie Detrich Jack States.
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Best Practices: Standing on the Shoulders of Giants? Ronnie Detrich Wing Institute.
RTI: Reasons, Practices, Systems, & Considerations George Sugai OSEP Center on PBIS University of Connecticut December 6,
Randomized Clinical Trials: The Versatility and Malleability of the “Gold Standard” Wing Institute Jack States Ronnie Detrich Randy Keyworth “Get your.
Evidence-Based Public Health Nancy Allee, MLS, MPH University of Michigan November 6, 2004.
Delmar Learning Copyright © 2003 Delmar Learning, a Thomson Learning company Nursing Leadership & Management Patricia Kelly-Heidenthal
Effects of a Universal Prevention Program in First and Second Grade Classrooms on Young Adult Problem Outcomes: Implications for Research, Prevention and.
The Ethical and Legal Basis for Evidence- based Education: Implications for the Profession Ronnie Detrich Wing Institute TED Conference November, 2008.
1 The Prevention, Treatment and Management of Conduct Problems in Childhood David M Fergusson Christchurch Health & Development Study Department of Psychological.
Evidence Based Practices: What are they? Who is Defining Them? and How does it Relate to My Work? Lou Danielson Susan Sanchez, Brian Cobb, Kathleen Lane.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
OSEP Project Directors’ Conference Washington, DC July 21, 2008 Tools for Bridging the Research to Practice Gap Mary Wagner, Ph.D. SRI International.
+ IDENTIFYING AND IMPLEMENTING EDUCATIONAL PRACTICES SUPPORTED BY RIGOROUS EVIDENCE: A USER FRIENDLY GUIDE Presented by Kristi Hunziker University of Utah.
Evidence Based Practice RCS /9/05. Definitions  Rosenthal and Donald (1996) defined evidence-based medicine as a process of turning clinical problems.
Ten Years of Evidence-based Education: A Status Report Ronnie Detrich Wing Institute.
+ NASP’s Position Statement on Prevention and Intervention Research in the Schools Training School Psychologists to be Experts in Evidence Based Practices.
+ Evidence Based Practice University of Utah Evidence-Based Treatment and Practice: New Opportunities to Bridge Clinical Research and Practice, Enhance.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Introduction to School-wide Positive Behavior Support.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute APBS Conference March, 2007.
Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.
SRBI/PBIS Implementation: Considerations George Sugai & Jen Freeman Center for Behavioral Education & Research Center on Positive Behavioral Interventions.
Campbell Collaboration Colloquium 2014 "Better Evidence for a Better World" Why The U.S. Is So Bad At Knowledge Transfer and Implementation Randy Keyworth.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Copyright © 2010, 2006, 2002 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 10 Evidence-Based Practice Sharon E. Lock.
Evidence-Based Mental Health PSYC 377. Structure of the Presentation 1. Describe EBP issues 2. Categorize EBP issues 3. Assess the quality of ‘evidence’
Copyright © 2011 Pearson Education, Inc. All rights reserved. Ethical Issues Chapter 30.
RTI: Linking Academic and Behavior Support Wesley Temple Dawn Davis.
Deciphering “Evidence” in the New Era of Education Research Standards Ben Clarke, Ph.D. Research Associate - Center for Teaching and Learning, University.
1 Copyright © 2012 by Mosby, an imprint of Elsevier Inc. Copyright © 2008 by Mosby, Inc., an affiliate of Elsevier Inc. Chapter 15 Evidence-Based Practice.
Bryan G. Cook University of Hawaii
Evidence Based Practice Process
MUHC Innovation Model.
Evidence-Based Practice I: Definition – What is it?
RTI & SRBI What Are They and How Can We Use Them?
MeOTa fall conference October 22, 2016
Extending RTI to School-wide Behavior Support
Some Considerations for Developing & Implementing Clinical Experiences
Overview of Individual Student Systems
Evidence-Based Public Health
Presentation transcript:

Evidence-based Education: Can We Get There From Here? Ronnie Detrich Wing Institute Association for Behavior Analysis International Evidence-based Education Conference September 6, 2008

Why Do We Need Evidence-based Education? From a university in the U.S.

Acknowledgements Randy Keyworth Jack States Tom Critchfield Tim Slocum Mark Shriver Teri Lewis-Palmer Karen Hager Janet Twyman Hill Walker Susan Wilczynski

Why Evidence-based Education? Federal policy emphasizes scientifically based instruction. No Child Left Behind Over 100 references to scientifically based instruction. Individuals with Disabilities Education Improvement Act Pre-service and professional development should prepare educators to implement scientifically based instructional practices.

Why Evidence-based Education? Professional organizations began validating interventions as evidence-based: Mid 1990’s Society for the Study of School Psychology American Psychological Association More recently What Works Clearinghouse (Institute for Education Science) Campbell Collaboration Coalition for Evidence-based Policy National Autism Center

Why Evidence-based Education? Most professional organizations have ethical guidelines emphasizing services are based on scientific knowledge. American Psychological Association Psychologists’ work is based on the established scientific and professional knowledge of the discipline. National Association of School Psychologists … direct and indirect service methods that the profession considers to be responsible, research-based practice. The Behavior Analyst Certification Board The behavior analyst always has the responsibility to recommend scientifically supported, most effective treatment procedures.

What is Evidence-based Practice? At its core the EBP movement is a consumer protection movement. It is not about science per se. It is a policy to use science for the benefit of consumers. “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)

What is Evidence-based Practice? Evidence-based practice has its roots in medicine. Movement has spread across major disciplines in human services: Psychology School Psychology Social Work Speech Pathology Occupational Therapy

What Is Evidence-based Practice? Professional Judgment Best available evidence Client Values Sackett et al (2000) Best Available Evidence Professional Judgment Client Values EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention.

What is Evidence-based Education? The term “evidence-based” has become ubiquitous in last decade. There is no consensus about what it means. At issue is what counts as evidence. Federal definition emphasizes experimental methods. Preference for randomized trials. Definition has been criticized as being positivistic.

What Counts as Evidence? Ultimately, this depends on the question being asked. Even behavior analysis allows for qualitative evidence (social validity measures). In EBP the goal is to identify causal relations between interventions and outcomes. Experimental methods do this best.

What Counts as Evidence? Even if we accept causal demonstrations to be evidence, we have no consensus. Randomized Clinical Trials (RCT) have become the “gold standard.” There is controversy about the status of single subject designs. Most frequently criticized on the basis of external validity.

How Are Evidence-based Interventions Identified? Identification is more than finding a study to support an intervention. Identification involves distilling a body of knowledge to determine the strength of evidence.

How Are Evidence-based Interventions Identified? Distillation requires standards of evidence for reviewing the literature. Standards specify: the quantity of evidence the quality of evidence

Continua of Evidence Threshold of Evidence Janet Twyman, 2007 Quantity of the Evidence Quality of the Evidence Meta-analysis (systematic review) Single Case Replication (Direct and Parametric) Single Study Various Investigations Repeated Systematic Measures Convergent Evidence Personal Observation Expert Opinion Current “Gold Standard” High Quality Randomized Controlled Trial Uncontrolled Studies General Consensus Single Case Designs Semi-Randomized Trials Well-conducted Clinical Studies Threshold of Evidence The literature bearing on a certain clinical procedure is inventoried for: the relevance of findings, the quality of findings, the number of findings, and the consistency of findings for establishing a clear and singular linkage between a certain clinical outcome and a certain clinical procedure applied to members of a certain clinical population. The terms "levels of evidence" or "strength of evidence" refer to systems for classifying the evidence in a body of literature through a hierarchy of scientific rigor and quality. Several dozen of these hierarchies exist (Agency for Healthcare Research and Quality [AHRQ], 2002b). Some systems comprise three levels and others eight or more. The gradations in some hierarchies are based on randomization and experimental controls. The organizing focus for others may center on magnitude of effect sizes, confidence intervals, number of results, consistency of results, sample size, or Type I and Type II error rates (AHRQ, 2002a). Janet Twyman, 2007

How Are Evidence-based Interventions Identified? Two approaches to validating interventions Threshold approach: Evidence must be of a specific quantity and quality before an intervention is considered evidence-based. Hierarchy of evidence approach: Strength of evidence falls along a continuum with each level having differential standards.

How Are Evidence-based Interventions Identified? There are no agreed upon standards. It is possible for an intervention to be evidence-based using one set of standards and to fail to meet evidence standards using an alternative set. Difficult for consumers and decision makers to sort out the competing claims about what is evidence-based.

Evidence-based Intervention

Effective Ineffective Effective Ineffective Actual Effectiveness Assessed Effectiveness Effective Ineffective Effective Ineffective Effective Effective True Most likely with hierarchy approach False Positive Positive Most likely with threshold approach False True Ineffective Negative Negative Ineffective

Choosing Between False Positives and False Negatives At this stage, it is better to have more false positives than false negatives. False Negatives: Effective interventions will not be selected for implementation. As a consequence, less likely to determine that they are actually effective. False Positives: Progress monitoring will identify interventions that are not effective.

Why Do We Need Evidence-based Education? Kazdin (2000) identified 550 named interventions for children and adolescents. A very small number of these interventions have been empirically evaluated. Of those that have been evaluated, the large majority are behavioral or cognitive-behavioral. Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact.

Evidence-based Education Roadmap Research to Practice Evidence-based Education Roadmap Research Replicability Practice

Efficacy Research (What Works?) Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Efficacy Research (What Works?) What works? Primary concern is demonstrations of causal relations. Rigorous experimental control so threats to internal validity are minimized. Not always easy to immediately translate to practice.

Behavior Analysis and Efficacy Behavior Analysis: emphasis on rigorous experimental control has resulted in many important contributions to education. Systematic, explicit teaching methods. Wide spread use of reinforcement systems.

Evidence-based Education Roadmap Research to Practice Evidence-based Education Roadmap Research Replicability Practice

Effectiveness Research (When Does it Work?) Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Effectiveness Research (When Does it Work?) Evaluates the robustness of an intervention when “taken to scale” and implemented in more typical practice settings. Answers questions related to external validity or generalizability of effects. Typically, smaller effect size. Efficacy and effectiveness fall on a continuum. When does it work?

Behavior Analysis and Effectiveness Research Behavior Analysis has not generally concerned itself with external validity questions. Emphasizes generality of behavioral principles. Has not resulted in the type of research that answers the “actuarial” questions asked by effectiveness research. What percent of population of students will benefit from a specific program? Which students will benefit?

Research to Practice Issues The lag time from efficacy research to effectiveness research to dissemination is 10-20 years. (Hoagwood, Burns & Weisz, 2002) Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004)

Good Behavior Game: Efficacy First efficacy study: fourth grade classroom (Barrish, Saunders, Wolf, 1969) Subsequent replications across: Settings (The Sudan, library, sheltered workshop) Students (general education, special education, 2nd grade, 5th grade, 6th grade, adults with developmental disabilities) Behaviors (on-task, off, task, disruptive, work productivity) All efficacy studies were single subject designs.

Good Behavior Game: Effectiveness Series of effectiveness studies by Kellam et al. examining it as a prevention program: Special issue of Drug and Alcohol Dependence (2008) If exposed to GBG in 1st and 2nd grade then reduced risk for young adults of: drug/alcohol abuse smoking anti-social personality disorder subsequent use of school-based services suicidal ideation and attempts All studies were RCTs.

Good Behavior Game: Validation Coalition for Evidence-based Policy reviewed the literature for Good Behavior Game: Determined it was evidence-based. Review included only those studies that were RCT. All single subject research was ignored.

A Consumer Perspective: One Year Follow-up “…you should give them more good behavior game. Keep on doing what’s good.”

Evidence-based Education Roadmap Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

Implementation (How do we make it work?) Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Implementation (How do we make it work?) “Identifying evidence-based interventions is one thing, implementing them is an entirely different thing.” (Dean Fixsen, 2008) The primary challenge is how to place an intervention within a specific context. Until implementation questions are answered, the ultimate promise of evidence-based education will go unfulfilled. How do we make it work?

Implementation is Fundamental 80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform

Behavior Analysis and Implementation Service delivery in behavior analysis is a mediated model. Requires behavior analysts to address many of the issues of implementation for each project. We have not systematically attended to many of these issues, especially at large scale. What organizational features are necessary to support evidence-based interventions? How do we modify an intervention so it fits local contingencies without diminishing effectiveness.

Evidence-based Education Roadmap Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

Progress Monitoring (Is it Working?) Practice Research Replicability Sustainability What works? When does it work? How do we make it work? Is it working? Progress Monitoring (Is it Working?) Research guides us to interventions that are most likely to work. Generalizing from a research base to a specific instance requires a leap of faith and confidence < 1.0. Assures that an intervention is actually effective in a setting (practice-based evidence). Is it working?

Behavior Analysis and Progress Monitoring Progress monitoring is the sine qua non of applied behavior analysis. It is not applied behavior analysis if data are not collected and reviewed. Behavior analysis has made enormous contributions to the direct measurement of behavior. Represents the best example of practice-based evidence about evidence-based practices.

Evidence-based Education Roadmap Research to Practice Evidence-based Education Roadmap Research Replicability Sustainability Practice

Similarities and Differences Between Behavior Analysis and Evidence-based Practice Unit of analysis is populations Unit of analysis is individual Data-based decision making Evidence is derived from systematic reviews Evidence is derived from experiments Assumption that science produces best outcomes for consumers Practitioner must know how to implement effectively Practitioner must know laws of behavior and how to apply

A Prevention Model for Evidence-based Education Academic Systems Behavioral Systems Intensive, Individual Interventions Individual Students Assessment-based High Intensity Intensive, Individual Interventions Individual Students Assessment-based Intense, durable procedures 1-5% 1-5% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response 5-10% 5-10% Targeted Group Interventions Some students (at-risk) High efficiency Rapid response Universal Interventions All students Preventive, proactive 80-90% Universal Interventions All settings, all students Preventive, proactive 80-90%

Can We Get There From Here? Behavior analysis has a great deal to contribute to the discussion about the most effective educational interventions. The current emphasis on RCT puts behavior analysis in a difficult position. If we are to have maximum impact on the field of education then we must change our behavior. “If you are not at the table, then you are on the menu.” (Cathy Watkins, 2008)

Can We Get There From Here? We should begin to conduct RCTs. If we have robust interventions, they will fare well with RCT. RCTs are well suited to answer actuarial questions. Decision makers are concerned with these actuarial questions. “How big a bang will I get for my buck?”

Can We Get There From Here? Sidman, The Behavior Analyst, 2006: “To make the general contributions of which our science is capable, behavior analysts will have to use methods of wider generality, in the sense they affect many people at the same time- or within a short time, without our being concerned about any particular members of the relevant population.”

Can We Get There? We should not abandon rigorous single subject research. Expand our repertoire to include other methods to answer different types of questions. Engage in a social influence process to assure that SSDs are included in evidence standards. Especially critical in special education context.