Ten Years of Evidence-based Education: A Status Report Ronnie Detrich Wing Institute.

Slides:



Advertisements
Similar presentations
Instructional Decision Making
Advertisements

Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute Cal-ABA, 2011.
Evidence-based Education: Can We Get There From Here?
SISEP Dean Fixsen, Karen Blase, Rob Horner, and George Sugai
Evidence-based Education: It Isn’t as Simple as You Might Think Ronnie Detrich Randy Keyworth Jack States Wing Institute.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute.
IDEA and NCLB Accountability and Instruction for Students with Disabilities SCDN Presentation 9/06 Candace Shyer.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
From Evidence-based Practice to Practice-based Evidence: Behavior Analysis in Special Education Ronnie Detrich Wing Institute.
Establishing an Effective Network of PB4L: School wide Coaches
Including Parents in Evidence-based Education Ronnie Detrich Wing Institute.
Ingham RtI District Leadership Team November 4, 2009.
C4K – Building an efficient and effective delivery system to impact critical outcomes for kids Our initial focus as we build this system is early literacy.
May Dr. Schultz, Dr. Owen, Dr. Ryan, Dr. Stephens.
“Sorting Out Response to Intervention” Nassau Association of District Curriculum Officials February 26, 2009 Presented by Arlene B. Crandall ABCD Consulting,
Treatment Integrity: A Fundamental Component of PBS Ronnie Detrich Wing Institute.
Prepared by the Justice Research and Statistics Association IMPLEMENTING EVIDENCE-BASED PRACTICES.
INSTRUCTIONAL LEADERSHIP FOR DIVERSE LEARNERS Susan Brody Hasazi Katharine S. Furney National Institute of Leadership, Disability, and Students Placed.
Funding Opportunities at the Institute of Education Sciences Elizabeth R. Albro, Ph.D. Associate Commissioner Teaching and Learning Division National Center.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Professional Development and Technical Assistance: The National Blueprint for Success Tim Lewis, Ph.D University of Missouri OSEP Center on Positive Behavioral.
Some Emerging Characteristics of Sustainable Practices Ronnie Detrich Randy Keyworth Jack States Wing Institute.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.

Implementation Strategy for Evidence- Based Practices CIMH Community Development Team Model Pam Hawkins, Senior Associate Association for Criminal Justice.
9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Sustainability Through the Looking Glass: Shifting Contingencies Across Levels of a System Jack States Randy Keyworth Ronnie Detrich 34th Annual Convention.
Association for Behavior Analysis Conference Sustainable Programs: In Search of the Elusive Randy Keyworth Ronnie Detrich Jack States.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
The Seventh Annual Hawaii International Conference on Education Sustainability: Implementing Programs That Survive and Thrive Randy Keyworth Jack States.
Best Practices: Standing on the Shoulders of Giants? Ronnie Detrich Wing Institute.
Coaching for Competence Margie McGlinchey SPDG Regional Mtg. October 1, 2009 Steve Goodman Margie McGlinchey Kathryn Schallmo Co-Directors.
A Framework for Making a Difference Rob Horner, University of Oregon Deputy Director of the Research to Practice Division for the U.S. Department of Education’s.
Organizational Conditions for Effective School Mental Health
MiBLSi Schools’ Implementation Process and Student Outcomes Anna L. Harms Michigan State University MiBLSi State Conference
A Descriptive Approach to Measuring a School Culture Ronnie Detrich Wing Institute ABAI Seattle, Washington 2012.
The Ethical and Legal Basis for Evidence- based Education: Implications for the Profession Ronnie Detrich Wing Institute TED Conference November, 2008.
Maria E. Fernandez, Ph.D. Associate Professor Health Promotion and Behavioral Sciences University of Texas, School of Public Health.
Goose Creek CISD Special Education Districtwide Staff Development Conference February 15, 2013.
1 RESPONSE TO INSTRUCTION ________________________________ RESPONSE TO INTERVENTION New Opportunities for Students and Reading Professionals.
Educable Mental Retardation as a Disability: Accurate, Defensible, & Compliant Identification Mississippi Department of Education.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Getting There from Here: Creating an Evidence- Based Culture Within Special Education Ronnie Detrich Randy Keyworth Jack States.
Iowa Support System for Schools in Need of Assistance (SINA) Overview and Audit Iowa Department of Education and AEA 267 August 2011.
“A Truthful Evaluation Of Yourself Gives Feedback For Growth and Success” Brenda Johnson Padgett Brenda Johnson Padgett.
Evidence-based Education and the Culture of Special Education Chair: Jack States, Wing Institute Discussant: Teri Palmer, University of Oregon.
Dr. Sarah McPherson New York Institute of Technology Adapted from Lora Parks-Recore CEWW Special Education Training and Resource Center SETRC 1 Response.
Victoria White, PhD Ann George, EdD Associate Professor Assistant Professor Director of KC Metro Center SSLS.
OSEP UPDATE 2006 Lou Danielson Director Research to Practice Division U.S. Office of Special Education Programs August 2, 2006.
: The National Center at EDC
Data Report July Collect and analyze RtI data Determine effectiveness of RtI in South Dakota in Guide.
Treatment Integrity: Necessary but Not Sufficient for Improving Outcomes Ronnie Detrich Wing Institute.
THOUGHTFUL SUSTAINABILITY Teri Lewis Oregon State University Ronnie Detrich Wing Institute David StandifordOregon State University ABAI, San Antonio, Texas.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
An Expanded Model of Evidence-based Practice in Special Education Randy Keyworth Jack States Ronnie Detrich Wing Institute.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Copyright © Dean L. Fixsen and Karen A. Blase, Dean L. Fixsen, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health.
By: Jill Mullins. RtI is… the practice of providing high-quality instruction/intervention matched to student needs and using learning rate over time and.
Evidence, Ethics, and the Law Ronnie Detrich Wing Institute APBS Conference March, 2007.
Bridging the Research to Practice Gap: Perspectives from the Practice Side Ronnie Detrich Wing Institute.
A Descriptive Approach to Measuring a School Culture Ronnie Detrich Wing Institute Cal-ABA, 2012.
Using Student Data as a Basis for Feedback to Teachers Ronnie Detrich Wing Institute ABAI, 2011.
Campbell Collaboration Colloquium 2014 "Better Evidence for a Better World" Why The U.S. Is So Bad At Knowledge Transfer and Implementation Randy Keyworth.
Cal-ABA 26th Annual Western Regional Conference What We Know About Sustaining Programs? Randy Keyworth Ronnie Detrich Jack States.
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
Data-based Decision Making: More than the Data Ronnie Detrich Randy Keyworth Jack States Wing Institute Cal-ABA, March 2009.
Some Emerging Characteristics of Sustainable Practices
Exceptional Education Department
14 Cultural Competence Training, Assessment, and Evaluation of Cultural Competence, and Evidence-Based Practices in Culturally Competent Agencies.
Presentation transcript:

Ten Years of Evidence-based Education: A Status Report Ronnie Detrich Wing Institute

Goals for Today Review mandates of NCLB and IDEIA to emphasize scientifically-based instruction. Place in broader context of evidence-based practice. Status of evidence-based education after 10 years.

2001 No Child Left Behind By 2014 every student will be at grade level. Instructional methods will be scientifically based. Educators will be held accountable for outcomes.

A Closer Look at Scientifically-based Instruction NCLB: interventions to improve educational performance are based on scientific research.  In NCLB there are over 100 references to scientific research. IDEIA (2004): interventions are scientifically based instructional practices.

A Closer Look at Scientifically-based Instruction Specific requirements of IDEIA include:  Pre-service and professional development for all who work with students with disabilities to ensure such personnel have the skills and knowledge necessary to improve the academic achievement and functional performance of children with disabilities, including the use of scientifically based instructional practices, to the maximum extent possible.

A Closer Look at Scientifically-based Instruction Scientifically based early reading programs, positive behavioral interventions and supports, and early intervention services to reduce the need to label children as disabled in order to address the learning and behavioral needs of such children.

A Closer Look at Scientifically-based Instruction The Individualized Education Program (IEP) shall include a statement of the special education and related services and supplementary aids and services, based on peer-reviewed research to the extent practicable, to be provided to the child, or on behalf of the child, and a statement of the program modifications or supports that will be provided for the child.

A Closer Look at Scientifically-based Instruction In determining if a child has a specific learning disability, a local education agency may use a process that determines if a child responds to a scientific, research-based intervention as part of the evaluation procedures.

After 10 Years: How Are We Doing? Student performance has not changed in last decade.  As measured by NAEP.

SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, 2000, 2002, 2003, 2005, and 2007 Reading Assessments. Age 17 Score Age 13 Score Age 9 Score Age 17 Proficiency

Are We Getting Our Money’s Worth? SOURCE: U.S. Department of Education, National Center for Education Statistics. (2009). Digest of Education Statistics, 2008 (NCES ), Chapter 2 and Table 179. We were doing better in 1970 than 2009 because we were getting same effect for half the cost.

After 10 Years: How Are We Doing? NCLB and IDEIA increased interest in evidence- based education.  Placed in broader context of evidence-based practice and data-based decision making. “Evidence-based” has developed two meanings.  Practices that meet evidence standards.  Process for practitioner decision-making.

What is Evidence-based Practice (Validated Practices) Often perceived as a list of interventions practitioners must use.  National Reading Panel 5 elements of scientifically-based reading. Reading First-many states established lists of approved reading programs.  Insurance companies will not fund autism services unless intervention is on the list.

What is Evidence-based Practice? (The Process) At its core the EBP movement is a consumer protection movement.  It is not about science per se.  It is a policy to use science for the benefit of consumers.  “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)

What Is Evidence-based Practice? EBP is a decision-making approach that places emphasis on evidence to:  guide decisions about which interventions to use;  evaluate the effects of an intervention. Professional Judgment Best available evidence Client Values Sackett et al (2000) Professional Judgment Best Available Evidence Client Values

Identify ImplementEvaluate Evidence-based Intervention Identify Implement Evaluate Phases of Evidence-based Education

How Are We Doing? The term “evidence-based” has become ubiquitous in last decade.  No consensus about what it means.  At issue is what counts as evidence.  Federal definition emphasizes experimental methods. Preference for randomized trials. Definition has been criticized as being positivistic. Identify

What Counts as Evidence? Ultimately, this depends on the question being asked. In EBP the goal is to identify causal relations between interventions and outcomes.  Experimental methods do this best. Identify

What Counts as Evidence? Even if we accept causal demonstrations to be evidence, there is no consensus.  Randomized Clinical Trials (RCT) have become the “gold standard.”  There is controversy about the status of single case designs. WWC has recently established standards for single case designs.  No well established method for calculating effect sizes. Identify

What is an Evidence-based Intervention? Identification is more than finding a study to support an intervention. Identification involves distilling a body of knowledge to determine the strength of evidence.  Systematic review Identify

How Are Evidence-based Interventions Identified? Distillation requires standards of evidence for reviewing the literature.  Standards specify: the quantity of evidence the quality of evidence Identify

Relationship Between Quality and Quantity of Evidence HighLow High Low Quality Quantity Identify Evidence- based May meet evidence standards Not Evidence- based May meet evidence standards

How Are Evidence-based Interventions Identified? Number of organizations publishing evidence- based reviews.  What Works Clearinghouse  Best Evidence Encyclopedia  Promising Practices Network  Coalition for Evidence-based Policy Share commonalities but also differences.  Modest correlations (Briggs, 2008). Identify

Implications of EBP Reviews How are consumers to decide? ValidatedNot Validated Standard 1 Standard 2 Intervention X   Identify

Most likely with hierarchy approach Most likely with threshold approach EffectiveIneffective Effective Ineffective True Positive True Negative False Positive False Negative Assessed Effectiveness Actual Effectiveness EffectiveIneffective Effective Identify

Choosing Between False Positives and False Negatives At this stage, it is better to have more false positives than false negatives. False Negatives: Effective interventions will not be selected for implementation. As a consequence, less likely to determine that they are actually effective. False Positives: Progress monitoring will identify interventions that are not effective. Identify

All direct instruction reading programs (e.g., Reading Mastery, Soar to Success, Rewards) Identify Is Reading Mastery 1 an effective program for beginning readers?Is Reading Mastery an effective reading program for beginning readers?Are Direct Instruction reading programs effective for beginning readers?Are direct instruction reading programs effective for beginning readers? Little research available specific to Reading Mastery 1More research availableEven more research available Expansion has changed the initial question All DI Reading Curricula Reading Mastery 1-3 Reading Mastery 1

Implications of EBP Reviews Emphasis on “best” evidence.  In many instances non-existent. In absence, what is basis for decision making?  “Best available” evidence is standard in evidence- based practice (the process). Identify

What Works Practice Guides: Best Available Evidence Level of SupportPercent Minimal45% Moderate33% Strong22% 14 Practice Guides 78 Total Recommendations Identify Tim Slocum, 2011

Implementing Evidence-based Interventions Where Good Ideas go to Die Implement

Research to Practice Gap Most Evident Research to Practice Gap concern in many disciplines. Education is not excluded. Scientist/Practitioner model aimed to close the gap. Gap or Chasm? Implement

Scope of the Problem 550 named interventions for children and adolescents Behavioral Cognitive- behavioral Empirically evaluated? Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact. Kazdin (2000) Implement

James Lancaster first experiment demonstrating how to prevent scurvy John Lind again experimentally demonstrated the effectiveness of citrus in preventing scurvy British Navy adopted policy to have citrus on all ships in the Royal Navy. Scurvy in the British Royal Navy: An Example of the Research to Practice Gap Implement

Research to Practice Issues The lag time from efficacy research to effectiveness research to dissemination is years. (Hoagwood, Burns & Weisz, 2002) Practitioners often view research as irrelevant and impractical. (Hoagwood, Burns & Weisz, 2002) Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004) Implement

Challenges of Implementation Average life span of an educational innovation is months (Latham, 1988).  Why? Innovation more difficult than expected. Causes too much change. Takes too much time. Supporters leave. Personnel lack training. External funds run out. Inadequate supervision. Implement

80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform Implementation is Fundamental Implement

Implementation is Multiplicative Effective Intervention 1 X Effective Implementation0 = Outcomes that benefit0 Individuals and society Implement

Diffusion of Innovation Rogers, Diffusion of Innovation, 2003 Diffusion of innovation is a social process, even more than a technical matter. The adoption rate of innovation is a function of its compatibility with the values, beliefs, and past experiences of the individuals in the social system. Implement

Principles for Effective Diffusion: Improving the Odds (Rogers, 2003) Innovation has to solve a problem that is important for the “client.” Innovation must have relative advantage over current practice. It is necessary to gain support of the opinion leaders if adoption is to reach critical mass and become self-sustaining. Innovation must be compatible with existing values, experiences and needs of the community. Implement

Principles of Effective Diffusion: Improving the Odds Innovation is perceived as being simple to understand and implement. Innovation can be implemented on a limited basis prior to broad scale adoption. Results of the innovation are observable to others. Implement

Effective Programs Are Not Effectively Implemented Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings.  Programs reviewed across 5 dimensions Site selection Training Technical assistance Fidelity Sustainability Implement

Keys to Effective Implementation Critical elements in site readiness  Well connected local champion  Strong administrative support  Formal organizational commitments  Formal organizational staffing stability  Up front commitment of necessary resources  Program credibility within the community  Program sustained by the existing operational budget. Implement

Keys to Effective Implementation Critical elements of training  Adhere to requirements for training, skills, and education.  Hire all staff before scheduling training.  Encourage administrators to attend training.  Plan and budget for staff turnover.  Implement program immediately after training. Implement

Keys to Effective Implementation Critical elements of Technical Assistance  Proactive plan for technical assistance. Critical elements of Fidelity  Monitor fidelity. Critical elements of Sustainability  Function of how well other dimensions are implemented. Implement

Phases of Implementation Adoption of Practice Implementation  Initial to full scale Sustainability 2-4 years to achieve full implementation. (Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005) Implement

Barriers to Effective Implementation (Training) Teachers are primary means of exposure to interventions. Students will not benefit from effective practices if they are not exposed to them. Data suggest that preparation programs are not preparing trainees to use evidence-based practices. Implement Implement

Implement

Barriers to Effective Dissemination Survey of School Psychology Directors of Training 29% Evidence-based interventions Knowledge (Shernoff, Kratochwill, & Stoiber, 2003) Training 41% programs directors Implement

Barriers to Effective Implementation Adoption vs Adaptation  Programs almost always require some adaptation to fit local circumstances.  What can be changed without doing “violence” to evidence-based program? Usual advice is to implement core components but others can be changed. Rarely are core components known. Implement

Evaluating Evidence-based Interventions Practice-based evidence about evidence-based practices Evaluate

Evaluating Evidence-based Interventions Progress Monitoring Implementation of evidence-based intervention does not assure success.  Necessary to evaluate impact in local context. No intervention will be effective for all students. Cannot predict who will benefit. Evaluate

Evaluating Evidence-based Interventions Progress Monitoring Two methods of evaluation:  Formative  Summative Formative facilitates real time decision- making.

General Outcome Measures (GOMs) The larger community is concerned with measures such as academic achievement, bullying, substance abuse. Curriculum-based measurement well established for assessing academic performance, especially early grades. There are no comparable measures for social behavior.  SWPBS relies on Office Discipline Referrals. Evaluate

Evaluating Evidence-based Interventions Curriculum based measurement is a powerful means for evaluating impact of academic interventions.  Scores on CBM correlated with scores on high stakes test. Can be used to predict how students will perform on high stakes tests. Evaluate

Benefits of Formative Assessment Progress monitoring 2-5/week in math and reading:  4 times as effective as 10% increase in per pupil spending;  6 times as effective as voucher programs;  64 times as effective as charter schools;  6 times as effective as increased accountability. Evaluate Yeh (2007)

Benefits of Formative Assessment Hattie, Visible Learning, 2009 Fuchs & Fuchs, 1986 Evaluate

Evidence-based Education, Progress Monitoring and Treatment Integrity Student data provides feedback about progress. If we know about adequacy of treatment integrity then can make decisions:  Adequacy of intervention  Adequacy of implementation If implementation is inadequate then focus should be on improving educator behavior. If implementation is adequate then focus should be on changing intervention so student can succeed.  Decisions can be made about increasing or decreasing intensity of intervention. Evaluate

Summing Up Evidence-based interventions provide best chance to improve positive student outcomes. Federal policy encourages use. Processes exist for identifying effective practices. There is no apparent systematic plan for implementing policy.  Without a plan, policy likely to fail.

Reasons for Hope Emerging science of implementation  National Implementation Research Network  Global Implementation Conference, Aug., Federally funded project  State Scaling Up and Implementation of Evidence- based Practices (SISEP)

Thank you Copies can be downloaded at