Download presentation
Presentation is loading. Please wait.
Published byCoral Greer Modified over 9 years ago
1
Ten Years of Evidence-based Education: A Status Report Ronnie Detrich Wing Institute
2
Goals for Today Review mandates of NCLB and IDEIA to emphasize scientifically-based instruction. Place in broader context of evidence-based practice. Status of evidence-based education after 10 years.
3
2001 No Child Left Behind By 2014 every student will be at grade level. Instructional methods will be scientifically based. Educators will be held accountable for outcomes.
4
A Closer Look at Scientifically-based Instruction NCLB: interventions to improve educational performance are based on scientific research. In NCLB there are over 100 references to scientific research. IDEIA (2004): interventions are scientifically based instructional practices.
5
A Closer Look at Scientifically-based Instruction Specific requirements of IDEIA include: Pre-service and professional development for all who work with students with disabilities to ensure such personnel have the skills and knowledge necessary to improve the academic achievement and functional performance of children with disabilities, including the use of scientifically based instructional practices, to the maximum extent possible.
6
A Closer Look at Scientifically-based Instruction Scientifically based early reading programs, positive behavioral interventions and supports, and early intervention services to reduce the need to label children as disabled in order to address the learning and behavioral needs of such children.
7
A Closer Look at Scientifically-based Instruction The Individualized Education Program (IEP) shall include a statement of the special education and related services and supplementary aids and services, based on peer-reviewed research to the extent practicable, to be provided to the child, or on behalf of the child, and a statement of the program modifications or supports that will be provided for the child.
8
A Closer Look at Scientifically-based Instruction In determining if a child has a specific learning disability, a local education agency may use a process that determines if a child responds to a scientific, research-based intervention as part of the evaluation procedures.
9
After 10 Years: How Are We Doing? Student performance has not changed in last decade. As measured by NAEP.
10
SOURCE: U.S. Department of Education, Institute of Education Sciences, National Center for Education Statistics, National Assessment of Educational Progress (NAEP), 1992, 1994, 1998, 2000, 2002, 2003, 2005, and 2007 Reading Assessments. Age 17 Score Age 13 Score Age 9 Score Age 17 Proficiency
11
Are We Getting Our Money’s Worth? SOURCE: U.S. Department of Education, National Center for Education Statistics. (2009). Digest of Education Statistics, 2008 (NCES 2009- 020), Chapter 2 and Table 179. We were doing better in 1970 than 2009 because we were getting same effect for half the cost.
12
After 10 Years: How Are We Doing? NCLB and IDEIA increased interest in evidence- based education. Placed in broader context of evidence-based practice and data-based decision making. “Evidence-based” has developed two meanings. Practices that meet evidence standards. Process for practitioner decision-making.
13
What is Evidence-based Practice (Validated Practices) Often perceived as a list of interventions practitioners must use. National Reading Panel 5 elements of scientifically-based reading. Reading First-many states established lists of approved reading programs. Insurance companies will not fund autism services unless intervention is on the list.
14
What is Evidence-based Practice? (The Process) At its core the EBP movement is a consumer protection movement. It is not about science per se. It is a policy to use science for the benefit of consumers. “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)
15
What Is Evidence-based Practice? EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention. Professional Judgment Best available evidence Client Values Sackett et al (2000) Professional Judgment Best Available Evidence Client Values
16
Identify ImplementEvaluate Evidence-based Intervention Identify Implement Evaluate Phases of Evidence-based Education
17
How Are We Doing? The term “evidence-based” has become ubiquitous in last decade. No consensus about what it means. At issue is what counts as evidence. Federal definition emphasizes experimental methods. Preference for randomized trials. Definition has been criticized as being positivistic. Identify
18
What Counts as Evidence? Ultimately, this depends on the question being asked. In EBP the goal is to identify causal relations between interventions and outcomes. Experimental methods do this best. Identify
19
What Counts as Evidence? Even if we accept causal demonstrations to be evidence, there is no consensus. Randomized Clinical Trials (RCT) have become the “gold standard.” There is controversy about the status of single case designs. WWC has recently established standards for single case designs. No well established method for calculating effect sizes. Identify
20
What is an Evidence-based Intervention? Identification is more than finding a study to support an intervention. Identification involves distilling a body of knowledge to determine the strength of evidence. Systematic review Identify
21
How Are Evidence-based Interventions Identified? Distillation requires standards of evidence for reviewing the literature. Standards specify: the quantity of evidence the quality of evidence Identify
22
Relationship Between Quality and Quantity of Evidence HighLow High Low Quality Quantity Identify Evidence- based May meet evidence standards Not Evidence- based May meet evidence standards
23
How Are Evidence-based Interventions Identified? Number of organizations publishing evidence- based reviews. What Works Clearinghouse Best Evidence Encyclopedia Promising Practices Network Coalition for Evidence-based Policy Share commonalities but also differences. Modest correlations (Briggs, 2008). Identify
24
Implications of EBP Reviews How are consumers to decide? ValidatedNot Validated Standard 1 Standard 2 Intervention X Identify
25
Most likely with hierarchy approach Most likely with threshold approach EffectiveIneffective Effective Ineffective True Positive True Negative False Positive False Negative Assessed Effectiveness Actual Effectiveness EffectiveIneffective Effective Identify
26
Choosing Between False Positives and False Negatives At this stage, it is better to have more false positives than false negatives. False Negatives: Effective interventions will not be selected for implementation. As a consequence, less likely to determine that they are actually effective. False Positives: Progress monitoring will identify interventions that are not effective. Identify
27
All direct instruction reading programs (e.g., Reading Mastery, Soar to Success, Rewards) Identify Is Reading Mastery 1 an effective program for beginning readers?Is Reading Mastery an effective reading program for beginning readers?Are Direct Instruction reading programs effective for beginning readers?Are direct instruction reading programs effective for beginning readers? Little research available specific to Reading Mastery 1More research availableEven more research available Expansion has changed the initial question All DI Reading Curricula Reading Mastery 1-3 Reading Mastery 1
28
Implications of EBP Reviews Emphasis on “best” evidence. In many instances non-existent. In absence, what is basis for decision making? “Best available” evidence is standard in evidence- based practice (the process). Identify
29
What Works Practice Guides: Best Available Evidence Level of SupportPercent Minimal45% Moderate33% Strong22% 14 Practice Guides 78 Total Recommendations Identify Tim Slocum, 2011
30
Implementing Evidence-based Interventions Where Good Ideas go to Die Implement
31
Research to Practice Gap Most Evident Research to Practice Gap concern in many disciplines. Education is not excluded. Scientist/Practitioner model aimed to close the gap. Gap or Chasm? Implement
32
Scope of the Problem 550 named interventions for children and adolescents Behavioral Cognitive- behavioral Empirically evaluated? Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact. Kazdin (2000) Implement
33
James Lancaster first experiment demonstrating how to prevent scurvy. 1601 John Lind again experimentally demonstrated the effectiveness of citrus in preventing scurvy. 17471795 British Navy adopted policy to have citrus on all ships in the Royal Navy. Scurvy in the British Royal Navy: An Example of the Research to Practice Gap Implement
34
Research to Practice Issues The lag time from efficacy research to effectiveness research to dissemination is 10- 20 years. (Hoagwood, Burns & Weisz, 2002) Practitioners often view research as irrelevant and impractical. (Hoagwood, Burns & Weisz, 2002) Only 4 of 10 Blueprint Violence Prevention programs had the capacity to disseminate to 10+ sites in a year. (Elliott & Mihalic, 2004) Implement
35
Challenges of Implementation Average life span of an educational innovation is 18-48 months (Latham, 1988). Why? Innovation more difficult than expected. Causes too much change. Takes too much time. Supporters leave. Personnel lack training. External funds run out. Inadequate supervision. Implement
36
80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform Implementation is Fundamental Implement
37
Implementation is Multiplicative Effective Intervention 1 X Effective Implementation0 = Outcomes that benefit0 Individuals and society Implement
38
Diffusion of Innovation Rogers, Diffusion of Innovation, 2003 Diffusion of innovation is a social process, even more than a technical matter. The adoption rate of innovation is a function of its compatibility with the values, beliefs, and past experiences of the individuals in the social system. Implement
39
Principles for Effective Diffusion: Improving the Odds (Rogers, 2003) Innovation has to solve a problem that is important for the “client.” Innovation must have relative advantage over current practice. It is necessary to gain support of the opinion leaders if adoption is to reach critical mass and become self-sustaining. Innovation must be compatible with existing values, experiences and needs of the community. Implement
40
Principles of Effective Diffusion: Improving the Odds Innovation is perceived as being simple to understand and implement. Innovation can be implemented on a limited basis prior to broad scale adoption. Results of the innovation are observable to others. Implement
41
Effective Programs Are Not Effectively Implemented Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings. Programs reviewed across 5 dimensions Site selection Training Technical assistance Fidelity Sustainability Implement
42
Keys to Effective Implementation Critical elements in site readiness Well connected local champion Strong administrative support Formal organizational commitments Formal organizational staffing stability Up front commitment of necessary resources Program credibility within the community Program sustained by the existing operational budget. Implement
43
Keys to Effective Implementation Critical elements of training Adhere to requirements for training, skills, and education. Hire all staff before scheduling training. Encourage administrators to attend training. Plan and budget for staff turnover. Implement program immediately after training. Implement
44
Keys to Effective Implementation Critical elements of Technical Assistance Proactive plan for technical assistance. Critical elements of Fidelity Monitor fidelity. Critical elements of Sustainability Function of how well other dimensions are implemented. Implement
45
Phases of Implementation Adoption of Practice Implementation Initial to full scale Sustainability 2-4 years to achieve full implementation. (Fixsen, Naoom, Blasé, Friedman, & Wallace, 2005) Implement
46
Barriers to Effective Implementation (Training) Teachers are primary means of exposure to interventions. Students will not benefit from effective practices if they are not exposed to them. Data suggest that preparation programs are not preparing trainees to use evidence-based practices. Implement Implement
47
Implement
48
Barriers to Effective Dissemination Survey of School Psychology Directors of Training 29% Evidence-based interventions Knowledge (Shernoff, Kratochwill, & Stoiber, 2003) Training 41% programs directors Implement
49
Barriers to Effective Implementation Adoption vs Adaptation Programs almost always require some adaptation to fit local circumstances. What can be changed without doing “violence” to evidence-based program? Usual advice is to implement core components but others can be changed. Rarely are core components known. Implement
50
Evaluating Evidence-based Interventions Practice-based evidence about evidence-based practices Evaluate
51
Evaluating Evidence-based Interventions Progress Monitoring Implementation of evidence-based intervention does not assure success. Necessary to evaluate impact in local context. No intervention will be effective for all students. Cannot predict who will benefit. Evaluate
52
Evaluating Evidence-based Interventions Progress Monitoring Two methods of evaluation: Formative Summative Formative facilitates real time decision- making.
53
General Outcome Measures (GOMs) The larger community is concerned with measures such as academic achievement, bullying, substance abuse. Curriculum-based measurement well established for assessing academic performance, especially early grades. There are no comparable measures for social behavior. SWPBS relies on Office Discipline Referrals. Evaluate
54
Evaluating Evidence-based Interventions Curriculum based measurement is a powerful means for evaluating impact of academic interventions. Scores on CBM correlated with scores on high stakes test. Can be used to predict how students will perform on high stakes tests. Evaluate
55
Benefits of Formative Assessment Progress monitoring 2-5/week in math and reading: 4 times as effective as 10% increase in per pupil spending; 6 times as effective as voucher programs; 64 times as effective as charter schools; 6 times as effective as increased accountability. Evaluate Yeh (2007)
56
Benefits of Formative Assessment Hattie, Visible Learning, 2009 Fuchs & Fuchs, 1986 Evaluate
57
Evidence-based Education, Progress Monitoring and Treatment Integrity Student data provides feedback about progress. If we know about adequacy of treatment integrity then can make decisions: Adequacy of intervention Adequacy of implementation If implementation is inadequate then focus should be on improving educator behavior. If implementation is adequate then focus should be on changing intervention so student can succeed. Decisions can be made about increasing or decreasing intensity of intervention. Evaluate
58
Summing Up Evidence-based interventions provide best chance to improve positive student outcomes. Federal policy encourages use. Processes exist for identifying effective practices. There is no apparent systematic plan for implementing policy. Without a plan, policy likely to fail.
59
Reasons for Hope Emerging science of implementation National Implementation Research Network Global Implementation Conference, Aug., 2011. Federally funded project State Scaling Up and Implementation of Evidence- based Practices (SISEP) http://scalingup.org/
60
Thank you Copies can be downloaded at www.winginstitute.org
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.