Download presentation
Presentation is loading. Please wait.
Published byMalcolm Burke Modified over 9 years ago
1
The Ethical and Legal Basis for Evidence- based Education: Implications for the Profession Ronnie Detrich Wing Institute TED Conference November, 2008 Dallas, Texas
2
Overview Discuss the recent legal and regulatory requirements to base educational interventions on scientific research. Describe ethical requirements to rely on scientific knowledge. Describe what it means to be evidence-based Definitions Issues Controversies Appraisal of current status and future directions.
3
The Legal Basis No Child Left Behind (NCLB): Interventions used to improve educational performance are based on scientific research. Over 100 references to scientific research in NCLB. Individuals with Disabilities Education Improvement Act [IDEIA] (2004): Interventions are scientifically based instructional practices.
4
The Legal Basis Specific requirements of IDEIA include: Pre-service and professional development to improve the academic achievement and functional performance of children with disabilities, including the use of scientifically based instructional practices, to the maximum extent possible.
5
The Legal Basis Scientifically based early reading programs, positive behavioral interventions and supports, and early intervention services to reduce the need to label children as disabled in order to address the learning and behavioral needs of such children.
6
The Legal Basis The Individualized Education Program (IEP) services based on peer-reviewed research to the extent practicable.
7
The Legal Basis In determining if a child has a specific learning disability, a local education agency may use a process that determines if a child responds to a scientific, research-based intervention as part of the evaluation procedures.
8
The Ethical Basis Most national psychological and educational organizations have ethical standards requiring science-based practices to address problems. American Psychological Association Ethical Standard 2.04: Psychologists’ work is based on the established scientific and professional knowledge of the discipline.
9
The Ethical Basis National Association of School Psychologists Standard III F 4. School psychology faculty members and clinical or field supervisors uphold recognized standards of the profession by providing training related to high quality, responsible, and research-based school psychology services.
10
The Ethical Basis National Association of School Psychologists Standard IV 4. School psychologists use assessment techniques, counseling and therapy procedures, consultation techniques, and other direct and indirect service methods that the profession considers to be responsible, research-based practice.
11
The Ethical Basis Behavior Analyst Certification Board Standard 2.09a The behavior analyst always has the responsibility to recommend scientifically supported, most effective treatment procedures. Effective treatment procedures have been validated as having both long-term and short- term benefits to clients and society. Standard 2.09b Clients have a right to effective treatment (i.e., based on the research literature and adapted to the individual client).
12
Another Ethical Responsibility Education services are largely funded through public dollars (taxpayers dollars). There is an implicit assumption that the money will be spent for the public good. A fiduciary responsibility exists when one person or organization is charged with managing another person’s money.
13
Another Ethical Responsibility Act solely for the benefit of the other party Fiduciary Carries the weight of ethical conduct Assure that the taxpayers are receiving the greatest possible return on their investment.
14
How Do We Meet Our Fiduciary Responsibility? Interventions that have an evidence base are more likely to produce positive effects for students. Does not assure positive outcomes but increases the probability. The impact of a non-evidence based intervention is unknown.
15
How Do We Meet our Fiduciary Responsibility? Implications: Using a non-evidence-based intervention when there are evidence-based interventions available constitutes unethical practice. Use of a non-evidence based intervention should be considered research. All of the safe-guards afforded research participants and their families should be in place. Conducting research with tax-dollars provided for education services may constitute a violation of our fiduciary responsibility.
16
Becoming Evidence-based Clearly, the intent of Congress, the U.S. Department of Education, and Office of Special Education Programs is to rely on interventions that have a scientific basis. Professional organizations place great value on scientific knowledge. What does it mean to be evidence-based?
17
What is Evidence-based Practice? At its core the EBP movement is a consumer protection movement. It is not about science per se. It is a policy to use science for the benefit of consumers. “The ultimate goal of the ‘evidence-based movement’ is to make better use of research findings in typical service settings, to benefit consumers and society….” (Fixsen, 2008)
18
What is Evidence-based Practice? Evidence-based practice has its roots in medicine. Movement has spread across major disciplines in human services: Psychology School Psychology Social Work Speech Pathology Occupational Therapy
19
What Is Evidence-based Practice? EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention. Professional Judgment Best available evidence Client Values Sackett et al (2000) Professional Judgment Best Available Evidence Client Values
20
Identify ImplementEvaluate Evidence-based Intervention Identify Implement Evaluate Phases of Evidence-based Intervention
21
What is Evidence-based Education? The term “evidence-based” has become ubiquitous in last decade. There is no consensus about what it means. At issue is what counts as evidence. Federal definition emphasizes experimental methods. Preference for randomized trials. Definition has been criticized as being positivistic. Identify
22
What Counts as Evidence? Ultimately, this depends on the question being asked. Qualitative methods are best for some questions. In EBP the goal is to identify causal relations between interventions and outcomes. Experimental methods do this best. Identify
23
What Counts as Evidence? Even if we accept causal demonstrations to be evidence, we have no consensus. Randomized Clinical Trials (RCT) have become the “gold standard.” There is controversy about the status of single subject designs. Most frequently criticized on the basis of external validity. Identify
24
How Are Evidence-based Interventions Identified? Identification is more than finding a study to support an intervention. Identification involves distilling a body of knowledge to determine the strength of evidence. Identify
25
How Are Evidence-based Interventions Identified? Distillation requires standards of evidence for reviewing the literature. Standards specify: the quantity of evidence the quality of evidence Identify
26
Continua of Evidence Quality of the Evidence Personal Observation Expert Opinion Current “Gold Standard” High Quality Randomized Controlled Trial Uncontrolled Studies General Consensus Single Case Designs Semi-Randomized Trials Well-conducted Clinical Studies Quantity of the Evidence Janet Twyman, 2007 Meta-analysis (systematic review) Single Case Replication (Direct and Parametric) Single Study Various Investigations Repeated Systematic Measures Convergent Evidence Threshold of Evidence Identify
27
How Are Evidence-based Interventions Identified? Two approaches to validating interventions Threshold approach: Evidence must be of a specific quantity and quality before an intervention is considered evidence-based. Hierarchy of evidence approach: Strength of evidence falls along a continuum with each level having differential standards. Identify
28
How Are Evidence-based Interventions Identified? There are no agreed upon standards. It is possible for an intervention to be evidence- based using one set of standards and to fail to meet evidence standards using an alternative set. Difficult for consumers and decision makers to sort out the competing claims about what is evidence-based. Identify
29
Identify
30
Evidence-based Intervention Identify
31
Most likely with threshold approach Most likely with hierarchy approach EffectiveIneffective Effective Ineffective True Positive True Negative False Positive False Negative Assessed Effectiveness Actual Effectiveness EffectiveIneffective EffectiveIdentify
32
Choosing Between False Positives and False Negatives At this stage, it is better to have more false positives than false negatives. False Negatives: Effective interventions will not be selected for implementation. As a consequence, less likely to determine that they are actually effective. False Positives: Progress monitoring will identify interventions that are not effective.Identify
33
Why Do We Need Evidence-based Education? 550 named interventions for children and adolescents Behavioral Cognitive- behavioral Empirically evaluated Evidence-based interventions are less likely to be used than interventions for which there is no evidence or there is evidence about lack of impact. Kazdin (2000)
34
Are We Training Educators to be Evidence-based? Survey of School Psychology Directors of Training 29% Evidence-based interventions Knowledge (Shernoff, Kratochwill, & Stoiber, 2003) training 41% programs
36
Implementing Evidence-based Interventions Where Good Interventions Go to Die Intent of both legal and ethical guidelines is to have positive impact. Evidence-based interventions are assumed to give to give us that chance. Identification is necessary but not sufficient to assure that intervention will be effective. Must address complex issues associated with implementation. Implement
37
Implementing Evidence-based Interventions Implementation is where the research to practice gap is most evident. Many innovations in education have very short life spans. 18 mos. (Latham) Most often a result of poor implementation. Implement
38
80% of initiatives ended within 2 years 90% of initiatives ended within 4 years Data from Center for Comprehensive School Reform Implementation is Fundamental
39
Well Tested Programs Often Fail Large Scale Implementation Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings. Programs reviewed across 5 dimensions Site selection Training Technical assistance Fidelity Sustainability Implement
40
Keys to Implementation Critical elements in site readiness Well connected local champion Strong administrative support Formal organizational commitments Formal organizational staffing stability Up front commitment of necessary resources Program credibility within the community Program sustained by the existing operational budget Implement
41
Keys to Implementation Critical elements of training Adhere to requirements for training, skills, and education. Hire all staff before scheduling training. Encourage administrators to attend training. Plan and budget for staff turnover. Implement program immediately after training. Implement
42
Keys to Implementation Critical elements of Technical Assistance Proactive plan for technical assistance. Critical elements of Fidelity Monitor fidelity. Critical elements of Sustainability Function of how well other dimensions are implemented. Implement
43
Implementing Evidence-based Interventions Dimensions of Implementation Contextual fit Complexity of intervention Implement
44
Implementing Evidence-based Interventions Contextual Fit Contextual Fit: the degree to which an intervention matches the culture, training, and resources of a particular setting. These characteristics of a setting can be measured. Degree of contextual fit may moderate the impact of an intervention. Implement
45
Implementing Evidence-based Interventions Contextual Fit Adoption or Accommodation Adoption: Implementing intervention as evaluated to be effective. Assures intervention is evidence-based. Does not assure implementation. Accommodation: adjusting intervention to meet local circumstances. May result in intervention no longer being evidence- based. May increase implementation with integrity. Implement
46
Implementing Evidence-based Interventions Contextual Fit Logically it would seem to make sense to always implement intervention that produces greatest impact. There may be exceptions: If high impact intervention requires great resources, specialized training, and is very different from current practices it may not be implemented with integrity. May be better to implement effective but lower impact intervention that is better contextual fit and will be implemented with greater integrity. Implement
47
Implementing Evidence-based Interventions Complexity Level of precision may increase complexity. Be as precise as necessary but no more. Complexity Precision Catch’em being good Good Behavior Game Individualized intervention plan Implement
48
Evaluating Evidence-based Interventions Progress Monitoring Implementation of evidence-based intervention does not assure success. Necessary to evaluate impact in local context. No intervention will be effective for all students. Cannot predict who will benefit. Progress monitoring is practice-based evidence about evidence-based practices. Consistent with legal requirements and ethical standards. Evaluate
49
Ethical Standards and Progress Monitoring National Association of School Psychologists Standard IV C 1b. Decision-making related to assessment and subsequent interventions is primarily data-based. Standard IV 6. School psychologists develop interventions that are appropriate to the presenting problems and are consistent with the data collected. They modify or terminate the treatment plan when the data indicate the plan is not achieving the desired goals. Evaluate
50
Ethical Standards and Progress Monitoring Behavior Analyst Certification Board Standard 4.04 The behavior analyst collects data or asks the client, client-surrogate, or designated other to collect data needed to assess progress within the program. Standard 4.05 The behavior analyst modifies the program on the basis of data. Evaluate
51
Fundamental to IEP process. Response to Intervention is accepted as alternative means for determining eligibility for Learning Disability classification. Progress monitoring is the heart of RTI. All students routinely and systematically monitored to assure adequate progress is occurring. Legal Requirements for Progress Monitoring Evaluate
52
Evaluating Evidence-based Interventions Progress monitoring is a systems level intervention. Systems must be in place to assure: Data are collected Data are reviewed Decisions are based on the data If systems are not in place, response effort associated with data collection will compromise data-based decision making. Evaluate
53
Progress Monitoring as an Intervention Progress monitoring 2-5/week in math and reading: 4 times as effective as 10% increase in per pupil spending; 6 times as effective as voucher programs; 64 times as effective as charter schools; 6 times as effective as increased accountability. (Yeh, 2007) Evaluate
54
Evidence-based Education and Treatment Integrity Progress monitoring allows data based decision making about effects of an intervention. It is impossible to make informed decisions without knowing how well the intervention was implemented.
55
Positive Negative High Low Continue Intervention Change Intervention Unknown reason Intervention problem? Implementation problem? Other life changes? Unknown intervention? Intervention is effective? Outcome Integrity Positive Negative High Low
56
Where Are We? From a university in the U.S.
57
Where are We? Being evidence-based is more than a good idea, it is the law and it is ethical conduct; however, it is not as easy as it might seem. Lack of consensus about evidence may do harm to consumers. The research to practice gap limits the impact of evidence-based education. The science of implementation and sustainability is in its infancy.
58
Where are We? Pre-service training should change to reflect current policy. Changes in both method of training and content.
59
OUTCOMES (% of Participants who demonstrate knowledge, demonstrate new skills in a training setting, and use new skills in the classroom) TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10% 5%0%..+Demonstration in Training 30%20%0% …+ Practice & Feedback in Training 60% 5% …+ Coaching in Classroom 95% Joyce and Showers, 2002 Effects of Training
60
Treatment EvidenceValues Basis for Choosing Treatment Szatmari (2004)
61
Do Nothing None Unethical Clinical Paralysis
62
Do Nothing Clinical Paralysis Unethical None
63
Toss a Coin None Unethical in light of evidence
64
Do Nothing Clinical Paralysis Unethical Toss a Coin Unethical in light of evidence None
65
Training NoneOutdated Perhaps some Current
66
Do Nothing Clinical Paralysis Unethical None Toss a Coin Unethical in light of evidence None Training Outdated None
67
Etiology LimitedDifficult
68
Do Nothing Clinical Paralysis Unethical None Toss a Coin Unethical in light of evidence None Training Outdated Etiology LimitedDifficult None
69
ABA Robust Not very humane Effective
70
ABA Not very humane Do Nothing Clinical Paralysis Unethical None Toss a Coin Unethical in light of evidence None Training Outdated Etiology LimitedDifficult ABA Not very humane Robust None
71
Developmental sociocognitive None Yet Highly preferred
72
Developmental sociocognitive Highly preferred None yet Inform Parents of Options Do Nothing Clinical Paralysis Unethical None Toss a Coin Unethical in light of evidence None Training Outdated Etiology limitedDifficult To be Ethical : ABA Not very humane Robust None
73
Thank you Copy of this presentation may be downloaded at www.winginstitute.org
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.