Download presentation
Presentation is loading. Please wait.
1
Exceptional Education Department
Some Considerations for Developing, Implementing, & Sustaining Clinical Experiences Larry Maheady, PhD Exceptional Education Department SUNY Buffalo State June 28, 2017
2
Some Important Considerations
Function of clinical experiences Increased emphasis on improving “student outcomes” as basis for clinical experiences & partnerships Use of EBP as a decision-making process for deciding which practices to use and how to teach candidates to use them fluently Use of implementation science principles to sustain change
3
Shared Vision for Clinical Experiences
Professional and Societal Needs EPPs & Candidate Needs P-12 Teacher & School Needs Students w/wo disabilities needs
4
Using Evidence-Based Practice as a Decision-Making Process in Selection & Monitoring of Effects
5
Two Ways to Think about Evidence-based Practice?
Practice/intervention that has met some evidentiary standard AND Broader framework for decision-making (Detrich, 2015)
6
Evidence-based Practice as Decision-Making Process
Professional Wisdom Best available evidence Student Needs Sackett et al (2000) EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of any intervention. Detrich (2008) Best Available Evidence Student Needs Professional Wisdom
7
Identify the most important practices
Should have rigorous empirical support Relevant to “high priority” instructional needs Used frequently in classroom (high incidence) Broadly applicable across content areas (high prevalence) What criteria should we use to make sure that candidates learn to use the practices that are the best for their students?
8
Challenge for Practitioners
Solve specific problem for specific students in specific context Research can vary in strength (weak to strong) & can be more or less relevant Even with insufficient evidence, decisions must be made. No Evidence vs. “Best Available Evidence” If EBP is implemented well, still must evaluate impact. Can’t predict which students will benefit; No intervention will work for all students Progress monitoring is practice-based evidence about the effects of an evidence-based practice (Detrich, 2013)
9
RIGOR Relevance Strong Weak High Low
Preferred outcome; implementation priority Try out; monitor effects High Relevance Adapt & monitor effects Low Don’t implement
10
Essential or Core Practices
Providing formative evaluation 0.90 Feedback 0.73 Spaced versus massed practice 0.71 Meta-cognitive strategies 0.69 Self-verbalization/self-questioning 0.64 Concept maps 0.57 Worked examples Peer tutoring 0.55
11
How should candidates be prepared to use practices?
12
Training Components and Attainment of Outcomes in Terms of Percent of Participants
OUTCOMES & PROCESS Joyce & Showers (2002)
13
Continuum of Options for Developing Practice
Any of these options may be useful for improving practice Some are much more useful than others High Impact Low Impact High Effort Low Effort Case-study instruction; application papers In class simulations, inter- & micro-teaching Early clinical experiences; tutoring programs Student teaching Clinical year & coaching
14
IMPACT EFFORT High Low Low High
Preferred outcome; implementation priority Try out; monitor effects Low EFFORT Positive outcome; cost-benefit analysis High Don’t implement
15
How can we sustain changes?
16
A Systems Perspective Students do well when teachers (candidates & cooperating) do well Teachers do well when they are effectively supported at the building and University levels Principals/EPPs are effective when they are supported by the district & University Districts/Universities perform well when they are supported State Education Agency. Requires an aligned system. Data about student performance and implementation.
17
Takeaways Improving P-12 student outcomes should drive decision-making at all levels (classroom, building, district, & university) Evidence-based practice (EBP) is systematic decision-making that can improve professional practice at all levels EPPs & partners should be guided by principles of implementation science Avoid politics of distraction (e.g., class size, vouchers, charter schools, grade retention, summer school)
18
Empirical Support Relevance
Strong Weak High Preferred outcome; implementation priority Try out and monitor impact Low Adapt practice & monitor impact Don’t implement Relevance
19
Formative evaluation = .90
Feedback = .73 Spaced practice = .71
20
Effectiveness Cost Ratio = Effect Size/Cost Per Student
21
Impact on Practice Effort
High Low Preferred option; implementation priority Try out; monitor effects Positive outcome; cost benefit analysis Don’t implement Effort
22
Consider Interteaching
23
Interteaching process
Provide students with preparation guide in advance Students work in pairs or small groups for ½ to 2/3 of time Instructor reviews records & develops clarifying lectures Interteaching process Students complete interteaching record forms
24
Nature of the Problem In education innovations come and go in months (Latham, 1988). Alderman & Taylor (2003) Optimally, sustainability should be a focus from the day a project is implemented. With most projects, the pressure of just becoming operational often postpones such a focus until well into the 2nd year.
25
Excellent Evidence for What Doesn’t Work
Disseminating information alone (research findings, mailings, & practice guidelines) Training alone, no matter how well done, does not lead to successful implementation Implementation by edict/accountability Implementation by “following the money” Implementation without changing support roles & responsibilities (Fixsen et al., 2008)
26
Why Such a Short Life Span?
High Effort Innovation more difficult than expected. Causes too much change. Takes too much time. Poor system design Supporters leave. Personnel lack training. External funds run out. Inadequate supervision. No accountability. No consequences for early termination.
27
Even Well Tested Programs Fail to Sustain
Elliott & Mihalic (2004) review Blueprint Model Programs (violence prevention and drug prevention programs) replication in community settings. Critical elements in site readiness Well connected local champion Strong administrative support Solid training Adhere to requirements for training, skills, and education. Hire all staff before scheduling training. Good technical assistance Fidelity monitor Some measure of sustainability
28
Cultural Analysis and Sustainability
Diffusion of Innovations (Rogers, 2003) Diffusion is a kind of social change, defined as the process by which alteration occurs in the structure and function of a social system. When new ideas are invented, diffused, and adopted or rejected, leading to certain consequences, social change occurs. Diffusion of innovation is a social process, even more than a technical matter. The adoption rate of innovation is a function of its compatibility with the values beliefs, and past experiences of the individuals in the social system.
29
Cultural Analysis and Sustainability
Harris (1979): practices are adopted and maintained to the extent that they have favorable, fundamental outcomes at a lower cost than alternative practices. Fundamental outcomes are subsistence and survival.
30
Important Funding Outcomes for Cultural Institutions
Schools: Average Daily Attendance. Schools: Unit cost for a classroom. Special Education: # of students identified Special Education services are often specified as # minutes per session or # sessions per week. Mental health services: # of clients seen/time. These all represent process measures rather than outcome measures.
31
Implications of Current Measures
If key outcome is survival of cultural practice then innovations in service must accomplish these outcomes at a much lower cost than current practice. Nothing in the current unit of analysis specifies effectiveness as critical dimension of the practice.
32
How Can We Increase Sustainability of Practices?
When developing innovative practices demonstrate how they address basic funding outcomes for schools. Monitor performance outcomes. Even though not directly tied to fundamental outcomes, the larger culture has expectations that schools will educate students in a safe environment. Find champions who are part of the system. Champion should control important reinforcers for others within the system. Champion needs to plan on “sticking around.”
33
How Can We Increase Sustainability of Practices?
Pro-active technical assistance. Help solve the real problems of implementation. Monitor integrity of implementation. Without monitoring, the system likely to drift back to previous practices. Anticipate 3-5 years before system is fully operational. Emphasizes the need to plan for multigenerational support. Use external funding and support with extreme caution.
34
QUESTIONS????
35
What considerations should participants think about to create coordinated & comprehensive fieldwork in their own contexts?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.