Some Considerations for Developing & Implementing Clinical Experiences Larry Maheady, PhD Exceptional Education Department SUNY Buffalo State maheadlj@buffalostate.edu June 28 & 29, 2017
What considerations should participants think about to create coordinated & comprehensive fieldwork in their own contexts?
Four Important Considerations Function of clinical experiences Which practices & processes candidates should use & master How these practices can be taught effectively, efficiently, & acceptably How these changes can be sustained
Shared Vision for Clinical Experiences Professional and Societal Needs EPPs & Candidate Needs P-12 Teacher & School Needs Students w/wo disabilities needs
Make Vision Operational Identify Mutually Beneficial Needs What are P-12 schools most pressing instructional needs & how can WE help? Link school needs to EPPs through structured, developmentally sequenced, clinical experiences Implement program sequentially (i.e., clinical experiences), require for graduation, and collect data on P-12 students, candidates, & program
Increase Candidate Use of Evidence-Based Practice(s)
What is Evidence-based Practice? Two ways to think about it Practice/intervention that has met some evidentiary standard OR Broader framework for decision-making (Detrich, 2015)
Evidence-based Practice as Decision-Making Process? Professional Judgment Best available evidence Student Needs Best Available Evidence Sackett et al (2000) Professional Judgment EBP is a decision-making approach that places emphasis on evidence to: guide decisions about which interventions to use; evaluate the effects of an intervention. Detrich (2008) Student Needs
Which practices should candidates use?
Identifying the most important practices Should have rigorous empirical support Relevant to “prioritized” instructional needs Used frequently in classroom (high incidence) Broadly applicable across content areas (high prevalence) What criteria should we use to make sure that candidates learn to use the practices that are the best for their students?
RIGOR Relevance Strong Weak High Low Preferred outcome; implement & monitor Try out; monitor effects High Relevance Adapt & monitor effects Low Don’t implement
Empirical Support Relevance Strong Weak Preferred outcome; implement & monitor Try out; monitor effects Adapt & Monitor Effects Don’t Implement High Relevance Low
Formative evaluation = .90 Feedback = .73 Spaced practice = .71
Essential or Core Practices Providing formative evaluation 0.90 Feedback 0.73 Spaced versus massed practice 0.71 Meta-cognitive strategies 0.69 Self-verbalization/self-questioning 0.64 Concept maps 0.57 Worked examples Peer tutoring 0.55
Effectiveness Cost Ratio = Effect Size/Cost Per Student
How should candidates be prepared to use practices?
Training Components and Attainment of Outcomes in Terms of Percent of Participants OUTCOMES & PROCESS Joyce & Showers , 2002
Continuum of Options for Developing Practice Any of these options may be useful for improving practice Some are much more useful than others High Impact Low Impact High Effort Low Effort Case-study instruction; application papers In class simulations, inter- & micro-teaching Early clinical experiences; tutoring programs Student teaching Clinical year & coaching
IMPACT EFFORT High Low Low High Preferred outcome; implement & monitor Try out; monitor effects Low EFFORT Positive outcome; cost-benefit analysis High Don’t implement
Impact on Practice Effort High Low Preferred option; implement & monitor Try out; useful for knowledge development Positive outcome; cost benefit analyisis Don’t implement Effort
Interteaching process Provide students with preparation guide in advance Students work in pairs or small groups for ½ to 2/3 of time Instructor reviews records & develops clarifying lectures Interteaching process Students complete interteaching record forms
How can we sustain changes?
A Systems Perspective Students do well when teachers do well Teachers do well when they are effectively supported at the building level. Principals are effective when they are supported by the district. Districts perform well when they are supported State Education Agency. Requires an aligned system. Data about student performance and implementation.
Detrich, 2014
What Do We Need to Know About Effective Implementation? Implementation occurs in stages Exploration Installation Initial Implementation Full Implementation Innovation Sustainability (Fixsen, Naoom, Blase, Friedman, & Wallace, 2005) 2 – 4 Years
Take-Aways Improving P-12 student outcomes should drive decision-making at all levels (classroom, building, district, & university) Evidence-based practice (EBP) is systematic decision-making that can improve professional practice at all levels EPPs & partners should be guided by principles of implementation science