Download presentation
Presentation is loading. Please wait.
Published byGabriella Gibson Modified over 8 years ago
1
WELCOME Oregon Scaling-up EBISS Utilizing Implementation Science Frameworks to Ensure Effective Behavioral & Instructional Support Systems (EBISS) January/February 2016 Pendleton, Vale and Redmond Oregon Oregon
2
Introductions Presenters and Support Staff Martha Buenrostro, Ph.D. Oregon Department of Education Marick Tedesco, Ph.D. State Transformation Specialist Scott Perry Systematic Attendance Improvement Sara Falcon Oregon EBISS Initiative Support
3
Goals for Today 1. Provide a review of the critical features of Effective Behavioral and Instructional Support Systems (EBISS) 2. Introduce key concepts of Implementation Science (IS) 3. Develop understanding of the frameworks associated with IS 4. Identify next steps for your target initiative
4
Introductions Districts District Number of team members Team member roles
5
Webinar Content Stages of Implementation Staff Competency (SISEP SELF ASSESSMENT) Organizational Supports Performance Assessment Putting it all together
6
Agenda 8:00 – 8:30 Registration and Pre Test 8:30 – 9:15Review EBISS Structures – Defining the “What” Activity 9:15 - 10:00Implementation Science: Frameworks Overview 10:00 – 10:15Break 10:15- 12:00IS Stages and Drivers: Activity 2 Stages and Drivers Sort 12:00 – 1:00Lunch 1:00 – 2:30Competency Drivers Overview: Activity 3 Self Assessment and Action Planning 2:30 – 2:45 Break and Post Test 2:15 – 3:30 Team Time: Action Planning Continued 3:30 – 4:00Next Steps/Evaluations
7
Active Implementation Frameworks WHEN Stages WHO Teams HOW Drivers WHAT Intervention HOW Cycles
8
Implementation Frameworks WHEN Stages WHO Teams HOW Drivers WHAT Interventio n HOW Cycles
9
Implementation Teams Support the full, effective, and sustained use of effective instruction and behavior methods. Linked Implementation Teams define an infrastructure to help assure dramatically and consistently improved student outcomes.
10
Task: Implementation Teams Self Assessment
11
Task 1 Instructions
12
Usable Interventions WHAT Intervention
13
Usable Interventions The “What” aka the “Lasagna that we discussed in meeting 1.
14
Usable Interventions Are effective and well ‐ operationalized. Well ‐ operationalized interventions can be taught and coached so educators can use them as intended (with fidelity). An intervention needs to be teachable, learnable, doable, and readily assessed in practice if it is to be used effectively to reach all students who could benefit.
15
The following criteria need to be in place to ensure that your intervention is usable: Clear description of the program Clear essential functions that define the program Operational definitions of essential functions Practical performance assessment
16
The “What” – Usable Interventions To be usable, it’s necessary to have sufficient detail about an intervention. With detail, you can train educators to implement it with fidelity, replicate it across multiple settings and measure the use of the intervention. So, an intervention needs to be teachable, learnable, doable, and be readily assessed in practice.
17
Clear Essential Functions The speed and effectiveness of implementation may depend upon knowing exactly what has to be in place to achieve the desired results for students. Knowing the core intervention components may allow for more efficient and cost effective implementation, and lead to confident decisions about what can be adapted to suit your school or district. Clear essential functions that define the program, or core components, include a clear description of the features that must be present to say that a program exists in a given location.
18
Practical Performance Assessment Consider these features when identifying current or potential performance assessment: The performance assessment relates to the program philosophy, values, principles and essential functions specified in the Practice Profiles The performance assessment is practical and can be done repeatedly in the context of typical educational systems There is evidence that the program is effective when used as intended The performance assessment is highly correlated with intended outcomes for students
19
Activity 1: Usable Interventions Develop a description of the intervention that reflects values, principles and expected outcomes: Identify the essential functional features of the program Discuss how your team will know that each of these functional features is in place? What would it look like in a school or district? How might fidelity and performance assessment data be captured?
20
Activity 1: Example Sample: Lighthouse school district values a high quality education to ensure very student experiences success. In order for students to develop mastery, they must acquire component skills, practice integrating them, and know when to apply what they have learned. To support students, we endorse effective teaching that aligns three components of instruction: Learning objectives, assessments, and instructional activities. To assist with advancing student and adult outcomes in our district, we will invest in training and coaching to support the following functional features of the program Data based problem solving Development of executable plans Articulation of the support provided to ensure plans are in place Measurable outcomes at every level
21
Activity 1 Share Out Re-Group into teams represented by your playing card #. Each person shares: 1. Description 2. Essential functions 3. One way fidelity or performance data might be captured.
22
Implementation Drivers The key components of capacity that enable the success of innovations in practice. Implementation Drivers assure development of relevant competencies, necessary organization supports, and engaged leadership.
23
Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Leadership Improved educational outcomes Interventions meet Implementation Consistent Use of Educational Innovations © Fixsen & Blase, 2008 Adaptive Technical
24
There are 3 categories of Implementation Drivers: Competency Drivers – are mechanisms to develop, improve and sustain one’s ability to implement an intervention as intended in order to benefit students. Organization Drivers – are mechanisms to create and sustain hospitable organizational and system environments for effective educational services Leadership Driver – focus on providing the right leadership strategies for different types of leadership challenges. These leadership challenges often emerge as part of the change management process needed to make decisions, provide guidance, and support organization functioning.
25
Implementation Stages Outline the integrated, non ‐ linear process of deciding to use an effective intervention and finally having it fully in place to realize the promised outcomes. Active implementation stages are Exploration, Installation, Initial Implementation and Full Implementation.
26
Stages Implementation Takes Time: 2 – 4 Years EXPLORATION INSTALLATION INITIAL IMPLEMENTATION FULL IMPLEMENTATION
27
Improvement Cycles Support systematic and intentional change. Improvement Cycles are based on the Plan, Do, Study, Act (PDSA) process for rapidly changing methods, usability testing for changing interventions and organization supports, and practice ‐ policy communication cycles for changing systems to enable continual improvement in impact and efficiency.
28
Active Implementation What about Improvement Cycles? HOW Cycles
29
Plan – Decide what to do Do – Do it (be sure you did it as intended) Study – Look at the results Act – Make adjustments Cycle – Do over and over again until the intended benefits are realized Shewhart (1924); Deming & Juran (1948); Six-Sigma (1990) PDSA Cycles: Trial & Learning
30
BREAK 30
31
Developing the Capacity to Implement Effectively “A serious deficiency is the lack of expertise to implement best practices and innovations effectively and efficiently to improve student outcomes.” Rhim, Kowal, Hassel, & Hassel (2007)
32
The Challenge: Recognizing Gaps Science to Service Gap What is known to be effective is not what is selected to help students Implementation Gap What is selected is not used with fidelity and good outcomes What is used with fidelity is not sustained for a useful period of time What is used with fidelity is not used on a scale sufficient to broadly impact student outcomes
33
Implementation Science “In theory there is no difference between theory and practice …in practice there is.” variously attributed to Jan La Van De Snepscheut or Albert Einstein or Yogi Berra
34
Implementation Math Effective Interventions (The “WHAT”) Effective Implementation (The “HOW”) Positive Outcomes for Students (The “WHY”)
35
Plan for Change It is not a “school problem” District is the point of entry for sustainable school improvement Use short-term infusion of resources Establish long-term, district-based capacity for quality
36
Plan for Change: “Making It Happen” To successfully implement and sustain evidence-based and evidence-informed interventions, we need to know: WHAT to do What is the intervention (e.g. effective instruction, effective assessment)? HOW to do it Active and effective implementation and sustainability frameworks (e.g. strategies to change and maintain behavior of adults) WHO will do it Organized, purposeful, & active implementation support from linked implementation teams
37
Stages of Implementation Purposeful matching of critical implementation activities to the appropriate stage of the process Stages
38
Implementation Drivers Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability and improvement at the organization and systems level Help guide leaders to use the right leadership strategies for the situation Drivers
39
Stages AND Drivers Implementation Takes Time: 2 – 4 Years EXPLORATION INSTALLATION INITIAL IMPLEMENTATION FULL IMPLEMENTATION Drivers “DRIVERS”
40
Stages of Implementation EXPLORATION Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory —Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 “Pay now or Pay later”
41
Exploration/Adoption (Decision to commit to adopting the program/practices and supporting successful implementation) Questions to Consider: What is it? How will it impact us? Do we agree with the philosophy? Should we do it?
42
Stages of Implementation Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory INSTALLATION —Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 “If you build it, they will come”... but you actually have to build it!
43
Installation (Set up the infrastructure so that the successful implementation can take place and be supported). Establish team to lead process, and data systems to guide implementation efforts Questions to Consider: How do we do it? Do we have the materials, training, time, and support to do it right? How can we master the new skills and fit it all in?
44
Stages of Implementation Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 EXPLORATION INSTALLATION INITIAL IMPLEMENTATION “Get Started, then Get Better.” Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory
45
Initial Implementation (Try out the practices, work through problems, work out the details, learn, improve before expanding) example: Try it out with a grade-level or specific location Questions to Consider: Is it working? Are we doing it right?
46
Stages of Implementation 2 - 4 Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 FULL IMPLEMENTATION “The only thing worse than failing and not knowing why you failed, is succeeding and not knowing why you succeeded.”
47
Full Implementation (Expand the practice/program to other locations, times, individuals) Example: whole school roll-out of the program Questions to Consider: It’s working well, do we have supports in place to involve more people? Are we ready to focus on other tiers of supports? Full Implementation - Sustainability (Make it easier, more efficient. Embed within current practices) Questions to Consider: It’s working fine, but how do others do it? Is there a way to make it better? How do we ensure this sustains over time and through staff changes?
48
Stages of Implementation Activity Purpose Help Team or Individual plan for and/or assess the use of stage-based activities to improve the success of implementation efforts for EBPs or evidence-informed innovations (action planning/anticipatory guidance) Can be used to Self-Assess current stage related activities (e.g. “We are in the midst of Exploration”) or past efforts related to a stage (e.g. “We just completed most of Installation? How did we do? What did we miss?) (manage expectations)
49
Activity Each participant (individual or team) will receive: A blank “Stages of Implementation” chart An envelope containing color coded cards representing a set of activities aligned with a Driver on the chart Each color is associated with a stage Teams will work with one stage (color) at a time, and will identify the driver for which each set of activities aligns Practice: Exploration Column Chart Completion: Installation, Initial Implementation and Full Implementation
50
Reflection Implementation Stages Approximately what stage of implementation are you in for the selected EBISS initiative? Within that stage, which Driver-based activities have you initiated? Of those activities that you initiated, evaluate the next steps needed to move this driver forward. Identify a driver where work has not been initiated. How important is this Driver in promoting fidelity and/or positive outcomes? What are the facilitators and barriers to engaging in the associated installation stage-based work? Supporting New Ways of Work
51
Continued Practice Implementation Stages What are 2 things your team could to tomorrow to asses your current infrastructure? Based on your current stage, back up one stage to see if all of the activities were conducted. If so, move forward with planning activities for next stage. If not, go back and identify if the activities that were passed over are critical for the success of your implementation. Planning for New Ways of Work
52
52
53
Implementation Drivers are Dynamic! A key feature of Implementation Drivers is their integrated and compensatory nature: Integration – means that the philosophy, goals, knowledge and skills related to the program or practice are consistently and thoughtfully expressed in each of the Implementation Drivers. Compensatory – means that the skills and abilities not acquired or supported through one driver can be compensated for by the use of another driver.
54
Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Adaptive Technical Integrated & Compensatory Competency Drivers Organization Drivers Leadership Improved educational outcomes Interventions meet Implementation Consistent Use of Educational Innovations © Fixsen & Blase, 2008
55
Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Implementation Drivers
56
Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Leadership Improved educational outcomes Interventions meet Implementation Consistent Use of Educational Innovations © Fixsen & Blase, 2008
57
Performance Assessment Purposes “Are we doing what we said we would do?” –Measure fidelity –Motivate implementation –Reinforce staff and build on strengths –Interpretation of Outcome Data –Feedback on functioning of »Recruitment and Selection Practices »Training Programs (pre and in-service) »Supervision and Coaching Systems Implementation Drivers Competency Drivers
58
Performance Assessment Challenges Drivers are ‘in service’ to a defined “it” No definition of “it” No definition of ‘fidelity’ to the “it” What’s the way forward? Performance Assessment processes are weak Content Context No Competency indicators What’s the way forward? Blaming the teacher What’s the way forward? Competency Drivers
59
Performance Assessment: Fidelity Fidelity assessment refers to measuring the degree to which teachers or staff are able to use the intervention or instructional practices as intended. Fidelity assessment measures the extent to which an innovation is implemented as intended. Did we do what we said we would do? Assessing fidelity at the teacher/practitioner level is imperative to interpreting outcomes. If we don’t assess fidelity, then we cannot: be sure an intervention was actually used, attribute outcomes to the use of the intervention, or know what to focus on to improve.
60
Performance Assessment: Fidelity If outcomes are not what we’d hoped for, but we have no fidelity data, it’s difficult to develop an improvement plan. Are results poor because we chose the wrong intervention or because the intervention is not yet being used as intended? We need to know the answers to these questions in order to create a functional improvement plan. Assessing fidelity also provides direct feedback regarding how well the other Implementation Drivers are functioning. Fidelity data and information, as well as intervention outcomes, are a direct reflection of the how well the Competency, Organization and Leadership
61
Three critical features of fidelity assessments are: Context – pre-requisite conditions that need to be in place regarding setting, qualifications and preparation Content – extent to which the required core content is used, referenced, monitored or accessed by the teacher in his or her work and/or activity Competence – extent to which the core content and competencies are skillfully modeled, relevant feedback is provided, and sensitively reviewed To produce useful information for improving practices and outcomes, fidelity assessments should be Frequent Relevant Actionable
62
Performance Assessment: Task and Activity Utilize the template to self-evaluate the degree to which the Performance assessment driver best practices are currently in place in your district/program Identify two items to prioritize for Action Planning List potential tasks to develop this component Assign individuals Develop a timeline
63
Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Training Implementation Drivers
64
Training Purposes Continue “Buy-in” process Knowledge acquisition Skill Development Form a ‘community’ Get started…get better Implementation Drivers Competency Drivers
65
Training Driver Challenges “But our staff deserve professional development opportunities – we have to trust them to make use of the information.” What’s the way forward? “Training events aren’t meeting our expectations for supporting implementation!” (e.g. no pre/post test, no practicing skills) “Now what?” What’s the way forward? Implementation Drivers Competency Drivers
66
Training: Task and Activity Utilize the template to self-evaluate the degree to which the Training driver best practices are currently in place in your district/program Identify two items to prioritize for Action Planning List potential tasks to develop this component Assign individuals Develop a timeline
67
Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Coaching Implementation Drivers
68
Coaching Purposes Coaching is a necessary component for promoting teacher confidence and ensuring competence. Coaching is defined as regular, embedded professional development designed to help teachers and staff use the program or innovation as intended. Implementation Drivers Competency Drivers
69
Coaching Common Challenges Discomfort with ‘active skill development’ – “Let’s reflect.” Multiple coaches for multiple initiatives. Lack of PDSA process – acts of random advice Process skills weighted more heavily than innovation knowledge and vice versa Resources – Who will do this? How will we fund it? Pair up, Pick one (or create your own), Discuss ideas for addressing the challenge. Implementation Drivers Competency Drivers
70
Training and Coaching Implementation Drivers OUTCOMES % of Participants who Demonstrate Knowledge, Demonstrate New Skills in a Training Setting, and Use new Skills in the Classroom TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10%5%0%..+Demonstration in Training 30%20%0% …+ Practice & Feedback in Training 60% 5% …+ Coaching in Classroom 95% —Joyce and Showers, 2002
71
Coaching: Task and Activity Utilize the template to self-evaluate the degree to which the Coaching driver best practices are currently in place in your district/program Identify two items to prioritize for Action Planning List potential tasks to develop this component Assign individuals Develop a timeline
72
Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Selection Implementation Drivers
73
Selection Purposes Select for the “tough to teach traits” Screen for pre-requisites Set expectations for new hires – use of data, coaching Allow for mutual selection Improve likelihood of retention after “investment” Improve likelihood that training, coaching, and supervision will result in implementation Implementation Drivers Competency Drivers
74
Selection Driver Challenges “We have who we have …this doesn’t apply to us!” What’s the way forward? Implementation Drivers Competency Drivers
75
Selection: Activity Utilize the checklist to determine the Staff Selection best practices that are currently in place in your district/program We will look further into the Staff Selection development in our next meeting!
76
Implementation Drivers A key feature of Implementation Drivers is their integrated and compensatory nature: Integration – means that the philosophy, goals, knowledge and skills related to the program or practice are consistently and thoughtfully expressed in each of the Implementation Drivers. Compensatory – means that the skills and abilities not acquired or supported through one driver can be compensated for by the use of another driver.
77
Reflection and Share Out Are there any ways that your current implementation activities compensate for deficiencies in any of the drivers?
78
Next Steps and Timelines 78
79
Questions and Answers Presentation Materials & Archived Webinars http://blogs.uoregon.edu/oregonscalingupebiss blog/ 79
80
Team Time Activity: Action Planning Continued 80
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.