Download presentation
1
Systemic Improvement:
Building Implementation Capacity to Move the Indicators Kathleen Ryan Jackson, SISEP Karen Blase, SISEP National Implementation Research Network Veronica Sullivan, SSIP Co-Lead Amanda Waldroup, SSIP Co-Lead Division of Learning Services Kentucky Department of Education SSIP January 12, 2015 © 2012 Karen A. Blase and Dean L. Fixsen
2
Implementation PDSA PLAN – SSIP Team will use the Independent Learning Plan and Complete 3 modules by November 17th DO – Goal: Complete the Modules by November 17th STUDY – Percent of members completing each Module by November 17th 20% Getting Started 20% Module 1 20% Module 6 ACT – Given these results, what is the new PLAN? How can we be more supportive to the SSIP Team? What were the barriers? Advice from those who completed one or more of the modules. Turn to your partner and discuss next steps
3
Implementation PDSA PLAN – KDE will send calendar reminders for completion of 3 modules by January 9th DO – Goal: 80% Complete the Modules – Do a closed eye vote! STUDY – Percent of members completing Modules by Jan 9th ___ % Getting Started (Dec 17) ___ % Module 1 (Jan 5) ___ % Module 6 (Jan 9) ACT – Given these results, is there a new PLAN?
4
Why: Capacity to Apply the Active Implementation Frameworks
WHO Teams WHEN Stages WHAT Effective & Usable Interventions WHY & HOW Cycles HOW Drivers SISEPs goal is to build capacity from the SEA to the classroom to close the GAP. NOT a “Recipe” but Important and necessary ingredients! We use the Active Implementation Frameworks to guide the implementation process. There are five of these frameworks: Usable Interventions help to ensure the selection of a program that is actually usable in practice Implementation Drivers and Improvement Cycles guide how to implement these programs Implementation teams are the individuals who guide the implementation process and ensure clear communication throughout the organization, and Implementation Stages give us information about which tasks should occur when in the process Today we’ll give you a quick overview of these frameworks, but will be focusing our time and planning on Implementation Teams and Organization Drivers “We tend to focus on snapshots of isolated parts of the system and wonder why our deepest problems never seem to get solved.” —Senge, 1990
5
WHO &HOW Effective Implementation
Formula for Success WHAT Effective & Usable Innovations WHO &HOW Effective Implementation WHERE Enabling Contexts WHY Educationally Significant Outcomes © 2012 Dean Fixsen and Karen Blase, National Implementation Research Network
6
Review and Discussion: Module 6
Usable Interventions An intervention needs to be teachable, learnable, doable, and be readily assessed in practice. Review and Discussion: Module 6
7
Performance Assessment
Usable Interventions Performance Assessment Operational Definitions Essential Functions As we talk through each of these, think about the activity 1.1 – Getting Started with Usable Innovations that you completed Clear Description
8
Usable Interventions Essential Functions Usable Innovations Clear description of the features that must be present to say that a program exists in a given location Core components Clear description of the features that must be present to say that an education innovation exists in a given location Essential functions sometimes are called core education innovation components, active ingredients, or practice elements The speed and effectiveness of implementation may depend upon knowing exactly what has to be in place to achieve the desired results for students, families, and communities; no more, and no less. Not knowing the essential innovation components leads to time and resources wasted on attempting to implement a variety of (if only we knew) nonfunctional elements.
9
Clear Description Philosophy, Values and Principles
Usable Interventions Clear Description Usable Innovations Philosophy, Values and Principles Inclusion and Exclusion Criteria “Innovations are rarely neutral. They tend to advance or enhance the position of certain groups and disadvantage or damage the position of others.” – Ball, 1987, p. 32 Clear Philosophy, Values, and Principles The philosophy, values, and principles that underlie an education innovation provide guidance for all education decisions and are used to promote consistency, integrity, and sustainable effort across all districts and schools Clear inclusion and exclusion criteria define the population for which the education innovation is intended (e.g. middle school Algebra students who have passed Advanced Math) The criteria define who is most likely to benefit when the education innovation is used as intended Not every education innovation is a good fit with the values and philosophy of a district or school. In addition, many innovations were developed with particular populations of students. Applications on the innovation with different populations of students may not be equally effective. Thus, having a good description of an education innovation and its foundations is required so that leaders and others can make informed choices about what to use.
10
Operational Definitions
Usable Interventions Operational Definitions Describe each core component in terms that can be taught, learned, done in practice, and assessed in practice Practice Profiles Practice profiles describe the core activities that allow an education innovation to be teachable, learnable, and doable in practice; and promote consistency across teachers and staff at the level of actual interactions with students Knowing the essential functions (criterion #2) is a good start. The next step is to express each essential component in terms that can be taught, learned, done in practice, and assessed in practice. The methods for developing operational descriptions (practice profiles) in education were established by Gene Hall and Shirley Hord as part of the Concerns-Based Adoptions Model (called intervention configurations in CBAM).
11
Performance Assessment
Usable Interventions Performance Assessment Provides evidence that the program is being used as intended and is resulting in the desired outcomes Fidelity Practical enough to repeat time and time again The performance assessment relates to the education innovation philosophy, values, and principles; essential functions; and core activities specified in the practice profiles; the performance assessment needs to be a feasible method (e.g. a 10-minute classroom walkthrough observation ratings) that can be done repeatedly in the context of typical education settings Evidence that the education innovation is effective when used as intended There are data to show the innovation is effective A performance (fidelity) assessment is available to indicate the presence and strength of the innovation in practice The performance assessment results are highly correlated (e.g or better) with intended outcomes for students, families, and society How well are teachers and staff saying and doing those things that are in keeping with the essential functions and with the intentions behind the education innovation? If performance assessments do not exist, this becomes a developmental task for a skilled Implementation Team. Note that the criterion for performance assessment includes the specification that a performance assessment should be highly predictive of intended outcomes. If educators use an innovation as intended then students will benefit as intended.
12
Implications for Sustainability and Scalability
Usable Interventions Implications for Sustainability and Scalability We tend to over-estimate how well defined “it” is We find out when we start to Install “it” Help Districts and Schools choose wisely based on: Needs of students Best evidence Fit and resources required Readiness and resources for replication
13
Implications for Sustainability and Scalability
Usable Interventions Implications for Sustainability and Scalability Help Districts and Schools “operationalize” the WHAT Practice Profiles Help Districts and Schools “make space” for the new work Supportive policies and practices
14
Usable Interventions: IPAC Team Now March April Initiative Inventory - Hexagon Tool Practice Profiles Usable Interventions Amanda you pick up here
15
*Check in: Ask specific questions to solicit qualitative data, identify adaptive challenges Respond with support
16
The Hexagon An EBP Exploration Tool
Need in school, district, state Academic & socially significant Issues Parent & community perceptions of need Data indicating need The “Hexagon” can be used as a planning tool to evaluate evidence-based programs and practices during the Exploration Stage of Implementation. Download available at: Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families NEED Fit with current Initiatives School, district , state priorities Organizational structures Community values CAPACITY FIT EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. High Med Low Need Fit Resource Availability Evidence Readiness for Replication Capacity to Implement Total Score READINESS RESOURCES Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe Several replications How well is it operationalized? Are Imp Drivers operationalized? Resources and supports for: Curricula & Classroom Technology supports (IT dept.) Staffing Training Data Systems Coaching & Supervision Administration & system * EVIDENCE Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness © Karen Blase, Laurel Kiser, & Melissa Van Dyke Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland © 2012 Karen A. Blase and Dean L. Fixsen
17
IPAC: Vetting Kentucky’s Practices
35 Brainstorm what is exclusionary criteria for out math interventions: 20
18
IPAC: Vetting Kentucky’s Practices
Activity: With a partner/s explore an intervention that you are familiar with from your initiative inventory. Gather information for each of the six factors. 2. Discuss information as a group: Where are the strengths? Where are the gaps? 3. After your discussion debrief on the process: What more do we need as a team before using this tool to vet math practices for Initiative Inventory? AMANDA:
19
Hexagon and Usable Interventions
Evidence: Number of studies Population similarities FIT with current initiatives Needs of students Clear Description: Philosophy, Values and Principles Inclusion and Exclusion Criteria AMANDA: Talk about what you know about EBP in math from your readings
20
Review and Discussion: Module 3
TEAMS: The Cascading Logic Model Teams must be linked so the inputs at one level are the outputs at the next level. Review and Discussion: Module 3
21
Linked Implementation Teams
“We tend to focus on snapshots of isolated parts of the system and wonder why our deepest problems never seem to get solved.” —Senge, 1990 School-based Implementation Teams District Implementation Teams Regional State-based
22
No Implementation Team
From “Letting it Happen” To “Making it Happen” 14% 17 Years 80% 3 Years Improvement in Intervention Outcomes Why are implementation teams so important? Implementation Teams have been called a new lever for organization change in education (Higgins, Weiner, & Young, 2012) Research has shown that, without the use of implementation teams that keep a focus on implementation infrastructures, it takes an average of 17 years to achieve full implementation in only14% of sites. However, with the support of implementation teams, we can reach full implementation in 80% of sites, in only 3 years. And that difference of 14 years is the full career of a generation of students Sources: Fixsen, Blase, Timbers, & Wolf, 2001 Balas & Boren, 2000 Green & Seifert, 2005 Saldana & Chamberlain, 2012
23
Shifting Accountability
Active Implementation Shifting Accountability Teachers Principal and District System *
24
Discussion: Active Implementation and the Formula for Success
Good Intentions Actual Supports Years 1-3 Outcomes Every Teacher Trained Fewer than 50% of the teachers received some training Fewer than 10% of the schools used the practice as designed Every Teacher Continually Supported Fewer than 25% of the teachers received support Vast majority of students did not benefit Look at slide 24 and 25: Discussion: How can Active Implementation change the comprehensive school reform data and make the formula for success true for GAP students and all students in KY,, and meeting your SIMr? Think about teams, usable interventions and how they are inter-related IF: Who and How = TEAMS What = Usable math interventions Where = Contexts supported by teams at all levels of the system (Cascading Logic Model: Inputs at one level are outputs at the next level) Comprehensive School Reform: Aladjem & Borman, 2006; Vernez, Karam, Mariano, & DeMartini, 2006 © 2012 Karen A. Blase and Dean L. Fixsen
25
WHO &HOW Effective Implementation
Formula for Success WHAT Effective & Usable Innovations WHO &HOW Effective Implementation WHERE Enabling Contexts WHY Educationally Significant Outcomes Discussion: Who and How = TEAMS Where: in contexts supported by team at all levels of the system Lets take a break! © 2012 Dean Fixsen and Karen Blase, National Implementation Research Network
26
Independent Learning Plan
Developing Implementation Capacity SISEP is not another initiative, it is an implementation framework that can be applied to any way of work Next Steps: Module 2, 4, 5
27
Module 2: Implementation Drivers
Consistent Use of Educational Innovations Performance Assessment (Fidelity) Coaching Systems Intervention Facilitative Administration Training Competency Drivers Organization Drivers Drivers Best Practice Decision Support Data System Selection Leadership © Fixsen & Blase, 2008 Technical Adaptive 27
28
Module 4: Stages of Implementation Initial Implementation
Exploration Winter 2015 Installation Spring & Summer 2015 Initial Implementation Fall 2015 Full Implementation Winter 2016 Leads Lead Monitor & manage change Achieve fidelity & outcome benchmarks Improve Sustain and improve structures for sustainability Replicate TZ Team Assesses Needs Fit Resources Buy-in Readiness for replication Capacity to sustain Mutual selection Teams Lead Acquire resources Prepare organization and staff (train, coach, assess fidelity) Build structure for sustainability Teams Lead Manage change Ensure all can use Data Systems Initiate Improvement Cycles Improve structures for sustainability Stages: Are We There Yet You see the key activities of each stage here. We have a tendency to ignore the Exploration and Installation stages, and jump right into Initial Implementation. Then, when the process does not produce the outcomes we expect, we blame the program rather than our lack of preparation. 2-4 Years
29
Transformation Zone Example: KDE Exploration
Coop District Develop implementation capacity of TZ Cooperatives and a few of their districts and schools KDE identifies 3 Cooperatives Each Coop identifies 2-3 districts Assess willingness, needs, fit etc. Engage in mutual selection process Coops attend SISEP training once a month Regions attend January 26 training at KDE (possibly a few school representatives)
30
What: Transformation Zone
A transformation zone focuses on innovations and implementation infrastructure development A transformation zone can be thought of as a “vertical slice” of the education system From the classroom to the district (capital) The “slice” is small enough to be manageable but large enough to include all aspects of the system so the “slice” can be replicated WASHINGTON WA 1 MOVING THE INDICATORS (c) Dean Fixsen and Karen Blase, 2012
31
Why: Transformation Zone
Cannot change everything at once (too big; too complex; too many of them and too few of us) Cannot stop and re-tool (have to create the new in the midst of continuing the existing) Cannot know what to do at every step (we will know it when we get there) WASHINGTON WA 1 MOVING THE INDICATORS Fixsen, D., Blase, K., & Van Dyke, M. (2012). From ghost systems to host systems via transformation zones (pp. 3-7). Washington, DC: U.S. Department of Education Office of Vocational and Adult Education. (c) Dean Fixsen and Karen Blase, 2012
32
Module 5: Practice-Policy Communication Cycle
SEA Sustainability PDSA Policy Enables Practice Practice Informs Policy Create a hospitable system with linked communication and problem-solving protocols to support the effective WHAT Ensure that practice informs policy, and policy enables better practice Operationalize success Prevent the institutionalization of errors Connecting policy to practice is a key aspect of reducing barriers to high-fidelity implementation. Frequent, regularly scheduled communication from classroom to district office and back is the major foundation of this successful implementation. And the implementation team is the structure that can ensure this practice. Effective policy should be in place to enable good practice in the classroom. But those policies can only be effective if they are informed by the teachers themselves, and that communication needs to occur on an ongoing basis. This ongoing communication is really the key to having that enabling context, or hospitable environment, in support of the work of your teachers. As you develop these communication processes, consider the frequency carefully – if this communication only occurs quarterly, then you only have 3 or 4 opportunities in a school year to make adaptations, or even a course change, in the implementation of a program. How quickly would you want to know if there is a barrier to implementation fidelity? How soon would you want to know about successful strategies that can be operationalized across the district? The frequency of these communication protocols determines how responsive you can be to teacher and program needs. Fixsen, D., Blase, K., Metz, A., & Van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children (Special Issue), 79(2), Implementation Teams Regional District Building Teachers Students (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen, Karen Blase, Robert Horner, George Sugai, 2008 32 32
33
Common definitions build a community of practice.
–Karen Blase in conference, 2015
34
©Copyright Dean Fixsen and Karen Blase
This content is licensed under Creative Commons license CC BY-NC-ND, Attribution-NonCommercial-NoDerivs. You are free to share, copy, distribute and transmit the work under the following conditions: Attribution — You must attribute the work in the manner specified by the author or licensor (but not in any way that suggests that they endorse you or your use of the work); Noncommercial — You may not use this work for commercial purposes; No Derivative Works — You may not alter or transform this work. Any of the above conditions can be waived if you get permission from the copyright holder.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.