Karen A. Blase Barbara Sims Michelle A. Duda Dean L. Fixsen Jonathan Green Ron Dughman, RRCP Jeanna Mullins, RRCP Presented by Jeanna Mullins, Mid-South.

Slides:



Advertisements
Similar presentations
Practice Profiles Guidance for West Virginia Schools and Districts April 2012.
Advertisements

Implementation Science overview
SISEP Dean Fixsen, Karen Blase, Rob Horner, and George Sugai
Barbara Sims Dean Fixsen Karen Blase Caryn Ward National SISEP Center National Implementation Research Network FPG Child Development Center University.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Establishing an Effective Network of PB4L: School wide Coaches
Schoolwide Positive Behavior Interventions and Support -SWPBIS- Mitchell L. Yell, Ph.D. University of South Carolina
A Quick Look at Implementation Science Mary Louise Peters Technical Assistance Specialist National Early Childhood Technical Assistance Center
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Welcome! October VTPBiS Regional Coordinators Meeting.
Building Implementation Capacity to Improve Youth Outcomes Allison Metz, Ph.D. Associate Director National Implementation Research Network University of.
Getting Ready for Phase II of the SSIP
Ensuring Quality and Effective Staff Professional Development to Increase Learning for ALL Students.
Rob Horner University of Oregon Implementation of Evidence-based practices School-wide behavior support Scaling evidence-based practices.
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
Dean L. Fixsen, Ph.D. Karen A. Blase, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health Institute Implementing Innovations.
Striving for Quality Using continuous improvement strategies to increase program quality, implementation fidelity and durability Steve Goodman Director.
Dean Fixsen, Karen Blase, Rob Horner, and George Sugai University of North Carolina – Chapel Hill University of Oregon University of Connecticut Scaling.
The District Role in Implementing and Sustaining PBIS

9/15/20151 Scaling Up Presentation: SIG/SPDG Regional Meeting October 2009 Marick Tedesco, Ph.D. State Transformation Specialist for Scaling Up.
Allison Metz, Ph.D., Karen Blase, Ph.D., Dean L. Fixsen, Ph.D., Rob Horner, Ph.D., George Sugai, Ph.D. Frank Porter Graham Child Development Institute.
Overview of the State Systemic Improvement Plan (SSIP) Anne Lucas, WRRC/ECTA Ron Dughman, MPRRC Janey Henkel, MPRRC 2013 WRRC Leadership Forum October.
Barbara Sims, Co-Director National SISEP Center FPG Child Development Center University of North Carolina at Chapel Hill Greensboro.NC March 20, 2013 Implementation.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
ISLLC Standard #2 Implementation
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
“Current systems support current practices, which yield current outcomes. Revised systems are needed to support new practices to generate improved outcomes.”
Implementing School-wide PBIS Pennsylvania PBIS Implementer’s Forum Rob Horner University of Oregon.
Improving Outcomes for All Students: Bringing Evidence-Based Practices to Scale March 25, 2009 MN RtI Center Conference Cammy Lehr, Ph.D. EBP & Implementation.
Michelle A. Duda, Barbara Sims, Dean L. Fixsen, Karen A. Blase, National Implementation Research Network FPG Child Development Institute University of.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
Scaling Up in Illinois Integrated System for Student Achievement (ISSA)
Rob Horner OSEP Center on PBIS Jon Potter Oregon RTI David Putnam Oregon RTI.
Systemic Improvement:
Barbara Sims Dean L. Fixsen Karen A. Blase Michelle A. Duda
Notes by Ben Boerkoel, Kent ISD, based on a training by Beth Steenwyk.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. & Karen A. Blase Ph.D., Melissa Van Dyke, LCSW Frank Porter Graham Child Development Institute University.
Integrated System for Student Achievement SISEP in Illinois.
“Sustaining & Expanding Effective Practices: Lessons Learned from Implementation of School-wide Positive Behavior Supports” Susan Barrett Cyndi Boezio,
Implementation Conversations Jennifer Coffey Audrey Desjarlais Steve Goodman.
: The National Center at EDC
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Karen A. Blase, PhD, Allison Metz, PhD and Dean L. Fixsen, PhD Frank Porter Graham Child Development Institute University of North Carolina at Chapel Hill.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Copyright © Dean L. Fixsen and Karen A. Blase, Dean L. Fixsen, Ph.D. National Implementation Research Network Louis de la Parte Florida Mental Health.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
School-wide Positive Behavior Support: Linking Social and Academic Gains Washington Association of School Administrators Rob Horner University of Oregon.
IMPLEMENTATION SCIENCE OVERVIEW. CONTEXT & RATIONALE.
Understanding Implementation and Leadership in School Mental Health.
WELCOME Oregon Scaling-up EBISS Utilizing Implementation Science Frameworks to Ensure Effective Behavioral & Instructional Support Systems (EBISS) January/February.
Michelle A. Duda Barbara Sims Dean L. Fixsen Karen A. Blase October 2, 2012 Making It Happen With Active Implementation Frameworks: Implementation Drivers.
EFFECTIVE IMPLEMENTATION: EXPLORATION
Introduction to the Grant August-September, 2012 Facilitated/Presented by: The Illinois RtI Network is a State Personnel Development Grant (SPDG) project.
EFFECTIVE IMPLEMENTATION: INSTALLATION
District Literacy Leaders Network Meeting March 24, :00am-12:00pm Dr. LaWonda Smith Manager, English Language Arts Dr. Argentina Back Manager, Multilingual.
Min.cenmi.org Michigan Implementation Network Providing Support through District Leadership and Implementation Team April 29, 2010 Michigan Implementation.
EFFECTIVE IMPLEMENTATION: FULL IMPLEMENTATION
Coaching for Impact Susan Barrett
An Introduction to Implementation Tools to Help Build Implementation Capacity SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen,
Zelphine Smith-Dixon, State Director of Special Education
Using the Hexagon tool to Effectively select a practice
The Hexagon An EBP Exploration Tool
Installation Stage and Implementation Analysis
Presentation transcript:

Karen A. Blase Barbara Sims Michelle A. Duda Dean L. Fixsen Jonathan Green Ron Dughman, RRCP Jeanna Mullins, RRCP Presented by Jeanna Mullins, Mid-South RRC Eugene, Oregon October 31- November 1, 2013 Building Implementation Capacity Western RRC Part B Leadership Forum

“A serious deficiency is the lack of expertise to implement best practices and innovations effectively and efficiently to improve student outcomes.” Rhim, Kowal, Hassel, & Hassel (2007) Developing the Capacity to Implement Effectively

GOALS OF THE SESSION Increase knowledge of the ‘active implementation frameworks’ Increase familiarity with tools and processes for using the frameworks Identify connections to the SSIP development and implementation

Complex environments  Unpredictable people  Competing demands  Shifting priorities  Various points of view The Challenge

5 years of turnaround work  < 10% out of improvement status  > 90% still “in improvement” —Stuit (2011; Are bad schools immortal?) The Challenge

Recognizing Gaps  Science to Service Gap  What is known to be effective is not what is selected to help students  Implementation Gap  What is selected is not used with fidelity and good outcomes  What is used with fidelity is not sustained for a useful period of time  What is used with fidelity is not used on a scale sufficient to broadly impact student outcomes The Challenge

“Implementation science is the systematic study of variables and conditions that lead to full and effective use of evidence-based programs and other effective innovations in typical human service settings.” —Blase and Fixsen, 2010 National Implementation Research Network Implementation Science

Business as Usual: Impact Best data show these methods, when used alone Do not Result in Implementation as Intended  Diffusion/ Dissemination of information  Training  Passing laws/ mandates/ regulations  Providing funding/ incentives  Organization change/ reorganization 5 to 10% return on investment NECESSARY BUT NOT SUFFICIENT Implementation Science

Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Implementation Research: A Synthesis of the Literature Implementation Science

 It is not a “school problem”  District is the point of entry for sustainable school improvement  Use short-term infusion of resources  Establish long-term, district-based capacity for quality Plan for Change

“Making It Happen” To successfully implement and sustain evidence-based and evidence-informed interventions, we need to know:  WHAT to do What is the intervention (e.g. effective instruction, effective assessment)?  HOW to do it Active and effective implementation and sustainability frameworks (e.g. strategies to change and maintain behavior of adults)  WHO will do it Organized, purposeful, & active implementation support from linked implementation teams Plan for Change

Q. HOW? A. Effective Implementation Changing the behavior of educators and administrators Creating the setting conditions to facilitate these changes Creating the processes to maintain and improve these changes in both setting conditions and behavior of well-intentioned adults So that students benefit

Active Implementation Frameworks Usable Interventions Implementation Stages Implementation Drivers Improvement Cycles Implementation Teams

An intervention needs to be teachable, learnable, doable, and be readily assessed in practice. Usable Interventions

USABLE INTERVENTIONS Purposeful matching of critical implementation activities to the stage of the process Usable Interventions

Clear Description Clear Description Philosophy, Values and Principles Inclusion and Exclusion Criteria Usable Interventions

Clear Description

Usable Interventions Operational Definitions Clear Description Performance Assessment Essential Functions

Usable Interventions Essential Functions

Clear description of the features that must be present to say that a program exists in a given location Core components Usable Interventions Essential Functions

Usable Interventions Operational Definitions

Operational Definitions Describe each core component in terms that can be taught, learned, done in practice, and assessed in practice Practice Profiles Usable Interventions Operational Definitions

Usable Interventions Performance Assessment

Provides evidence that the program is being used as intended and is resulting in the desired outcomes Fidelity Practical enough to repeat time and time again Usable Interventions Performance Assessment

Usable Interventions Operational Definitions Clear Description Performance Assessment Essential Functions

Usable Interventions Tools You Can Use Hexagon Tool Practice Profiles Usable Interventions

IMPLEMENTATION STAGES Purposeful matching of critical implementation activities to the stage of the process

Stages AND Drivers Implementation Takes Time: 2 – 4 Years EXPLORATION INSTALLATION INITIAL IMPLEMENTATION FULL IMPLEMENTATION Drivers “DRIVERS”

Stages of Implementation EXPLORATION Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory —Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 “Pay now or Pay later”

Exploration Goals Create readiness for change Changing hearts and minds Examine degree to which the proposed strategies and practices meet the needs of our State and our students Determine whether the strategies, practices, and implementation are desirable and feasible Implementation Stages

Exploration Stage What happens during Exploration Stage? Determine Need and Identify Options Assess “Fit” and Feasibility Structural and functional changes identified Promote “Buy in” for the innovation and for implementation supports Make recommendations (go/no go) Implementation “Team” identified Implementation Stages

The Hexagon An EBP Exploration Tool NEED FIT RESOURCES EVIDENCE CAPACITY READINESS Fit with current Initiatives School, district, state priorities Organizational structures Community values Need in school, district, state Academic & socially significant Issues Parent & community perceptions of need Data indicating need Resources and supports for: Curricula & Classroom Technology supports (IT dept.) Staffing Training Data Systems Coaching & Supervision Administration & system Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe Several replications How well is it operationalized? Are Imp Drivers operationalized? The “Hexagon” can be used as a planning tool to evaluate evidence- based programs and practices during the Exploration Stage of Implementation. Download available at: EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. HighMedLow Need Fit Resource Availability Evidence Readiness for Replication Capacity to Implement Total Score © National Implementation Research Network Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland

Reflection Creating Readiness What role can you play in developing readiness for the EBP you have in mind? What are 2 things your team could do tomorrow to asses your current infrastructure? The Hexagon Tool (Analysis of Evidence- based Programs or Practices) Supporting New Ways of Work

Stages of Implementation Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory INSTALLATION —Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 “If you build it, they will come”... but you actually have to build it!

Installation Goals Structural and functional changes are made to support implementation Staff selection protocols developed First ‘practitioners’ selected Define and initiate training of first cohort of practitioners Develop coaching system and plans Evaluate readiness and sustainability of data systems (e.g. fidelity, outcomes) Implementation Stages

Installation What’s Needed High-level protection, problem solving, and support Reduced expectations and higher costs during start up Help in evolving organizational supports at every level Help in establishing new implementation supports for the EBP Implementation Stages

Stages of Implementation Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 EXPLORATION INSTALLATION INITIAL IMPLEMENTATION “Get Started, then Get Better.” Competency Drivers Organization Drivers Leadership Drivers Integrated & Compensatory

Initial Implementation Work through the Awkwardness New skills are fragile and uncomfortable Implementation supports require new thinking/doing Organization/system change is scary Provide training and coaching on the evidence-based practice, re-organization of school roles, functions and structures Make use of improvement cycles to resolve systems issues Implementation Stages

Initial Implementation “Get started, then get better!” Learn from mistakes (detect and correct) Celebrate participation and progress Continue “buy-in” efforts Make organization and systems changes Manage expectations, “buy time in order to get better” All the components of the program or innovation are at least partially in place and the implementation supports begin to function Implementation Stages

Stages of Implementation Years Fixsen, Naoom, Blase, Friedman, & Wallace, 2005 FULL IMPLEMENTATION “The only thing worse than failing and not knowing why you failed, is succeeding and not knowing why you succeeded.”

Full Implementation Maintaining and improving skills and activities throughout the system Components integrated, fully functioning Skillful practices by front line staff, supervisors, administrators (50% meet performance criteria) Changes in policy are reflected in practice at all levels Ready to be evaluated for expected outcomes Implementation Stages

Full Implementation “What Change? This is our way of work!” Skillful Teaching and School Practices Skillful Use of the Drivers Drivers experience their own Improvement Cycles Data systems are in use, reliable and efficient, and are used for decision-making at multiple levels to regenerate and improve Practice to Policy Feedback Cycles are working and policies are reviewed regularly and changed to support improved practices and outcomes Implementation Stages

Reflection Implementation Stages What are you already doing that is “stage-based” relative to SSIP? What are the facilitators and barriers to doing stage-based work for the SSIP?  Stages of Implementation Analysis Supporting New Ways of Work

Stages of Implementation Analysis Purpose  Help Team plan for and/or assess the use of stage-based activities to improve the success of implementation efforts for EBPs or evidence-informed innovations (action planning/anticipatory guidance)  The tool can be used to Self-Assess current stage related activities (e.g. “We are in the midst of Exploration”) or past efforts related to a stage (e.g. “We just completed most of Installation? How did we do? What did we miss?) (manage expectations)

Stages Tools You Can Use Stages of Implementation Analysis Template Exploration Stage Guiding Questions f Hexagon Tool and Instructions

IMPLEMENTATION DRIVERS Common features of successful supports to help make full and effective use of a wide variety of innovations

Core Implementation Components © Fixsen & Blase, 2008 Positive Outcomes for Students Competency Drivers Organization Drivers Leadership Effective Educational Practices How: What: Why: Professional Development/ Professional Learning Professional Development/ Professional Learning Capacity to provide direction and vision Staff capacity to support children/families with the selected practices Institutional capacity to support teachers & staff in implementing practices with fidelity

Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Implementation Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Leadership Improved educational outcomes Interventions meet Implementation Consistent Use of Educational Innovations © Fixsen & Blase, 2008

Performance Assessment Purposes “Are we doing what we said we would do?”  Measure fidelity  Motivate implementation  Reinforce staff and build on strengths  Interpretation of Outcome Data  Feedback on functioning of »Recruitment and Selection Practices »Training Programs (pre and in-service) »Supervision and Coaching Systems Implementation Drivers Competency Drivers

Performance Assessment Challenges Drivers are ‘in service’ to a defined “it” No definition of “it” No definition of ‘fidelity’ to the “it” What’s the way forward? Performance Assessment processes are weak Content Context No Competency indicators What’s the way forward? Blaming the teacher What’s the way forward? Implementation Drivers Competency Drivers

Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Implementation Drivers

Selection Purposes Select for the “tough to teach traits” Screen for pre-requisites Set expectations for new hires – use of data, coaching Allow for mutual selection Improve likelihood of retention after “investment” Improve likelihood that training, coaching, and supervision will result in implementation Implementation Drivers Competency Drivers

Selection Driver Challenges “We have who we have …this doesn’t apply to us!” What’s the way forward? Implementation Drivers Competency Drivers

Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Implementation Drivers

Training Driver Challenges “But our staff deserve professional development opportunities – we have to trust them to make use of the information.” What’s the way forward? “Training events aren’t meeting our expectations for supporting implementation!” (e.g. no pre/post test, no practicing skills) “Now what?” What’s the way forward? Implementation Drivers Competency Drivers

Training Purposes Continue “Buy-in” process Knowledge acquisition Skill Development Form a ‘community’ Get started…get better Implementation Drivers Competency Drivers

Performance Assessment (Fidelity) Coaching Training Selection Competency Drivers © Fixsen & Blase, 2008 Implementation Drivers

Training and Coaching Implementation Drivers OUTCOMES % of Participants who Demonstrate Knowledge, Demonstrate New Skills in a Training Setting, and Use new Skills in the Classroom TRAINING COMPONENTS Knowledge Skill Demonstration Use in the Classroom Theory and Discussion 10%5%0%..+Demonstration in Training 30%20%0% …+ Practice & Feedback in Training 60% 5% …+ Coaching in Classroom 95% —Joyce and Showers, 2002

Coaching Purposes Ensures fidelity Ensures implementation Develops application judgment in their setting Provides feedback to selection and training processes Grounded in “Best Practices” Must include direct observation and feedback Implementation Drivers Competency Drivers

Coaching Common Challenges Discomfort with ‘active skill development’ – “Let’s reflect.” Multiple coaches for multiple initiatives. Lack of PDSA process – acts of random advice Process skills weighted more heavily than innovation knowledge and vice versa Resources – Who will do this? How will we fund it? Pair up, Pick one (or create your own), Discuss ideas for addressing the challenge. Implementation Drivers Competency Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Implementation Drivers © Fixsen & Blase, 2008

Organization Drivers Change Organizations and Systems Create and sustain hospitable organizational and system environments for effective services Develop functional data systems that can be used to inform decision-making Implementation Drivers Organization Drivers

© Dean Fixsen, Karen Blase, Robert Horner, George Sugai, 2008 All organizations [and systems] are designed, intentionally or unwittingly, to achieve precisely the results they get.“ — R. Spencer Darling Business Expert Organizational Change

System Change Creating Capacity for Competent Change New innovations do not fare well in current organizational structures and systems Develop new position descriptions and job functions in State Departments of Education and in Regional and District systems. “Systems trump programs.” —Patrick McCarthy, Annie E. Casey Foundation

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Implementation Drivers © Fixsen & Blase, 2008

Decision Support Data Systems Purposes Monitor and improve student outcomes through data-based decisions Provide information to assess effectiveness of intervention and prevention practices Analyze the relationship of fidelity to outcomes Guide further program development  Detect discrete issues as well as systemic issues Engage in continuous quality improvement  Of the Intervention and the Drivers Celebrate success Be accountable to parents, Board of Education taxpayers, and other funders Implementation Drivers Organization Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Implementation Drivers © Fixsen & Blase, 2008

Facilitative Administration Purposes Creates an internally hospitable environment for the new way of work – at the level of the “agency” (e.g. school, District) Facilitates the installation, implementation, and improvement of the Drivers for each innovation Takes the lead on Systems Interventions Looks for ways to make the direct work of practitioners (e.g. teachers, school staff) and administrators more effective and less “burdensome”!! Implementation Drivers Organization Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Implementation Drivers © Fixsen & Blase, 2008

Systems Intervention Purposes Identify and “lift up” systemic barriers and facilitators to the next level to improve support for the new way of work Create an externally “hospitable” environment for the new way of work Embed facilitators and strengths Contribute to cumulative learning in multi-site projects Implementation Drivers Organization Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Competency Drivers Organization Drivers Implementation Drivers © Fixsen & Blase, 2008 Leadership AdaptiveTechnical

Leadership Drivers Purpose Identifying “wicked” problems and applying effective strategies to address those problems Implementation Drivers Leadership Drivers

Wicked problems Each attempted solution permanently alters the nature of the problem.  “The problem” is a moving target  Attempted “solutions” often make the problem worse, not better  Legitimate But Competing Alternatives: “Solutions” as defined by one group are seen as “calamitous failures” by other groups

Leadership Drivers Different challenges call for different strategies Technical Strategies Adaptive Strategies According to Ron Heifetz and his colleagues at Harvard’s Kennedy School of Government, one of the biggest mistakes “leaders” make is to incorrectly identify the type of challenge they are facing Using technical approaches for adaptive issues (and vice versa) Implementation Drivers Leadership Drivers

Technical Challenges Perspectives are aligned (views, values) Definition of the problem is clear Solution and implementation of the solution is relatively clear There is reasonable confidence that if the solution is implemented there will be resolution There can be a “primary” locus of responsibility for organizing the work Implementation Drivers Leadership Drivers

Adaptive Challenges Legitimate, yet competing, perspectives emerge Definition of the problem is unclear There are different perspectives on the “issue” at hand Solution and implementation are unclear and require learning Primary locus of responsibility is not a single entity or person Implementation Drivers Leadership Drivers

Strategies for Adaptive Work Terms of Reference Nominal Group Process Criteria Referenced Problem-Solving  What are the legitimate but competing agendas in play? How do we define the challenge?  What are the criteria for a good solution?  Options?  How do options address each of the criteria?  Agreement to “next right steps” and next assessment points Implementation Drivers Leadership Drivers

Performance Assessment (Fidelity) Coaching Training Selection Systems Intervention Facilitative Administration Decision Support Data System Adaptive Technical Integrated & Compensatory Competency Drivers Organization Drivers Leadership Improved educational outcomes Interventions meet Implementation Consistent Use of Educational Innovations © Fixsen & Blase, 2008

Benefits of Driver-Based Action Planning Infrastructure needed becomes visible to all Strengths and progress get celebrated Next right steps are planned and results measured Resources can be aligned and re- purposed to improve implementation Implementation Drivers Action Plans

Drivers Tools You Can Use - Drivers Best Practices -Locus of Responsibility for Drivers Full Suite of Drivers’ Assessments Drivers’ Best Practices – Overview Assessment Drivers’ Strategic Analysis – Starter Discussion

Reflection Implementation Drivers  Do the Competency Drivers have applicability to SSIP work? o How? o Why not? o Under what conditions?  Do the Organization Drivers have applicability to SSIP work? o Why? o Why not? o Under what conditions  Does adaptive and technical leadership have application to SSIP work? o Why? o Why not? o Under what conditions? Supporting New Ways of Work

Changing on Purpose New practices do not fare well in existing organizational structures and systems Effective innovations are changed to fit the system, as opposed to existing systems changing to support effective innovations. People, organizations, and systems... Cannot change everything at once (too big; too complex; too many of them and too few of us) Cannot stop and re-tool (have to create the new in the midst of the existing) Cannot know what to do at every step (we will know it when we get there) Many outcomes are not predictable (who knew!?) Improvement Cycles

Types of Improvement Cycles Plan-Do-Study-Act Cycles Rapid cycle problem solving (Shewhart; Deming) Transformation Zone Usability testing (Neilson; Rubin) Practice-policy communication loops Improvement Cycles

Rapid Cycle Problem Solving Improvement Cycles Plan DoStudy Act

Usability Testing Improvement Cycles Plan DoStudy Act Plan DoStudy ActPlan DoStudy Act

Usability vs. Pilot Testing Improvement Cycles Usability Clear description of the program Trial and learning approach Small number of participants (N = 3 - 5) Multiple iterations to detect and correct problems as they arise Learn HOW to do the work effectively Pilot Clear description of the program Trial and assessment approach Sufficient number of participants for statistical power (N= 20 – 50) Sufficient time to realize potential results

Transformation Zone A “vertical slice” of the service system (from the classroom to the Capitol) The “slice” is small enough to be manageable The “slice” is large enough to include all aspects of the system The “slice” is large enough to “disturb the system” – a “ghost” system won’t work. Improvement Cycles

Practice-Policy Communication Cycle Policy Practice Policy Enables Practices Plan Do External Implementation Support Policy Practice Structure Procedure Practice Informs Policy Feedback Study - Act FORM SUPPORTS FUNCTION

Implementation Team State Management Team Teachers Innovations Students System Change SISEP System Change Support Practice- Policy Communication Cycle Policy Supports Effective Practice System Alignment

Improvement Cycles Tools You Can Use Improvement Cycles – Analysis Worksheet Transformation Zone Functions and Structure

Reflection Improvement Cycles and Communication Loops How can we make use of improvement cycles in developing and implementing our improvement activities? Linking Communication Protocols Supporting New Ways of Work

IMPLEMENTATION TEAMS Organized, expert assistance to develop and sustain an accountable and effective structure

Implementation Teams Provide accountable and effective structure to move intervention through stages of implementation Scope of the initiative determines the number of teams and the linked communication protocols needed Focus is on  Ongoing “buy-in” and readiness  Installing and sustaining the Implementation Drivers  Fidelity & Outcomes  Systems Alignment and Stage-based work  Problem-solving and sustainability

“We tend to focus on snapshots of isolated parts of the system and wonder why our deepest problems never seem to get solved.” —Senge, 1990 Linked Team Structures Implementation Teams School-based Implementation Team School-based Implementation Team District-based Implementation Team District-based Implementation Team Regionally-based Implementation Team State-based Implementation Team State-based Implementation Team

Reflection Implementation Teams In your experience, who supports the change process? How is the transition made from external expertise to building internal capacity? Supporting New Ways of Work

Implementation Teams Tools You Can Use & Learn More Terms of Reference and Linking Communication Protocols SISEP Teams, Roles, Functions

Your Implementation Resource Active Implementation Hub

Summary “Making it Happen” for students and families:  Purposeful selection of an effective and feasible “What”  Conceptualize a change process so that effective interventions for children and families can become embedded and sustained in socially complex settings  “stage-matched activities” to guide the process  “implementation drivers” to build the infrastructure  Improvement processes are critical  the work is never done because the environment is in motion  Invest in the development of organized, “expert” implementation support

Reflection Implementation Action Planning What are the best next steps for your team relative to applying implementation frameworks to the work around SSIP? Supporting New Ways of Work