Download presentation
Presentation is loading. Please wait.
Published byBenedict Stewart Modified over 6 years ago
1
An Introduction to Implementation Tools to Help Build Implementation Capacity
SPDG Evaluators May 2012 Michelle A. Duda, Dean L. Fixsen, Karen A. Blase and Barbara Sims University of North Carolina at Chapel Hill © 2012 Karen A. Blase and Dean L. Fixsen
2
Introductions Name States you support/work with
Level of the Education “system” you support (State, Region, District, Building) What you hope to learn SISEP 2012
3
SISEP Intentions Develop implementation capacity in 6 States
Establish a protocol for selecting States that are ready Select evidence-based programs ready for scaling Evaluate the state-level application of the approach Assess State capacity; improve outcomes for students Extend findings to additional States Work with RRC and TA Centers Develop Communities of Practice Disseminate findings Make presentations and write papers Develop web site and share tools
4
Objectives Introduce tools to help support implementation capacity development State Capacity Assessment Stages of Implementation Analysis Stage-Based Implementation Driver Analysis A deeper review of a selected tool a) Explore how it leads to developing Action Plans b) Discuss Administration and Implications c) Next Steps © 2012 Karen A. Blase and Dean L. Fixsen
5
Effective intervention practices
Formula for Success Effective intervention practices Effective implementation practices X = Good Outcomes 5
6
Shared Goals Improve student outcomes Improve teacher instruction
Improve school supports for teachers Improve district supports for schools Improve regional supports for districts Improve State supports for outcomes Re-define relationships among system components Focus fully on student outcomes © 2012 Karen A. Blase and Dean L. Fixsen
7
Capacity Building Implementation Teams Organization Change
System Reinvention Capacity Funding AMOUNTS YEARS
8
Developing the Capacity to Implement Well
“A serious deficiency is the lack of expertise to implement best practices and innovations effectively and efficiently to improve student outcomes.” Rhim, Kowal, Hassel, & Hassel (2007) Rhim, L. M., Kowal, J. M., Hassel, B. C., & Hassel, E. A. (2007). School turnarounds: A review of the cross-sector evidence on dramatic organizational improvement. Lincoln, IL: Public Impact, Academic Development Institute.
9
Implementation Team 80%, 3 Yrs 14%, 17 Yrs IMPLEMENTATION Impl. Team
NO Impl. Team 80%, 3 Yrs 14%, 17 Yrs Effective INTERVENTION Balas & Boren, 2000 Effective use of Implementation Science & Practice Letting it Happen Helping it Happen Fixsen, D. L., Blase, K. A., Timbers, G. D., & Wolf, M. M. (2001). In search of program implementation: 792 replications of the Teaching-Family Model. In G. A. Bernfeld, D. P. Farrington & A. W. Leschied (Eds.), Offender rehabilitation in practice: Implementing and evaluating effective programs (pp ). London: Wiley. It takes an estimated average of 17 years for only 14% of new scientific discoveries to enter day-to-day clinical practice (Balas & Boren, 2000) Balas EA, Boren SA. Yearbook of Medical Informatics: Managing Clinical Knowledge for Health Care Improvement. Stuttgart, Germany: Schattauer Verlagsgesellschaft mbH; 2000. Green, L. A., & Seifert, C. M. (2005). Translation of research into practice: Why we can't "Just do it". Journal of the American Board of Family Practitioners, 18(6), With the use of competent Implementation Teams, over 80% of the implementation sites were sustained for 6 years or more (up from 30%) and the time for them to achieve Certification was reduced to 3.6 years. Fixsen, Blase, Timbers, & Wolf, 2001 Balas & Boren, 2000 Green & Seifert, 2005
10
State Management Team IMPLEMENTATION CAPACITY FOR SCALING UP EBPs
SISEP Support & 2 STSs State Management Team STSs and a State Transformation Workgroup Regional Implementation Team “District” Impl.Teams N = 50 – 100 Schools SISEP 2012 (c) Dean Fixsen and Karen Blase, 2004 (c) Dean Fixsen, Karen Blase, Robert Horner, George Sugai, 2008 10 10
11
Department Leadership Regional Implementation Teams
State Department Leadership Implementation-Skilled Workforce Staff with special implementation skills Re-Purpose Regional Implementation Teams “District” Implementation Teams Fixsen, D., Blase, K., Metz, A., & Dyke, M. V. (in press). Statewide implementation of evidence-based programs. Exceptional Children (Special Issue). Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, National Implementation Research Network. (FMHI Publication No. 231). Greenhalgh, T., Robert, G., MacFarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. The Milbank Quarterly, 82(4), Glennan Jr., T. K., Bodilly, S. J., Galegher, J. R., & Kerr, K. A. (2004). Expanding the reach of education reforms. Santa Monica, CA: RAND Corporation. Schofield, J. (2004). A Model of Learned Implementation. Public Administration, 82(2), Brown, B. S., & Flynn, P. M. (2002). The federal role in drug abuse technology transfer: A history and perspective. Journal of Substance Abuse Treatment, 22(4), Building Implementation Team School Teachers and Staff Adult interactions produce Student benefits All Students & Families © Fixsen & Blase, 2008
12
State Capacity Assessment
Fixsen, D.L., Blase, K.A., Duda, M.A., and Horner, R, 2012 Purpose Provide a State Management Team (SMT) with a regular measure of the state capacity for implementation of evidence-based practices Pre-Requisites State Management Team interested/dedicated to creating aligned teaming structures State Management Team ready engage in action planning Agreed upon and Defined Evidence-based Practice *
13
Administration of the SCA
The SCA is completed in the Fall and Spring by the SMT (typically with support from implementation coordinators and others) and used to guide action planning for improving or sustaining state capacity for effectively implementing evidence-based programs and other innovations in education Protocol Includes a facilitator to review each item on the tool and document consensus and dialogue of critical components
14
Scoring key All Scaling Capacity Items scored using the following format. Fully In Place 2 points - All dimensions of element adhered to and evidence available to support this Partially In Place 1 point - Some dimensions of element adhered to and/or some dimensions attended to - Action Planning occurs with these elements Not in Place 0 points Element not adhered to Action Planning occurs with these elements or teams may not be developmentally ready to build in this component
15
Active Implementation Frameworks
Implementation Teams Implementation Stages Implementation Drivers Improvement Cycles Briefly Summarize the Four Implementation Frameworks Set the Context for Tools and Resources State and RRC Examples Table Discussion © 2012 Karen A. Blase and Dean L. Fixsen
16
Active Implementation Frameworks
Successful implementation on a useful scale requires. . . Organized, expert assistance – “IMPLEMENTATION TEAMS” Purposeful matching of critical implementation activities to the stage of the process – “STAGES OF IMPLEMENTATION” Active use of implementation core components “best practices”– “IMPLEMENTATION DRIVERS” A focus on continuous, purposeful improvement – “IMPROVEMENT PROCESSES” SISEP 2012
17
Implementation Teams State, Regional, District, Building Teams
Know innovations very well (formal and craft knowledge) Know implementation very well (formal and craft knowledge) Know improvement cycles to make interventions and implementation methods more effective and efficient over time 17
18
Effective Implementation
Implementation Team members make effective use of: Implementation Stages Implementation Drivers Improvement Cycles 18
19
Sages of Implementation
Stages of Implementation School Level Implementation Takes Time: 2 – 4 Years EXPLORATION INSTALLATION INITIAL IMPLEMENTATION FULL IMPLEMENTATION Sages of Implementation
20
“Implementation Drivers”
Common features of successful supports to help make full and effective uses of a wide variety of innovations
21
Reliable Student Benefits
Consistent Use of Educational Innovations Interventions Meet Implementation Implementation Drivers Performance Assessment (fidelity) Systems Intervention Coaching Organization Drivers Facilitative Administration Training Competency Drivers Integrated & Compensatory Decision Support Data System Selection Leadership Drivers Technical Adaptive SISEP 2012 © Fixsen & Blase, 2008 21
22
Improvement Cycles © 2012 Karen A. Blase and Dean L. Fixsen
23
Stage–Based Assessments/ Stage-Based TA
The Exploration Stage Installation Stage Initial Implementation Full Implementation © 2012 Karen A. Blase and Dean L. Fixsen
24
Stages of Implementation Analysis
Purpose Help Team plan for and/or assess the use of stage-based activities to improve the success of implementation efforts for EBPs or evidence-informed innovations (action planning/anticipatory guidance) The tool can be used to Self-Assess current stage related activities (e.g. “We are in the midst of Exploration”) or past efforts related to a stage (e.g. “We just completed most of Installation? How did we do? What did we miss?) (manage expectations)
25
Administration of the Stages of Implementation Analysis
Define the desired function of the tool in advance (assess current status? Action planning?) For self-assessment Implementation lead or Implementation Team completes entire assessment to achieve “strength of stage” score for each stage of Implementation For Items “Initiated or Partially in Place” and “Not Yet Initiated” Action Plans can be developed to determine next steps or what needs to be revisited Usually used during exploration stage, but can be used thorough out all stages to check back or when implementation dips occur (change in leadership, staff turnover, etc.)
26
Using the Implementation Drivers
Interviews and Self-Assessments to analyze the implementation drivers for action planning: What infrastructure exists? How implementation-informed is it? What additional infrastructure is needed? What are the “next right steps” to improve the infrastructure supports? Integrated & Compensatory Competency Drivers Organization Drivers Leadership Drivers © 2012 Karen A. Blase and Dean L. Fixsen
27
Implementation Drivers Tools
High level tool to get State, Districts, or Buildings thinking about The functions Who be accountable for them? How can they be improved to better support implementation?
28
Tools to Help Build Capacity
Stage of Implementation Assessments Exploration Assessment of Implementation Stages ImpleMap Installation Installation Stage Assessment Installation Stage Action Planning Guide Initial Implementation Initial Implementation Component Assessment Initial Implementation Action Planning Guide Full Implementation Full Implementation Component Assessment Implementation Tracker
29
Exploration: ImpleMap
Purpose: To assess and understand “the implementation landscape” A data collection tool to assess Implementation drivers and implementation best practices Pre-Requisites Interviewer needs to have expertise in Implementation Science Site has a history of successful (or not) use of evidence-based or evidence informed practices
30
Administration of the ImpleMap
Semi-structured interview lead by individuals fluent in Implementation Science or has deep knowledge of the Implementation Drivers Participants include individuals who are immersed in helping building staff use innovations and leadership involved in selecting and sustaining these innovations Select more than one EBP to discuss so that themes can be identified SISEP 2012
31
ImpleMap Administration
INTERVENTION WHAT HOW WHO Enter the name of each intervention provided by the respondent. You may know the intervention by another name, but record the name used in this provider organization. Ask questions to get information about the vetting process. Ask about the “core intervention components” as they are described by the respondent. Core intervention components are the critical functions that define an intervention. Ask about the Implementation Drivers. Implementation Drivers are components related to: ►developing staff competency (selection, training, coaching, performance assessments); ►organization supports (decision support data systems, facilitative administration, systems interventions); and ►leadership supports (technical and adaptive). Ask about the person accountable for providing each Implementation Driver. Record the name, position, and physical location of each person. SISEP 2012
32
Installation & Initial Implementation Stage Driver Component Assessments
Purpose Serve as a Self-Assessment for Implementation Infrastructure for Evidence-based Practices Identified Help Teams Create and Implementation Action Plan for installing Driver Best Practice Components Pre-Requisites Implementation Team or “Accountability/Leadership” Team that can achieve action items that will be identified Clearly Defined Evidence-based Practice Components
33
Administration of the Installation Stage Assessment
Facilitated by Implementation Specialist or Member of Implementation Team May choose to review 1 – 2 Drivers/per meeting For components identified as “Fully In Place” provide a clear description or “evidence” that this component is transparent Score components to help prioritize Action Items Outcome: Implementation Action Plan
34
Scoring key All Implementation best-practice components scored using the following format. Fully In Place 2 points - All dimensions of element adhered to and evidence available to support this Partially In Place 1 point - Some dimensions of element adhered to and/or some dimensions attended to - Action Planning occurs with these elements Not in Place 0 points Element not adhered to Action Planning occurs with these elements or teams may not be developmentally ready to build in this component
35
Full Implementation Stage Driver Component Assessments
Purpose Help correlate high fidelity implementation of evidence-based practices to implementation infrastructure Help correlate (proximal) high fidelity implementation of evidence-based practices to consumer outcomes Identify supports/threats to sustainability Pre-Requisites Implementation of Evidence-based Practice or Evidence Informed innovation for at least one year and practitioners have demonstrated high fidelity us of these practices
36
Administration of the Full Stage Assessment
Key Informants are Identified and interviewed Practitioners /Clinicians Supervisors/Coaches Decision Makers All interviews are completed within small time frame (4 weeks) Can be repeated overtime to capture implementation fluctuations that may occur over time
37
Scoring key 1 = Completely Disagree 2 = Disagree 3 = Somewhat Disagree
All responses scored using the following format. 1 = Completely Disagree 2 = Disagree 3 = Somewhat Disagree 4 = Neither Agree nor Disagree 5 = Somewhat Agree 6 = Agree 7 = Completely Agree 8 = Does Not Exist in our organization 9 = Don’t Know
38
Benefits of Driver-Based Planning
Infrastructure needed becomes visible to all Strengths and progress get celebrated Next right steps are planned and results measured Resources can be aligned and re-purposed to improve implementation © 2012 Karen A. Blase and Dean L. Fixsen
39
General Discussion & Questions
© 2012 Karen A. Blase and Dean L. Fixsen
40
For more on Implementation Science
Stay Connected! @SISEPcenter SISEP For more on Implementation Science 40
41
Implementation Science
Implementation Research: A Synthesis of the Literature Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M. & Wallace, F. (2005). Implementation Research: A Synthesis of the Literature. Tampa, FL: University of South Florida, Louis de la Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). 41
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.