2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

State Systemic Improvement Plan: Preparing, Planning, and Staying Informed Presentation to Louisiana ICC July 10, 2013.
Student Services Personnel and RtI: Bridging the Skill Gap FASSA Institute George M. Batsche Professor and Co-Director Institute for School Reform Florida.
Barbara Sims Dean Fixsen Karen Blase Caryn Ward National SISEP Center National Implementation Research Network FPG Child Development Center University.
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Targets & Improvement Activities State Improvement Planning Carol Massanari, MPRRC Western Regional Resource Center APR Clinic 2010 November 1-3, 2010.
NC SSIP: 5 Things We’ve Learned Directors’ Update March 2015 ncimplementationscience.ncdpi.wikispaces.net/Recent+Presentations.
Getting Ready for Phase II of the SSIP
Ready for Phase II? Developing an Effective Systemic Improvement Plan Anne Lucas, ECTA/WRRC Grace Kelley, SERRC Taletha Derrington, DaSy Christina Kasprzak,
Ready for Phase II? Developing an Effective Systemic Improvement Plan Anne Lucas, ECTA/WRRC Grace Kelley, SERRC Taletha Derrington, DaSy Christina Kasprzak,
Evaluating SPP/APR Improvement Activities Presented by Jeanna Mullins, Mid-South Regional Resource Center, RRCP Document developed by members of the Systems.
NC SSIP: Top 5 Things We’ve Learned Mid-South Meeting January 7-8, 2015.
PHAB's Approach to Internal and External Evaluation Jessica Kronstadt | Director of Research and Evaluation | November 18, 2014 APHA 2014 Annual Meeting.
SWIFT School Wide Integrated Framework for Transformation
Continuing QIAT Conversations Planning For Success Joan Breslin Larson Third webinar in a series of three follow up webinars for.
RE-EXAMINING THE ROLE OF PROFESSIONAL DEVELOPMENT AND TRAINING EVALUATION THROUGH AN IMPLEMENTATION SCIENCE LENS MICHELLE GRAEF & ROBIN LEAKE NHSTES June.
1. 2 Why is the Core important? To set high expectations –for all students –for educators To attend to the learning needs of students To break through.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Improving Student Outcomes: Using Technical Assistance.
The RRCP Program A Framework for Change Presented to our SPDG Partners June 2010.
Sustainability Through the Looking Glass: Shifting Contingencies Across Levels of a System Jack States Randy Keyworth Ronnie Detrich 34th Annual Convention.
Overview of the State Systemic Improvement Plan (SSIP) Anne Lucas, WRRC/ECTA Ron Dughman, MPRRC Janey Henkel, MPRRC 2013 WRRC Leadership Forum October.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
Maine’s Response to Intervention Implementation: Moving Forward Presented by: Barbara Moody Title II Coordinator Maine Department of Education.
V Implementing and Sustaining Effective Programs and Services that Promote the Social-Emotional Development of Young Children Part I Karen Blase, Barbara.
Implementation Science Vision 21: Linking Systems of Care June 2015 Lyman Legters.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
SectionVideo/PresentSlidesTotal Time Overview + Useable Intervention8:30 min Stages7:19 min Teams PDSA Terri present Drivers8:50 min Lessons Learned +
1 Early Childhood and Accountability OSEP’s Project Director’s Meeting August 2006.
Presented by Shawn Fletcher Special Thanks To: Oregon DATA Project Team, EBISS, & David Douglas School District.
Improving Outcomes for All Students: Bringing Evidence-Based Practices to Scale March 25, 2009 MN RtI Center Conference Cammy Lehr, Ph.D. EBP & Implementation.
2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Participation of the State Advisory Panel and State Interagency.
Overview of the State Systemic Improvement Plan (SSIP)
The Challenge We must realize that the system is the cause of weak execution due to lack of clarity, commitment, collaboration and accountability resulting.
Connecting with the SPP/APR Kansas State Personnel Development Grant.
Early Childhood Outcomes Center1 Using Data for Program Improvement Christina Kasprzak, NECTAC/ECO Ann Bailey, NCRRC July 2010.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Scaling-Up Within a Statewide Multi-Tiered System of Supports (MTSS) SPDG National Meeting miblsi.cenmi.org.
Barbara Sims Brenda Melcher Dean Fixsen Karen Blase Michelle Duda Washington, D.C. July 2013 Keep Dancing After the Music Stops OSEP Project Directors’
Results Driven Accountability PRT System Support Grant Targeted Improvement Plan Cole Johnson, NDE.
DEVELOPING AN EVALUATION SYSTEM BOB ALGOZZINE AND STEVE GOODMAN National PBIS Leadership Forum Hyatt Regency O’Hare Rosemont, Illinois October 14, 2010.
Tier 2/ Tier 3 Planning for Sustainability Rachel Saladis WI PBIS Network/Wi RtI Center Katrina Krych Sun Prairie Area School District.
2009 OSEP Project Directors Meeting Martha Diefendorf, Kristin Reedy & Pat Mueller.
: The National Center at EDC
Early Childhood Transition Part C Indicator C-8 & Part B Indicator B-12 Analysis and Summary Report of All States’ Annual Performance Reports.
Consultant Advance Research Team. Outline UNDERSTANDING M&E DATA NEEDS PEOPLE, PARTNERSHIP AND PLANNING 1.Organizational structures with HIV M&E functions.
Michelle A. Duda, Ph.D., BCBA, Dean L. Fixsen, Ph.D. &
Welcome To Implementation Science 8 Part Webinar Series Kathleen Ryan Jackson Erin Chaparro, Ph.D University of Oregon.
Barbara Sims Debbie Egan Dean L. Fixsen Karen A. Blase Michelle A. Duda Using Implementation Frameworks to Identify Evidence Based Practices 2011 PBIS.
Early Childhood Transition: Effective Approaches for Building and Sustaining State Infrastructure Indiana’s Transition Initiative for Young Children and.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Help to develop, improve, and sustain educators’ competence and confidence to implement effective educational practices and supports. Help ensure sustainability.
Vermont Integrated Instruction Model (ViiM) Highlights August 2012.
LEA Self-Assessment LEASA: Presentations:
EFFECTIVE IMPLEMENTATION: EXPLORATION
EFFECTIVE IMPLEMENTATION: INSTALLATION
U.S. Department of Education Office of Special Education Programs General Supervision: Developing an Effective System Implications for States.
Evaluating SPP/APR Improvement Activities
Zelphine Smith-Dixon, State Director of Special Education
OSEP Project Directors Meeting
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Monitoring Child Outcomes: The Good, the Bad, and the Ugly
The Hexagon An EBP Exploration Tool
Using Data for Program Improvement
Using Data for Program Improvement
February 21-22, 2018.
Evaluating SPP/APR Improvement Activities
Installation Stage and Implementation Analysis
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Measuring Child and Family Outcomes Conference August 2008
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Presentation transcript:

2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing and Evaluating Effective Improvement Activities A Presentation by the Systems and Improvement Planning Priority Team Maureen Hawes, Christina Kasprzak, and Jeanna Mullins

Setting the Stage Following the 2004 re-authorization of IDEA, States were required to develop and submit State Performance Plans (SPPs) for a period of six years. The SPP: ₋Evaluates the state's implementation of IDEA ₋Describes how the state will improve such implementation (SPP). SPPs were updated each year with the submission of the Annual Performance Report (APR).

Setting the Stage The first SPP was submitted in December SPPs were updated each year with the submission of the Annual Performance Report (APR). The current SPP has been extended to FFY 2012 pending the upcoming re- authorization. The State’s performance on each indicator is reported annually in the APR.

Setting the Stage Improvement activities are critical components of the SPP/APR ₋Required for each of the SPP/APR indicators ₋Describe how the state will improve performance for each indicator, including activities, timelines, and resources. Source:: Choosing a Common Language: Terms and Definitions Used for the SPP/APR (Reviewed/revised by the RRCP Data Priority Team and the State Data Managers Feedback Group after the June 2008 Data Meeting) Revised August 2009

Evolution of Improvement Activities Initial observations of improvement activities Over and under-identification of activities – Linkages to SPP Indicators – Connections to State data – Alignments with other State initiatives

Evolution of Improvement Activities Initial observations of improvement activities – Description of implementation (action plan) – Limited resources (personnel, fiscal) for implementation – Limited evaluation plan for determining effectiveness

Evolution of Improvement Activities Recent observations – States have begun to organize and structure their work to align with the SPP indicators.

Evolution of Improvement Activities Recent observations – States have begun to use the SPP as a foundation for improvement processes and activities.

Evolution of Improvement Activities Recent observations – States have continued to refine their improvement activities Types Design Implementation Evaluation

Evolution of Improvement Activities Many of these refinements have been associated with: – Systems Thinking and Improvement Planning – Interrelationship of SPP/APR Components – Theory of Change Model – Implementation Science

Discussion and Questions Do any of these observations ring true for your state? If so, how?

Discussion Are there other observations that we did not include?

Lessons Learned Types of Improvement Activities

Types of Improvement Activities Improve data collection and reporting Improve systems administration and monitoring Build systems and infrastructures of technical assistance and support Provide technical assistance/ training/professional development

Types of Improvement Activities Clarify/examine/develop policies and procedures Program Development Collaboration/coordination Evaluation Increase/adjust FTE

Lessons Learned Designing Improvement Activities

Designing Improvement Activities Systemic approach needed Data-based decision making drives identification of areas of need Relationship between improvement activity and needs/target group Target group selection (statewide or targeted districts)

Designing Improvement Activities Alignment with capacity and resources Activities that impact multiple indicators Alignment with priorities, initiatives, and goals

Designing Improvement Activities Root cause analysis? Links between root cause, data and proposed outcomes? Evidence-based practices? Addresses more than one indicator? Identification of collaborative partners and role?

Designing Improvement Activities Detailed action plan (tasks, persons responsible, resources needed, timelines) for each activity? Short and long-term outcomes identified? Methods, data sources, data collection timelines, and reporting?

Insufficient Methods Implementation by… Laws/compliance “Following the money” Implementation without changing supporting roles and functions Diffusion/dissemination of information Training alone …does not lead to successful implementation » Fixsen, Naoom, Blase, Friedman, Wallace 2005

Implementation Not an event Mission-oriented process Takes 2-4 years Requires multiple decisions, actions and corrections

Designing Improvement Activities Sustainable systems change requires developing “a comprehensive, long- term plan for implementing change and strengthening the infrastructure needed to sustain change at all levels of the system….” (Hurth & Goode, 2009, p. 2)

Implementation Stages Exploration Assess needs Examine innovations Examine Implementation Assess fit Years Installation  Acquire resources  Prepare organization  Prepare implementation  Prepare staff Initial Implementation  Implementation drivers  Manage change  Data systems  Improvement cycles Full Implementation  Implementation drivers  Implementation outcomes  Innovation outcomes  Standard practice

Designing Improvement Activities Application of implementation science work to activity design- Exploration – Assess needs – Examine innovations – Examine implementation – Assess the fit

Formalize Structures Develop/formalize team structures ₋Who will be accountable for the work? ₋How will SEA leadership ensure successes are operationalized and barriers to implementation are removed?

Determine Need and Options Assess performance and needs ₋What do your current data suggest is most critical need? ₋What is the supporting research/evidence of strategies you are considering?

Assess Fit and Feasibility Analysis of Needs and Resources – What structural or organizational changes are needed at State or local level? ₋What resources will be needed?

Assess Fit and Feasibility Analysis of Implementation Requirements ₋What are the priorities of the State? ₋What is your theory of change? ₋How will you measure progress toward that goal at state and local levels? ₋Who will do what differently at state and local levels?

Promote “Buy-In” Develop collaboration and co- ownership of the work among stakeholders in the state and local programs ₋How will readiness be created at the state and local levels? ₋What will be continuous communication processes between implementers and the state to promote continued buy-in

Re-Assess and Decide Gather data collected to re-assess and decide on adoption and implementation of a practice, program or model Consider all information that has emerged during Exploration that impacts your decision

EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. HighMediumLow Need Fit Resources Availability Evidence Readiness for Replication Capacity to Implement Total Score: Need in Agency, Setting Socially Significant Issues Parent & Community Perceptions of Need Data indicating Need Need Fit Fit with current - Initiatives State and Local Priorities Organizational structures Community Values Resource Availability Resource Availability IT Staffing Training Data Systems Coaching & Supervision Administrative & system supports needed Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness Evidence Assessing Evidence-Based Programs and Practices Intervention Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe # of replications How well is it operationalized? Are Imp Drivers operationalized? Intervention Readiness for Replication Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families Agency Capacity to Implement © National Implementation Research Network 2009 Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland

Lessons Learned Implementating Improvement Activities

Implementing Improvement Activities Design of high quality improvement activities does not guarantee effective implementation and desired outcomes States must provide adequate resources including personnel and fiscal support to facilitate implementation

Implementing Improvement Activities Management of implementation is essential Monitoring of implementation is essential to guide adjustments

Implications for Implementation Consider the “Why” Define and understand the “What” Invest in the “How” Think through “Who” will do this work

Discussion Questions What is one of the positive changes your State has made in your process or system for designing or implementing improvement activities? What is one of the biggest challenges that your State is experiencing in designing or implementing your SPP/APR improvement activities?

Evaluating Improvement Activities

Lessons Learned The need for evaluating improvement activities was not high on our radar at the beginning but has become increasingly seen as important – For assessing how well an activity is being implemented – For determining if an activity is making an impact on the indicator data – AND, if not, making adjustments!

Lessons Learned TA resources and supports related to evaluating improvement activities have become increasingly available – Products – TA services – Conference presentations

Resources for Evaluating Activities 41

Resources for Evaluating Activities 42

Resources for Evaluating Activities 43

Paper Highlights Evaluation, improvement planning, and systems thinking Types of SPP/APR improvement activities Selection and review of activities Steps for evaluation of an activity Evaluation scenarios Resources and tools

Systems Change and the SPP/APR “A system is a group of interacting, interrelated, and interdependent components that form a complex and unified whole.” (Coffman, 2007, p. 3) The Part C and Part B Programs are complex systems with interrelated and interdependent components. ‘Theory of change’ shows how the inputs (resources) and processes (activities) connect to outputs (products or units produced, such as number of staff trained) and outcomes (intended results). Inputs Proces s Outputs Outcome s

Sustainable systems change requires developing “a comprehensive, long- term plan for implementing change and strengthening the infrastructure needed to sustain change at all levels of the system….” (Hurth & Goode, 2009, p. 2)

Reviewing a Set of Improvement Activities Do we have a good set of improvement activities for this indicator?

Reviewing a Set of Improvement Activities Root cause analysis? Links between root cause, data, and proposed outcomes? Evidence-based practices? Identification of collaborative partners? Action plan (tasks, persons responsible, resources needed, timelines)? Short and long-term outcomes? Methods, data sources, data collection timelines, and reporting?

Evaluating An Improvement Activity Do we have a good plan for evaluating an improvement activity?

Evaluating An Improvement Activity Goal/Purpose Process/Impact Data Collection Methods Timelines Data Analysis Methods Use and Reporting of Results Person(s) Responsible

Lessons Learned State resources and supports related to evaluating improvement activities are tight… … and states must be strategic about which activities they can evaluate and what methods they can use to evaluate

Lessons Learned Sometimes states evaluate improvement activities but do not report them in their APR

Lessons Learned States have improved their data collection systems to monitor implementation of improvement activities States are more often using this data to modify activities as needed

Example: Child Outcomes Data Improvement Activities: State conducted a number of activities (policies, procedures, professional development) on collecting child outcomes data. Evaluation: Are local programs implementing the child outcomes data collection process with fidelity?

Example: Child Outcomes Data Data collection: Statewide implementation survey on data collection and reporting practices Data Analysis: Identification of good practices as well as challenges Data Use: Targeted TA and supports

C3/B7 Implementation Surveys See examples of states’ implementation surveys online:

Example: EC Transitions Improvement Activities: State conducted a number of activities (revised policies, procedures, professional development) on improving EC transitions. Evaluation: Are local programs understanding and implementing quality transition practices?

Example: EC Transitions Data collection: Statewide implementation survey on transition practices Data analysis: Identification of practice improvements as well as continuing challenges Data use: Documented improved practices; targeted TA and supports

Discussion Questions What have been your state’s accomplishments with regard to evaluating improvement activities? What have been your greatest challenges with regard to evaluating improvement activities?

Resources for Improvement Activities

Resources Using the SPP as a Management Tool Evaluating Improvement Activities

Resources Implementation Science

Resources What additional materials could the SIP team make available to support you in improvement planning work?

THANK YOU! Systems and Improvement Planning Team Maureen Hawes RRCP Co-Coordinator North Central Regional Resource Center Christina Kasprzak Associate Director National Early Childhood TA Center Jeanna Mullins State TA Provider Mid-South Regional Resource Center Discussion and Questions