Presentation is loading. Please wait.

Presentation is loading. Please wait.

2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing.

Similar presentations


Presentation on theme: "2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing."— Presentation transcript:

1 2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing and Evaluating Effective Improvement Activities A Presentation by the Systems and Improvement Planning Priority Team Maureen Hawes, Christina Kasprzak, and Jeanna Mullins

2 Setting the Stage Following the 2004 re-authorization of IDEA, States were required to develop and submit State Performance Plans (SPPs) for a period of six years. The SPP: ₋Evaluates the state's implementation of IDEA ₋Describes how the state will improve such implementation (SPP). SPPs were updated each year with the submission of the Annual Performance Report (APR).

3 Setting the Stage The first SPP was submitted in December 2005. SPPs were updated each year with the submission of the Annual Performance Report (APR). The current SPP has been extended to FFY 2012 pending the upcoming re- authorization. The State’s performance on each indicator is reported annually in the APR.

4 Setting the Stage Improvement activities are critical components of the SPP/APR ₋Required for each of the SPP/APR indicators ₋Describe how the state will improve performance for each indicator, including activities, timelines, and resources. Source:: Choosing a Common Language: Terms and Definitions Used for the SPP/APR (Reviewed/revised by the RRCP Data Priority Team and the State Data Managers Feedback Group after the June 2008 Data Meeting) Revised August 2009

5 Evolution of Improvement Activities Initial observations of improvement activities Over and under-identification of activities – Linkages to SPP Indicators – Connections to State data – Alignments with other State initiatives

6 Evolution of Improvement Activities Initial observations of improvement activities – Description of implementation (action plan) – Limited resources (personnel, fiscal) for implementation – Limited evaluation plan for determining effectiveness

7 Evolution of Improvement Activities Recent observations – States have begun to organize and structure their work to align with the SPP indicators.

8 Evolution of Improvement Activities Recent observations – States have begun to use the SPP as a foundation for improvement processes and activities.

9 Evolution of Improvement Activities Recent observations – States have continued to refine their improvement activities Types Design Implementation Evaluation

10 Evolution of Improvement Activities Many of these refinements have been associated with: – Systems Thinking and Improvement Planning – Interrelationship of SPP/APR Components – Theory of Change Model – Implementation Science

11 Discussion and Questions Do any of these observations ring true for your state? If so, how?

12 Discussion Are there other observations that we did not include?

13 Lessons Learned Types of Improvement Activities

14 Types of Improvement Activities Improve data collection and reporting Improve systems administration and monitoring Build systems and infrastructures of technical assistance and support Provide technical assistance/ training/professional development

15 Types of Improvement Activities Clarify/examine/develop policies and procedures Program Development Collaboration/coordination Evaluation Increase/adjust FTE

16 Lessons Learned Designing Improvement Activities

17 Designing Improvement Activities Systemic approach needed Data-based decision making drives identification of areas of need Relationship between improvement activity and needs/target group Target group selection (statewide or targeted districts)

18 Designing Improvement Activities Alignment with capacity and resources Activities that impact multiple indicators Alignment with priorities, initiatives, and goals

19 Designing Improvement Activities Root cause analysis? Links between root cause, data and proposed outcomes? Evidence-based practices? Addresses more than one indicator? Identification of collaborative partners and role?

20 Designing Improvement Activities Detailed action plan (tasks, persons responsible, resources needed, timelines) for each activity? Short and long-term outcomes identified? Methods, data sources, data collection timelines, and reporting?

21 Insufficient Methods Implementation by… Laws/compliance “Following the money” Implementation without changing supporting roles and functions Diffusion/dissemination of information Training alone …does not lead to successful implementation » Fixsen, Naoom, Blase, Friedman, Wallace 2005

22 Implementation Not an event Mission-oriented process Takes 2-4 years Requires multiple decisions, actions and corrections

23 Designing Improvement Activities Sustainable systems change requires developing “a comprehensive, long- term plan for implementing change and strengthening the infrastructure needed to sustain change at all levels of the system….” (Hurth & Goode, 2009, p. 2)

24 Implementation Stages Exploration Assess needs Examine innovations Examine Implementation Assess fit 2 - 4 Years Installation  Acquire resources  Prepare organization  Prepare implementation  Prepare staff Initial Implementation  Implementation drivers  Manage change  Data systems  Improvement cycles Full Implementation  Implementation drivers  Implementation outcomes  Innovation outcomes  Standard practice

25 Designing Improvement Activities Application of implementation science work to activity design- Exploration – Assess needs – Examine innovations – Examine implementation – Assess the fit

26 Formalize Structures Develop/formalize team structures ₋Who will be accountable for the work? ₋How will SEA leadership ensure successes are operationalized and barriers to implementation are removed?

27 Determine Need and Options Assess performance and needs ₋What do your current data suggest is most critical need? ₋What is the supporting research/evidence of strategies you are considering?

28 Assess Fit and Feasibility Analysis of Needs and Resources – What structural or organizational changes are needed at State or local level? ₋What resources will be needed?

29 Assess Fit and Feasibility Analysis of Implementation Requirements ₋What are the priorities of the State? ₋What is your theory of change? ₋How will you measure progress toward that goal at state and local levels? ₋Who will do what differently at state and local levels?

30 Promote “Buy-In” Develop collaboration and co- ownership of the work among stakeholders in the state and local programs ₋How will readiness be created at the state and local levels? ₋What will be continuous communication processes between implementers and the state to promote continued buy-in

31 Re-Assess and Decide Gather data collected to re-assess and decide on adoption and implementation of a practice, program or model Consider all information that has emerged during Exploration that impacts your decision

32 EBP: 5 Point Rating Scale: High = 5; Medium = 3; Low = 1. Midpoints can be used and scored as a 2 or 4. HighMediumLow Need Fit Resources Availability Evidence Readiness for Replication Capacity to Implement Total Score: Need in Agency, Setting Socially Significant Issues Parent & Community Perceptions of Need Data indicating Need Need Fit Fit with current - Initiatives State and Local Priorities Organizational structures Community Values Resource Availability Resource Availability IT Staffing Training Data Systems Coaching & Supervision Administrative & system supports needed Evidence Outcomes – Is it worth it? Fidelity data Cost – effectiveness data Number of studies Population similarities Diverse cultural groups Efficacy or Effectiveness Evidence Assessing Evidence-Based Programs and Practices Intervention Readiness for Replication Qualified purveyor Expert or TA available Mature sites to observe # of replications How well is it operationalized? Are Imp Drivers operationalized? Intervention Readiness for Replication Capacity to Implement Staff meet minimum qualifications Able to sustain Imp Drivers Financially Structurally Buy-in process operationalized Practitioners Families Agency Capacity to Implement © National Implementation Research Network 2009 Adapted from work by Laurel J. Kiser, Michelle Zabel, Albert A. Zachik, and Joan Smith at the University of Maryland

33 Lessons Learned Implementating Improvement Activities

34 Implementing Improvement Activities Design of high quality improvement activities does not guarantee effective implementation and desired outcomes States must provide adequate resources including personnel and fiscal support to facilitate implementation

35 Implementing Improvement Activities Management of implementation is essential Monitoring of implementation is essential to guide adjustments

36 Implications for Implementation Consider the “Why” Define and understand the “What” Invest in the “How” Think through “Who” will do this work

37 Discussion Questions What is one of the positive changes your State has made in your process or system for designing or implementing improvement activities? What is one of the biggest challenges that your State is experiencing in designing or implementing your SPP/APR improvement activities?

38 Evaluating Improvement Activities

39 Lessons Learned The need for evaluating improvement activities was not high on our radar at the beginning but has become increasingly seen as important – For assessing how well an activity is being implemented – For determining if an activity is making an impact on the indicator data – AND, if not, making adjustments!

40 Lessons Learned TA resources and supports related to evaluating improvement activities have become increasingly available – Products – TA services – Conference presentations

41 Resources for Evaluating Activities 41

42 Resources for Evaluating Activities 42

43 Resources for Evaluating Activities 43

44 Paper Highlights Evaluation, improvement planning, and systems thinking Types of SPP/APR improvement activities Selection and review of activities Steps for evaluation of an activity Evaluation scenarios Resources and tools

45 Systems Change and the SPP/APR “A system is a group of interacting, interrelated, and interdependent components that form a complex and unified whole.” (Coffman, 2007, p. 3) The Part C and Part B Programs are complex systems with interrelated and interdependent components. ‘Theory of change’ shows how the inputs (resources) and processes (activities) connect to outputs (products or units produced, such as number of staff trained) and outcomes (intended results). Inputs Proces s Outputs Outcome s

46 Sustainable systems change requires developing “a comprehensive, long- term plan for implementing change and strengthening the infrastructure needed to sustain change at all levels of the system….” (Hurth & Goode, 2009, p. 2)

47 Reviewing a Set of Improvement Activities Do we have a good set of improvement activities for this indicator?

48 Reviewing a Set of Improvement Activities Root cause analysis? Links between root cause, data, and proposed outcomes? Evidence-based practices? Identification of collaborative partners? Action plan (tasks, persons responsible, resources needed, timelines)? Short and long-term outcomes? Methods, data sources, data collection timelines, and reporting?

49 Evaluating An Improvement Activity Do we have a good plan for evaluating an improvement activity?

50 Evaluating An Improvement Activity Goal/Purpose Process/Impact Data Collection Methods Timelines Data Analysis Methods Use and Reporting of Results Person(s) Responsible

51 Lessons Learned State resources and supports related to evaluating improvement activities are tight… … and states must be strategic about which activities they can evaluate and what methods they can use to evaluate

52 Lessons Learned Sometimes states evaluate improvement activities but do not report them in their APR

53 Lessons Learned States have improved their data collection systems to monitor implementation of improvement activities States are more often using this data to modify activities as needed

54 Example: Child Outcomes Data Improvement Activities: State conducted a number of activities (policies, procedures, professional development) on collecting child outcomes data. Evaluation: Are local programs implementing the child outcomes data collection process with fidelity?

55 Example: Child Outcomes Data Data collection: Statewide implementation survey on data collection and reporting practices Data Analysis: Identification of good practices as well as challenges Data Use: Targeted TA and supports

56 C3/B7 Implementation Surveys See examples of states’ implementation surveys online: http://www.fpg.unc.edu/~eco/pages/states_quality.cfm

57 Example: EC Transitions Improvement Activities: State conducted a number of activities (revised policies, procedures, professional development) on improving EC transitions. Evaluation: Are local programs understanding and implementing quality transition practices?

58 Example: EC Transitions Data collection: Statewide implementation survey on transition practices Data analysis: Identification of practice improvements as well as continuing challenges Data use: Documented improved practices; targeted TA and supports

59 Discussion Questions What have been your state’s accomplishments with regard to evaluating improvement activities? What have been your greatest challenges with regard to evaluating improvement activities?

60 Resources for Improvement Activities

61 Resources Using the SPP as a Management Tool Evaluating Improvement Activities http://spp-apr-calendar.rrfcnetwork.org/

62 Resources Implementation Science www.scalingup.org

63 Resources What additional materials could the SIP team make available to support you in improvement planning work?

64 THANK YOU! Systems and Improvement Planning Team www.rrfcnetwork.org www.rrfcnetwork.org Maureen Hawes RRCP Co-Coordinator North Central Regional Resource Center hawes001@umn.edu Christina Kasprzak Associate Director National Early Childhood TA Center christina.kasprzak@unc.edu Jeanna Mullins State TA Provider Mid-South Regional Resource Center baile045@umn.edu baile045@umn.edu Discussion and Questions


Download ppt "2011 OSEP Leadership Mega Conference Collaboration to Achieve Success from Cradle to Career 2.0 Six Years of SPPs: Lessons Learning for Designing, Implementing."

Similar presentations


Ads by Google