Download presentation
Presentation is loading. Please wait.
1
State Systemic Improvement Plans
Phase III: Implementation and Evaluation or Where the Rubber Meets the Road
2
Introduction Speaker Introductions
Leslie Fox & Christine Pilgrim, OSEP Performance Accountability Implementation Team Presentation Objectives Explanation of OSEP’s Phase III Review Process Overview of OSEP’s Phase III Analysis Implications for Phase III, Year 2 Overview of changes to Information Collection Language for B-17/C-11 Leslie: Review process and what states can expect in the upcoming weeks; initial impressions from the Phase III analysis – process modified in response to the DMS focus group reports and state feedback from last year; implications for engagement and support in year 2 and upcoming TA gudiance that will be available this fall
3
SSIP Phase III Submission Included
FFY 2015 data for the State Identified Measurable Result (SiMR) Progress implementing the SSIP Progress toward the SiMR Any modifications to the SSIP and a rationale or justification Leslie: Data reported for FFY 2015 but the narrative is for activities that took place last year once the Phases II plan was submitted in April 2016
4
OSEP Phase III Analysis Process
Performance Accountability Implementation Team – 1st tier Research to Practice staff – Secondary, 1st tier Leslie Fox and Judy Gregorian – 2nd tier Use of a structured review tool with 28 items Progress Implementing the SSIP Progress Toward the SiMR Data Collection and Analysis Data Quality Leslie: PAIT members reviewed SSIPs each; RTP for states that received intensive designations for the SSIP last fall; similar to last year there was use of a structured review tool to support consistency in the scope of content for the reviews and to identify big picture trends and issues that may require focused technical assistance and support; the review tool began development in 2016 and was developed in coordination with CIPP staff thanks to Pat Gonzalez and Tom Fiore. The subgroup that consulted on the review tool included MSIP and RTP staff; the review elements were shared with state during a national TA call in February.
5
What is Progress Implementing the SSIP?
Did the State meet expected milestones? Did activities result in intended outputs? Did the implemented strategies and activities align with the State’s theory of action and, if included, logic model? Did the State meet established short-term outcomes? What infrastructure changes occurred that support the SSIP? How were stakeholders informed of, and included, in the implementation of SSIP activities? What data did the State collect and analyze to evaluate progress? Leslie: our review serves three purposes: 1) our expectation to review the indicator for the APR; 2) have a process that yields data points to make designations or identify levels of engagement for states as part of OSEP’s differentiated monitoring and support process; 3) recognize issues or needs reported by states as they implement and evaluate their state plans and mobilize TA resources through OSEP funded centers or to be provided directly by MSIP staff; Gregg Corr provided some of our preliminary data yesterday on milestones, activities and outputs. We were very generous in our coding.
6
What is Progress toward the SiMR?
Did the State implement evidence-based practices with fidelity? Are implemented evidence-based practices intended to impact: System-level procedures or processes? Teacher or provider level practices? Family behaviors or practices? Child or student outcomes or skills? What data did the State collect and analyze to evaluate progress and fidelity? Leslie: states were less prepared in year 1 to demonstrate that selected EBPs were being implemented and if so, how fidelity was evaluated; remember Gregg’s presentation yesterday: 25 of 60 Part B programs and 20 of 56 Part C programs reported that the State evidence-based practices included in the SSIP are being implemented with fidelity; I’ll now turn it over to Christine to discuss some of OSEP’s impressions as we move from reviews to conversations with states
7
Year 1 Infrastructure Improvement Strategies
Part B accomplishments Part C accomplishments Data sources and evaluation measures Challenges or barriers that States reported Christine: Gregg highlighted some of the positive trends we recorded during our reviews yesterday. It’s important to note that states that used the SSIP report organizational outline or followed the prompts in GRADS on the indicator page were more likely to provide the information expected for the first year. Although 44 of 60 Part B programs and 40 of 56 Part C programs reported meeting the planned activities implementation milestones, our reviewers often erred on the side of the state when deciding if an accomplishment was a “milestone” or not. States that referenced their theory of action or a logic model were more likely to be credited for this review element than states that filled in spaces on a work plan or provided a list of activities or accomplishments without referencing the TOA; NCSI and ECTA are completing an analysis of the SSIPs and will prepare a report that details the different accomplishments reported for year 1. Anecdotally, we noted that Part B accomplishments included selection of implementation schools/districts; alignment of SSIP with SPDG or ESSA state plans; and formation of implementation teams; Part C accomplishments included revising vendor agreements, alignment of professional development activities with quality standards/licensure and credentialing; securing funds for data systems; Some strengths of the SSIPs included using performance measures or indicators as criteria to evaluate progress toward infrastructure improvements and explained how this improvement was necessary for either achieving the SiMR, scale-up across the state or sustainability of the work over time; Barriers: there were some barriers reported by states that were applicable for both B and C programs. These included changes in leadership, staff turnover and data systems that limited the collection and use of data to inform decision-making. Part B specifically emphasized challenges with initiative overload and multiple or competing priorities and changes in statewide assessments/missing assessment data as challenges and barriers to SSIP implementation and evaluation. Part C overwhelmingly identifies staff time, limited # of staff at state level and prolonged staff vacancies in addition to general challenges with staff/provider turnover as a challenge for effective implementation and evaluation of the SSIP activities. A close second for Part C was fiscal/budget challenges.
8
Year 1 Evidence-Based Practices
Part B accomplishments Part C accomplishments Data sources and evaluation measures Challenges or Barriers that States reported Christine: Both Part B and Part C SSIPs included professional development activities and training on selected evidence-based practices including some data on post-training knowledge assessment; many Part B states that had not identified EBPs in the phase II SSIP reported identified what practices would be supported throughout phase III; both Part B and Part C SSIPs that include coaching as strategy included hiring coaches and training them as an accomplishment for year 1; data sources and evaluation measures for Part B specific to evidence-based practices included teacher checklists, observations by coaches and coaching logs and some post-training knowledge assessments; Part C data sources included participant information for training activities (e.g. # providers attending, discipline) and program administrator review of IFSPs for routines-based language. Challenges and Barriers: Few part B and part C programs were able to communicate how the state would collect and analyze use of EBPs and fidelity data when “local control” was a concern or if programs/districts were given a menu of EBPs to choose from and support at the local level. Financing professional development activities was a common challenge reported as some resistance by teachers/providers to receiving feedback from coaches.
9
Phase III - Year 2: The Honeymoon Is Over
Changes that States made to their SSIPs in Phase III, Year 1 Expected challenges and technical assistance needs for Year 2 Updated Information Collection Language for B-17/C-11 Christine: As Gregg highlighted yesterday, stakeholder engagement was a key component of the phase III work as states made revisions and changes to the SSIP plan. some changes States made to their SSIPs included revisions to baseline/targets for the SiMR, but most changes reflected modified timelines for implementation and combining some of the phase III coherent improvement strategies; some of the TA needs identified by B & C programs for phase III, year 2 included support with professional development activates and evaluating knowledge acquisition and use; supporting teacher/provider use of data with children and families; provision of effective coaching and how to evaluate coaching , and assistance developing fidelity measures for selected EBPs and how to feasibly collect and analyze fidelity data; Part B SSIPs frequently included requests for TA to develop progress monitoring tools and identify progress monitoring data sources, plan for scale-up, develop and implement communication tools/plans for stakeholder engagement. As we move forward with year 2, OSEP will partner with TA centers and providers to coordinate resources and support states around these and other concerns. We’ll move now to the proposed changes to the measurement language for the indicator for the remaining three years of phase III.
10
FFY 2016 data for the SiMR and progress toward target
Data Analysis FFY 2016 data for the SiMR and progress toward target Any additional data (e.g. progress monitoring) the State collected to suggest progress toward the SiMR Description of the data sources, collection, and analysis procedures In April 2018 states submit phase III,year 2 report. It should include FFY 2016 data for the SiMR and a statement on the progress toward the SiMR. We are encouraging states to report any additional data that would suggest progress toward the SiMR. Consider that FFY begins Oct 1st and the data the state collects during the upcoming year is what will be reported in the final SSIP submission due in The SiMR data is collected in FFY 2018 but there will be two more years of SSIP implementation and evaluation activities. Description: we are hoping that as the SSIP reports are submitted each year that the work conducted by the state is robust but the report is succinct. Fewer changes to the plan, less descriptive or background information and more this is what we did, this is why it was important and this is the data collected specific to that activity. Part B “progress monitoring” may look very different from Part C where it could be a focus on provider practices or family outcomes as an intermediate measure.
11
Infrastructure Improvement Strategies
Narrative or graphic representation that describes the critical activities implemented and their relationship to the theory of action Summarize infrastructure improvement activities Relate short-term outcomes to one or more areas of a system framework Explain how strategies and outcomes support system change and are necessary for: Achievement of the SiMR Scale-up Sustainability What are the linchpin activities??? Some of the recent phase III submissions included a lot of rich but unnecessary information. Even using the organizational outline there were redundancies and it was sometime unclear what was an “accomplishment” or important activity and what was interesting or necessary for the state but not critical for the implementation and evaluation of the SSIP. Robust but succinct!
12
Evidence-Based Practices
Summarize the specific evidence-based practices that were implemented Describe how the State ensured fidelity of use Describe how the evidence-based practices are intended to impact the SiMR General supervision, review of implementation plans, data manager and review of data entered into system; continuum of impact: program, teacher/provider, family and child/student
13
Stakeholder Engagement
Specific strategies implemented to engage stakeholders in key improvement efforts How the State addressed concerns, if any, raised by stakeholders through engagement activities Stakeholder engagement remains critical both to continue implementation as planned and to justify any modifications
14
Organizational Outline Phase III Q&A Document Phase III Guidance Tool
OSEP Resources Organizational Outline Phase III Q&A Document Phase III Guidance Tool
15
April 2, 2018 Phase III, Year 2 SSIP due to OSEP
Save the Date April 2, 2018 Phase III, Year 2 SSIP due to OSEP
16
State Systemic Improvement Plans
Questions?
17
The Performance Accountability Team
Leslie Clithero Marion Crayton Leslie Fox, Facilitator Judy Gregorian, ADD Kathy Heck Curtis Kinnard Reha Mallory Genee Norbert Christine Pilgrim Janine Rudder Alecia Walters With Special Guest Stars: Jennifer Coffey Julia Martin Eile
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.