Download presentation
Presentation is loading. Please wait.
Published byLindsay Gilbert Modified over 6 years ago
1
Linking SSIP Implementation and Evaluation: Learning from States
Anne Lucas, ECTA/DaSy Abby Schachner, ECTA/DaSy Christy Cronheim, Part C Idaho Stacy Kong, Part C Hawaii Christy Scott, Part C Colorado Amanda Sutton, Part C Colorado
2
Intended Outcomes Participants will gain an understanding of:
Strategies several states used to align their implementation activities with their evaluation to ensure they can collect necessary evaluation data that reflects implementation progress and improvement toward achieving intended outcomes. How several states developed their plans or made adjustments to their implementation activities/evaluation plans to ensure their SSIP was feasible and manageable yet meaningful.
3
Session Outline Importance of linking implementation with evaluation/having a feasible plan State stories Idaho Hawaii Colorado Questions and Discussion
4
Importance of Linking Implementation with Evaluation/Having a Feasible, Meaningful Plan
“There is no substitute for knowledge.” – W. Edwards Deming Meaningful evaluation is essential to improving the quality and success of implementation. Ongoing evaluation is a critical to Active Implementation and Implementation Science. In order to inform implementation and mid- course corrections, evaluation data need to be closely aligned with what is being done and the intended impacts. Feasibility is key. How will you know that change is an improvement without evaluation? Meaningful evaluation is essential to improving the quality and success of implementation. Ongoing evaluation is a critical to carrying out improvement cycles in Active Implementation and Implementation Science. Evaluation and improvement cycles support the purposeful process of change. In order to inform implementation and mid-course corrections, the evaluation data (outputs, outcomes, performance indicators) need to be closely aligned with what is being done and the intended impacts. Feasibility is key. An overly ambitious evaluation plan that is not feasible and cannot be carried out cannot inform implementation.
5
Strategies for Linking Implementation and Evaluation/Having Feasible, Meaningful Plan
Align the evaluation plan with the theory of action and logic model. Review intended outcomes and ensure that they remain related to and logically follow the improvement strategies and related improvement activities. Focus the evaluation on the most important outcomes to tell if you are achieving the intended impacts and if progress toward the SIMR is being made. Review performance criteria and indicators and make adjustments as needed based on implementation. Be sure to focus on the most important outcomes – that will tell you if you are having the intended impacts and those things which will let you know if you are making progress.
6
Strategies for Having a Feasible, Meaningful Plan
Consider if you already have existing data or information you already collect that could be used before taking on a new data collection effort. Make adjustments in data collection strategies/sources as needed to better measure intended outcomes. Use evaluation data on an ongoing basis to support and guide improvement strategies and implementation processes. Examples: Combining or linking two different data sources/data sets. Recoding data into categories or creating cut points. Entering paper forms/records into electronic format to link with other data or summarize. Consider if you already have existing data or information you already collect that could be used or repurposed to answer your evaluation questions – i.e., combining two sets of data, recoding or creating categories or cut points with existing data, entering paper records to link with other data or summarize (such as training attendance) – before taking on a new data collection effort that may involve creating a tool and overseeing it’s implementation to ensure data are of high quality, etc.
7
Idaho
8
SSIP Process
9
Phase III Capacity Challenges
Too many short-term and intermediate outcomes (32 total) Activities were developed in a linear fashion – creating duplication Resources are tight to complete new improvement activities Operational changes can create significant disruptions to already large caseloads Working on SSIP strands simultaneously was not effective in making sustained progress towards the overall plan
10
Actions to Address Challenges
Focussed on activities with the highest impact on SiMR Reduced duplication across strategies where possible Revised implementation activities Revised evaluation plan to create efficiencies where possible - 32 to 17 short-term and intermediate outcomes
11
Revised Plan Stayed focused on same three strands of action
Manageable activities that build upon one another and still very meaningful In order to ensure our revised plan was feasible, we decided to embed SSIP activities into our current system rather than creating new systems. More focused implementation activities that clearly align with identified outputs Ensured the Theory of Action, Logic Models and Evaluation Plan were all aligned and linked with the anticipated outcomes, outputs and the state- identified measurable result (SiMR).
12
Feasibility of Embedding ECOs into IFSP
Based on data from the Exploration Team Self-Assessment and Exploration Team Resource survey, Idaho decided not to modify the IFSP and other relevant forms to incorporate the ECO process at this time Idaho will continue its work to improve the ECO process through training and the development of new staff and family resources Embedding the ECOs into the IFSP will occur at a later point in time when ECO foundational work is fully implemented.
13
ECO Process Strand Phase II Improvement Strategy
Modify the early childhood outcomes process for ITP staff, contractors, and families Phase III Improvement Strategy Strengthen the early childhood outcomes process for ITP staff, contractors, and families through training for staff and contractors and the development of additional resources for staff and families
14
ECO Process Activities
Phase II Activities Develop training to address: typical child development, family engagement, purpose of ECOs use of appropriate ECO assessment tools Modify the IFSP to embed ECOs Phase III Activities Deliver or make available ECO training for staff and contractors so they may better understand how to complete ECO ratings Explore embedding ECOs into the IFSP Develop or adopt a list of standard social emotional tools Deliver or make available training to enhance staff and contractors understanding and use of social emotional information to determine the social emotional ECO rating
15
ECO Process Outcomes Phase II Outcomes
Staff/contractors have increased understanding of the ECO process Staff/contractors have increased understanding of typical child development Families have increased involvement in the ECO process and measurement, and the IFSP development process IFSPs include ECOs and strategies related to SE development Phase III Outcomes Staff and contractors are proficient in the ECO process including determining the ECO ratings Staff and contractors are proficient in the use of appropriate SE tools Staff and contractors are confident in their knowledge of typical/atypical social emotional development Families have an awareness and understanding of the ECOs Families are involved in the ECO process including determining ECO ratings
16
Building on Implementation Activities
ECO training, understanding SE development and assessment tools ECO fidelity checks are developed and implemented Social emotional proficiency check developed and implemented SE competencies developed and training to embed into EI EBP Embed SE competencies into EI EBP
17
Hawaii
18
Developing a Feasible, Achievable and Meaningful Plan
Initially Hawaii’s Action Plan was very comprehensive and included multiple activities and outcomes overwhelming unmanageable unrealistic Hawaii developed a Logic Model to streamline activities and outcomes focused achievable
19
Revising the Evaluation Plan
Phase II Evaluation Plan Confusing Multiple evaluation questions per outcome Multiple performance indicators Extensive measurement/data collection Revised Evaluation Plan in Phase III Narrowed scope of performance measures Incorporated data that was already being collected Aligned with the activities
20
Connections Between Activities
Hawaii Social Emotional (SE) Competencies SE Self-Assessment Training Individual Training Plans Evaluation
21
Evaluation of Intended Outcomes linked to SE Competencies and Related Activities
SE Self Assessment for Evaluation Purposes: Short term Intended Outcome: EI providers will understand how to support SE development for children ages 0-3 Performance Indicator: 75% of providers who participated in the training will demonstrate overall step movement towards level III: Triadic Relationships on the SE Competency Self-Assessment Intermediate Intended Outcome: EI providers will implement EBPs related to SE development using the PSP Approach to Teaming and Coaching Model in Natural Learning Environments with fidelity. Performance Indicator: 75% of providers who participated in the trainings, demonstrate at least a one-step movement for each competency towards “actively supports caregivers” on the Hawaii SE Competencies Coaching Log Review
22
Social Emotional Competency Self-Assessment
23
Phase III Outputs Achieved Related to these Activities
SE Competencies SE Competencies incorporated into trainings Gather and review EI EOS on-line training modules and identify areas that need to be enhanced
24
Key Accomplishments in Phase III
Implementation Year Added a few steps in improvement activities to address identified gaps SE Verification Worksheet to ensure SE Competencies are incorporated into trainings Develop or revise training modules on SE Competencies and EBPs Use SE Competency and EBPs training modules as a resource in training plan for new providers and to sustain current provider’s implementation of EBPs related to SE competencies Develop and annually update training plan to train new staff and sustain current providers’ implementation of EBPs related to SE competencies; training content to be based on needs identified from annual administration of SE Competencies Self-Assessment tool and from coaching logs Revised evaluation plan Streamlined to ensure data would best reflect the evaluation questions and intended outcomes Collected baseline data SE Self-Assessment Staffing Timely Services
25
Colorado
26
Colorado
27
Colorado Implementation activities on track
First two cohorts of local EI programs (CCBs) are fully implemented Third cohort beginning training for planned implementation by June 30, 2018 Lessons learned from initial cohort implementation Adjustments to training Incorporation of evaluation tool into training Begin engagement of next two cohorts (particularly Cohort 3) earlier Being on target with timeline allows for concentration on evaluation activities
28
Colorado Plan -> On Track
29
Tool Development: Quality IFSP Outcomes and Integration of Child Outcomes in to the IFSP
Review of pre-existing quality tools: Ohio –Outcome Assessment Tool (OAT), Kansas – Quality Indicator Rubric (QIR) Customization of Kansas QIR and development of web-based tool Stakeholder Engagement and Feedback Review -> Feedback -> Edits -> Testing -> Feedback -> Edits -> Feedback -> Training -> Baseline Sampling Procedure and Methodology Data Analyst will extract a random sample of IFSPs of all CCBs in Cohort 1 (5 Programs) Self-Assessment: Representative sample of initial IFSPs stratified by CCB; subsequently representative of Service Coordinators at each CCB; over 1 year at the 95% confidence level, 5% margin of error. Verification: Representative sample of CCB self-assessment stratified by CCB over 1 year at 95% confidence level, 5% margin of error.
30
Quality IFSP and Outcomes (QIO) Assessment Tool
Quality Measures: Family Assessment, Child Outcomes, IFSP Outcomes SELF ASSESSMENT VERIFICATION Local Program (CCB) conduct self assessments of randomly chosen IFSPs monthly Build local capacity to monitor quality for individualized outcomes Reasonable workload for CCB State EI staff verifies a sample of self assessments monthly Monitor inter-rater reliability Determine effectiveness of implementation and improvement strategies
31
Quality IFSP and Outcomes (QIO) Assessment Tool: Data Collection Procedures
The sample of IFSPs to be self assessed are sent to a lead at each CCB. Individual leads will structure how the IFSPs are distributed amongst qualified assessors—anyone overseeing Service Coordinators. Who? Randomized sample of IFSPs compiled by data analyst and then distributed to identified Leads at each CCB. Assessors utilize the web-based QIO to complete assessments of IFSPs listed on the document. How? Self-Assessment and Verification occurs on a monthly basis. How Often? The number of IFSPs reviewed varies depending on CCB size and is equitable due to paralleled variance in supervisory capacity at each program. Approximately 80 self assessments and 100 verifications complete per month. How Many?
32
Quality IFSP Outcomes (QIO) Assessment Tool
33
Data Collection and Analysis: Preliminary Findings and Mid-Course Corrections
Data suggests we are in the early stages of making a difference in QUALITY Data suggests the QIO assessment tool demonstrates evidence of implementation Mid-Course Corrections Baseline data prompted adjustment to implementation training Introduction to Quality Outcome Tool earlier in the implementation process
34
Data Collection and Analysis: Next Steps
SIMR: All infants and toddlers who receive early intervention services in Colorado will demonstrate increased growth in the use of appropriate behaviors to meet their needs Comparative analysis and testing hypothesis Short term: High quality IFSP outcomes = improvement in number of children who reach their IFSP outcome Initial IFSP Quality vs. Annual IFSP Quality Long term: High quality IFSP outcomes = improvement in percentage of children who show improvement in their use of behaviors to get their needs met
35
Questions and Discussion
What strategies has your state used to ensure your SSIP implementation and evaluation activities are feasible and well aligned? If you are challenged with having a manageable and feasible plan and/or have encountered evaluation measures that do not clearly evaluate implementation progress or impact of implementation, what strategies might you use to address these issues?
36
Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P and #H373Z However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.