Linking SSIP Implementation and Evaluation: Learning from States

Slides:



Advertisements
Similar presentations
Goals-Based Evaluation (GBE)
Advertisements

 Reading School Committee January 23,
Infant & Toddler Connection of Virginia 1 Virginia’s System for Determination of Child Progress (VSDCP)
System Office Performance Management
The Center for IDEA Early Childhood Data Systems ECTA/DaSy System Framework Self- Assessment June 18, 2015.
System Office Performance Management
One Voice – One Plan Office of Education Improvement and Innovation MI-CSI: Do Stage Implement Plan and Monitor Plan.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
The Instructional Decision-Making Process 1 hour presentation.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
SPP/APR - SSIP Stakeholders Meeting # 5. Agenda for Today Stakeholder involvement Review Draft SSIP –Baseline Data / Target setting –Introduction –Data.
Illinois Department of Children & Family Service/Chicago State University STEP Program - NHSTES May THE STEP PROGRAM Supervisory Training to Enhance.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
Connecticut Part C State Performance Plan Indicator 11 State Systemic Improvement Plan Phase II.
Section 6 The Three Global Outcomes. Key Principles for Early Intervention Service Provision 1.Infants and toddlers learn best through every day experiences.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
325K: COMBINED PRIORITY FOR PERSONNEL PREPARATION Webinar on the Annual Performance Report for Continuation Funding Office of Special Education Programs.
Strategies for Increasing Data Quality
Child Outcomes Data Collection: Results of the ITCA National Survey and Discussion of Policies and Considerations Christina Kasprzak, ECTA/DaSy Cornelia.
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Child Outcomes Summary Process April 26, 2017
Overview of MAAP Accreditation
IFSP Aligned with the Early Intervention Data System
Pacific and Caribbean States/Entities Early Intervention and
Using Formative Assessment
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes Infusing Partnership Principles and Practices into Family Engagement.
Improving Data, Improving Outcomes Conference August, 2016
Performance Improvement Project Validation Process Outcome Focused Scoring Methodology and Critical Analysis Presenter: Christi Melendez, RN, CPHQ Associate.
Zelphine Smith-Dixon, State Director of Special Education
An Overview of the Minnesota Afterschool Accreditation Program (MAAP)
Part C State Performance Plan/Annual Performance Report:
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
Measuring Project Performance: Tips and Tools to Showcase Your Results
DESE Educator Evaluation System for Superintendents
Strategies for Increasing Data Quality
ECTA/DaSy System Framework Self-Assessment
Who Wants to be a 7-Point Rating Scale Millionaire?
Monitoring and Evaluation using the
School Improvement Plans and School Data Teams
Child Outcomes Data: A Critical Lever for Systems Change
Pay For Success: An Invitation to Learn More
Logic Models and Theory of Change Models: Defining and Telling Apart
DB Summit 2016 Early Identification/Referral Session
Integrating Outcomes Learning Community Call February 8, 2012
2018 OSEP Project Directors’ Conference
Assuring the Quality of your COSF Data
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
CURRICULUM & INSTRUCTION
Supporting Improvement of Local Child Outcomes Measurement Systems
2018 Improving Data, Improving Outcomes Conference August 2018
Let’s Talk Data: Making Data Conversations Engaging and Productive
Grantee Guide to Project Performance Measurement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
Early Childhood and Family Outcomes
Strategies for Increasing Data Quality
Assessing Academic Programs at IPFW
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Integrating Results into Accountability Procedures and Activities
SGM Mid-Year Conference Gina Graham
Encore Webinar February 13, 2019, 3:00pm ET
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Christina Kasprzak Frank Porter Graham Child Development Institute
Measuring Child and Family Outcomes Conference August 2008
Assuring the Quality of your COSF Data
Data Culture: What does it look like in your program?
Presentation transcript:

Linking SSIP Implementation and Evaluation: Learning from States Anne Lucas, ECTA/DaSy Abby Schachner, ECTA/DaSy Christy Cronheim, Part C Idaho Stacy Kong, Part C Hawaii Christy Scott, Part C Colorado Amanda Sutton, Part C Colorado

Intended Outcomes Participants will gain an understanding of: Strategies several states used to align their implementation activities with their evaluation to ensure they can collect necessary evaluation data that reflects implementation progress and improvement toward achieving intended outcomes. How several states developed their plans or made adjustments to their implementation activities/evaluation plans to ensure their SSIP was feasible and manageable yet meaningful.

Session Outline Importance of linking implementation with evaluation/having a feasible plan State stories Idaho Hawaii Colorado Questions and Discussion

Importance of Linking Implementation with Evaluation/Having a Feasible, Meaningful Plan “There is no substitute for knowledge.” – W. Edwards Deming Meaningful evaluation is essential to improving the quality and success of implementation. Ongoing evaluation is a critical to Active Implementation and Implementation Science. In order to inform implementation and mid- course corrections, evaluation data need to be closely aligned with what is being done and the intended impacts. Feasibility is key. How will you know that change is an improvement without evaluation? Meaningful evaluation is essential to improving the quality and success of implementation. Ongoing evaluation is a critical to carrying out improvement cycles in Active Implementation and Implementation Science. Evaluation and improvement cycles support the purposeful process of change. In order to inform implementation and mid-course corrections, the evaluation data (outputs, outcomes, performance indicators) need to be closely aligned with what is being done and the intended impacts. Feasibility is key. An overly ambitious evaluation plan that is not feasible and cannot be carried out cannot inform implementation.

Strategies for Linking Implementation and Evaluation/Having Feasible, Meaningful Plan Align the evaluation plan with the theory of action and logic model. Review intended outcomes and ensure that they remain related to and logically follow the improvement strategies and related improvement activities. Focus the evaluation on the most important outcomes to tell if you are achieving the intended impacts and if progress toward the SIMR is being made. Review performance criteria and indicators and make adjustments as needed based on implementation. Be sure to focus on the most important outcomes – that will tell you if you are having the intended impacts and those things which will let you know if you are making progress.

Strategies for Having a Feasible, Meaningful Plan Consider if you already have existing data or information you already collect that could be used before taking on a new data collection effort. Make adjustments in data collection strategies/sources as needed to better measure intended outcomes. Use evaluation data on an ongoing basis to support and guide improvement strategies and implementation processes. Examples: Combining or linking two different data sources/data sets. Recoding data into categories or creating cut points. Entering paper forms/records into electronic format to link with other data or summarize. Consider if you already have existing data or information you already collect that could be used or repurposed to answer your evaluation questions – i.e., combining two sets of data, recoding or creating categories or cut points with existing data, entering paper records to link with other data or summarize (such as training attendance) – before taking on a new data collection effort that may involve creating a tool and overseeing it’s implementation to ensure data are of high quality, etc.

Idaho

SSIP Process

Phase III Capacity Challenges Too many short-term and intermediate outcomes (32 total) Activities were developed in a linear fashion – creating duplication Resources are tight to complete new improvement activities Operational changes can create significant disruptions to already large caseloads Working on SSIP strands simultaneously was not effective in making sustained progress towards the overall plan

Actions to Address Challenges Focussed on activities with the highest impact on SiMR Reduced duplication across strategies where possible Revised implementation activities Revised evaluation plan to create efficiencies where possible - 32 to 17 short-term and intermediate outcomes

Revised Plan Stayed focused on same three strands of action Manageable activities that build upon one another and still very meaningful In order to ensure our revised plan was feasible, we decided to embed SSIP activities into our current system rather than creating new systems.  More focused implementation activities that clearly align with identified outputs Ensured the Theory of Action, Logic Models and Evaluation Plan were all aligned and linked with the anticipated outcomes, outputs and the state- identified measurable result (SiMR).

Feasibility of Embedding ECOs into IFSP Based on data from the Exploration Team Self-Assessment and Exploration Team Resource survey, Idaho decided not to modify the IFSP and other relevant forms to incorporate the ECO process at this time Idaho will continue its work to improve the ECO process through training and the development of new staff and family resources Embedding the ECOs into the IFSP will occur at a later point in time when ECO foundational work is fully implemented.

ECO Process Strand Phase II Improvement Strategy Modify the early childhood outcomes process for ITP staff, contractors, and families Phase III Improvement Strategy Strengthen the early childhood outcomes process for ITP staff, contractors, and families through training for staff and contractors and the development of additional resources for staff and families

ECO Process Activities Phase II Activities Develop training to address: typical child development, family engagement, purpose of ECOs use of appropriate ECO assessment tools Modify the IFSP to embed ECOs Phase III Activities Deliver or make available ECO training for staff and contractors so they may better understand how to complete ECO ratings Explore embedding ECOs into the IFSP Develop or adopt a list of standard social emotional tools Deliver or make available training to enhance staff and contractors understanding and use of social emotional information to determine the social emotional ECO rating

ECO Process Outcomes Phase II Outcomes Staff/contractors have increased understanding of the ECO process Staff/contractors have increased understanding of typical child development Families have increased involvement in the ECO process and measurement, and the IFSP development process IFSPs include ECOs and strategies related to SE development Phase III Outcomes Staff and contractors are proficient in the ECO process including determining the ECO ratings Staff and contractors are proficient in the use of appropriate SE tools Staff and contractors are confident in their knowledge of typical/atypical social emotional development Families have an awareness and understanding of the ECOs Families are involved in the ECO process including determining ECO ratings

Building on Implementation Activities ECO training, understanding SE development and assessment tools ECO fidelity checks are developed and implemented Social emotional proficiency check developed and implemented SE competencies developed and training to embed into EI EBP Embed SE competencies into EI EBP

Hawaii

Developing a Feasible, Achievable and Meaningful Plan Initially Hawaii’s Action Plan was very comprehensive and included multiple activities and outcomes overwhelming unmanageable unrealistic Hawaii developed a Logic Model to streamline activities and outcomes focused achievable

Revising the Evaluation Plan Phase II Evaluation Plan Confusing Multiple evaluation questions per outcome Multiple performance indicators Extensive measurement/data collection Revised Evaluation Plan in Phase III Narrowed scope of performance measures Incorporated data that was already being collected Aligned with the activities

Connections Between Activities Hawaii Social Emotional (SE) Competencies SE Self-Assessment Training Individual Training Plans Evaluation

Evaluation of Intended Outcomes linked to SE Competencies and Related Activities SE Self Assessment for Evaluation Purposes: Short term Intended Outcome: EI providers will understand how to support SE development for children ages 0-3 Performance Indicator: 75% of providers who participated in the training will demonstrate overall step movement towards level III: Triadic Relationships on the SE Competency Self-Assessment Intermediate Intended Outcome: EI providers will implement EBPs related to SE development using the PSP Approach to Teaming and Coaching Model in Natural Learning Environments with fidelity. Performance Indicator: 75% of providers who participated in the trainings, demonstrate at least a one-step movement for each competency towards “actively supports caregivers” on the Hawaii SE Competencies Coaching Log Review

Social Emotional Competency Self-Assessment

Phase III Outputs Achieved Related to these Activities SE Competencies SE Competencies incorporated into trainings Gather and review EI EOS on-line training modules and identify areas that need to be enhanced

Key Accomplishments in Phase III Implementation Year Added a few steps in improvement activities to address identified gaps SE Verification Worksheet to ensure SE Competencies are incorporated into trainings Develop or revise training modules on SE Competencies and EBPs Use SE Competency and EBPs training modules as a resource in training plan for new providers and to sustain current provider’s implementation of EBPs related to SE competencies Develop and annually update training plan to train new staff and sustain current providers’ implementation of EBPs related to SE competencies; training content to be based on needs identified from annual administration of SE Competencies Self-Assessment tool and from coaching logs Revised evaluation plan Streamlined to ensure data would best reflect the evaluation questions and intended outcomes Collected baseline data SE Self-Assessment Staffing Timely Services

Colorado

Colorado

Colorado Implementation activities on track First two cohorts of local EI programs (CCBs) are fully implemented Third cohort beginning training for planned implementation by June 30, 2018 Lessons learned from initial cohort implementation Adjustments to training Incorporation of evaluation tool into training Begin engagement of next two cohorts (particularly Cohort 3) earlier Being on target with timeline allows for concentration on evaluation activities

Colorado Plan -> On Track

Tool Development: Quality IFSP Outcomes and Integration of Child Outcomes in to the IFSP Review of pre-existing quality tools: Ohio –Outcome Assessment Tool (OAT), Kansas – Quality Indicator Rubric (QIR) Customization of Kansas QIR and development of web-based tool Stakeholder Engagement and Feedback Review -> Feedback -> Edits -> Testing -> Feedback -> Edits -> Feedback -> Training -> Baseline Sampling Procedure and Methodology Data Analyst will extract a random sample of IFSPs of all CCBs in Cohort 1 (5 Programs) Self-Assessment: Representative sample of initial IFSPs stratified by CCB; subsequently representative of Service Coordinators at each CCB; over 1 year at the 95% confidence level, 5% margin of error. Verification: Representative sample of CCB self-assessment stratified by CCB over 1 year at 95% confidence level, 5% margin of error.

Quality IFSP and Outcomes (QIO) Assessment Tool Quality Measures: Family Assessment, Child Outcomes, IFSP Outcomes SELF ASSESSMENT VERIFICATION Local Program (CCB) conduct self assessments of randomly chosen IFSPs monthly Build local capacity to monitor quality for individualized outcomes Reasonable workload for CCB State EI staff verifies a sample of self assessments monthly Monitor inter-rater reliability Determine effectiveness of implementation and improvement strategies

Quality IFSP and Outcomes (QIO) Assessment Tool: Data Collection Procedures The sample of IFSPs to be self assessed are sent to a lead at each CCB. Individual leads will structure how the IFSPs are distributed amongst qualified assessors—anyone overseeing Service Coordinators. Who? Randomized sample of IFSPs compiled by data analyst and then distributed to identified Leads at each CCB. Assessors utilize the web-based QIO to complete assessments of IFSPs listed on the document. How? Self-Assessment and Verification occurs on a monthly basis. How Often? The number of IFSPs reviewed varies depending on CCB size and is equitable due to paralleled variance in supervisory capacity at each program. Approximately 80 self assessments and 100 verifications complete per month. How Many?

Quality IFSP Outcomes (QIO) Assessment Tool

Data Collection and Analysis: Preliminary Findings and Mid-Course Corrections Data suggests we are in the early stages of making a difference in QUALITY Data suggests the QIO assessment tool demonstrates evidence of implementation Mid-Course Corrections Baseline data prompted adjustment to implementation training Introduction to Quality Outcome Tool earlier in the implementation process

Data Collection and Analysis: Next Steps SIMR: All infants and toddlers who receive early intervention services in Colorado will demonstrate increased growth in the use of appropriate behaviors to meet their needs Comparative analysis and testing hypothesis Short term: High quality IFSP outcomes = improvement in number of children who reach their IFSP outcome Initial IFSP Quality vs. Annual IFSP Quality Long term: High quality IFSP outcomes = improvement in percentage of children who show improvement in their use of behaviors to get their needs met

Questions and Discussion What strategies has your state used to ensure your SSIP implementation and evaluation activities are feasible and well aligned? If you are challenged with having a manageable and feasible plan and/or have encountered evaluation measures that do not clearly evaluate implementation progress or impact of implementation, what strategies might you use to address these issues?

Thank you The contents of this tool and guidance were developed under grants from the U.S. Department of Education, #H326P120002 and #H373Z120002. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, and Julia Martin Eile.