Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.

Slides:



Advertisements
Similar presentations
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Advertisements

Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
California Stakeholder Group State Performance and Personnel Development Plan Stakeholders January 29-30, 2007 Sacramento, California Radisson Hotel Welcome.
Engagement as Strategy: Leading by Convening in the SSIP Part 2 8 th Annual Capacity Building Institute May, 2014 Joanne Cashman, IDEA Partnership Mariola.
Using State Data to Inform Parent Center Work. Region 2 Parent Technical Assistance Center (PTAC) Conference Charleston, SC June 25, 2015 Presenter: Terry.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
Full Implementation of the Common Core. Last Meeting Performance Tasks Smarter Balanced Assessment Consortium Upcoming Accountability Measure Strong teaching.
Data Use Professional Development Series Day The contents of this slideshow were developed under a Race.
SHERRI YBARRA, SUPERINTENDENT OF PUBLIC INSTRUCTION SUPPORTING SCHOOLS AND STUDENTS TO ACHIEVE.
Ohio Improvement Process (OIP) Facilitating District-wide Improvement in Instructional Practices and Student Performance.
KEEPING THE FOCUS ON STUDENT ACHIEVEMENT Stephanie Benedict Academic Development Institute & Center on Innovations in Learning.
Positive Behavior Interventions & Supports Family & Community Team Member Network Meeting Thank you for coming! Please make yourself comfortable.
Wisconsin Personnel Development System Grant Click on the speaker to listen to each slide. You may wish to follow along in your WPDM Guide.
Equity, Inclusion, and Opportunity: Getting Results by Addressing Success Gaps [PRESENTATION 2-4: ADD DATE]
Creating Engaging and Effective Data Displays
SAM (Self-Assessment of MTSS Implementation) ADMINISTRATION TRAINING
School Building Leader and School District Leader exam
Capacity Building: Drafting an Evaluation Blueprint
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Phase I Strategies to Improve Social-Emotional Outcomes
Program Review For School Counseling Programs
Part C Data Managers — Review, Resources, and Relationship Building
Pacific and Caribbean States/Entities Early Intervention and
New Significant Disproportionality Regulations
Using Formative Assessment
[Presentation 1: add Date]
What’s New in the IDC Part C Exiting Data Toolkit
National, State and Local Educational Environments Data:
Introduction to Program Evaluation
Florida’s MTSS Project: Self-Assessment of MTSS (SAM)
As use of 619 data increases in state accountability systems, new and challenging issues for data sharing/governance arise particularly as these data are.
Continuous Improvement through Accreditation AdvancED ESA Accreditation MAISA Conference January 27, 2016.
Guest WIFI Password: Back to school!
Zelphine Smith-Dixon, State Director of Special Education
Kristin Reedy, Co-Director June 24, 2016
Learning Forward Annual Conference Session F28
Supporting Improvement of Local Child Outcomes Measurement Systems
National Webinar Presented by: Amy Nicholas Cathy Smyth
High-Leverage Practices in Special Education: Assessment ceedar.org
End of Year Performance Review Meetings and objective setting for 2018/19 This briefing pack is designed to be used by line managers to brief their teams.
ECTA/DaSy System Framework Self-Assessment
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Improving Data, Improving Outcomes Conference, September 2014
Pay For Success: An Invitation to Learn More
G-CASE Fall Conference November 14, 2013 Savannah, Ga
2018 OSEP Project Directors’ Conference
Exploring The Power of C!
Perfect Together: Aligning and Leveraging SEAs and Parent Centers in Shared Work Helen Post and Kim Fratto January 10, :30 pm – 3:45 pm ET (11:30-12:45.
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment
Supporting Improvement of Local Child Outcomes Measurement Systems
Let’s Talk Data: Making Data Conversations Engaging and Productive
Introductions Introduction
Introduction Introduction
Using Data for Program Improvement
2018 Improving Data, Improving Outcomes Conference
The Center for IDEA Early Childhood Data Systems presents
Using Data for Program Improvement
Employee engagement Delivery guide
Exploring The Power of C!
Introductions Introduction
Introductions Introduction
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Part B: Evaluating Implementation Progress and Impact of Professional Development Efforts on Evidence-Based Practices   NCSI Cross-State Learning Collaboratives.
Christina Kasprzak Frank Porter Graham Child Development Institute
Data Culture: What does it look like in your program?
Communicating Your Data
Implementing the Child Outcomes Summary Process: Challenges, strategies, and benefits July, 2011 Welcome to a presentation on implementation issues.
Using Data to Build LEA Capacity to Improve Outcomes
Data Culture: What does it look like in your program?
Presentation transcript:

Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build capacity within states for collecting, reporting, analyzing, and using high-quality IDEA data. You have joined us today for a webinar on leveraging evaluation data: leading data-informed discussion to guide ssip decisionmaking. Introduction of presenters Housekeeping- Phone lines have been muted. To ask Questions or share comments, please use the Chat Box, To: All Participants. January 17, 2018 Tamara Nimkoff Kim Schroeder Debbie Shaver

Intended outcomes Participants will increase their understanding of The value of data discussions for assessing progress toward achieving intended outcomes and informing decisionmaking A structured process that groups can use to guide next steps in State Systemic Improvement Plan (SSIP) implementation IDC’s Data Meeting Protocol and related resources available to support data analysis and use In providing this webinar content today, it is our intention that we increase your understanding of…[read 3 bullets].

Agenda Assessing progress in SSIP Phase III, Year 2 Leveraging your data for decisionmaking Overview of the IDC Data Meeting Protocol Examples of protocol use Resources for data-informed decisionmaking We will do this by: First, I will set the stage by reviewing the need for states to be assessing progress during this Year 2 of Phase III of the SSIP and the value of leveraging your SSIP data for decisionmaking. Kim will then introduce you to a new tool that can be used to support data discussions, the IDC Data Meeting Protocol. She will provide an overview of each step of protocol. Debbie will then lead in us in hearing about two example situations in which the protocol has been used within states. And she will provide information on resources to support this process.

Assessing progress in SSIP Phase III, Year 2 Use evaluation results to assess progress implementing the SSIP Assess both short-term and intermediate outcomes to gauge progress toward the State-identified Measurable Result (SiMR) Make data-informed decisions in SSIP strategies and activities (See the State Performance Plan/Annual Performance Report [SPP/APR] Measurement Table) So, let’s get started. The Office of Special Education Programs has provided clear guidance on the expectation that states should assess the progress of their SSIP during this Year 2 of Phase III. The FY2016 SPP/APR Measurement Table, for example, outlines that a state’s SSIP report should indicate how the state has Used evaluation results to assess progress implementing the SSIP. This includes evaluating progress toward the state-identified measureable result by assessing both short-term and intermediate outcomes that lead to the SiMR. And perhaps, most importantly, using those evaluation data to make data-inform decisions about the ongoing implementation of SSIP strategies and activities.

Leveraging your data Power of data-informed decisionmaking Make decisions about resource allocation and target areas for program and service improvement Build awareness, interest, and skills for the routine use of data Ensure that data have value to the agency Support improved data quality Using your data to guide your SSIP decisions is about leveraging your the data, or taking advantage of the data you have collected in relation to the SSIP, and using it as a tool. The power of using your data to inform your decisionmaking is that it allows you to: Make decisions about resource allocation and target areas for program and service improvement, Build awareness, interest, and skills (among your staff and stakeholders) for the routine use of data, Ensure that the data that has been collected has value to your agency and its staff, And subsequently support improved quality of your data by actively examining and using it. Kim is going to now take over to introduce a new tool that can support you in actively using your data. -End by 3:08-

IDC Data Meeting Protocol One strategy for supporting data-informed decisions is through focused discussions about your SSIP data KIM begins

Why a data meeting protocol? Provides a simple structure to guide conversation around evaluation data during meetings Helps groups examine evaluation results and make meaning of the results together Supports the analysis and use of evaluation data to inform continuous improvement

Who is this protocol for? Anyone engaged in making decisions for improvement efforts, such as the SSIP State staff involved in SSIP implementation or evaluation Local staff involved in SSIP Partners such as professional development providers Stakeholders such as state advisory groups, organizations representing constituents, and families

What key roles are involved? Protocol lead Has key responsibilities both before and after the meeting Can be one or more individuals Facilitator Guides participants through the group discussion process during the meeting Can be internal staff or outside support Other roles: notetaker, timekeeper In addition to the general participants in the meeting, there are 4 key roles involved…. Outside facilitator may be IDC TA Provider

How might the protocol be used? Can be used during a single meeting or a series of meetings as part of a recurring decisionmaking process Can be used to facilitate discussions about data related to A program’s processes and implementation The extent to which a program achieves its expected outcomes Dedicate focused meeting time to the protocol to allow the group to dig beneath the surface to discuss both observations and implications for improvement.

What is the protocol process? Before the meeting Protocol lead plans and prepares for the meeting During the meeting Facilitator guides participant discussion based on the data After the meeting Protocol lead provides recap of the meeting and next steps

Protocol steps: Before the meeting 1. Determine the Objective 2. Identify the Data 3. Identify Participants and Key Responsibilities 4. Organize the Data to Present 5. Prepare and Distribute the Agenda

Protocol steps: During the meeting 1. Introductions and Key Messages 2. Present the Data 3-5. Discuss the Data 6. Determine Next Steps for the Group 7. Reflect on the Meeting’s Effectiveness

Steps 3-5: Discussing the data What do you see? What are your initial thoughts or reactions? What do these data not provide? 3. Discuss Observations of the Data What do the data tell you? What answers are you getting for our original evaluation questions? What do these data confirm? 4. Discuss Interpretations of the Data What are the implications? So what? Why does this matter? What does this mean for the work? 5. Discuss Implications of the Data

Example: Has family engagement increased over time?

Example: Discussing the data District 1 reported 70 percent engagement in 2016 and 85 percent in 2017 2 of the other 3 districts also saw an increase in family engagement from 2016 to 2017 Observations of the Data Overall, family engagement increased from 2016 to 2017 Interpretation of the Data We need more data to determine why Districts 1,2, and 3 have seen an increase and why District 4 has not seen an increase in family engagement Implication of the Data

Protocol steps: After the meeting 1. Distribute Notes From the Protocol Process 2. Confirm Next Steps and Timeline for Additional Actions

Example 1 of protocol in use: New Mexico Part B SSIP focus: Improved reading achievement for students with disabilities in Results Driven Accountability (RDA) schools Discussion objective (evaluation question): To what extent are RDA (SSIP) schools implementing the evidence- based practices that are expected to result in improved outcomes for students with disabilities? DEBBIE Begins. Introduce New Mexico guests. Debbie present slide.

New Mexico’s protocol use Data: Site visit implementation rubric (collected through interviews and observations across multiple practice domains) Protocol roles: State staff and IDC technical assistance (TA) provider collaborated as protocol leads, and IDC TA provider facilitated the data meetings Meeting participants: Meeting #1: State staff involved in SSIP implementation and evaluation Meeting #2: Various state staff and SSIP stakeholders (principals, professional development provider) Debbie present slide. Ask New Mexico to comment on process (3-5 minutes).

Example 2 of protocol in use: New Hampshire Part B SSIP focus: Improved social-emotional outcomes for preschool children with disabilities through complementary infrastructure development and leadership Discussion objective (evaluation questions): What is the status of practitioner fidelity of implementation? What are the implications for coaching infrastructure moving forward? What were the most valuable components of process coaching during the 2016-17 implementation year? What value did process coaching contribute to implementation teams? What additional supports are needed for the coaches and the implementation teams? Here’s another scenario, illustrating how another state used the protocol in a slightly different way.

New Hampshire’s protocol use Data: Fidelity of Implementation Observation tool Process Coach Feedback Survey Protocol roles: State staff served as protocol leads with IDC TA provider support, and state and local staff facilitated the data meetings Meeting participants: Meetings #1 and #2: State leadership team Meeting #3: Local leadership and implementation teams

Summary IDC’s Data Meeting Protocol provides a structured process that groups can use to Conduct data discussions Assess progress toward achieving intended outcomes Inform next steps in SSIP implementation https://ideadata.org/resources/resource/1758/data- meeting-protocol

Registration Deadline: January 19th Related resources Individualized TA support with protocol Contact your IDC state liaison Forthcoming facilitators’ guide for using data with stakeholders Upcoming Event: Interactive Institutes 2018: Building a Culture of High-Quality Part B Data Registration Deadline: January 19th Presenter Note: We’re going to have time for Questions after this slide. Please type your Questions into the Chat Box.

For more information Visit the IDC website http://ideadata.org/ Follow us on Twitter https://twitter.com/ideadatacenter Follow us on LinkedIn http://www.linkedin.com/company/idea-data-center

The contents of this presentation were developed under a grant from the U.S. Department of Education, #H373Y130002. However, the contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the federal government. Project Officers: Richelle Davis and Meredith Miceli