Data Informed Decision Makers: How to Use Data For Decision Making

Slides:



Advertisements
Similar presentations
Virginia - March 2014 (Content adapted from 2014 MSRRC Forum) Preparing for the State Systemic Improvement Plan.
Advertisements

Data, Now What? Skills for Analyzing and Interpreting Data
Title I Schoolwide Providing the Tools for Change Presented by Education Service Center Region XI February 2008.
Office of Special Education Services Instructional Leaders Roundtable Oct. 16, 2014 John R. Payne, Director.
RESULTS DRIVEN ACCOUNTABILITY SSIP Implementation Support Activity 1 OFFICE OF SPECIAL EDUCATION PROGRAMS.
Results-Driven Accountability OFFICE OF SPECIAL EDUCATION PROGRAMS 1.
FewSomeAll. Multi-Tiered System of Supports A Comprehensive Framework for Implementing the California Common Core State Standards Professional Learning.
SSIP Implementation Support Visit Idaho State Department of Education September 23-24, 2014.
Office of Special Education Programs U.S. Department of Education GRANT PERFORMANCE REPORT FOR CONTINUATION FUNDING.
Using COS Data to Inform Program Improvement at All Levels Every day, we are honored to take action that inspires the world to discover, love and nurture.
ANNOOR ISLAMIC SCHOOL AdvancEd Survey PURPOSE AND DIRECTION.
Staff All Surveys Questions 1-27 n=45 surveys Strongly Disagree Disagree Neutral Agree Strongly Agree The relative sizes of the colored bars in the chart.
The Center for IDEA Early Childhood Data Systems Why Should EI/ECSE Participate in Early Childhood Integrated Data Systems (ECIDs)? Missy Cochenour (DaSy.
OSEP-Funded TA and Data Centers David Guardino, Office of Special Education Programs, U.S. Department of Education.
The Center for IDEA Early Childhood Data Systems Improving Data, Improving Outcomes Conference, September 2014 Digging into “Data Use” Using the DaSy Framework.
Child Outcomes Measurement and Data Quality Abby Winer Schachner & Kathleen Hebbeler International Society on Early Intervention Conference Stockholm,
Data-based practitioners: How to use data for decision making Megan Vinh, PhD Abby Winer Schachner, PhD 13 th National Training Institute on Effective.
Am I Making a Difference? Using Data to Improve Practice Megan Vinh, PhD Lise Fox, PhD 2016 National Inclusion Institute May 12, 2016.
Presented at Annual Conference of the American Evaluation Association Anaheim, CA, November 2011 Lessons Learned about How to Support Outcomes Measurement.
Supporting Local Data Use for Program Improvement: Where are you now? Abby Schachner Kerry Belodoff Tony Ruggiero Improving Data, Improving Outcomes Conference.
Fostering a Culture of Data Use
Engaging Families and Creating Trusting Partnerships to Improve Child and Family Outcomes More on Infusing Partnership Principles and Practices into Family.
Supporting Families’ and Practitioners’ Use of the DEC Recommended Practices Chelsea Guillen-Early Intervention Training Program at the University of.
Child Outcomes Summary Process April 26, 2017
Fundamentals of Monitoring and Evaluation
Using Formative Assessment
What’s New in the IDC Part C Exiting Data Toolkit
TAIS Overview for Districts
Abby introduce Robb and Anne Robb Geier, Abby Winer & Anne Lucas
OSEP Project Directors Meeting
Supporting Improvement of Local Child Outcomes Measurement Systems
2018 OSEP Project Directors’ Conference
National Webinar Presented by: Amy Nicholas Cathy Smyth
Authentic Assessment in Early Intervention
2018 OSEP Project Directors’ Conference
ECTA/DaSy System Framework Self-Assessment Comparison Tool
Child Outcomes Data: A Critical Lever for Systems Change
Improving Data, Improving Outcomes Conference, September 2014
Pay For Success: An Invitation to Learn More
G-CASE Fall Conference November 14, 2013 Savannah, Ga
Integrating Outcomes Learning Community Call February 8, 2012
Structures for Implementation
2018 OSEP Project Directors’ Conference
IDEA Part C and Part B Section 619 National Child Outcomes Results for
Leveraging Evaluation Data: Leading Data-Informed Discussions to Guide SSIP Decisionmaking Welcome Mission of IDC- we provide technical assistance to build.
Implementation Guide for Linking Adults to Opportunity
2018 OSEP Project Directors’ Conference
Using outcomes data for program improvement
Supporting Improvement of Local Child Outcomes Measurement Systems
Let’s Talk Data: Making Data Conversations Engaging and Productive
Grantee Guide to Project Performance Measurement
Using Data for Program Improvement
Building Capacity to Use Child Outcomes Data to Improve Systems and Practices 2018 DEC Conference.
2018 Improving Data, Improving Outcomes Conference
Early Childhood and Family Outcomes
Parent-Teacher Partnerships for Student Success
NC Preschool Pyramid Model Leadership Team Summit January 9-10, 2019
Using Data for Program Improvement
Integrating Results into Accountability Procedures and Activities
Presenters: Ravyn Hawkins, Arkansas Department of Human Services
Refresher: Background on Federal and State Requirements
Christina Kasprzak Frank Porter Graham Child Development Institute
Student Success: Imagine the Possibilities
Using the Child and Family Outcomes Analysis Tools
Data Culture: What does it look like in your program?
Measuring Child and Family Outcomes Conference August 2008
Using State and Local Data to Improve Results
Using Data to Build LEA Capacity to Improve Outcomes
Implementing, Sustaining and Scaling-Up High Quality Inclusive Preschool Policies and Practices: Application for Intensive TA September 10, 2019 Lise.
Data Culture: What does it look like in your program?
Presentation transcript:

Data Informed Decision Makers: How to Use Data For Decision Making Abby Schachner Megan Vinh Megan Cox Division for Early Childhood Annual Conference October 2017

Welcome! Who is here? What are your expectations for this session? MEGAN C Review desired outcomes This interactive session will engage participants in addressing the challenges of being data-informed decision makers with limited time and resources. Participants bring laptops to apply concepts and strategies for using data to make decisions related to reducing suspensions and expulsions and promoting positive child outcomes with sample data and charts. NECTAC/ECO/WRRC 2012

Agenda National context Creating a culture of data-informed decision-making Including practitioner-driven data questions How to efficiently and intentionally use data MEGAN C NECTAC/ECO/WRRC 2012

National Context Results Driven Accountability For over 30 years, there has been a strong focus on regulatory compliance based on the IDEA and Federal regulations for early intervention and special education OSEP States Districts/Programs As a result, compliance has improved! ABBY? For the Office of Special Education Programs (OSEP) states are required to report on five categories of progress for each of the three child outcomes: The OSEP categories describe types of progress children can make between entry and exit Two COS ratings (entry and exit), and the yes/no progress question, are need to calculate what OSEP category describes a child’s progress NECTAC/ECO/WRRC 2012

Compliance improves, but not results? Despite this focus on compliance, states are not seeing improved results for children and youth with disabilities: Young children are not coming to Kindergarten prepared to learn In many locations, a significant achievement gap exists between students with disabilities and their general education peers Students are dropping out of school Many students who do graduate with a regular education diploma are not college and career ready ABBY? FROM NTI SLIDES Michael Yudin, former Assistant Secretary for Special Education and Rehabilitative Services, summarized this, when he said: “Despite this focus on compliance, states are not seeing improved results for children and youth with disabilities.” He continued to say that “young children are not coming to Kindergarten prepared to learn. In many areas, a significant achievement gap exists between students with disabilities and their general education peers. We are also seeing students drop out of schools. And, many students who do graduate with a regular education diploma are not college and career ready.” OSEP requirement for states to develop State Systemic Improvement Plans (SSIPs) is an example RDA's Three Components: State Performance Plan/Annual Performance Reports (SPP/APR), which measures results and compliance. States are currently developing State Systematic Improvement Plans (SSIPs), designed to improve outcomes in targeted areas. Determinations, which reflect state performance on results, as well as compliance. Differentiated monitoring and support for all states, but especially low performing states. The following Core Principles (Full Version) underlie and will guide OSEP’s RDA work: Principle 1: Partnership with stakeholders Principle 2: Transparent and understandable to educators and families Principle 3: Drives improved results Principle 4: Protects children and families Principle 5: Differentiated incentives and supports to states Principle 6: Encourages states to target resources and reduces burden Principle 7: Responsive to needs Multi-year, achievable plan that: Increase capacity of EIS programs/LEAs to implement, scale up, and sustain evidence-based practices Improves outcomes for children with disabilities (and their families)

National Context Three joint policy statements have been released: Policy Statement on Inclusion of Children with Disabilities in Early Childhood Programs Policy Statement on Expulsion and Suspension Policies in Early Childhood Settings Policy Statement on Family Engagement Joint Policy Statements http://www.acf.hhs.gov/programs/ecd/child-health-development/reducing-suspension-and-expulsion-practices http://www2.ed.gov/policy/speced/guid/earlylearning/joint-statement-full-text.pdf ABBY?

What do these all have in common? They all require the use of data to inform planning and to systemically improve results for children with disabilities and their families. ABBY?

Creating a Culture of data-informed decision making ECTACenter.org

Practices DEC Recommended Practices Leadership promotes data-informed decision-making by creating a culture of evidence-centered policies and professional development opportunities that promote the implementation of the recommended practices. Reflection of leadership to set the stage for data use – reference RP (ABBY) From our proposal: Leadership - L12 – using data for program management and continuous program improvement. Assessment - A9 – Practitioners implement systematic ongoing assessment to identify learning targets, plan activities, and monitor the child’s progress to revise instruction as needed.

Data – It’s a Leadership Team Responsibility Monthly review of data Who, How often, What, Where, When Monthly review of program incidents What’s up, what’s down, why, what should we do about it Review of all teacher fidelity measures to determine next steps, training, coaching, support Review of child progress data to ensure supports are effective ABBY ECTACenter.org

Cultural Challenges to Data-Informed Decision Making Many providers/teachers have developed their own personal metric for judging the effectiveness of their intervention/teaching and often this metric differs from the metrics of external parties (e.g., state accountability systems and school boards). Many providers/teachers and administrators base their decisions on experience, intuition, and anecdotal information (professional judgment) rather than on information that is collected systematically. There is little agreement among stakeholders about what kinds of data are meaningful and what to prioritize. Some providers/teachers disassociate their own performance and that of children, which leads them to overlook useful data. ABBY Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

Technical Challenges Data-Informed Decision Making Data that providers /teachers want – about outcomes, services and quality – are rarely available and are usually difficult to measure. Programs and schools rarely provide the time needed to collect and analyze data. Providers/teachers and/or administrators lack the access or capacity to analyze data for program improvement. Abby Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.

Political Challenges Data-Informed Decision Making Data have often been used politically, leading to mistrust of data and data avoidance. Providers/teachers and administrators may worry about the way data will be used to penalize them. Abby

Discussion Questions- Small Group Activity What are your barriers to creating a culture of data-informed decision making? What are your potential solutions? What do you struggle with in being a data-informed decision-maker? Abby 10 minutes

Efficiently and Intentionally using data MEGAN V Efficiently and Intentionally using data This Photo by Unknown Author is licensed under CC BY-SA

Key Concepts for Data-Informed Decision-Making What are your questions? What is your process for looking at data and making interpretations? What are the data sources you might have? Is there other data you need to collect or gather? MEGAN V Reference the ways to use data quadrants – multiple questions for differing purposes

Starting with a question (or two…) All analyses are driven by questions Questions come from different sources Different versions of the same question are necessary and appropriate for different audiences. What are your critical questions? What questions might practitioners have? MEGAN V We have a handout for you of example questions.

Defining Data Analysis Questions What are your crucial policy and programmatic questions? Example: 1. Does our program remove some children more often than others? Are children with different racial/ethnic backgrounds removed at similar rates? MEGAN V

What is Your Process for Looking at Data? Evidence Inference Action MEGAN C Reference other frames to explore data and inference – EIA/PDSA/LTA These are the same concepts and underlying frame

Evidence Evidence refers to the numbers, such as “35% of boys have been removed at least once” The numbers are not debatable MEGAN C This is on page 1 of their chart handout

Inference How do you interpret the evidence? What can you conclude from the numbers? Does evidence mean good news? Bad news? News you can’t interpret? To reach an inference, sometimes you need to analyze data in other ways (ask for more evidence) MEGAN C

Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders and having a variety of perspectives can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data MEGAN C Importance of multiple perspectives Importance of stakeholders in program and practice data and results

Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders and teams May involve looking at additional data and information Again, early on the action might have to do with improving the quality of the data MEGAN C

Data-informed Decision Making Key Components Preparation – Plan to Succeed Define purpose and the issue Identify who needs to be involved Timelines Identify relevant questions Identify relevant data Generate hypotheses Evidence – Dig Into Data Analyze the data Develop methods and materials for displaying the data Inference – Interpret & Share With Others Share data materials Check support for hypotheses Connect inferences with root causes Action – Contribute to Success Celebrate success Develop & implement improvement plans Evaluate progress MEGAN C Consider adding parallel framework language into figure- Key Components in a Process for Data-informed Decision Making– Using the Evidence Inference Action Framework (which we’ve used in ECO data workshops and trainings) Preparation – Plan to succeed Identify who needs to be involved = identify team members Need someone who… Evidence- Digging into the data Inference – Interpreting and Sharing Data with Others Action – Next Steps to Contribute to Success Use group to make inference and determine the real actions – using stakeholder to finish that inference and decide what appropriate action should be taken EIA gets repeated in cycle, but preparation occurs at beginning and gets revised occasionally (as needed, not part of regular cycle) Depending on state infrastructure and system, locals may be ones completing process and development materials or state may be the one developing materials – completing preparation and evidence phases and then involving/sharing with locals for inference and action Data-informed Decision Making Key Components

What Are Your Data Sources? Data: facts or information used usually to calculate, analyze, or plan something ABBY For suspension and expulsion, we don’t currently have good data on it – your inferences are only as good as your data. If you have a question and don’t have good data sources, what do you do? Define what to think of as “data” Are we missing data sources, are there other data sources – Broaden the view of data From Merriam-Webster dictionary: facts or information used usually to calculate, analyze, or plan something Lots of information can be considered “data” – not just numbers – what information do you have that you can look at systematically and can turn qualitative information into quantitative information

Small Group Activity Articulate your question Evidence: Critically examine the data provided Inference: Discuss what inferences you can make Action: Brainstorm potential actions and next steps ABBY Dilemma or can we rephrase as question to be consistent with terms used above? Interactive activity to work through process – 30 min. In small groups – provide dilemma handout, activity sheet, and chart handout Illustrative example –dilemma

Wrap-Up What were your inferences and actions based on the data? Did you have any ah-has? Reflections on the process and experience? MEGAN V

Intentionality Leads to Success Think about how you can maximize data you already collect and collect what you need Think about how to organize your staff and your agency around ongoing data use Its all about continuous improvement Use data to determine priority for focus It is important to “drill down” to understand performance to identify meaningful solutions MEGAN V Include the audience and who you are sharing with Inform and engage families

Contact Abby Schachner, abby.schachner@sri.com Megan Vinh, mvinh@email.unc.edu Megan Cox, megan.cox@sri.com DaSy Center website: http://dasycenter.org/ ECTA Center website: http://ectacenter.org/

Resources DaSy Critical Questions for Early Intervention and Early Childhood Special Education http://dasycenter.org/critical-questions-about-early-intervention-and-early-childhood-special-education/ Planning, Conducting, and Documenting Data Analysis for Program Improvement http://dasycenter.org/planning-conducting-and-documenting-data-analysis-for-program-improvement/ Head Start Modules on Creating a Culture that Embraces Data http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/operations/data/guide/guide.html Data Visualization Toolkit http://dasycenter.org/data-visualization-toolkit/ Prevent Expulsion http://preventexpulsion.org/ RP2 materials http://ectacenter.org/implement_ebp/implement_ebp.asp Inclusion self-assessment http://ectacenter.org/~pdfs/topics/inclusion/ecta-dasy_inclusion-online-self-assessment_05-17-17.pdf

Thank you The contents of this presentation were developed under a grant from the U.S. Department of Education, # H373Z120002, and a cooperative agreement, #H326P120002, from the Office of Special Education Programs, U.S. Department of Education. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. DaSy Center Project Officers, Meredith Miceli and Richelle Davis and ECTA Center Project Officer, Julia Martin Eile.