Data Informed Decision Makers: How to Use Data For Decision Making Abby Schachner Megan Vinh Megan Cox Division for Early Childhood Annual Conference October 2017
Welcome! Who is here? What are your expectations for this session? MEGAN C Review desired outcomes This interactive session will engage participants in addressing the challenges of being data-informed decision makers with limited time and resources. Participants bring laptops to apply concepts and strategies for using data to make decisions related to reducing suspensions and expulsions and promoting positive child outcomes with sample data and charts. NECTAC/ECO/WRRC 2012
Agenda National context Creating a culture of data-informed decision-making Including practitioner-driven data questions How to efficiently and intentionally use data MEGAN C NECTAC/ECO/WRRC 2012
National Context Results Driven Accountability For over 30 years, there has been a strong focus on regulatory compliance based on the IDEA and Federal regulations for early intervention and special education OSEP States Districts/Programs As a result, compliance has improved! ABBY? For the Office of Special Education Programs (OSEP) states are required to report on five categories of progress for each of the three child outcomes: The OSEP categories describe types of progress children can make between entry and exit Two COS ratings (entry and exit), and the yes/no progress question, are need to calculate what OSEP category describes a child’s progress NECTAC/ECO/WRRC 2012
Compliance improves, but not results? Despite this focus on compliance, states are not seeing improved results for children and youth with disabilities: Young children are not coming to Kindergarten prepared to learn In many locations, a significant achievement gap exists between students with disabilities and their general education peers Students are dropping out of school Many students who do graduate with a regular education diploma are not college and career ready ABBY? FROM NTI SLIDES Michael Yudin, former Assistant Secretary for Special Education and Rehabilitative Services, summarized this, when he said: “Despite this focus on compliance, states are not seeing improved results for children and youth with disabilities.” He continued to say that “young children are not coming to Kindergarten prepared to learn. In many areas, a significant achievement gap exists between students with disabilities and their general education peers. We are also seeing students drop out of schools. And, many students who do graduate with a regular education diploma are not college and career ready.” OSEP requirement for states to develop State Systemic Improvement Plans (SSIPs) is an example RDA's Three Components: State Performance Plan/Annual Performance Reports (SPP/APR), which measures results and compliance. States are currently developing State Systematic Improvement Plans (SSIPs), designed to improve outcomes in targeted areas. Determinations, which reflect state performance on results, as well as compliance. Differentiated monitoring and support for all states, but especially low performing states. The following Core Principles (Full Version) underlie and will guide OSEP’s RDA work: Principle 1: Partnership with stakeholders Principle 2: Transparent and understandable to educators and families Principle 3: Drives improved results Principle 4: Protects children and families Principle 5: Differentiated incentives and supports to states Principle 6: Encourages states to target resources and reduces burden Principle 7: Responsive to needs Multi-year, achievable plan that: Increase capacity of EIS programs/LEAs to implement, scale up, and sustain evidence-based practices Improves outcomes for children with disabilities (and their families)
National Context Three joint policy statements have been released: Policy Statement on Inclusion of Children with Disabilities in Early Childhood Programs Policy Statement on Expulsion and Suspension Policies in Early Childhood Settings Policy Statement on Family Engagement Joint Policy Statements http://www.acf.hhs.gov/programs/ecd/child-health-development/reducing-suspension-and-expulsion-practices http://www2.ed.gov/policy/speced/guid/earlylearning/joint-statement-full-text.pdf ABBY?
What do these all have in common? They all require the use of data to inform planning and to systemically improve results for children with disabilities and their families. ABBY?
Creating a Culture of data-informed decision making ECTACenter.org
Practices DEC Recommended Practices Leadership promotes data-informed decision-making by creating a culture of evidence-centered policies and professional development opportunities that promote the implementation of the recommended practices. Reflection of leadership to set the stage for data use – reference RP (ABBY) From our proposal: Leadership - L12 – using data for program management and continuous program improvement. Assessment - A9 – Practitioners implement systematic ongoing assessment to identify learning targets, plan activities, and monitor the child’s progress to revise instruction as needed.
Data – It’s a Leadership Team Responsibility Monthly review of data Who, How often, What, Where, When Monthly review of program incidents What’s up, what’s down, why, what should we do about it Review of all teacher fidelity measures to determine next steps, training, coaching, support Review of child progress data to ensure supports are effective ABBY ECTACenter.org
Cultural Challenges to Data-Informed Decision Making Many providers/teachers have developed their own personal metric for judging the effectiveness of their intervention/teaching and often this metric differs from the metrics of external parties (e.g., state accountability systems and school boards). Many providers/teachers and administrators base their decisions on experience, intuition, and anecdotal information (professional judgment) rather than on information that is collected systematically. There is little agreement among stakeholders about what kinds of data are meaningful and what to prioritize. Some providers/teachers disassociate their own performance and that of children, which leads them to overlook useful data. ABBY Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.
Technical Challenges Data-Informed Decision Making Data that providers /teachers want – about outcomes, services and quality – are rarely available and are usually difficult to measure. Programs and schools rarely provide the time needed to collect and analyze data. Providers/teachers and/or administrators lack the access or capacity to analyze data for program improvement. Abby Ingram, D. S. (2004). Accountability policies and teacher decision making: Barriers to the use of data to improve practice. Teachers College Record, 106(6), 1258–1287.
Political Challenges Data-Informed Decision Making Data have often been used politically, leading to mistrust of data and data avoidance. Providers/teachers and administrators may worry about the way data will be used to penalize them. Abby
Discussion Questions- Small Group Activity What are your barriers to creating a culture of data-informed decision making? What are your potential solutions? What do you struggle with in being a data-informed decision-maker? Abby 10 minutes
Efficiently and Intentionally using data MEGAN V Efficiently and Intentionally using data This Photo by Unknown Author is licensed under CC BY-SA
Key Concepts for Data-Informed Decision-Making What are your questions? What is your process for looking at data and making interpretations? What are the data sources you might have? Is there other data you need to collect or gather? MEGAN V Reference the ways to use data quadrants – multiple questions for differing purposes
Starting with a question (or two…) All analyses are driven by questions Questions come from different sources Different versions of the same question are necessary and appropriate for different audiences. What are your critical questions? What questions might practitioners have? MEGAN V We have a handout for you of example questions.
Defining Data Analysis Questions What are your crucial policy and programmatic questions? Example: 1. Does our program remove some children more often than others? Are children with different racial/ethnic backgrounds removed at similar rates? MEGAN V
What is Your Process for Looking at Data? Evidence Inference Action MEGAN C Reference other frames to explore data and inference – EIA/PDSA/LTA These are the same concepts and underlying frame
Evidence Evidence refers to the numbers, such as “35% of boys have been removed at least once” The numbers are not debatable MEGAN C This is on page 1 of their chart handout
Inference How do you interpret the evidence? What can you conclude from the numbers? Does evidence mean good news? Bad news? News you can’t interpret? To reach an inference, sometimes you need to analyze data in other ways (ask for more evidence) MEGAN C
Inference Inference is debatable -- even reasonable people can reach different conclusions Stakeholders and having a variety of perspectives can help with putting meaning on the numbers Early on, the inference may be more a question of the quality of the data MEGAN C Importance of multiple perspectives Importance of stakeholders in program and practice data and results
Action Given the inference from the numbers, what should be done? Recommendations or action steps Action can be debatable – and often is Another role for stakeholders and teams May involve looking at additional data and information Again, early on the action might have to do with improving the quality of the data MEGAN C
Data-informed Decision Making Key Components Preparation – Plan to Succeed Define purpose and the issue Identify who needs to be involved Timelines Identify relevant questions Identify relevant data Generate hypotheses Evidence – Dig Into Data Analyze the data Develop methods and materials for displaying the data Inference – Interpret & Share With Others Share data materials Check support for hypotheses Connect inferences with root causes Action – Contribute to Success Celebrate success Develop & implement improvement plans Evaluate progress MEGAN C Consider adding parallel framework language into figure- Key Components in a Process for Data-informed Decision Making– Using the Evidence Inference Action Framework (which we’ve used in ECO data workshops and trainings) Preparation – Plan to succeed Identify who needs to be involved = identify team members Need someone who… Evidence- Digging into the data Inference – Interpreting and Sharing Data with Others Action – Next Steps to Contribute to Success Use group to make inference and determine the real actions – using stakeholder to finish that inference and decide what appropriate action should be taken EIA gets repeated in cycle, but preparation occurs at beginning and gets revised occasionally (as needed, not part of regular cycle) Depending on state infrastructure and system, locals may be ones completing process and development materials or state may be the one developing materials – completing preparation and evidence phases and then involving/sharing with locals for inference and action Data-informed Decision Making Key Components
What Are Your Data Sources? Data: facts or information used usually to calculate, analyze, or plan something ABBY For suspension and expulsion, we don’t currently have good data on it – your inferences are only as good as your data. If you have a question and don’t have good data sources, what do you do? Define what to think of as “data” Are we missing data sources, are there other data sources – Broaden the view of data From Merriam-Webster dictionary: facts or information used usually to calculate, analyze, or plan something Lots of information can be considered “data” – not just numbers – what information do you have that you can look at systematically and can turn qualitative information into quantitative information
Small Group Activity Articulate your question Evidence: Critically examine the data provided Inference: Discuss what inferences you can make Action: Brainstorm potential actions and next steps ABBY Dilemma or can we rephrase as question to be consistent with terms used above? Interactive activity to work through process – 30 min. In small groups – provide dilemma handout, activity sheet, and chart handout Illustrative example –dilemma
Wrap-Up What were your inferences and actions based on the data? Did you have any ah-has? Reflections on the process and experience? MEGAN V
Intentionality Leads to Success Think about how you can maximize data you already collect and collect what you need Think about how to organize your staff and your agency around ongoing data use Its all about continuous improvement Use data to determine priority for focus It is important to “drill down” to understand performance to identify meaningful solutions MEGAN V Include the audience and who you are sharing with Inform and engage families
Contact Abby Schachner, abby.schachner@sri.com Megan Vinh, mvinh@email.unc.edu Megan Cox, megan.cox@sri.com DaSy Center website: http://dasycenter.org/ ECTA Center website: http://ectacenter.org/
Resources DaSy Critical Questions for Early Intervention and Early Childhood Special Education http://dasycenter.org/critical-questions-about-early-intervention-and-early-childhood-special-education/ Planning, Conducting, and Documenting Data Analysis for Program Improvement http://dasycenter.org/planning-conducting-and-documenting-data-analysis-for-program-improvement/ Head Start Modules on Creating a Culture that Embraces Data http://eclkc.ohs.acf.hhs.gov/hslc/tta-system/operations/data/guide/guide.html Data Visualization Toolkit http://dasycenter.org/data-visualization-toolkit/ Prevent Expulsion http://preventexpulsion.org/ RP2 materials http://ectacenter.org/implement_ebp/implement_ebp.asp Inclusion self-assessment http://ectacenter.org/~pdfs/topics/inclusion/ecta-dasy_inclusion-online-self-assessment_05-17-17.pdf
Thank you The contents of this presentation were developed under a grant from the U.S. Department of Education, # H373Z120002, and a cooperative agreement, #H326P120002, from the Office of Special Education Programs, U.S. Department of Education. However, those contents do not necessarily represent the policy of the U.S. Department of Education, and you should not assume endorsement by the Federal Government. DaSy Center Project Officers, Meredith Miceli and Richelle Davis and ECTA Center Project Officer, Julia Martin Eile.