Download presentation
Presentation is loading. Please wait.
1
Triangulating Data to Surface Equities and Inequities in Student Outcomes
Office of Student and School Success Presenters: Jennifer Jones, Center for Educational Effectiveness Erica Ferrelli, Data Analyst, OSPI Sue Cohn, Policy and Research Coach, Office of Student and School Success February 2017 Welcome to this presentation hosted by the Office of Student and School Success and the Center for Educational Effectiveness. I’m Sue Cohn, Policy and Research Coach with the Office. With me today are Jennifer Jones, Center for Educational Effectiveness, and Erica Ferrelli, Data Analyst at OSPI. This is the second of three sets of modules designed to support coaches, leaders, and staffs to use various sources of data to inform decision-making. The first sets of modules are considered pre-requisites for this module. Pre-requisites focus on accessing and using the Assessment Analytics tool, Online Reporting System, and Perceptual Data. The final module, Deep Data Dive with Leadership Teams, will be recorded and posted to our website in March.
2
Before We Begin We recommend participants:
View prerequisite modules on Assessment Analytics, Perceptual Data, and ORS prior to viewing this module. The link is: Have CEE Data Reports, Achievement Analytic Data, and other data you and your team are using with you during this presentation. We will use a case study format to guide coaches through the process of triangulating data to surface equities and inequities in student outcomes and the practices impacting those outcomes. We ask you to view prerequisite modules prior to viewing this module. We also suggest you have your CEE Data Reports, Achievement Analytic Data, and other data you and your team are using with you during this presentation. Additionally, we recommend you have a data analysis tool to assist you as you review your data. Throughout this presentation, we will pose questions for you to consider. Please feel free to pause the presentation while you and your team discuss your response, examine your data, and identify additional questions to consider. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
3
Data Analysis Protocols
Student and School Success Action-Planning Handbook (Appendix B) Data Reflection Protocol (B-4) Data Carousels and Writing Narrative Statements (B-2) “What to Collect” Worksheets (B-1) Data Driven Dialogue (Wellman & Lipton, 2004) In addition to your data, we recommend you have a data analysis tool to assist you as you review your data. Sample protocols and sources are listed on this slide. Data Reflection Protocol from Student and School Success Action–Planning Handbook Data Carousels and Writing Narrative Statements from Student and School Success Action–Planning Handbook “What to Collect” Worksheets from Student and School Success Action–Planning Handbook Collaborative Learning Cycle from the book Data Driven Dialogue by Wellman and Lipton Question to consider: What other tools did you and your team use when analyzing data? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
4
Protocols: Noticings/Wonderings/Insights
Data Reflection Protocol: Tool may be used for multiple sources of data, from Progress Reports and Assessments to Attendance and Behavior Key Questions What important points “pop-out”? What is surprising or unexpected? What patterns or trends – gaps/equities/inequities – seem to be emerging? Questions: What key questions do you and your team use for Assessment Data? Contextual data? Other data? What questions about equities and inequities in outcomes surfaced? Both our Leadership Team and I used the Data Reflection Protocol when looking at our school’s state assessment data, progress reports, and attendance/discipline data. This tool may be used for other sources of data as well. Key questions our team and I considered are listed on the slide. What are some of the key questions you and your team used when looking at your state assessment data? Contextual data? Other data? How did the question of equities and inequities in student outcomes surface? You may pause the presentation as you and your team consider your responses to the questions on the slide. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
5
Protocols: Noticings/Wonderings/Insights
Writing Narrative Statements: When making observations, we suggest… Keep it simple—Communicate a single idea about student performance. “The gap in 7th grade ELA proficiency between All students and Low Income students increased by 6% from 2015 to 2016” Make the narrative statement short and easy to read. “The number of ELs at our school increased from 25 to 45 between 2014 and ” Avoid evaluative statements—just describe the data, not why or what to do about it. “38% of parents state they don’t receive information about ways to help their children learn at home.” When we looked at data, I asked our team to begin with writing narrative statements that describe the data, rather than evaluative statements about the data or comparing data from multiple sources. These are some samples we talked about. I also use this same format when going through data. First come noticings and wonderings, particularly around questions of equity and inequities. Before continuing the presentation, please take a moment to gather data and analysis tools you and your team can use as you build your skills around triangulating data to surface equities and inequities in student outcomes. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
6
Context: Demographics
OSPI Report card indicates stable demographics over last 5 years Size: Approximately 525 students, low mobility rate Poverty: Ranged from 84% to 91% ELL: Approximately 40% Language spoken by ELLs: Spanish Since this is a case study, we begin with contextual data about my elementary school. While we don’t set goals to change the context of our school, we know that understanding the context and needs of our school is essential as we move forward with goal-setting around our achievement and other data. What context data did you and your team review this fall? How did a knowledge of those data impact goal-setting?
7
Data from School Report Card: All Students
This slide describes our achievement data for the last two years. You’re no doubt familiar with this data display, since this is probably what you looked at last fall before we had the Assessment Analytics tool. We think that both the State Report Card data and the Assessment Analytic displays can inform our thinking around student achievement. Our team developed SMART goals for this spring’s SB Assessments in ELA and Math for each grade level. We also developed SMART goals to close gaps in achievement between our Low Income students and non-Low Income students.
8
EES Perceptual Surveys
These screen shots show covers of two of the Perceptual Data reports our school recently received. We recommend beginning with these reports. The 3-way Staff Report includes the Indistar Supplement that we’ll talk about in a bit. I’m familiar with our SMART Goals around state assessments, the context of our school, and progress-monitoring with our ORS data. The questions that arose for me included: How can I use these data to inform thinking around our goals for spring SB assessments? For the next round of goal-setting? What are some of the essential questions that come to mind about closing gaps – about equities and inequities? Armed with three sets of data – context, achievement, and perceptual – the challenge for me is to understand how to bring these data together – to triangulate the data to gain insights into the work and progress of our school.
9
Learning Targets Describe purpose for and process of triangulating data Triangulate data to surface equities and inequities in student outcomes Prioritize findings based on an understanding of school context/needs Identify initial coaching moves and next steps with Principal and Leadership Teams This brings us to the problem of practice we would like to address and the learning targets for this module. Our school and district leaders and staffs have an abundance of data available to them, from achievement and demographic data to perceptual and other data based on their unique context (such as ELL). One of the problems of practice for educators and our coaches is knowing how to use these data to surface equities and inequities in student outcomes as well as to understand the educator and systems practices impacting those outcomes. Today’s presentation is designed to support educators and teams in building their capacity to bring various sources of data together for several purposes: Identify equities in outcomes in your school and practices impacting those outcomes. Those are the practices we want to move to full implementation and sustain over time. Identify gaps in outcomes and the practices impacting those outcomes. Those are the practices we want to disrupt and dismantle and replace with practices research and evidence indicate are essential for closing achievement and opportunity gaps. And, based on these observations, identify next steps for you and your team to consider. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
10
Triangulating Data Purpose: The process of “triangulation” leads to two paths Congruence (agreement): Is there agreement between what the achievement data indicate, staff and student perceptions, and other data? Discrepancy (disagreement): Is there disagreement between what the achievement data indicate, staff and student perceptions, and other data? End Product focuses on equities and inequities in the data: May include: Compilation of Noticings/Wonderings/Insights from the data Identification of additional data to collect and analyze “Lesson plan” outlining next coaching moves and essential questions around data Other steps based on your school’s context and data Triangulation is the process of reviewing multiple sources of data to gain greater insights about information we gleaned from previous data sources. For instance, your team may have reviewed disaggregated results of an Interim Benchmark Assessment administered in November, first quarter or first semester grades, or attendance and discipline data. During triangulation, teams will view other sources of data to gain additional information about suspicions that surfaced during the earlier review of their data. Today, we will focus on several sources of data and reports: Assessment Analytic data from state assessments over the last few years Perceptual Data Other data sources the school or district is using. Of all the steps in the data analysis process, triangulation is the messiest. There is really no way around it. This is the time when team members bring to the table, literally, a vast array of student and teacher artifacts and other data sources to help them address questions related to inequities and equities in learning outcomes and educator/system practices related to those outcomes. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
11
Triangulation: The Search for Congruence
Congruence (agreement among data): What does this look like? Student achievement on State Assessments is improving (performance, growth, closing gaps) EES-Staff perceptual data show improvement in practices EES-Student perceptual data show student experiences mirror what staff members are trying to do Semester Progress Reports show decrease in number of freshmen D’s and F’s from last to this year Discrepancy (disagreement among data): What does this look like? Student achievement on State Assessments is declining (performance, growth, or widening gaps) EES-Staff perceptual data show improvement; staff members believe they are doing the right things EES-Student perceptual data show student experiences are different than what staff feels is happening Semester Progress Reports show increase in number of freshmen D’s and Fs from last to this year This slide describes a bit more about the search for congruence. Please take a moment to review the descriptions for both congruence and discrepancy.
12
Triangulation: The Search for Congruence
Context Achievement EES-Staff & EES-Student Attendance & Discipline Local Assessment, PLC, and Observation Data See Appendix B for data to collect Go to Summary Recall that triangulation is the process of reviewing multiple sources of data to gain greater insights about information we gleaned from previous data sources. As stated earlier, among the data our team analyzed earlier this year were contextual data and achievement data. We talked about contextual data. Let’s spend more time looking at our Assessment Data.
13
Data from School Report Card: All Students
Questions to consider: What did you notice in your data? How did these data inform your questions about closing gaps – about equity and inequities in student outcomes? Data from School Report Card: All Students This slide describes our achievement data for the last two years. Among the noticings and wonderings for our team: 3rd grade and 4th grade both increased in ELA from 2015 to 2016 5th grade ELA decreased over the same time period. Math for both 3rd and 4th grade was over 45% in 2016, while Math for 5th grade was 16%. Our population is fairly stable, so the 4th graders in 2015 were our 5th graders in Based on that, we saw an increase in scores for that cohort of students from to 2016. We wondered about the dip from 2015 to 2016 in 5th grade; we also wondered about our Level 1 and Level 2 students, and if they were making progress from one level to the next. In fact, it’s not unusual for an analysis of data to beget more questions than answers. It’s the surfacing of questions and seeking additional data and information to respond to those questions that will inform instructional and leadership decision making.
14
Data from School Report Card: Low Income Students
Questions to consider: What congruence and discrepancies did you notice in your data? How did these observations inform your goal-setting? Data from School Report Card: Low Income Students So we began with our 5th grade data, since that is where we had the lowest scores in We turned to the Report Card to look more deeply at the numbers of students at each level for our low income students. Our team used these data and the data from the previous slide to surface both both congruence and discrepancies. The discrepancies in our 5th grade data informed the SMART goals set by our Leadership Team and grade level teams around all students and low income students performance on state assessments. For example, the 5th grade team set specific goals for moving Level I students to Level 2, Level 2 students to Level 3, and so on. What congruence and discrepancies do you observe in your data? How did these observations inform your goal-setting?
15
Assessment Analytic Opportunity Gap: Low Income and Non-Low Income
Sue: Our team also wanted to know more about the performance of certain subgroups in this school. Fortunately, we were able to access these data on the Assessment Analytics tool. Our team viewed the Assessment Analytics tool presentation on the OSSS website prior to looking at these data, so these charts make sense to us. We recommend you and your team also view the presentation if you have not yet done so. I asked Erica to assist us in understanding our data. Erica: OSPI recommends comparing distinct groups (Low Income vs Non-Low Income or Special Education vs Non-Special Education) to ensure distinct student counts. If you compare groups that are not similar (Low Income vs All Students), the same students could be counted in both groups. However, like many schools, certain data are suppressed and not shown publically. If your school’s data are suppressed and you would like to know why, you may look at and download school level data Excel files on the School Report Card. For this chart, we only have school-level data from for low-income and non-low income subgroups. However, we can make observations regarding the opportunity gap between these subgroups and we can compare the percent of students meeting standard at the school level to the district and state average. As seen in the 2 boxes on the bottom half of the diagram, 29.3% of third graders in the low-income (blue dot) subgroup and 63.2% of third graders in the non-low income (yellow dot) subgroup met standard in ELA, for a gap of 33.9%. Corresponding %s for the district level are 23.8% and 49.2%, for a gap of 25.4%. Scores at the state level are 37.7% and 69.7%, for a gap of 32% between low-income and non-low income students. So, the gap for our 3rd graders was greater than both the district and the school. We wondered about our 4th graders.
16
Assessment Analytic Opportunity Gap: Low Income and Non-Low Income
Erica: For fourth grade, we had enough data available to compare low income (blue dot) and non-low income (yellow dot) subgroups over time. We looked at both the yellow/blue lines and the two right boxes to understand our data. We observed our % of students meeting standard increased for both subgroups. However, the gap between the two lines increased, meaning the gap between the two subgroups increased from the previous year for the school. The gaps at the district level and state level decreased slightly in
17
Assessment Analytic Opportunity Gap: Low Income and Non-Low Income
Questions to consider: What are the gaps between your low-income and non-low income subgroups at school level? District level? State level? What about other subgroups? What data can be used if you don’t have sufficient N? Assessment Analytic Opportunity Gap: Low Income and Non-Low Income Erica: A similar examination of 5th grade data indicated our scores for both subgroups are below both the district and state. That said, we also noticed we have a school-level gap of 11.6%, compared to a 20.3% gap at the district level and a 30.8% gap at the state level. Sue: Thank you Erica for those explanations of our data. Based on data about both proficiency and gaps, we revised our SMART goals for this year’s SB Assessment to include a gap-closing goal at each grade level. Again, we’re using interim assessments and other measures to inform our decisions about instruction and interventions. We weren’t sure about educator and systems practices that might be impacting that gap, so we knew we needed more information and we knew we needed to develop goals and tasks to address this gap. The questions on the slide align with the questions our team considered when analyzing the data. You may pause the presentation at this point if you would like to examine your school’s data more closely.
18
EES-Staff Readiness for Change 3-Way Comparison, p. 4
Questions to consider: How have perceptions changed over the 3 years for your school? What do your data say about staff readiness for change? EES-Staff Readiness for Change 3-Way Comparison, p. 4 Sue: I recalled learning that triangulation is the process of reviewing multiple sources of data to gain greater insights about information we gleaned from previous data sources. Based on this, I wondered how we might use our EES data to inform our work around the SMART goals we’ve already set. So we turned to the resources provided by our partners at CEE to assist us in understanding these data. Fortunately, we have Jennifer Jones with us today, and she’s going to walk us through several perceptual data reports. Jen: The 3-Way Comparison Report is a great place to begin. Why? Our perceptions are behind the actions we take and the decisions we make. Understanding what drives an individual’s actions and beliefs helps us to work well together and make quality decisions intentionally. In this case study, we are looking at a team that has a high readiness for change. These data show consistent growth among all staff over the 3 academic years. Question: What do your data say about staff readiness for change? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
19
EES-Staff Indistar Supplement 3-Way Comparison, p. 19
Jen: This chart from the 3-Way Comparison Report focuses on Turnaround Principle 2: Ensure Effective Teachers and Instruction. The group of EES-Staff questions highlighted here all show consistent growth over the 3 academic years. We have opportunities to learn effective teaching strategies for the diversity represented in our school We reflect upon instructional practice to inform our conversations about improvement Appropriate data are used to guide building-directed professional development We are provided training to meet the needs of a diverse student population in our school The staff in this case study are increasingly positive about the efficacy of their instruction. Questions: What do you notice in your data? What wonderings do you have? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
20
EES-Staff Indistar Supplement 3-Way Comparison, p. 21
Jen: This chart from the 3-Way Comparison Report focuses on Turnaround Principle 4: Strengthen the School’s Instructional Program. The group of EES-Staff questions highlighted here all show consistent growth over the 3 academic years. Lessons are designed to support instructional outcomes Instruction is personalized to meet the needs of each student Struggling student receive early intervention and remediation to acquire skills Common benchmark assessments are used to inform instruction The staff in this case study are increasingly positive about the instructional program based on student needs. Questions: What do you notice in your data? What wonderings do you have? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
21
EES-Student Report EES-Student p. 17
Jen: Now that we have reviewed the EES-Staff survey data, let’s look at the EES-Student survey data to see if there is congruence (agreement) or discrepancy (disagreement) between the two groups of stakeholders. You will find this chart in your EES-Student Report - Turnaround Principle 4: Strengthen the School’s Instructional Program. If I want to talk with my teacher(s), he/she is available to me 70% combined positive In this school, students get extra help when they need it STUDENT 77% combined positive, STAFF 83% combined positive My teacher(s) often tell me how I am doing in their class 67% combined positive Overall students are less positive about the instructional program. This shows a discrepancy between the two groups of stakeholders. Sue: Thank you Jen for adding to our understanding of perceptual data. As you might imagine, questions arose for us when comparing our staff perceptions and student perceptions. What questions arise for you and your team as you look at the various reports and compare student and staff perceptions? Questions: What do you notice in your data? What wonderings do you have about staff perceptions and student perceptions? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
22
Other Sources of Data Attendance through 2nd reporting period
Discipline through 2nd reporting period Report Card through 2nd reporting period Online Reporting System PLC/Collaborative time agendas & reports Classroom-based assessments in core content areas See Student and School Success Handbook (Appendix B) for additional options. This slide lists a variety of additional data we’ve collected and reviewed. Recall that triangulation is the process of reviewing multiple sources of data to gain greater insights about information we may have gleaned from previous data sources. Question: What other sources of data can inform your action planning to address equities and inequities and to monitor progress? Question: What other sources of data can inform your action planning to address equities and inequities and to monitor progress? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018 Back to Diagram
23
Prioritizing Findings Based on School Context and Needs
What do we see? Context: Stable demographics with approximately 85% poverty Achievement: Mixed results at each grade level; lowest performance in 5th grade; gaps between Low-Income and Non-Low Income subgroups at all levels, with smallest gap at 5th grade. EES-Staff: Strong improvement in staff perceptions of the quality of their work (Aligned, personalized, interventions for struggling students) EES-Student: Not as strong and not as positive that they get extra help Essential Questions: Why the disagreement between achievement outcomes, staff perception of practices, and student perception of practices? Why the gaps in outcomes, and how do we surface practices impacting those outcomes? With the assistance of both Erica and Jen, I have a greater understanding of both our Achievement data and our Perceptual data, and how both can inform my thinking and coaching moves. This slide lists my priorities from analyzing our data and the essential questions that will inform my coaching moves. Your priorities will be different based on your data and your school’s context and needs. What are the findings you would prioritize for working with your principal and leadership team as you look at your Assessment Data, Perceptual Data, and other sources of data? Question: What findings and essential questions will you decide are most important to share with your leadership team? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
24
Initial Coaching Moves Based on School Context and Needs
Build capacity around using Perceptual Data (Consultative Coaching) min View Perceptual Data Module Review Staff EES Reports: Focus on Readiness to Change and Indistar Supplement Review Student EES Reports: Focus on Indistar Supplement Plan Leadership Team Meeting (Collaborative Coaching) – min View Triangulation of Data Module Surface equities and inequities in the data and questions to consider Determine team’s skill level for using data and next steps Our recent work with Gary Bloom’s Blended Coaching and knowledge of my principal’s skill with data influenced my selection of coaching moves based on these priorities. We’ve spent considerable time setting SMART goals around our SB Assessment Data. We looked at recent progress reports and attendance/discipline data through January. However, our principal isn’t as familiar with perceptual data. So one of my first coaching moves, with her permission, will be to build her skills and knowledge around using perceptual data reports. Armed with that knowledge, together we will plan our next Leadership Team meeting. What coaching strategies would you use with your Principal? With the Leadership Team? Questions: What are your initial coaching moves and what coaching strategies would you use with your principal? Leadership Team? OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
25
Next Steps Module 1 (Prerequisites): View with Principal and Teams
Module 2: Triangulating Data to Surface Equities/Inequities Analyze multiple sources of data Prioritize findings Identify initial moves with Principal, Leadership Team, Staff Module 3: Deep Data Dive with Teams Triangulate school-specific data with focus on equitable/inequitable student outcomes and educator/systems practices impacting outcomes Craft and monitor S.M.A.R.T. Goals and tasks in Action Plan We recommend coaches and their principals/teams view the modules in the order listed here. As you do so, we encourage you to continue to ask those questions that cause you to dig deeper into the data, so that you gain an understanding of those practices leading to equitable outcomes – the practices we want to sustain over time – and those leading to gaps and inequities – so we can disrupt and dismantle those practices. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
26
Thank you! Please contact us with your questions: Jennifer Jones: Greg Lobdell: Erica Ferrelli: Sue Cohn: Office: I would also like to thank several colleagues for their support in developing this module: Success Coaches Jim Paxinos, Jeanine Butler, Chriss Burgess, Lee Smith, and Monica Hulubei Piergallini. And of course, thank you to Craig Shurick, Director of Operations, for his leadership and championing of this effort. OFFICE OF SUPERINTENDENT OF PUBLIC INSTRUCTION 11/21/2018
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.