The CCSR 5 Essentials Survey Helping Schools Organize for Improvement Elaine Allensworth The University of Chicago Consortium on Chicago School Research Say “next slide” to progress. May take a moment, keep talking.
Chicago 5 Essentials Surveys What they do: Demonstrate that student and teacher reports have an empirical relationship to school improvement Provide an evidence about how policies and initiatives affect students’ and teachers experiences in schools Give actionable information that schools can use to prioritize initiatives and drive improvement We’ve been conducting surveys of students and teachers in Chicago for many years, and have found students’ and teachers’ responses to be highly predictive of student outcomes and school improvement. The surveys are used in research – to understand how policies and practices affect students’ and teachers’ experiences in schools. The survey results are also provided back to schools so that they can diagnose their strengths and weaknesses and make plans for improvement. They surveys are organized into five areas, which we call the 5 Essential Supports for School Improvement: Leadership, Teacher organization, including collaboration & professional development. Family involvement, Safe & supportive environment for students, and Instruction.
Chicago 5 Essentials Surveys Administered to teachers in all grades and students in grades 6-12 Response rates of about 80% in 2013 Yearly administration since 2011, biennial from1997 Framework developed in 1994 Developed jointly by researchers, teachers, principals, members of reform organizations Grounded in research and school practice The surveys have been given since 1991, in 1997 we started giving them on a 2-year cycle, and then in 2011 we started giving them annually. We have high response rates because we’ve had many years to build buy-in, familiarity. The framework itself was initially developed jointly with by researchers – bringing in the literature at the time, along with teachers, principals, others in the education reform community. Included dimensions of the 5 essentials that research had shown or suggested were related to student outcomes, asked in a way that was meaningful to school practitioners.
Each Essential Represents Multiple Measures Questions Measures Course Clarity 5 Essentials I learn a lot from feedback on my work. It's clear to me what I need to do to get a good grade. The work we do in class is good preparation for the test. The homework assignments help me to learn the course material. I know what my teacher wants me to learn in this class. Each of the 5 Essential support areas, is measured along a number of different dimensions. For example, “ambitious instruction” includes measures of… Each Dimension is measured with a set of about five questions which try to tap into different aspects of the dimension. You can see some of the questions we use to measure course clarity. This particular measure is related to students’ course grades – controlling for students’ initial test scores, students get better grades in classes with higher course clarity. That’s why we include it. There are many different issues you need to consider when designing surveys. One is the length. The longer the survey is, the more likely people will get tired and not answer carefully, or not complete the survey. So you want to make sure that each question provides good information and unique information. We are under constant pressure to measure more dimensions – but doing so comes at the cost of worse measurement or taking something else out of the survey. We test each measure to make sure: 1) it is measuring what it is designed to measure; 2) it is related to an outcome we care about (student achievement or school improvement); and 3) it’s not redundant with other measures in the survey. Course Clarity Math Pedagogy English Pedagogy Student Discussion Classroom Behavior Challenge
Surveys used for both research and school practice Diagnostic tool for school improvement Individual school reports provided confidentially to school leaders since 1997 In 2008, CPS adopted the “Five Fundamentals” for school improvement planning, based on survey tool In 2009, reports became publicly-available In 2013, survey results included in school accountability Research tool Allows researchers to understand how policies worked – effects on instruction, learning climate, support for student learning Provides insight into key processes in schools – what matters most From the very early days – actually 1991 was our first survey, we have provided reports back to schools. They can then use the results for school improvement planning, and many have done so since the first surveys. Although, many others did not. It was left up to schools. Because the reports were confidential, only the principal had access and any external partners or community members who wanted to use the surveys had to get permission from the principal. Some shared it widely, others not at all. In 2008, the district encouraged the use of the surveys… In 2009, the district insisted the reports become publicly-available. We had been concerned that doing so would result in pressure for teachers and students to make their schools look good. At the same time, we recognized the problem that principals often would keep the results to themselves. Even the teachers who took the survey wouldn’t necessarily be able to see the results. Parents could see them, external partners. So, when they went public, we did an analysis to look for upward bias. Luckily, we did not find any indications that making the surveys public resulted in suddenly more positive results. This past year, the district included the results in the school accountability formula, as 5% of the total. Since the results are public, we can’t stop them from using them. We’re concerned about what effect this might have, so again we’ll look for bias. Examples of research on policies – provides a lot of insight beyond looking at student achievement. For example, with a policy that required students to pass minimum test scores to move from 8th to 9th grade, we found that students with low test scores reported more support from teachers and parents, while students with high test scores reported being less challenged. We could see that the time teachers spent on non-tested subjects went down, while they spent more time on the subjects being tested.
Surveys validated and refined through two decades of research Initial study 1990-1996 Replicated from 1997-2005 Measurement of each component refined over time Identify key construct from theory or practice Develop measures – refine with each survey administration Use in research – discover how it matters for schools or student outcomes Refine theory Track progress in schools The most comprehensive study of the surveys is described in this book – where all of the 5 essentials are examined simultaneously, over time – predicting improvement in schools’ test scores and attendance. Most of the research examines a few components at a time, not the entire framework. The general process is to…
Reports to schools: 5 Essentials https://cps.5-essentials.org/2013/ Here is an example of a screen shot from a school report. This is on the lead page. We started color-coding the survey reports in 2009, based on feedback from practitioners who wanted to easily see where their school was strong, and where it needed to improve. The colors definitely help. The issue is that it then becomes evaluative – red looks bad, green good. This leads to questions about what should be considered “strong” vs. “weak”? Should schools be compared to each other? If so, someone has to be in the bottom. Should they be compared to an objective standard (criterion referenced) – you need to be at a certain level to have X% likelihood of improvement. You see both of these methods with test scores (e.g., percentile ranks, ACT’s college benchmark scores). With ranking method… Should they be compared to schools serving similar populations? This can be criticized as holding more disadvantaged schools to a lower standard. With criterion method… Should we consider that some schools show higher student achievement with lower organizational strength? This can be criticized as holding more advantaged schools to a lower standard. In the end, we compare to all schools in the district. For the state survey, which I’ll talk about later, compared to all schools in the state. Easiest for people to understand. However, there is the issue that even if all the schools are pretty good, someone still has to be in the bottom. People can then drill down into each of the 5Essential areas. For example, “effective leaders” next slide.
Reports to schools: Effective leaders example Comparison on each essential and each measure to similar schools, as well as to the district. So this school could see it was weak, even compared to schools serving similar students. Why? Well, it was particularly low on program coherence. When you drill down on that one – next slide
Reports to schools: Program coherence example You can get a breakdown of responses of teachers/students on each question – these are two questions in program coherence. The leaders in this school saw that teachers generally disagreed that new programs were followed up, and that everyone felt like there were too many programs in the school to keep track of. They did an inventory of everything going on, and found there were 90 programs, many not coordinated, conflicting, not organized. So they made some hard decisions and now the school is much more organized so that programs are supporting instruction and learning in the school.
Expansion of 5 Essentials Surveys Now given in multiple cities by Uchicago Impact Adopted by State of Illinois in 2012-13 school year Mandated by law Five Essentials Day High participation in first year Stakeholder meetings in first year – not enough at local level Pushback once results were given to schools Need for validation across settings, adapting to different contexts CCSR is a research organization, focused on Chicago. The university created another organization, Uchicago Impact, to make the surveys available to other cities and states. The surveys are given in Detroit, and Minneapolis/St. Paul, as well as other places. In 2012-13 they were adopted by the state of Illinois. High participation: 93% of districts, 85% of schools, 75% of all teachers, 71% of all students Pushback: Content, Comparisons Without research showing it matters in their context, why should they take the survey? How can there be research without data? It would have been nice to have a year without public reports to get people comfortable, address concerns. Turned into a political issue for the state board.
Issues in design, implementation Trade-offs from survey length, report length Trade-offs from making results public vs confidential Framework - useful or confusing Comparisons to other schools – context with challenges Content stability vs adaptability over time and across places Length: Longer more comprehensive, but tong - bad measurement from survey fatigue. Need to make decisions about which specific measures to include. Broader it is given, the more conflict there will be about content. Public: Concerns about reporting bias, but more useful, more buy-in from multiple parties Framework useful as a guide, consistent w/district guidance. But other places may have other frameworks in place. Comparisons: Turns reports from formative assessment tool to evaluation of school, dispute over what comparison should be – what types of schools, what years? Decisions about how finely tailored the survey should be to specific contexts – rural schools may have different issues than urban, selective than general. Also may want changes over time, and different issues or practices become prominent in schools. But the more there are changes from place to place or across time, the less comparable the surveys are.
Chicago 5 Essentials Surveys Student and teacher voice can provide really useful insights into what is happening in schools, how policies and initiatives are working on the ground. They can explain why things are or are not working.