Download presentation
Presentation is loading. Please wait.
1
Value Added in CPS
2
What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the impact of schooling from other factors Focuses on how much students improve from one year to the next
3
Demographic adjustments
Value added makes adjustments for demographics of schools and classrooms Adjustments determined by relationships between growth, student characteristics Adjustments measure partial differences in growth across groups district-wide
4
Some schools with low percent meet/exceed are high value-added schools in which students grow
5
Value added in many domains
Annual state assessments Focus on year-to-year student improvement Short-term assessments Focus on short-term student improvement Potentially faster turnaround High school assessments Explore/PLAN/ACT, for example Focus on improvement in high school
6
Value added in CPS Based on ISAT for grades 3 through 8
Analyzes students’ ISAT scores, demographics, and schools attended Schools and classrooms where students improve more (relative to similar students) identified as high value added Extra ISAT points gained by students at a school/classroom on average relative to observably similar students across district
7
Alternative understanding
Average student gain on ISAT relative to district average, with adjustments for: Shape of the test scale (Prior ISAT score) Grade level Gender, mobility, free/reduced-price lunch, race/ethnicity, disability, language proficiency, homelessness, other-subject pretest Enrollment in multiple schools or classrooms
8
Regression model (in English)
Post-on-Pre Link x Posttest = Pretest School and Classroom Effects Student Characteristics Unobserved Factors + + + Value Added
9
Student characteristics
Gender Race/ethnicity Free or reduced-price lunch Language proficiency (by Access score) Disability (by disability type) Mobility Homelessness Other-subject pretest
10
Why include student characteristics?
One goal of value-added analysis is to be as fair as possible We want to remove the effect of factors that were not caused by the school during the specific period we are evaluating
11
What do we want to evaluate?
Related to the school Not related to the school Examples Curriculum Classroom teacher School culture Math pull-out program at school Structure of lessons in school Safety at the school Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge Value added reflects the impact of these factors These factors need to be measured and isolated
12
Controlling for other factors
Students bring different resources to the classroom. These factors can affect growth, so we want to remove the effects of these non-school factors. Not related to the school Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge These factors need to be measured and isolated
13
Controlling for other factors
In order to include a characteristic in the model, we must have data on that characteristic for all students. Some characteristics are harder to measure and collect than others. The data that we do have available can tell us something about the effect of data we would like to have.
14
Controlling for other factors
What we want Household financial resources What we have Free or reduced-price lunch Related data For example, we can use free or reduced-price lunch as a substitute for our ideal data about household finances in our calculations. Some possible talking points on the question: Less likely to have external help (tutor service, etc.) Reduced access to resources at home (books, computer, etc.) Reduced parental availability for homework / study help (multiple jobs?) Doug Harris’s book points (summer learning loss, etc.) From Doug Harris Book: “Should value-added models take into account student race, income, and other student factors in estimating value-added? accountability. In one respect, the issue boils down to whether taking This is one of the more controversial questions in using value-added for (see Figure 1.1). But it turns out that these factors are less closely income are after all closely related to student attainment on test scores into account prior achievement is enough. Race, ethnicity, and related to achievement growth. are associated with achievement in every grade, then the association The reason for this should be intuitive: if student demographics scored 60 points in one year and 80 points in the next year and grade, simple, concrete example to highlight this: Suppose that a student should largely “cancel out” when subtracting the two. Here is a make sure the child does her homework each night (in all years). This explaining these scores is that fact that the students’ parents do not for growth of 20 points. Now, suppose that part of the reason (instead of 60 and 8) if the student had done her homework. Notice would have been. So, the student would have scored 65 and 85 might reduce the student’s score by 5 points in each year from what it on scores should cancel out in this way. as the influence on student scores is constant over time, the influence that the growth is exactly the same in both cases—20 points. So long exists in 1st grade, it grows by about 30 percent between 1st and 5th rates. One study finds that while most of the gap by family income But minority and low-income students do still learn at slower years when students are not in school. This is most likely because school, but to the “summer learning loss,” the period between school grade.1 Almost all of this, in turn is due, not to what happens in the beginning of the school year, the summer learning loss is learning growth. Because standardized tests are not administered at the same factors creating the starting gate inequalities also affect therefore problematic. An accurate measure of value-added must take learning loss is substantially outside the control of schools and embedded within the student growth measures. Further, the summer into account all of the factors outside their control. that some see it as reducing expectations for these students. There The concern with accounting for race and income, however, is are legitimate differences of opinion on this point, but let me clarify schools can get by giving less effort to raise achievement for this could mean that accounting for race and income means that the two different meanings of “lower expectations.” On the one hand, point of value-added is to create an even playing field and one that and income do not lower expectations in this sense. In fact, the whole disadvantaged students. Value-added measures that account for race provides incentives for schools to help all students. schools serving disadvantaged students will have the same measured Alternatively, some interpret “lower expectations” to mean that then it is a legitimate point. Value-added models that adjust less student learning. If this is what is meant by lower expectations, performance as schools serving advantaged students while generating performance. Again, this just reflects the fact that schools should not of disadvantaged students to reach the same level of school predictions based on student race and income do require less learning into that category. home environments and other factors affecting students clearly fall be held accountable for factors outside their control and students’ different groups, value-added measures can be designed to place as concern is about how much emphasis schools place on achievement of There is some potential middle ground on this issue. If the in growth for a low-income student counts twice as much as for a example, we could design the accountability system so that 10 points much or as little weight on disadvantaged students as we wish. For between student race and income and student achievement growth they also end up estimating the statistical relationship (correlation) high-income student. Also, when statisticians estimate value-added, (see Oakville example above). Districts could use the measured role of disadvantage is associated with lower growth, then this might to overcome racial and income achievement gaps.2 If socio-economic student disadvantage as an indicator of how well the district is doing expectations and even greater effort for those students. rather than lowering expectations, it can be a driver of higher motivate districts to try even harder to address the issue. That is, The same goes for disadvantaged students. To the degree that reduce the problem of driving out teachers of low-attainment schools. It is also worth recalling that value-added measures can help those students at a disadvantage. growth, failing to account for those factors will place the teachers of student race and income are associated with their achievement to the Cardinal Rule: Hold people accountable for what they can when predicting each school’s achievement. The reason comes back In my view, it is best to account for student race and income does not mean that schools can get by with giving less effort to these students will receive higher performance ratings with less growth, it control. While this means that schools serving disadvantaged statistical relationship between student disadvantage and student growth of racial minorities and low-income students, and use the students. If, in addition, we give disproportionate weight to the expectations—and outcomes—for these students rather than lower accounting for student disadvantage would seems to raise achievement growth as a motivation to address the gap, then rationale behind it.” them. I cannot prove that, unfortunately, but there is a strong
15
Adjustments are based on real data
Why is it important that VARC uses student test scores to calculate adjustment factors? We do not have a preconceived notion of which student subgroups will grow faster than others We want to be as fair as possible when evaluating school performance Student subgroups perform differently on each subject area from year-to-year We want our adjustments to apply specifically to the situation we are evaluating
16
Multiple regression Measures effects of each variable on posttest controlling for all other variables Effect of pretest on posttest controls for student characteristics, schools Effects of student characteristics on posttest control for pretest, schools Effects of schools (value added) on posttest control for pretest, student characteristics All effects measured simultaneously
17
Dosage Accounts for students changing schools and classrooms
Students enrolled in a school or classroom for a fraction of a year get a fractional “dose” of the school’s or classroom’s effect Apportions student growth among schools and classrooms enrolled in the same year
18
Pretest measurement error
Pretest measures student attainment in previous year with measurement error Models that ignore this will bias in favor of high-attainment schools and classrooms Measurement error is accounted for in VA model to correctly account for pretest Using approaches in Fuller (1987) Ensures against bias
19
Models that correct for measurement error avoid biasing in favor of schools and classrooms with high initial scores
20
Value added model All of these features ensure that value added reflects the results of schooling on student achievement Value added uses the data available to measure the impact of schools and classrooms as accurately, fairly, and realistically as possible
21
Work in progress Classroom-level value added
Measures student growth within classrooms Differential effects value added Measures growth among students of a particular group (ELL, disability, etc.) in a school or classroom Value added from other assessments Scantron (short-term) Explore/PLAN/ACT (high school)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.