Value Added in CPS.

Slides:



Advertisements
Similar presentations
NYC Teacher Data Initiative: An introduction for Teachers ESO Focus on Professional Development December 2008.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Square Peg and Round Hole… As parents and educators, the change in grading systems requires a fundamental switch in our thinking… 4=A 1=F 2=D 3=B.
Data Analysis State Accountability. Data Analysis (What) Needs Assessment (Why ) Improvement Plan (How) Implement and Monitor.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Multiple Regression Fenster Today we start on the last part of the course: multivariate analysis. Up to now we have been concerned with testing the significance.
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
1 Leanna Stiefel and Amy Ellen Schwartz Faculty, Wagner Graduate School and Colin Chellman Research Associate, Institute for Education and Social Policy.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
What Makes For a Good Teacher and Who Can Tell? Douglas N. Harris Tim R. Sass Dept. of Ed. Policy Studies Dept. of Economics Univ. of Wisconsin Florida.
1 The New York State Education Department New York State’s Student Reporting and Accountability System.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
CRIOP Professional Development: Program Evaluation Evaluatio Susan Chambers Cantrell, Ed.D. Pamela Correll, M.A. Victor Malo-Juvera, Ed.D.
Student Engagement Survey Results and Analysis June 2011.
+ Equity Audit & Root Cause Analysis University of Mount Union.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
Van Hise Elementary School Review of Data School Improvement Process March 3, 2009.
September 2013 THE COMMON CORE STANDARDS & THE NEW STATE TESTS: ADVANCING COLLEGE AND CAREER READINESS IN NYC.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
No Child Left Behind Impact on Gwinnett County Public Schools’ Students and Schools.
C R E S S T / CU University of Colorado at Boulder National Center for Research on Evaluation, Standards, and Student Testing Measuring Adequate Yearly.
Ready At Five & Maryland State Department of Education.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
Conversation about State Report Card November 28, 2016
CORE Academic Growth Model: Introduction to Growth Models
Charlton Kings Junior School
Cecil County March 2012 Children Entering School Ready to Learn
EXPERIMENTAL RESEARCH
What does the Research Say About . . .
Smarter Balanced Assessment Results
Understanding the Next-Generation MCAS
Understanding the Next-Generation MCAS
Baltimore County March 2012 Children Entering School Ready to Learn
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
What is API? The Academic Performance Index (API) is the cornerstone of California's Public Schools Accountability Act of 1999 (PSAA). It is required.
St Peter’s Catholic Primary
2015 PARCC Results for R.I: Work to do, focus on teaching and learning
Dr. Robert H. Meyer Research Professor and Director
TESTING: How We Measure Academic Achievement
Understanding the Next-Generation MCAS
Making Data Work for Kids: EVAAS Teacher Reports October 2012
CORE Academic Growth Model: Results Interpretation
Understanding the Next-Generation MCAS
Lauren Kinsella Dr. Wright ITEC 7305
Douglas D. Ready, Ph.D. Teachers College, Columbia University
Created by Jena Parish Austell Intermediate July 2011 School Faculty
Why should you care about the EVAAS Teacher Value Added Report?
Background This slide should be removed from the deck once the template is updated. During the 2018 Legislative Session, Act 555 was passed requiring schools.
Data Overview Sandtown Middle School
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Madison Elementary / Middle School and the New Accountability System
Public Finance Seminar Spring 2019, Professor Yinger
Statistical Analysis and Unit Improvement Plan Book pgs
Measuring Student Growth
Lake Bluff Gifted Education Benchmarking Progress Report
Reminder for next week CUELT Conference.
Frederick County March 2012 Children Entering School Ready to Learn
Background This slide should be removed from the deck once the template is updated. During the 2019 Legislative Session, the Legislature updated a the.
Presentation transcript:

Value Added in CPS

What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the impact of schooling from other factors Focuses on how much students improve from one year to the next

Demographic adjustments Value added makes adjustments for demographics of schools and classrooms Adjustments determined by relationships between growth, student characteristics Adjustments measure partial differences in growth across groups district-wide

Some schools with low percent meet/exceed are high value-added schools in which students grow

Value added in many domains Annual state assessments Focus on year-to-year student improvement Short-term assessments Focus on short-term student improvement Potentially faster turnaround High school assessments Explore/PLAN/ACT, for example Focus on improvement in high school

Value added in CPS Based on ISAT for grades 3 through 8 Analyzes students’ ISAT scores, demographics, and schools attended Schools and classrooms where students improve more (relative to similar students) identified as high value added Extra ISAT points gained by students at a school/classroom on average relative to observably similar students across district

Alternative understanding Average student gain on ISAT relative to district average, with adjustments for: Shape of the test scale (Prior ISAT score) Grade level Gender, mobility, free/reduced-price lunch, race/ethnicity, disability, language proficiency, homelessness, other-subject pretest Enrollment in multiple schools or classrooms

Regression model (in English) Post-on-Pre Link x Posttest = Pretest School and Classroom Effects Student Characteristics Unobserved Factors + + + Value Added

Student characteristics Gender Race/ethnicity Free or reduced-price lunch Language proficiency (by Access score) Disability (by disability type) Mobility Homelessness Other-subject pretest

Why include student characteristics? One goal of value-added analysis is to be as fair as possible We want to remove the effect of factors that were not caused by the school during the specific period we are evaluating

What do we want to evaluate? Related to the school Not related to the school Examples Curriculum Classroom teacher School culture Math pull-out program at school Structure of lessons in school Safety at the school Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge Value added reflects the impact of these factors These factors need to be measured and isolated

Controlling for other factors Students bring different resources to the classroom. These factors can affect growth, so we want to remove the effects of these non-school factors. Not related to the school Examples Student motivation English Language Learner Status At home support Household financial resources Learning disability Prior knowledge These factors need to be measured and isolated

Controlling for other factors In order to include a characteristic in the model, we must have data on that characteristic for all students. Some characteristics are harder to measure and collect than others. The data that we do have available can tell us something about the effect of data we would like to have.

Controlling for other factors What we want Household financial resources What we have Free or reduced-price lunch Related data For example, we can use free or reduced-price lunch as a substitute for our ideal data about household finances in our calculations. Some possible talking points on the question: Less likely to have external help (tutor service, etc.) Reduced access to resources at home (books, computer, etc.) Reduced parental availability for homework / study help (multiple jobs?) Doug Harris’s book points (summer learning loss, etc.) From Doug Harris Book: “Should value-added models take into account student race, income, and other student factors in estimating value-added? accountability. In one respect, the issue boils down to whether taking This is one of the more controversial questions in using value-added for (see Figure 1.1). But it turns out that these factors are less closely income are after all closely related to student attainment on test scores into account prior achievement is enough. Race, ethnicity, and related to achievement growth. are associated with achievement in every grade, then the association The reason for this should be intuitive: if student demographics scored 60 points in one year and 80 points in the next year and grade, simple, concrete example to highlight this: Suppose that a student should largely “cancel out” when subtracting the two. Here is a make sure the child does her homework each night (in all years). This explaining these scores is that fact that the students’ parents do not for growth of 20 points. Now, suppose that part of the reason (instead of 60 and 8) if the student had done her homework. Notice would have been. So, the student would have scored 65 and 85 might reduce the student’s score by 5 points in each year from what it on scores should cancel out in this way. as the influence on student scores is constant over time, the influence that the growth is exactly the same in both cases—20 points. So long exists in 1st grade, it grows by about 30 percent between 1st and 5th rates. One study finds that while most of the gap by family income But minority and low-income students do still learn at slower years when students are not in school. This is most likely because school, but to the “summer learning loss,” the period between school grade.1 Almost all of this, in turn is due, not to what happens in the beginning of the school year, the summer learning loss is learning growth. Because standardized tests are not administered at the same factors creating the starting gate inequalities also affect therefore problematic. An accurate measure of value-added must take learning loss is substantially outside the control of schools and embedded within the student growth measures. Further, the summer into account all of the factors outside their control. that some see it as reducing expectations for these students. There The concern with accounting for race and income, however, is are legitimate differences of opinion on this point, but let me clarify schools can get by giving less effort to raise achievement for this could mean that accounting for race and income means that the two different meanings of “lower expectations.” On the one hand, point of value-added is to create an even playing field and one that and income do not lower expectations in this sense. In fact, the whole disadvantaged students. Value-added measures that account for race provides incentives for schools to help all students. schools serving disadvantaged students will have the same measured Alternatively, some interpret “lower expectations” to mean that then it is a legitimate point. Value-added models that adjust less student learning. If this is what is meant by lower expectations, performance as schools serving advantaged students while generating performance. Again, this just reflects the fact that schools should not of disadvantaged students to reach the same level of school predictions based on student race and income do require less learning into that category. home environments and other factors affecting students clearly fall be held accountable for factors outside their control and students’ different groups, value-added measures can be designed to place as concern is about how much emphasis schools place on achievement of There is some potential middle ground on this issue. If the in growth for a low-income student counts twice as much as for a example, we could design the accountability system so that 10 points much or as little weight on disadvantaged students as we wish. For between student race and income and student achievement growth they also end up estimating the statistical relationship (correlation) high-income student. Also, when statisticians estimate value-added, (see Oakville example above). Districts could use the measured role of disadvantage is associated with lower growth, then this might to overcome racial and income achievement gaps.2 If socio-economic student disadvantage as an indicator of how well the district is doing expectations and even greater effort for those students. rather than lowering expectations, it can be a driver of higher motivate districts to try even harder to address the issue. That is, The same goes for disadvantaged students. To the degree that reduce the problem of driving out teachers of low-attainment schools. It is also worth recalling that value-added measures can help those students at a disadvantage. growth, failing to account for those factors will place the teachers of student race and income are associated with their achievement to the Cardinal Rule: Hold people accountable for what they can when predicting each school’s achievement. The reason comes back In my view, it is best to account for student race and income does not mean that schools can get by with giving less effort to these students will receive higher performance ratings with less growth, it control. While this means that schools serving disadvantaged statistical relationship between student disadvantage and student growth of racial minorities and low-income students, and use the students. If, in addition, we give disproportionate weight to the expectations—and outcomes—for these students rather than lower accounting for student disadvantage would seems to raise achievement growth as a motivation to address the gap, then rationale behind it.” them. I cannot prove that, unfortunately, but there is a strong

Adjustments are based on real data Why is it important that VARC uses student test scores to calculate adjustment factors? We do not have a preconceived notion of which student subgroups will grow faster than others We want to be as fair as possible when evaluating school performance Student subgroups perform differently on each subject area from year-to-year We want our adjustments to apply specifically to the situation we are evaluating

Multiple regression Measures effects of each variable on posttest controlling for all other variables Effect of pretest on posttest controls for student characteristics, schools Effects of student characteristics on posttest control for pretest, schools Effects of schools (value added) on posttest control for pretest, student characteristics All effects measured simultaneously

Dosage Accounts for students changing schools and classrooms Students enrolled in a school or classroom for a fraction of a year get a fractional “dose” of the school’s or classroom’s effect Apportions student growth among schools and classrooms enrolled in the same year

Pretest measurement error Pretest measures student attainment in previous year with measurement error Models that ignore this will bias in favor of high-attainment schools and classrooms Measurement error is accounted for in VA model to correctly account for pretest Using approaches in Fuller (1987) Ensures against bias

Models that correct for measurement error avoid biasing in favor of schools and classrooms with high initial scores

Value added model All of these features ensure that value added reflects the results of schooling on student achievement Value added uses the data available to measure the impact of schools and classrooms as accurately, fairly, and realistically as possible

Work in progress Classroom-level value added Measures student growth within classrooms Differential effects value added Measures growth among students of a particular group (ELL, disability, etc.) in a school or classroom Value added from other assessments Scantron (short-term) Explore/PLAN/ACT (high school)