Presentation is loading. Please wait.

Presentation is loading. Please wait.

TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY.

Similar presentations


Presentation on theme: "TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY."— Presentation transcript:

1 TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY OF WISCONSIN-MADISON Robert Meyer, Research Professor and Director

2 VARC Partner Districts and States  Design of Wisconsin State Value-Added System (1989)  Minneapolis (1992)  Milwaukee (1996)  Chicago (2006)  Department of Education: Teacher Incentive Fund (TIF) (2006 and 2010)  Madison (2008)  Wisconsin Value-Added System (2009)  Milwaukee Area Public and Private Schools (2009)  Racine(2009)  New York City (2009)  Minnesota, North Dakota & South Dakota: Teacher Education Institutions and Districts (2009)  Illinois (2010)  Hillsborough (2010)  Atlanta (2010)  Los Angeles (2010)  Tulsa (2010)  Collier County (2012)  New York (2012)  California Charter Schools Association (2012)  Oklahoma Gear Up (2012)

3 Minneapolis Milwaukee Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA SOUTH DAKOTA MINNESOTA WISCONSIN ILLINOIS Districts and States Working with VARC Collier County NEW YORK

4 Context and Research Questions

5 Components to Educator Effectiveness Systems Educator Effectiveness Systems Data Requirements and Data Quality Professional Development (Understanding and Application) Evaluating Instructional Practices, Programs, and Policies Alignment with School, District, State Policies and Practices Embed within a Framework of Data-Informed Decision-Making Value-Added System

6 Uses of a Value-Added System Value- Added Evidence that All students can Learn Set School Performance Standards Triage: Identify Low Performing Schools Contribute to District Knowledge about “What Works” Data-Informed Decision- Making / Performance Management

7 Development of a Value-Added System  Clarity: What is the objective?  Dimensions of validity and reliability  Why? Achieve accuracy, fairness, improved teaching and learning How complex should a value-added model be? Possible rule: "Simpler is better, unless it is wrong.”

8 Dimensions of Validity and Reliability  Accuracy  Criterion validity  Technical (causal) validity  Reliability (precision)  Consequential validity  Transparency

9 Technical validity  Technical validity measures the degree to which the statistical model and data used in the model (for example, student outcomes, student characteristics, and student-classroom-teacher linkages) provide consistent (unbiased) estimates of performance using the available student outcomes/assessments  Requires development of a quasi-experimental model that captures (to the extent possible) the structural factors that determine student achievement and growth in student achievement

10 Consequential validity  Consequential validity addresses the incentives and decisions that are triggered by the design and use of performance measures and performance systems

11 Transparency  Transparency addresses the consequences of simplicity versus complexity in the design (and clarity of explanation) of value-added models and reports

12 Criterion Validity  Criterion validity captures the degree to which effect estimates based on available student outcome data fully align with estimates based on the complete spectrum of student outcomes valued by stakeholders

13 Reliability  Reliability (or precision) captures statistical error due to the fact that effectiveness estimates are based on finite samples of students, which in the context of estimating classroom and teacher performance are generally small

14 Application of Framework  Develop a value-added model that incorporates important structural factors that determine growth in student achievement and specify performance parameters that represent educational units (classrooms) and agents (teachers)  Identify and address threats to validity that could cause bias in the estimation of desired performance parameters  Specify data uses, including the design of reports intended to inform decision making

15 Technical vs. Consequential Validity I  Consider the consequences of controlling for prior achievement and other predictors – switching from measurement of attainment (as in NCLB) to growth  Positive from the standpoint of technical validity because the estimates are more accurate  Possibly negative from the perspective of consequential validity if controlling for prior achievement and other predictors inevitably leads to reduced expectations for poor and minority students.

16 Technical vs. Consequential Validity II  Consequences of inclusion of demographic variables?  Possibly positive from the standpoint of technical validity because the estimates are more accurate  Possibly negative from the perspective of consequential validity because the inclusion of these variables inevitably leads to reduced expectations for poor and minority students.  Or, the reverse is true

17 Value-Added Model

18 Generally Recommended Value-Added Model Features  Longitudinal student outcome/assessment data  Flexible (data-driven) posttest-on-pretest link, including possible nonlinearities in this relationship  Contextual covariates  Adjust for test measurement error  Address changes in assessments over time  Allow for end-of-grade & end-of-course exams  Dosage/student mobility  Allow differential effects by student characteristics  Statistical shrinkage: address noise due to small samples  Measures of precision and confidence ranges

19 Model Simplifications  Longitudinal data for two time periods (appropriate for early grades)  Model will be defined in terms of true test scores. Estimation method controls for test measurement error  Posttest on pretest relationship is assumed to be linear – this can be generalized  Student mobility with the school year is ignored in order to simplify notation

20 Structural Determinants of Achievement and Achievement Growth  Student level  Prior achievement  Student and family contribution  Within-classroom allocation of resources (including student performance expectations)  School contributions external to classroom (supplemental in-school instruction, after school instruction, summer school)

21 Structural Determinants of Achievement and Achievement Growth  Classroom level  Peer effects  Contributions external to teacher (school resources, policies, and climate, class size)  Contributions internal to teacher (teacher resources, policies, and instructional practices, alignment with standards implied by assessments) (factors that may be covered by observational rubrics)

22 Preview of Alternative Performance Parameters  Teacher performance:  Classroom performance:  Includes contributions in classroom from student peers and resources external to teacher (such as other staff and class size)  Factors external to the classroom (supplemental in-school instruction, after school instruction, summer school):  Classroom/school performance:  Includes contributions from classroom and resources external to the classroom

23 Model Specification Strategy  Include in the model all structural determinants of achievement and achievement growth  Be explicit how demographic variables and prior achievement contribute directly or indirectly (via other determinants) to achievement and growth  Two types of student and demographic variables:  Level I (Student level):  Level II (Classroom level):  Subscripts: student i, teacher j, and school k

24 I: Student-Level Equation  Posttest:  Pretest: with durability/decay parameter:  Student and family contribution:  Within classroom contribution:  Supplemental contribution:  Measures of supplemental factors not observed  Subscripts: student i, teacher j, and school k

25 Alternative Student-Level Equation  Include explicit measures of supplemental resources in the model, producing a multiple-input (crossed effects) model  This model is tractable if the crossed effects are not highly collinear. If the crossed effects are highly (or completely) collinear, then it may be possible to address provision of supplemental resources in the second level of the model as a factor external to the teacher.  Our focus is on the conventional one input model

26 Condition Factors on Student-Level Demographic Variables  Student and family factor  Within classroom factor  Supplemental factor

27 Defines a VAM of Student Growth and Classroom/School Performance  Combine student-level structural factors  Pretest coefficient  Effect of student-level characteristics  Classroom/school performance

28 Decomposition of Average Achievement  Predicted achievement = Prior achievement + Student growth  Average post achievement = Predicted achievement + Classroom/school performance  Teacher subscripts jk dropped

29 Technical Validity  Classroom/school performance from the value- added model that includes demographic variables is structural parameter of interest:  The performance parameter obtained from a model that excludes demographic variables is (approximately)  This parameter is biased

30 II. Classroom/School Level Equation  Classroom/school performance:  Peer effects:  Contributions external to teacher:  Contributions internal to teacher:

31 Condition Factors on Average Classroom- Level Demographic Variables  Peer effects:  Contributions external to teacher:  Contributions internal to teacher:

32 Defines a Model of Classroom/School Performance  Preferred model (but not identified)  Teacher parameter (not identified):  Bias: productivity external to teacher =  Feasible model (biased)  Bias is caused by “over-controlling”

33 Dilemma in the Choice of Models from the Perspective of Technical Validity  Option A: Use classroom/school performance as a proxy measure of teacher performance; commit an error of “omission”  Option B: Use the feasible, but biased, estimate of teacher performance; commit an error of “commission”  Option C: Use a more complicated model to control for the factors external to the teacher

34 Consequential Validity: Uses and Decisions  Parental choice of schools  Teachers willingness to teach in given schools  Identification of master teachers  Identification of teachers for professional development  Performance based compensation  Provision of supplemental services  Avoid bubble effects: incentives to deploy resources to students as artifact of statistical measures (Statistics based on means rather than medians can be affected by all students)

35 Key Point: the Power of Two  Decisions need to be informed by:  Measure of school/classroom or teacher performance  Measures of student achievement Actual average student achievement Student achievement target (e.g., proficiency status)  Options  Use only information on student attainment (NCLB)  Use only information on value-added performance  Use both pieces of data to inform decisions

36 Achievement Target, Performance, and Achievement Shortfall – Retrospective View  Example with two teachers  Focus on use of classroom/school indicator  Scale of parameters:  Value-added ratings are centered around zero with a standard deviation of one, and thus range from approximately -3 to 3  All other parameters (average achievement and the average contribution of demographic characteristics) are centered around zero and have been transformed to the value-added scale, although the standard deviations of these parameters are not constrained to equal one

37 How to Read the Scatter Plots 12354 0 20 40 60 80 100 Value-Added (2009-2010) Percent Prof/Adv (2009) Schools in your district A A. Students know a lot and are growing faster than predicted B B. Students are behind, but are growing faster than predicted C C. Students know a lot, but are growing slower than predicted D D. Students are behind, and are growing slower than predicted E E. Students are about average in how much they know and how fast they are growing

38 Achievement Target, Performance, and Achievement Shortfall – Retrospective View Achievement Target Average Prior Achievement Student Factor Classroom/School Performance Average Posttest Achievement Shortfall 43131 40 -2-6

39 Achievement Target, Performance, and Achievement Shortfall – Prospective View Achievement Target Average Prior Achievement Student Factor Classroom/School Performance Average Posttest Achievement Shortfall 43131 040 15NA 26 37

40 Achievement Target, Performance, and Achievement Shortfall – Prospective View Achievement Target Average Prior Achievement Student Factor Classroom/School Performance Average Posttest Achievement Shortfall 40 -2-6 0-5 10-4 21-3 32-2 43 540

41 The Pros and Cons of Using Attainment Only  It is straightforward to connect actual attainment with achievement targets and maintain a universal target  Average achievement and related attainment indicators such as percent proficient are severely biased as measures of classroom/school performance  Given a universal achievement target, the achievement shortfalls very enormously across teachers and schools

42 The Pros and Cons of Using Value- Added Only  The value-added model provides an unbiased/consistent estimate of classroom/school performance  High value-added targets do not eliminate achievement shortfalls if prior achievement (or more correctly, predicted achievement, which includes student growth) is extremely low

43 The Power of Using Both Indicators  The value-added model provides an unbiased/consistent estimate of classroom/school performance  Achievement shortfalls can be identified prospectively and thus can trigger supplemental resource allocations designed to eliminate them

44 Include Student-Level Demographics?  Yes, to provide more accurate measures of classroom/school performance  Does this reduce expectations?  No, achievement targets are set independently  Predicted achievement shortfalls are not reduced in a model that includes student demographics. In fact, they are identical  Supplemental resource allocations can be triggered to eliminate achievement shortfalls

45 Does Including Demographic Variables Matter? Value Added Difference-0.7-0.6-0.5-0.4-0.3-0.2-0.100.10.20.3 Percent of Schools 0.0008940.0044680.0017870.0285970.0339590.0643430.1617520.3413760.3083110.0536190.000894 Percent of Students 0.0006370.0041550.0018390.0264870.0293870.0606670.1418440.3234550.3415260.0687830.00122 Female0.388890.52340.605770.542720.490970.502770.502740.497020.478640.465810.44928 African American0.972220.834040.865380.755010.528280.298750.126530.047290.028790.018510.01449 Hispanic00.080850.019230.142190.309870.299910.161810.079870.053070.047040.01449 Asian00.02553200.0393860.0541520.0454680.0675640.038430.0294590.0218510.072464 Indian00.0127660.0096150.0040050.0048130.0090350.0244330.0211010.0156360.0046270 White0.027780.046810.105770.059410.102890.346840.619670.813210.873050.907970.89855 Free Reduced Lunch10.936170.826920.900530.904330.735350.560080.406380.298890.238050.15942


Download ppt "TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY."

Similar presentations


Ads by Google