Dr. Robert H. Meyer Research Professor and Director

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Using Growth Models to improve quality of school accountability systems October 22, 2010.
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY.
Upper Darby School District Growth Data
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
DATA ANALYTICS TO SUPPORT K-12 EDUCATION: 25 YEARS OF RESEARCH AND NATIONWIDE IMPLEMENTATION October 20, 2014 Robert H. Meyer, Research Professor, WCER.
Value-added Accountability for Achievement in Minneapolis Schools and Classrooms Minneapolis Public Schools December,
-- Preliminary, Do Not Quote Without Permission -- VALUE-ADDED MODELS AND THE MEASUREMENT OF TEACHER QUALITY Douglas HarrisTim R. Sass Dept. of Ed. LeadershipDept.
Copyright © 2001 by The Psychological Corporation 1 The Academic Competence Evaluation Scales (ACES) Rating scale technology for identifying students with.
Models for Evaluating MSP Projects Evaluation of Professional Development Programs MSP Regional Conference Dallas, Texas February 7, 2007 Norman L. Webb.
Jack Buckley Commissioner National Center for Education Statistics December 7, 2011.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Alicia Currin-Moore Executive Director, TLE Oklahoma State Department of Education.
THE DRAGON CONNECTION March Who are we?  Jefferson City Schools  Small, rural school district 60 miles north of Atlanta, 18 miles north of the.
Instruction, Teacher Evaluation and Value-Added Student Learning Minneapolis Public Schools November,
Measuring Student Growth in Educator Evaluation Name of School.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
Project on Educator Effectiveness & Quality Chancellor Summit September 27, 2011 Cynthia Osborne, Ph.D.
Issues in Assessment Design, Vertical Alignment, and Data Management : Working with Growth Models Pete Goldschmidt UCLA Graduate School of Education &
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
TVAAS Tennessee Value-Added Assessment System
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of School District School District.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Evaluating Impacts of MSP Grants Ellen Bobronnikov January 6, 2009 Common Issues and Potential Solutions.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Using EVAAS to Improve Student Performance Donna Albaugh Rachel McBroom Heather Stewart Region 4 PD Leads NCDPI.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
Copyright © 2010, SAS Institute Inc. All rights reserved. How Do They Do That? EVAAS and the New Tests October 2013 SAS ® EVAAS ® for K-12.
EVAAS and Expectations. Answers the question of how effective a schooling experience is for learners Produces reports that –Predict student success –Show.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Measuring Turnaround Success October 29 th, 2015 Jeanette P. Cornier, Ph.D.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
Session Objectives Decode the Teacher Summative Evaluation form, including the Student Achievement Measures, so it can be used to give teachers feedback.
Using Data to Identify Priorities in an Accountability System Jared E. Knowles, Wisconsin Department of Public Instruction.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
The use of Value-added Analysis to maximize student learning and narrow achievement gaps in Minnesota Dave Heistad, Ph.D., Executive Director Research,
Adding Value to the Value-Added Evaluation Approach: Linking to the Context of Educational Programs Chi-Keung (Alex) Chan, Mary Pickart, David Heistad,
CORE Academic Growth Model: Introduction to Growth Models
EXPERIMENTAL RESEARCH
What is Value Added?.
Evaluation of An Urban Natural Science Initiative
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
Educational Analytics
Preliminary Analysis of EOG/EVOS Data – Greene County 2009,2010,2011
THE JOURNEY TO BECOMING
FY17 Evaluation Overview: Student Performance Rating
Portability of Teacher Effectiveness across School Settings
EVAAS Overview.
Making Data Work for Kids: EVAAS Teacher Reports October 2012
Impact Evaluation Methods
Value Added in CPS.
Tell A Meaningful Story With Data Through Research
Measuring Student Growth
Non-Experimental designs: Correlational & Quasi-experimental designs
Milwaukee Public Schools University of Wisconsin-Milwaukee
Release of Preliminary Value-Added Data Webinar
Presentation transcript:

Value-Added Systems Presentation to the ISBE Performance Evaluation Advisory Council Dr. Robert H. Meyer Research Professor and Director Value-Added Research Center University of Wisconsin-Madison February 25, 2011

Attainment and Gain Attainment – a “point in time” measure of student proficiency compares the measured proficiency rate with a predefined proficiency goal. Gain – measures average gain in student scores from one year to the next

Attainment versus Gain Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8

Growth: Starting Point Matters Reading results of a cohort of students at two schools School 2006 Grade 4 Scale Score Avg. 2007 Grade 5 Average Scale Score Gain A 455 465 10 B 425* 455* 30 Grade 4 Proficient Cutoff 438 Grade 5 Proficient Cutoff 463 *Scale Score Average is below Proficient Example assumes beginning of year testing

Value-Added A kind of growth model that measures the contribution of schooling to student performance on the standardized tests Uses statistical techniques to separate the impact of schooling from other factors that may influence growth Focuses on how much students improve on the tests from one year to the next as measured in scale score points

Value-Added Model Definition A value-added model (VAM) is a quasi-experimental statistical model that yields estimates of the contribution of schools, classrooms, teachers, or other educational units to student achievement, controlling for non-school sources of student achievement growth, including prior student achievement and student and family characteristics. A VAM produces estimates of productivity under the counterfactual assumption that all schools serve the same group of students. This facilitates apples-to-apples school comparisons rather than apples-to-oranges comparisons. The objective is to facilitate valid and fair comparisons of productivity with respect to student outcomes, given that schools may serve very different student populations.

A More Transparent (and Useful) Definition of VA Value-added productivity is the difference between actual student achievement and predicted student achievement. Or, value-added productivity is the difference between actual student achievement and the average achievement of a comparable group of students (where comparability is defined by a set of characteristics such a prior achievement, poverty and ELL status).

In English x = + + + Post-on-Pre Link Posttest Pretest Student Characteristics School Effects Unobserved Factors + + + Value Added

VARC Philosophy Development and implementation of a value-added system should be structured as a continuous improvement process that allows for full participation of stakeholders Model Co-Build; Complete customization Analysis Reporting Value–added is one tool in a toolbox with multiple indicators

VARC Value-Added Partners Design of Wisconsin State Value-Added System (1989) Minneapolis (1992) Milwaukee (1996) Madison (2008) Wisconsin Value-Added System (2009) Milwaukee Area Public and Private Schools (2009) Racine (2009) Chicago (2006) Department of Education: Teacher Incentive Fund (TIF) (2006 and 2010) New York City (2009) Minnesota, North Dakota & South Dakota: Teacher Education Institutions and Districts (2009) Illinois (2010) Hillsborough County , FL (2010) Broward County, FL (2010) Atlanta (2010) Los Angeles (2010) Tulsa (2010)

Districts and States working with VARC Minneapolis Milwaukee Madison Racine Chicago New York City Los Angeles Tulsa Atlanta Hillsborough County Broward County

Measuring knowledge Many factors influence what a student learns and how their knowledge is measured A variety of measures, including (but not limited to) assessments, tell us what a student knows at a point in time. What are some ways we measure knowledge?

Measuring knowledge Large scale assessments Daily teacher assessments Local assessments used by the district Daily teacher assessments Observations MAP WKCE Diagnostic Test End-of-course Exam Daily Journal Unit Project After-school Activities Hands-on Project

The Simple Logic of Value-Added Analysis School Value-Added Report School specific data Grade level value-added Comparison Value-Added Reports Compare a school to other schools in the district, CESA, or state Also allows for grade level comparisons Tabular Data available for School Report and Comparison Reports

-How to read post-on-pre graph -Hallmark of growth: 7th grade made strongly predicts 8th grade math (as evident in the strong positive association) -Noise in test scores

Attainment and Value-Added

How complex should a value-added model be? Rule: "Simpler is better, unless it is wrong.“ Implies need for “quality of indicator/ quality of model” diagnostics.

Model Features Demographics Posttest on pretest link Measurement error Student mobility: dose model Classroom vs. teacher: unit vs. agent Differential effects Selection bias mitigation: longitudinal data Test property analysis

MAP vs. ISAT MAP dates: September, January, May MAP: uses Rasch equating ISAT: 3PL MAP: slightly higher reliability - ~0.96 in math, ~0.94 in reading ISAT math ~0.93, reading ~0.9 Cut scores on MAP are determined by equipercentile equating to ISAT

Minimal correlation between initial status and value-added

Grade-Level Statewide Results Subject State Grade N Mean Score SD of Score Reliability of Value-Added Math MN 3 59460 200.0 13.9 0.901 4 58346 210.8 14.6 0.916 5 57053 219.9 16.2 0.907 6 52400 226.7 16.5 0.873 7 47985 232.1 17.4 0.883 8 44227 236.4 17.9 0.823 9 26512 238.8 18.2 0.826

Grade-Level Statewide Results Subject State Grade N Mean Score SD of Score Reliability of Value-Added Math WI 3 43289 199.9 13.2 0.820 4 44140 209.3 13.7 0.842 5 43822 217.3 14.8 0.849 6 47004 222.7 15.2 0.836 7 44549 228.4 16.0 0.837 8 43246 233.1 16.8 0.865 9 26427 234.0 17.7 0.862

Grade-Level Statewide Results Subject State Grade N Mean Score SD of Score Reliability of Value-Added Reading WI 3 43139 194.8 15.1 0.736 4 43671 202.9 14.4 0.780 5 43668 209.7 13.8 0.737 6 46233 214.0 14.2 0.719 7 44616 218.3 14.0 0.792 8 43251 221.7 14.1 0.826 9 28066 223.3 0.843

MPS and MMSD Value-Added compared to Wisconsin 6th to 7th Grade (Nov 2006 – Nov 2007) Mathematics – State VA Model School Effects MPS School Effects MMSD School Effects School/District VA Productivity Parameters in WKCE Scale Score Units (Relative to State)

for more information about VARC and value-added Visit the VARC Website http://varc.wceruw.org/ for more information about VARC and value-added