Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.

Slides:



Advertisements
Similar presentations
Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.
Advertisements

Summary of NCDPI Staff Development 12/4/12
Value-Added Online Course Preview Value-Added Research Center University of Wisconsin-Madison October 17, 2011.
FLORIDA’S VALUE ADDED MODEL FLORIDA’S VALUE ADDED MODEL Overview of the Model to Measure Student Learning Growth on FCAT January
Data Warehousing How do we use the data now that we have it? David Rutherford Director, Instructional Support Services ONC BOCES May 18, 2007.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Haywood County Schools February 20,2013
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
Comparing Growth in Student Performance David Stern, UC Berkeley Career Academy Support Network Presentation to Educating for Careers/ California Partnership.
TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY.
Upper Darby School District Growth Data
Dallas ISD’s Value-Added Model School Effectiveness Index (SEI) Classroom Effectiveness Index (CEI) Data Analysis, Reporting, and Research Services.
Measuring Student Growth An introduction to Value-Added.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
June  Articulate the impact SLG goals have on improving student learning  Identify the characteristics of assessments that measure growth and.
Who is Smarter? Academic or CTE Students? Jack Elliot Jim Knight The University of Arizona.
Evaluating Pretest to Posttest Score Differences in CAP Science and Social Studies Assessments: How Much Growth is Enough? February 2014 Dale Whittington,
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
Partnering to help all kids learn Wisconsin Student Learning Objectives (SLOs)
DATA ANALYTICS TO SUPPORT K-12 EDUCATION: 25 YEARS OF RESEARCH AND NATIONWIDE IMPLEMENTATION October 20, 2014 Robert H. Meyer, Research Professor, WCER.
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Introduction to the Georgia Student Growth Model Student Growth Percentiles 1.
Jack Buckley Commissioner National Center for Education Statistics December 7, 2011.
Value-Added Overview Sapulpa Public Schools August 16, 2012.
DRE Agenda Student Learning Growth – Teacher VAM – School Growth PYG Area Scorecards. PYG, and other Performance Indicators.
Statutory Requirements Quantitative Measures (35% of Total TLE) The State Board voted to use a Value Added Model to measure student academic growth for.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
The Impact of Including Predictors and Using Various Hierarchical Linear Models on Evaluating School Effectiveness in Mathematics Nicole Traxel & Cindy.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
VALUE-ADDED NETWORK WORKSHOP May Agenda  9:00-11:00 Value-Added Updates  11:15-11:30 Welcome, Introductions  11:30-12:15 Keynote Luncheon (Kurt.
State Charter Schools Commission of Georgia SCSC Academic Accountability Update State Charter School Performance
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
Growth Model for District “X” Why Use Growth Models? Showing progress over time is a more fair way of evaluating It is not just a “snap shot” in time.
The Power of Two: Achievement and Progress. The Achievement Lens Provides a measure of what students know and are able to do relative to the Ohio standards,
PROGRESS & & ACHIEVEMENT Pennsylvania Value-Added Assessment System (PVAAS) The Power of Derry Township School District School Board Presentation Sept.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
June  Articulate the impact SLG goals have on improving student learning  Identify the characteristics of assessments that measure growth and.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
Evaluation Planning & Reporting for School Climate Transformation Grant (SCTG) Sites Bob Algozzine University of North Carolina at Charlotte Steve GoodmanMichigan's.
VALUE-ADDED NETWORK WORKSHOP May Agenda  8:30-10:30 Value-Added Growth Model Overview and Refresher  10:45-11:00 Welcome, Introductions  11:00-11:30.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Cherokee County Schools February.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
EVAAS Proactive and Teacher Reports: Assessing Students’ Academic Needs and Using Teacher Reports to Improve Student Progress Becky Pearson and Joyce Gardner.
Tuesday, November 27, 2015 Board of Education Meeting District MAP.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
VAM Training. Florida’s value-added model developed by Florida educators  The Department convened a committee of stakeholders (Student Growth Implementation.
Copyright © 2014 American Institutes for Research and Cleveland Metropolitan School District. All rights reserved. March 2014 Interpreting Vendor Assessment.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
Florida Algebra I EOC Value-Added Model June 2013.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
CORE Academic Growth Model: Introduction to Growth Models
CORE Academic Growth Model: Introduction to Growth Models
What is Value Added?.
The CORE Data Collaborative and the Riverside County Office of Education Noah Bookman, Chief Strategy Officer (CORE) October 20, 2017.
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
Dr. Robert H. Meyer Research Professor and Director
CORE Academic Growth Model: Results Interpretation
New York State Education Department Using Growth Measures for Educator Evaluation August 2012.
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Measuring Student Growth
Presentation transcript:

Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA SOUTH DAKOTA MINNESOTA WISCONSIN ILLINOIS Denver Collier County

The Power of Two & A more complete picture of student learning AchievementValue-Added Compares students’ performance to a standard Does not factor in students’ background characteristics Measures students’ performance at a single point in time Critical to students’ post- secondary opportunities Measures students’ individual academic growth longitudinally Factors in students’ background characteristics outside of the school’s control Critical to ensuring students’ future academic success Measures the impact of teachers and schools on academic growth Adapted from materials created by Battelle for Kids

Design Quasi-experimental statistical model Controls for non- school factors (prior achievement, student and family characteristics) Output Productivity estimates for contribution of educational units (schools, classrooms, teachers) to student achievement growth Objective Valid and fair comparisons of school productivity, given that schools may serve very different student populations Value-Added Model Description

The Oak Tree Analogy

For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Gardener A Gardener B Gardener A Gardener B Explaining Value-Added by Evaluating Gardener Performance

This method is analogous to using an Achievement Model. Using this method, Gardener B is the better gardener. Gardener A Gardener B 61 in. 72 in. Method 1: Measure the Height of the Trees Today (One Year After the Gardeners Began)

How is this similar to how schools have been judged in New York? What information is missing? Pause and Reflect

We need to find the starting height for each tree in order to more fairly evaluate each gardener’s performance during the past year. 61 in. 72 in. Gardener A Gardener B Oak A Age 4 (Today) Oak B Age 4 (Today) Oak A Age 3 (1 year ago) Oak B Age 3 (1 year ago) 47 in. 52 in. This Achievement Result is not the Whole Story

61 in. 72 in. Gardener A Gardener B Oak A Age 4 (Today) Oak B Age 4 (Today) Oak A Age 3 (1 year ago) Oak B Age 3 (1 year ago) 47 in. 52 in. This is analogous to a Simple Growth Model, also called Gain. +14 in. +20 in. Oak B had more growth this year, so Gardener B is the better gardener. Method 2: Compare Starting Height to Ending Height

This is an “apples to oranges” comparison. For our oak tree example, three environmental factors we will examine are: Rainfall, Soil Richness, and Temperature. Gardener A Gardener B What About Factors Outside the Gardeners’ Influence?

External condition Oak Tree AOak Tree B Rainfall amount Soil richness Temperature High Low Low High High Low Gardener A Gardener B

Gardener A Gardener B We compare the actual height of the trees to their predicted heights to determine if the gardener’s effect was above or below average. We need to analyze real data from the region to predict growth for these trees. How Much Did These External Factors Affect Growth?

In order to find the impact of rainfall, soil richness, and temperature, we will plot the growth of each individual oak in the region compared to its environmental conditions.

RainfallLowMediumHigh Growth in inches relative to the average Soil RichnessLowMediumHigh Growth in inches relative to the average -3+2 TemperatureLowMediumHigh Growth in inches relative to the average Calculating Our Prediction Adjustments Based on Real Data

Next, we will refine our prediction based on the growing conditions for each tree. When we are done, we will have an “apples to apples” comparison of the gardeners’ effect. Oak A Age 3 (1 year ago) Oak B Age 3 (1 year ago) 67 in. 72 in. Gardener A Gardener B Oak A Prediction Oak B Prediction 47 in. 52 in. +20 Average Make Initial Predictions for the Trees Based on Starting Height

Similarly, for having low rainfall, Oak B’s prediction is adjusted by -5 to compensate. For having high rainfall, Oak A’s prediction is adjusted by +3 to compensate. 70 in. 67 in. Gardener A Gardener B 47 in. 52 in. +20 Average + 3 for Rainfall - 5 for Rainfall Based on Real Data, Customize Predictions for Rainfall

For having rich soil, Oak B’s prediction is adjusted by +2. For having poor soil, Oak A’s prediction is adjusted by in. 69 in. Gardener A Gardener B 47 in. 52 in. +20 Average + 3 for Rainfall - 3 for Soil + 2 for Soil - 5 for Rainfall Adjusting for Soil Richness

For having low temperature, Oak B’s prediction is adjusted by +5. For having high temperature, Oak A’s prediction is adjusted by in. 74 in. Gardener A Gardener B 47 in. 52 in. +20 Average + 3 for Rainfall - 3 for Soil + 2 for Soil - 8 for Temp + 5 for Temp - 5 for Rainfall Adjusting for Temperature

+20 Average + 3 for Rainfall - 3 for Soil + 2 for Soil - 8 for Temp + 5 for Temp _________ +12 inches During the year _________ +22 inches During the year The predicted height for trees in Oak B’s conditions is 74 inches. The predicted height for trees in Oak A’s conditions is 59 inches. 59 in. 74 in. Gardener A Gardener B 47 in. 52 in. - 5 for Rainfall Our Gardeners are Now on a Level Playing Field

Oak B’s actual height is 2 inches less than predicted. We attribute this below-average result to the effect of Gardener B. Oak A’s actual height is 2 inches more than predicted. We attribute this above-average result to the effect of Gardener A. Predicted Oak A Predicted Oak B Actual Oak A Actual Oak B 59 in. 74 in. Gardener A Gardener B 61 in. 72 in Compare the Predicted Height to the Actual Height

This is analogous to a Value-Added measure. By accounting for last year’s height and environmental conditions of the trees during this year, we found the “value” each gardener “added” to the growth of the tree. Above Average Value-Added Below Average Value-Added Predicted Oak A Predicted Oak B Actual Oak A Actual Oak B 59 in. 74 in. Gardener A Gardener B 61 in. 72 in Method 3: Compare the Predicted Height to the Actual Height

Oak Tree AnalogyValue-Added in Education What are we evaluating? Gardeners Districts Schools Grades Classrooms Programs and Interventions How does this analogy relate to value added in the education context? What are we using to measure success? Relative height improvement in inches Relative improvement on standardized test scores Sample Single oak tree Groups of students Control factors Tree’s prior height Other factors beyond the gardener’s control: Rainfall Soil richness Temperature Students’ prior test performance (usually most significant predictor) Other demographic characteristics such as: Grade level Gender Race / Ethnicity Low-Income Status ELL Status Disability Status

A Visual Representation of Value-Added Year 2 (Post-test) Actual student achievement scale score Predicted student achievement (Based on observationally similar students) Value-Added Starting student achievement scale score Year 1 (Prior-test)

District Model Statewide Model Unified Output and Messaging Best Reference Point for Relative Growth Measurement Largest Pool of Data to Refine Predictions Customized Model Design Producing a Value-Added Model for New York State

The Value-Added model typically generates a set of results measured in scale scores. What do Value-Added Results Look Like? TeacherValue-Added Teacher A+10 Teacher B-10 Teacher C0 This teacher’s students gained 10 more points on the RIT scale than observationally similar students across the state. (10 points more than predicted) 10 points fewer than predicted These students gained exactly as many points as predicted

Value-Added in “Tier” Units Grade In some cases, Value-Added is displayed on “Tier” scale based on standard deviations (z-score) for reporting purposes. About 95% of estimates will fall between -2 and +2 on the scale.

Transformation Example IneffectiveDeveloping Effective Highly Effective

Transformation Example IneffectiveDeveloping Effective Highly Effective

Transformation Example IneffectiveDeveloping Effective Highly Effective

Transformation Example IneffectiveDeveloping Effective Highly Effective

Transformation Example IneffectiveDeveloping Effective Highly Effective

Advisor Group Value-Added Research Center New York BOCES and Districts Other New York Stakeholders Northwest Evaluation Association How do we Translate Value- Added into the 0-20 Scale?

Output Formats VARC Value-Added Output 0-20 scale based on advisory group’s recommended methodology scale score standard deviations