TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012.

Slides:



Advertisements
Similar presentations
Value-Added Online Course Preview Value-Added Research Center University of Wisconsin-Madison October 17, 2011.
Advertisements

PVAAS (The Pennsylvania Value Added Assessment System )
Completing the Classroom Teacher and Non-Classroom Teacher Evaluations for Presented by: The Office of Talent Development Employee Evaluations.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
TECHNICAL AND CONSEQUENTIAL VALIDITY IN THE DESIGN AND USE OF VALUE-ADDED SYSTEMS LAFOLLETTE SCHOOL OF PUBLIC AFFAIRS & VALUE-ADDED RESEARCH CENTER, UNIVERSITY.
Pennsylvania Value-Added Assessment System (PVAAS): A Compass for Improvement Oxford Area School District February 14, 2013.
Upper Darby School District Growth Data
Measuring Student Growth An introduction to Value-Added.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
Districts and States Working with VARC Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA.
Enquiring mines wanna no.... Who is it? Coleman Report “[S]chools bring little influence to bear upon a child’s achievement that is independent of.
DATA ANALYTICS TO SUPPORT K-12 EDUCATION: 25 YEARS OF RESEARCH AND NATIONWIDE IMPLEMENTATION October 20, 2014 Robert H. Meyer, Research Professor, WCER.
JUNE 26, 2012 BOARD MEETING Measures of Academic Progress (MAP)
The Oak Tree Analogy. For the past year, these gardeners have been tending to their oak trees trying to maximize the height of the trees. Explaining the.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
POSTER TEMPLATE BY: Increasing the Accountability of Teachers to Reach all Students and their Needs Raymon J. Smith References.
IMPROVING ACHIEVEMENT AND CLOSING GAPS BETWEEN GROUPS: How Can Researchers Help? IES June, 2006.
Including a detailed description of the Colorado Growth Model 1.
Jack Buckley Commissioner National Center for Education Statistics December 7, 2011.
Value-Added Overview Sapulpa Public Schools August 16, 2012.
Statutory Requirements Quantitative Measures (35% of Total TLE) The State Board voted to use a Value Added Model to measure student academic growth for.
Jim Lloyd_2007 Educational Value Added Assessment System (EVAAS) Olmsted Falls City Schools Initial Presentation of 4 th Grade Students.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
Copyright ©2006. Battelle for Kids. Understanding & Using Value-Added Analysis.
PPT Presentation Template: This PPT includes all slides to present a district or building level overview of PVAAS. This was used with a district- wide.
PVAAS – Pennsylvania Value-Added Assessment System added_assessment_system_(pvaas)/8751.
Florida Department of Education Value-added Model (VAM) FY2012 Using Student Growth in Teacher and School Based Administrator Evaluations.
VALUE-ADDED NETWORK WORKSHOP May Agenda  9:00-11:00 Value-Added Updates  11:15-11:30 Welcome, Introductions  11:30-12:15 Keynote Luncheon (Kurt.
VALUE-ADDED REPORT INTERPRETATION AND FAQS Minnesota Report Example.
1 Watertown Public Schools Assessment Reports 2010 Ann Koufman-Frederick and Administrative Council School Committee Meetings Oct, Nov, Dec, 2010 Part.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
© 2011, Tulsa Public Schools Copyright © Tulsa Public Schools 2011 © 2011, Tulsa Public Schools Jana Burk, Tulsa Public Schools Fellow Office of Teacher.
I NFUSING V ALUE -A DDED INTO THE SLO PROCESS August 2015.
DRE FLDOE “Value-Added Model” School District of Palm Beach County Performance Accountability.
TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC)
EE Coaches August Welcome! Introductions –Who you are –Where you are from –An EE success from last year.
MMSD Value-Added Results January 3, Attainment versus Growth Grade 3Grade 4Grade 5Grade 6Grade 7Grade 8 2.
1 New York State Growth Model for Educator Evaluation 2011–12 July 2012 PRESENTATION as of 7/9/12.
PVAAS Public Data Release January January 20,
DVAS Training Find out how Battelle for Kids can help Presentation Outcomes Learn rationale for value-added progress measures Receive conceptual.
VALUE-ADDED NETWORK WORKSHOP May Agenda  8:30-10:30 Value-Added Growth Model Overview and Refresher  10:45-11:00 Welcome, Introductions  11:00-11:30.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
Value Added Model and Evaluations: Keeping It Simple Polk County Schools – November 2015.
PVAAS Overview: Evaluating Growth, Projecting Performance PVAAS Statewide Core Team Fall 2008.
October 24, 2012 Jonathan Wiens, PhD Accountability and Reporting Oregon Department of Education.
Florida Department of Education’s Florida Department of Education’s Teacher Evaluation System Student Learning Growth.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Accountability Training Review Agenda for Today: Review of calculation changes and what’s new In depth review of Closing Gaps calculations Graduation Rates.
Value Added Model Value Added Model. New Standard for Teacher EvaluationsNew Standard for Teacher Evaluations Performance of Students. At least 50% of.
Making Data Work for Kids: EVAAS Teacher Reports October 2012 SAS ® EVAAS ® for K-12.
1 New York State Growth Model for Educator Evaluation June 2012 PRESENTATION as of 6/14/12.
CORE Academic Growth Model: Introduction to Growth Models
CORE Academic Growth Model: Introduction to Growth Models
The CORE Data Collaborative and the Riverside County Office of Education Noah Bookman, Chief Strategy Officer (CORE) October 20, 2017.
TAILWINDS HEADWINDS. TAILWINDS HEADWINDS CNUSD SBAC ELA Based on the # of HW.
Bennett County School District
Preliminary Analysis of EOG/EVOS Data – Greene County 2009,2010,2011
ACE August 3, 2012 Dr. Russ Romans District Accountability Manager
FY17 Evaluation Overview: Student Performance Rating
Dr. Robert H. Meyer Research Professor and Director
EVAAS Overview.
CORE Academic Growth Model: Results Interpretation
Danvers Public Schools: Our Story
Using Data for Improvement
Student Mobility and Achievement Growth In State Assessment Mohamed Dirir Connecticut Department of Education Paper presented at National Conference.
Muncy School District November 9, 2015
CORE Academic Growth Model: Step-By-Step
CORE Academic Growth Model: Step-By-Step
Measuring Student Growth
Presentation transcript:

TEACHER EFFECTIVENESS INITIATIVE VALUE-ADDED TRAINING Value-Added Research Center (VARC) October 2012

Minneapolis Milwaukee Racine Chicago Madison Tulsa Atlanta New York City Los Angeles Hillsborough County NORTH DAKOTA SOUTH DAKOTA MINNESOTA WISCONSIN ILLINOIS Districts and States Working with VARC Collier County NEW YORK

The Power of Two & A more complete picture of student learning AchievementValue-Added Compares students’ performance to a standard Does not factor in students’ background characteristics Measures students’ performance at a single point in time Critical to students’ post- secondary opportunities Measures students’ individual academic growth longitudinally Factors in students’ background characteristics outside of the school’s control Critical to ensuring students’ future academic success Measures the impact of teachers and schools on academic growth Adapted from materials created by Battelle for Kids

How does Value-Added fit into the Teacher Effectiveness Initiative? Overview

Where does VARC get data? VARC State Departments of Education (Student Test Scores, Demographics, Enrollment; Teacher Licensure and Assignments) Institutions of Higher Education (Graduate Information) Pilot Districts (student-teacher linkages, non- NCLB test scores)

Where does VARC usually send School-Level and Grade-Level output? District Schools IHE? If District Shares Teachers VARC

Where will VARC send Teacher-Level output? VARC Teacher IHE (Graduates Only) Principals? Districts?

How will this inform your IHE’s Program Improvement? Teacher Education Program Improvement Graduate Value-Added IHE-Level Aggregate Value-Added Surveys Observational Data

varc.wceruw.org/welcome The Oak Tree Analogy

The Oak Tree Analogy Review

This method is analogous to using an Achievement Model. Gardener A Gardener B 61 in. 72 in. Method 1: Measure the Height of the Trees Today (One Year After the Gardeners Began)  Using this method, Gardener B is the more effective gardener.

This is analogous to a Simple Growth Model, also called Gain. 61 in. 72 in. Gardener A Gardener B Oak A Age 4 (Today) Oak B Age 4 (Today) Oak A Age 3 (1 year ago) Oak B Age 3 (1 year ago) 47 in. 52 in. +14 in. +20 in. Method 2: Compare Starting Height to Ending Height  Oak B had more growth this year, so Gardener B is the more effective gardener.

+20 Average + 3 for Rainfall - 3 for Soil + 2 for Soil - 8 for Temp + 5 for Temp _________ +12 inches During the year _________ +22 inches During the year 59 in. 74 in. Gardener A Gardener B 47 in. 52 in. - 5 for Rainfall Controlling for Non-Gardener Factors  The predicted height for trees in Oak A’s conditions is 59 inches.  The predicted height for trees in Oak B’s conditions is 74 inches. Predicted Oak A Predicted Oak B

This is analogous to a Value-Added measure. Above Average Value-Added Below Average Value-Added Predicted Oak A Predicted Oak B Actual Oak A Actual Oak B 59 in. 74 in. Gardener A Gardener B 61 in. 72 in Method 3: Compare the Predicted Height to the Actual Height  By accounting for last year’s height and environmental conditions of the trees during this year, we found the “value” each gardener “added” to the growth of the trees.

Sample Report Review

Page 1 Color Coding Explanation Table of Contents Reporting Period and Context

Page 2 School-Level Value-Added Estimates Grade-Level Value-Added Estimates

Page 2 Top School-Level Value-Added NUMBER OF STUDENTS (WEIGHTED) NUMBER OF STUDENTS (WEIGHTED) VALUE-ADDED ESTIMATES Past Academic Year Up-To-3-Year Average READING MATH School-Level Value-Added Subject Level of Analysis Value-Added Estimate Point Estimate (number in color-coded bubble) 95% Confidence Interval (black line) Number of students included in the analysis Past Academic Year Up-To-3-Year Average

Page 2 Bottom Grade-Level Value-Added Grade 3 Grade 4 Grade 5 Grade 3 Grade 4 Grade 5 FAQ 1: Which school year is this?

Value-Added on the NDSA Grade 3 Summer Grade 4 Summer Grade 5 Summer Grade 6 Nov 3 rd Grade Value-Added 4 th Grade Value-Added 5 th Grade Value-Added  4 th grade example:  “Starting knowledge” is the November th grade test.  “Ending knowledge” is the November th grade test.  This aligns to growth in the th grade school year.  Why don’t we have 8 th grade Value-Added in North Dakota?

Page 2 Bottom Grade-Level Value-Added FAQ 2: How do I interpret the “Up-To-3-Year Average”? Grade 3 Grade 4 Grade 5 Grade 3 Grade 4 Grade

 NOT Jimmy as he goes through three consecutive school years  3 rd grade to 4 th grade  4 th grade to 5 th grade  5 th grade to 6 th grade  3 rd grade team with  cohort (3 rd grade to 4 th grade)  cohort (3 rd grade to 4 th grade)  cohort (3 rd grade to 4 th grade)  Keep teacher mobility in mind Does not follow individual students for 3 years Represents the 3 rd grade teaching team over three cohorts of students What Does “Up-To-3-Year Average” Mean for the 3 rd Grade?

NUMBER OF STUDENTS (WEIGHTED) NUMBER OF STUDENTS (WEIGHTED) VALUE-ADDED ESTIMATES Past Academic Year Up-To-3-Year Average READING Grade-Level Value-Added Grade 3 Grade 4 Grade 5 What Does “Up-To-3-Year Average” Mean? rd Graders  The “Past Academic Year” represents longitudinal growth over a single school year rd Gr rd Gr rd Gr th Graders th Graders th Gr th Gr th Gr th Gr th Gr th Gr.  The “Up-To-3-Year Average” represents average longitudinal growth of three different groups of students at each grade level.

NUMBER OF STUDENTS (WEIGHTED) NUMBER OF STUDENTS (WEIGHTED) VALUE-ADDED ESTIMATES Past Academic Year Up-To-3-Year Average READING Grade-Level Value-Added What Does “Up-To-3-Year Average” Mean?  Which grade-level teaching team…  Was most effective in the school year?  Was most effective over the past three school years?  Was more effective in than in the past? Grade 3 Grade 4 Grade

Grade 3 Grade 4 Grade 5 Grade 3 Grade 4 Grade FAQ 3: Does this show student growth to go from red to yellow to green over time? Page 2 Bottom Grade-Level Value-Added

Value-Added, Not Achievement Grade 3 61 READING Grade 4 63 Grade Grade 3 61 MATH Grade 4 63 Grade  In your groups:  Describe this school’s math performance  Describe this school’s reading performance

Page 2 Bottom Grade-Level Value-Added Grade 3 Grade 4 Grade 5 Grade 3 Grade 4 Grade FAQ 4: Why are there non-integer numbers of students?

Mobile Students  If a student is enrolled in more than one school between the November NDSA administration and the end of the school year, each school gets credit for a portion of the student’s growth. Grade 3 Nov NDSA School ASchool B 45% Attributed to B 55% Attributed to A End of School Year

Student Group Interpretation

Pages 3 & 4 School-Level Value-Added Estimates by Student Groups: Special Ed Low Income Gender LEP

Example Student Group Interpretation  At this school which student group is growing faster than their similar peers from across the state?  Does that mean the “Special Ed” group grew more scale score points on the test than “Not Special Ed” group? NUMBER OF STUDENTS (WEIGHTED) NUMBER OF STUDENTS (WEIGHTED) VALUE-ADDED ESTIMATES Past Academic Year Up-To-3-Year Average READING By Special Ed Special Ed Not Special Ed

Student Group Interpretation What Would You Do?  What do these results mean?  If this was your school, how could you use these results to monitor instructional improvement? NUMBER OF STUDENTS (WEIGHTED) NUMBER OF STUDENTS (WEIGHTED) VALUE-ADDED ESTIMATES Past Academic Year Up-To-3-Year Average By LEP LEP Not LEP MATH

Page 5 School-Level Value-Added and Achievement Scatter Plot Interpretation

Page 5 Scatter Plots

How to Read the Scatter Plots Value-Added ( ) Percent Prof/Adv (2010) These scatter plots are a way to represent Achievement and Value-Added together Achievement Value-Added

How to Read the Scatter Plots Value-Added ( ) Percent Prof/Adv (2010) Schools in your district A A. Students know a lot and are growing faster than predicted B B. Students are behind, but are growing faster than predicted C C. Students know a lot, but are growing slower than predicted D D. Students are behind, and are growing slower than predicted E E. Students are about average in how much they know and how fast they are growing

Page 6 (or 6 & 7 for large grade span schools) Grade-Level Value-Added and Achievement

Page 6 Example (Grade 4)

Last Pages (Part 1) 1-5 Value-Added Scale Note Regarding Comparison of Student Groups Number of Students (Weighted)

Last Pages (Part 2) List of Control Variables Note Regarding the Bush Foundation List of Reasons for Result Suppression