Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia.

Slides:



Advertisements
Similar presentations
1 Survey of Super LEAs Evaluation Systems Performance Evaluation Advisory Council July 16 th, 2010.
Advertisements

Value Added in CPS. What is value added? A measure of the contribution of schooling to student performance Uses statistical techniques to isolate the.

Purpose of Instruction
TEACHER QUALITY AND DISTRIBUTION Principals and Teachers Effectiveness and Evaluation NSBA’s Federal Relations Network Conference February
Planning for NECAP Score Release RIDE Meeting January
Texas State Accountability 2013 and Beyond Current T.E.A. Framework as of March 22, 2013 Austin Independent School District Bill Caritj, Chief Performance.
Getting Organized for the Transition to the Common Core What You Need to Know.
Teacher & Principal Evaluation: As Easy as Doing the Hula.
DC CAS Kickoff Tamara Reavis Director Standards, Assessment, and Accountability.
Student Learning Targets (SLT) You Can Do This! Getting Ready for the School Year.
August 15, 2012 Fontana Unified School District Superintendent, Cali Olsen-Binks Associate Superintendent, Oscar Dueñas Director, Human Resources, Mark.
Massachusetts Department of Education EDUCATOR DATABASE Informational Sessions Overview: September 2005 Web:
MEASURING TEACHING PRACTICE Tony Milanowski & Allan Odden SMHC District Reform Network March 2009.
Essential Elements in Implementing and Monitoring Quality RtI Procedures Rose Dymacek & Edward Daly Nebraska Department of Education University of Nebraska-
Engaging Local Stakeholders Panel Presenters: Iowa; Oregon; Virginia; Washington, DC SLDS Webinar September 30, 2011.
CIVIL RIGHTS DATA COLLECTION National Forum: PPI Committee: 2011 MIS Meeting.
Why Student Perceptions Matter Rob Ramsdell, Co-founder April 2015.
Minnesota Manual of Accommodations for Students with Disabilities Training Guide
Why Summer Learning Matters - to Boston and the Nation Summer Learning: Bridging the Opportunity and Achievement Gap April 3, 2013 Will Miller President,
Student Perception Survey Toolkit Colorado’s Student Perception Survey Planning Webinar for Districts.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
School Leadership Evaluation System Orientation SY13-14 Evaluation Systems Office, HR Dr. Michael Shanahan, CHRO.
Shared Decision Making: Moving Forward Together
Meeting of the Staff and Curriculum Development Network December 2, 2010 Implementing Race to the Top Delivering the Regents Reform Agenda with Measured.
An Update on Florida’s Charter Schools Program Grant: CAPES External Evaluation 2014 Florida Charter Schools Conference: Sharing Responsibility November.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
March 28, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
John Cronin, Ph.D. Director The Kingsbury NWEA Measuring and Modeling Growth in a High Stakes Environment.
Thebroadfoundations PAY FOR PERFORMANCE PACE Conference Oakland and Los Angeles, CA March 2009.
Committee of Practitioners ESEA Flexibility Waiver Review June 25, 2014.
OFFICE OF FIELD SERVICES SPRING PLANNING WORKSHOP 2012.
1 New Jersey State Funded Nonpublic School Programs School Year Nonpublic School Services Division of Charter Schools, School Choice and Educational.
© 2014, Battelle for Kids. All Rights Reserved. ROSTER VERIFICATION OVERVIEW © 2014, Battelle for Kids. All Rights Reserved.
© 2014, Florida Department of Education. All Rights Reserved Annual District Assessment Coordinator Meeting VAM Update.
Technology Use Plan Bighorn County School District #4 Basin / Manderson, Wyoming “Life-long learning through attitude, academics, and accountability.”
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
EEX 3257 COOPERATIVE LEARNING. BENEFITS OF COOPERATIVE LEARNING Academic Benefits Increased achievement and increased retention of knowledge Improved.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Final Reports from the Measures of Effective Teaching Project Tom Kane Harvard University Steve Cantrell, Bill & Melinda Gates Foundation.
Urban Implementation of Positive Behavioral Interventions and Supports in Public and Charter Schools Presenters: Stephanie Wood-Garnett, Executive Director.
Teacher and Principal Evaluations and Discipline Under Chapter 103.
STAR3 Project for WS/FCS. STAR3 All students deserve and thrive under a great teacher that cares for their well being. Our responsibility is to provide.
Las Cruces Public Schools Principal Evaluation Overview Stan Rounds Superintendent Stan Rounds Superintendent.
Mt. Diablo Unified School District Elementary Mathematics Adoption Buy Back Day Thursday, August 27, 2009.
LEARNING FROM PRACTICE: OPENING THE BLACK BOX OF CONSULTING ENGAGEMENTS Supporting material: SMS Conference Dr. Paul N. Friga.
Data for Action 2011: Implications for Teacher Effectiveness February 16, 2012 Laura Sonn Associate, State Policy Initiatives Data Quality Campaign.
“Value added” measures of teacher quality: use and policy validity Sean P. Corcoran New York University NYU Abu Dhabi Conference January 22, 2009.
ISAT FREQUENTLY ASKED QUESTIONS. Why is OSBE making so many changes to the ISAT? The contract between NWEA & OSBE was due to expire. Even if the contractor.
School Effectiveness Framework Building effective learning communities together October 2009 Michelle Jones Professional Adviser WAG.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
New Employee Induction Program
Teacher Evaluation Systems 2.0: What Have We Learned? EdWeek Webinar March 14, 2013 Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
District of Columbia Public Schools | 1200 First Street, NE | Washington, DC | T | F | Lessons Learned.
2013.  Familiarize staff with parent involvement requirements  Learn process to involve parents in the development of activities and policies  Learn.
GEORGIA’S CRITERION-REFERENCED COMPETENCY TESTS (CRCT) Questions and Answers for Parents of Georgia Students February 11, 2009 Presented by: MCES.
1 School Board and Superintendent Continuous Improvement and Evaluation Quality New Mexico June 9, 2011.
Title III: 101 Jacqueline A. Iribarren Ph.D. Title III, ESL & Bilingual Ed. Consultant October 20, 2011.
Overview of SB 191 Ensuring Quality Instruction through Educator Effectiveness Colorado Department of Education September 2010.
Summer Series, 2007 Building Capacity to Make Research-Based Practice Common Practice In Georgia Utilizing the Keys to Quality.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
1 Update on Teacher Effectiveness July 25, 2011 Dr. Rebecca Garland Chief Academic Officer.
CaMSP Science Assessment Webinar Public Works, Inc. Sharing Lessons Learned in the Development and Use of Science Assessments for CaMSP Teachers and Students.
 Mark D. Reckase.  Student achievement is a result of the interaction of the student and the educational environment including each teacher.  Teachers.
2018 OSEP Project Directors’ Conference
2019 Title I Annual Parent Meeting
Presentation transcript:

Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia Public Schools Eric Isenberg, Mathematica Policy Research Association for Education Finance and Policy 37 th Annual Conference March 16, 2012

 Statistical model predicts student achievement  Account for pretests, student characteristics  Ranks teachers relative to an average teacher Value Added Teacher value added = Students’ actual end-of-year test scores – Students’ predicted end-of-year test scores 2

Value Added in DCPS Evaluation System

 Implementation requires sufficient capacity  Communication strategy is vital  Value added is worth the investment Key Points 4

Where We Were in 2007 District of Columbia Public Schools | Summer th grade reading proficiency (2007 NAEP) Teachers meeting or exceeding expectations 12% VS 95%

Why Value Added for DCPS?  Fairest way to evaluate teachers  Objective, data-based measure  Focused on student achievement 6

Value Added in DCPS Evaluation System  Individual value-added measures: 50 percent of eligible teachers’ IMPACT scores IVA:Individual value added TLF: Teaching and learning framework (classroom observations) CSC:Commitment to school community SVA: School value added 7

 Highly effective: performance pay  Ineffective (one year): subject to separation  Minimally effective (consecutive years): subject to separation 8 IMPACT Is High Stakes

Overall Performance Distribution PPEP vs. IMPACT District of Columbia Public Schools | Summer 2011 n=3,469 9

Value Added in DC DateValue Added 2009DCPS (trial run) First year of IMPACT in DCPS Second year of IMPACT in DCPS October presentThird year of IMPACT in DCPS First year of Race to the Top for DCPS and DC charter schools 10

Help for DC Public Schools 11  Mathematica Policy Research  Technical Advisory Board [2012] –Steve Cantrell, Gates Foundation –Laura Hamilton, RAND Corporation –Rick Hanushek, Stanford University –Kati Haycock, Education Trust –David Heistad, Minneapolis Public Schools –Jonah Rockoff, Columbia Business School –Tim Sass, Georgia State University –Jim Wyckoff, University of Virginia

Mathematica’s Work with DC Schools

Challenges  Consider face validity, incentive effects  Teacher-student link data can be challenging  All data decisions shared with district  Timeline must allow DCPS to transition out poor performers, hire new teachers 13

No One-Size-Fits-All Value Added Model  Choosing student characteristics: communications challenge for race/ethnicity  Multiple years of data: bias/precision trade-off  Joint responsibility for co-teaching –Cannot estimate model of separate teacher effects –Can estimate “teams” model, but should team estimates count?  Comparing teachers of different grades 14

Roster Confirmation  Teacher-student links critical for value added  Administrative data can be challenging –Specialized elementary school teachers –Co-teaching –Pull-out and push-in programs –Midyear student transfers  Teachers surveyed to confirm administrative roster data (Battelle for Kids) 15

Business Rules: Documenting Data Decisions  Every data decision defined, discussed, documented beforehand  Let OSSE, DCPS review all decisions  Document entire process  Make quick progress when final data arrive 16

Production: Meeting Timelines, Ensuring Accuracy  October data: formulate business rules  February data –Establish data cleaning programs –Begin trial runs from analysis file to final output  April data: Final student data in trial runs  June (test score) data: produce final results 17

Perspective of State Education Agency

Race To The Top 19  Federal competition between states  Required student achievement to contribute 50% of teacher evaluation score  Decision to use DCPS value-added model for all eligible DC teachers  Brought DCPS and charter schools together  Each charter school LEA has own evaluation system used to inform personnel decisions

Common Decision-Making  Need to make decisions on value added – Quickly to meet production schedule – Informed by best available data – Obtains buy-in from charter schools and DCPS  Technical Support Committee (TSC) – Six members: five charter, one DCPS – Meets periodically – Consensus decisions sought 20

Data Infrastructure  Most data elements for value added exist ... but not necessarily collected on right schedule  Student background characteristics – Collected twice a year for AYP purposes – Need three-time-a-year collection, earlier schedule for value added 21

Need Capacity Within District  Do not just hire a contractor  Need dedicated staff to answer questions – Data team – Technical Support Committee 22

Communicating Results to DC Teachers

Communication Strategy  Value added hard to understand – Requires a strong statistical background – Final information is hard to connect to familiar test scores – Different from other student achievement measures teachers commonly use  Communication tools – Guidebooks – Information sessions 24

What Factors Affect a Student’s Achievement? 25 Teacher’s Level of Expectations Teacher’s Pedagogical Expertise Teacher’s Ability to Motivate Teacher’s Content Knowledge Student’s Prior Learning Student’s Disability (If Any) Student’s English ProficiencyStudent’s Resources at Home Student Achievement As Measured by the DC CAS Value-added isolates the teacher’s impact on student achievement.

Initiatives Under Development  Student-level output for DC teachers – Would show pretest, predicted posttest, actual posttest score for each student – May be in graphical format  Intermediate value-added scores – Individual value-added scores based on intermediate tests – Could be given to teachers midyear 26

Conclusions  Implementing value added requires... – Availability and accessibility of current data – Confirmation of teacher-student links – Careful planning of production process – Sufficient capacity within local and/or state education agency to interact with value-added contractor  Teacher buy-in is not a given – communication strategy is vital  Properly implemented, value added is worth the investment – Fairest measure of teacher effectiveness – Provides data for answering research questions 27