District of Columbia Public Schools | 1200 First Street, NE | Washington, DC 20002 | T 202.442.5885 | F 202.442.5026 | www.dcps.dc.gov Lessons Learned.

Slides:



Advertisements
Similar presentations
Briefing: NYU Education Policy Breakfast on Teacher Quality November 4, 2011 Dennis M. Walcott Chancellor NYC Department of Education.
Advertisements

Common Core State Standards OVERVIEW CESA #9 - September 2010 Presented by: CESA #9 School Improvement Services Jayne Werner and Yvonne Vandenberg.
Overview of the New Massachusetts Educator Evaluation Framework Opening Day Presentation August 26, 2013.
Evidence: First… 1. Assemble your district team to include teachers, administrators, association representatives 2. Research and select an instructional.
2012 © 2012, Battelle for Kids. All Rights Reserved. ROSTER VERIFICATION Principal and Support Team Guide.
Proposal Writing Workshop Features of Effective Proposals: Fellowship Track Washington, DC January 9, 2014.
Link Before You Leap Ohio RttT Webinar Presented by Battelle for Kids June 9, 2011 Race to the Top.
Consistency of Assessment
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Matt Moxham EDUC 290. The Idaho Core Teacher Standards are ten standards set by the State of Idaho that teachers are expected to uphold. This is because.
March, What does the new law require?  20% State student growth data (increases to 25% upon implementation of value0added growth model)  20%
The Education Adjustment Program Profile – Revised.
Deliberate Practice Technical Assistance Day
Session Materials  Wiki
Session Materials Wireless Wiki
Welcome What’s a pilot?. What’s the purpose of the pilot? Support teachers and administrators with the new evaluation system as we learn together about.
Clinical Teaching/Student Teaching
CHICAGO PUBLIC SCHOOLS IMPACT Gradebook Core Team Support Guidelines July 7, 2008 Instructional Management Program & Academic Communication Tool I M P.
Value Added for Teacher Evaluation in the District of Columbia Robin Chait, Office of the State Superintendent of Education Anna Gregory, District of Columbia.
Student Learning Objectives 1 Phase 3 Regional Training April 2013.
2014 E DUCATIONAL T ECHNOLOGY P LAN P ROJECT K ICKOFF.
Washington State Teacher and Principal Evaluation 1.
Jackson Public School District Holistic Accountability in Action.
FAEIS Project User Opinion Survey 2005 Thursday, June 23, 2005 Washington, D.C. H. Dean SutphinYasamin Miller ProfessorDirector, SRI Agriculture & Extension.
GTEP Resource Manual Training 2 The Education Trust Study (1998) Katie Haycock “However important demographic variables may appear in their association.
Documenting Current Practice and Defining Key Dimensions Ann-Marie Faria, Ph.D. & Jessica Heppen, Ph.D. American Institutes for Research.
Master Teacher Program Presenters:Ginny Elliott Winifred Nweke.
© 2014, Battelle for Kids. All Rights Reserved. ROSTER VERIFICATION OVERVIEW © 2014, Battelle for Kids. All Rights Reserved.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
2014 © 2014, Battelle for Kids. All Rights Reserved. ROSTER VERIFICATION Teacher Guide.
Educator Growth and Professional Development. Objectives for this session The SLT will…  Have a thorough understanding of High Quality Standard 5: Educator.
Northern Humboldt Union High School District Performance-Based Compensation William J. Slotnik Executive Director Community Training and Assistance Center.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
THE DANIELSON FRAMEWORK. LEARNING TARGET I will be be able to identify to others the value of the classroom teacher, the Domains of the Danielson framework.
South Western School District Differentiated Supervision Plan DRAFT 2010.
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
May 29, 2013 Chanute USD 413 And Kansas State Department of Education.
Hastings Public Schools PLC Staff Development Planning & Reporting Guide.
Washington State Teacher and Principal Evaluation Project Update 11/29/12.
Other Options Design Group: Jeanne Bennett, Corrine Chalifour, Warren Dobbins, Julie Jones, Lindi McCurry, Jigna Patel, Sue Varga-Ward, Tom Tomberlin March.
Copyright © 2009 National Comprehensive Center for Teacher Quality. All rights reserved. Evaluating Teacher Effectiveness: Some Models to Consider Laura.
Washington State Teacher and Principal Evaluation Program Introduction to Principal Evaluation in Washington 1 June 2015.
Education Unit The Practicum Experience Session Two.
NCATE STANDARD I STATUS REPORT  Hyacinth E. Findlay  March 1, 2007.
BISD Update Teacher & Principal Evaluation Update Board of Directors October 27,
APPR: Ready or Not Joan Townley & Andy Greene October 20 and 21, 2011.
Learning More About Oregon’s ESEA Waiver Plan January 23, 2013.
© 2014 K12 Insight Introduction and Implementation.
Release of Preliminary Value-Added Data Webinar August 13, 2012 Florida Department of Education.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Teacher Evaluation Systems 2.0: What Have We Learned? EdWeek Webinar March 14, 2013 Laura Goe, Ph.D. Research Scientist, ETS Sr. Research and Technical.
Statewide Evaluation Cohort 7 Overview of Evaluation March 23, 2010 Mikala L. Rahn, Ph.D.
Teacher Incentive Fund U.S. Department of Education.
Positive Behavior Support for Families and Community Members School Name / Date (Red font denotes information to be completed/inserted by the district.
Overview of Student Learning Objectives (SLOs) for
Content Pedagogy Design Group: Jeanne Bennett, Corrine Chalifour, Warren Dobbins, Julie Jones, Liz Linear, Lindi McCurry, Jigna Patel, Sue Varga-Ward,
Weighting components of teacher evaluation models Laura Goe, Ph.D. Research Scientist, ETS Principal Investigator for Research and Dissemination, The National.
July 11, 2013 DDM Technical Assistance and Networking Session.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Schoolwide Systems Review: Module 3.0 Gather Cohort 7 Middle Schools.
1 Introduction Overview This annotated PowerPoint is designed to help communicate about your instructional priorities. Note: The facts and data here are.
A Training Course for the Analysis and Reporting of Data from Education Management Information Systems (EMIS)
ABCs and 123s of the Campus Improvement Plan in Plan 4 Learning April 11, 2012.
Merit & Incentive Pay Based on High Stakes Testing
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Teacher Effectiveness and Support for Growth
Administrator Evaluation Orientation
Presentation transcript:

District of Columbia Public Schools | 1200 First Street, NE | Washington, DC | T | F | Lessons Learned in the DC Public Schools July 26, 2011 Student-Teacher Data Links

Goals, Relevance and Outline  Goals  Roster confirmation (RC): to verify student-teacher data links i.e., to connect the right students to the right teachers and to ensure we know who teaches whom what when.  Presentation: to assist other SEAs/LEAs with the successful and efficient implementation of roster confirmation by capturing those aspects of the project that were critical to project success and issues that require particular attention. Lessons Learned District of Columbia Public Schools, July 20112

Goals, Relevance and Outline  Relevance:  Mathematica Policy Research conducted analyses based on a preliminary roster validation process from : “We measure the effect of classification error on value-added estimates of teacher effectiveness by comparing results generated with the original administrative roster data to results using the validated roster data. (…) About one in seven teachers in our data are mismatched with entire classrooms of students because they did not teach math, reading, or both to their students.” (Hock and Isenberg, 2010 – emphasis added)  These results assumed that teachers with at least ten students would get an estimate. If that number were changed to something larger, an even higher teacher mismatch rate might be expected. Lessons Learned District of Columbia Public Schools, July 20113

Goals, Relevance and Outline  Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July 20114

Outline  Overview of IMPACT  Overview of RC  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July 20115

Where We Were In 2007 IMPACT OVERVIEW District of Columbia Public Schools | Summer % vs. 95% 8th Grade Reading Proficiency (2007 NAEP) Teachers Meeting or Exceeding Expectations

Simple Goal For Our Teacher Effectiveness Work IMPACT REFLECTION District of Columbia Public Schools | Summer Create a system in which every parent would be satisfied randomly assigning their children to any classroom in DCPS.

IMPACT Overview Lessons Learned District of Columbia Public Schools |  IMPACT is DCPS’ performance management system for teachers and other school-based employees  Goals:  Reward high-performing teachers  Provide clear feedback and guidance for growth  Transition out underperforming teachers

IMPACT Overview  Important because “…having three years of good teachers…in a row would overcome the average achievement deficit between low- income kids…and others. In other words, high-quality teachers can make up for the typical deficits that we see in the preparation of kids from disadvantaged backgrounds.” Eric Hanushek Stanford University (in Izumi and Evers, Teacher Quality, 2002)  Collaborative process with substantial input from the DCPS community, prominent researchers and assessment systems from all major school districts around the country  Measures of effectiveness tailored to 20 different staff categories and based on multiple components, each with a detailed scoring rubric Lessons Learned District of Columbia Public Schools |

IMPACT Overview Lessons Learned District of Columbia Public Schools |

IMPACT Overview Lessons Learned District of Columbia Public Schools |  Five observations per year:  Three by an Administrator (Principal or Assistant Principal)  Two by a Master Educator

IMPACT Overview  Overall IMPACT Scale  100 Points – 400 Points  Four Ratings: Highly Effective, Effective, Minimally Effective, and Ineffective Lessons Learned District of Columbia Public Schools |

IMPACT Overview  The Ratings and Their Implications  Highly Effective: additional compensation pending union agreement  Effective: normal salary advancement  Minimally Effective:  Additional professional development  Salary “hold”  Separation after two consecutive years  Ineffective: separation Lessons Learned District of Columbia Public Schools |

Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July

Overview of the RC Process  Pilot  Pilot at scale  Contract  Business process  Tool is built  Tool goes live  Initial data  Follow ups  Final data Lessons Learned District of Columbia Public Schools, July

Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July

 High educator participation  What percent of teachers are expected to complete roster confirmation?  Critical data (100%)  Subject, roster, dosage  Minimal time commitment  Minimal direct support necessary  Hotline, , in-person sessions, school sessions Lessons Learned District of Columbia Public Schools, July Defining the success of RC

Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July

RC was designed with the following User Profiles in mind:  Administrator / central office role  What: design, development, testing, and operationalization.  Who: developers and business owners (central office personnel who verified the data entered by teachers and conducted follow ups).  Access: full.  Teacher role  What: target audience, confirmed their rosters and assigned dosages to students.  Who: teachers.  Access: data entry, access to guidelines, instructions, FAQs and ability to view the data they entered by logging in after submitting their roster. Lessons Learned District of Columbia Public Schools, July Key Technical Elements of RC

A custom front-end to the Quickbase Roster Confirmation online application was built to provide the following features to users:  Ability to perform RC in a user friendly interface that would allow teachers to add students to their roster from a dropdown containing all students at their school.  Ability to read the guidelines before doing RC and to refer to the guidelines, FAQs and instructions at any given point while using the application.  Automated quality-control checks: the system recognizes common mistakes and stops the teacher from submitting results until mistakes are fixed. Lessons Learned District of Columbia Public Schools, July Key Technical Elements of RC

 Underlying code that runs the data manipulation for the application retrieves the student roster for each teacher, then loads it on to the user interface and captures the dosage data entered by the teachers.  The application was designed and tested on different browsers. Lessons Learned District of Columbia Public Schools, July Key Technical Elements of RC

Identification and manipulation of seed data:  Obtain list of schools with the contact information of the School/Principal.  Classify schools as Elementary, Middle, Bilingual, and Departmentalized.  Obtain list of students from the student record system.  Identify teachers in the tested grades and subjects.  Design rules to flag potential errors and a process of follow up to correct the information.  Identify appropriate teacher ID for best match across databases, and s.  Identify Homerooms, and Math and Reading courses.  Classify teachers as Math, Reading or both (Homeroom) - may require calls to schools).  Assign appropriate dosages. Lessons Learned District of Columbia Public Schools, July Key Technical Elements of RC

 A relational Quickbase database was used, consisting of four major tables:  A school table with a school code.  A teacher table with a teacher ID as well as a “related” school ID.  A student table with a student ID as well as an “attended” school ID.  A student-to-teacher assignment (STA) table with all three IDs.  After follow-ups, a teacher table and an STA table with the relevant elements were provided to the value-added contractor, including input data, and initial and final output data. Lessons Learned District of Columbia Public Schools, July Key Technical Elements of RC

Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Key technical elements of RC  Keys to the success of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July

 What are the priorities?  What are teachers asked to verify?  The subject(s) they teach? AND/OR  Their roster? AND/OR  The dosage (the estimate of their contribution to each student’s instruction)?  What system/process is appropriate? e.g., role for schools/Principals? Lessons Learned District of Columbia Public Schools, July Keys to the success of RC

 When does the tool go live? When should application development begin? When must exact requirements be ready?  Which teachers will complete roster confirmation? How soon do we need to know?  What is a reading course? What is a math course? What is a homeroom? Which schools are departmentalized/bilingual and how does it work? How much of this does the administrative data capture? Can scheduling data be leveraged?  What are the must-haves for the first year? What are the nice-to-haves to add once the first year has rolled out successfully? Lessons Learned District of Columbia Public Schools, July Keys to the success of RC

 Who should be involved?  E.g., teacher human capital, accountability, legal, data, technical, teaching and learning  How many people from each team? Who? Role? Who provides input? Who decides?  How to ensure ongoing alignment? Meet? How often?  Is the language in the application both accurate and clear? Is it consistent with other communications materials? Are there legal/Union issues? Lessons Learned District of Columbia Public Schools, July Keys to the success of RC

 How many rounds?  What’s best? What’s feasible?  How many rounds of follow-ups?  To what extent can issues be identified in advance?  What is the capacity for last-minute issues?  What is the role of the value-added contractor?  Are there legal/Union issues here? Lessons Learned District of Columbia Public Schools, July Keys to the success of RC

Outline  Overview of IMPACT  Overview of the RC process  Defining the success of RC  Keys to the success of RC  Key technical elements of RC  Threats to the success of RC Lessons Learned District of Columbia Public Schools, July

 Unclear understanding of the importance of data accuracy  Lack of clarity in messaging and instructions  Highly complex concepts: one explanation does not fit all audiences and impractical to present all explanations to all audiences  Difficulty in getting everyone’s attention to a sometimes high level of detail  What are the options?  In person/in-school sessions Lessons Learned District of Columbia Public Schools, July Threats to the success of RC

 Process to verify teacher entries  Role of co-teachers, administrators  Legal/Union rules and teacher privacy  Teachers not even entering the application and central office not knowing why  Balancing of thorough follow ups with capacity  Very few changes in some follow-up categories Lessons Learned District of Columbia Public Schools, July Threats to the success of RC

Contact Information Anna Gregory Hella Bel Hadj Amor, Ph.D. Director, IMPACT Operations Senior Researcher District of Columbia Public Schools American Institutes for Research Office of Human Capital Education, Human Development and the Workforce 1200 First Street NE 10FL 1000 Thomas Jefferson Street, NW Washington, DC Washington, DC T T F F E E Lessons Learned District of Columbia Public Schools, July