KVEC Presents PGES Observation Calibration Are You On Target?

Slides:



Advertisements
Similar presentations
USING THE FRAMEWORK FOR TEACHING TO SUPPORT EFFECTIVE TEACHER EVALUATION Mary Weck, Ed. D Danielson Group Member.
Advertisements

Guiding Principle Five Mentoring needs to be tailored to the needs of the individual teacher and, at the same time, verifying their skills as teachers.
A Vehicle to Promote Student Learning
Simpson County Schools: New Teacher Support Program A Proposal.
Teacher Evaluation Model
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
Curriculum & Instruction Webinar October 18, 2013.
Teacher Evaluation & Developing Goals Glenn Maleyko, Executive Director, Ph.D Haigh Elementary September 8, 2014.
An Overview of TPGES: The Framework for Teaching Jenny Ray, Facilitator Kentucky Department of Education & NKCES July 26, 2013.
1. Learning Targets 2 Staff will understand the various Teachscape platforms offered by South Dakota Staff will be able to login to their Teachscape account.
Skills for Routines Breakout Sessions Breakout session 1: Preparing for a routine: Self-assessment and calibration.
Student Growth Developing Quality Growth Goals II
The Oregon Framework for Teacher and Administrator Evaluation and Support Systems April Regionals Multiple Measures: Gathering Evidence 1.
Consistency of Assessment
Philip Parker Administrator Training and Certification.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
Looking at Student work to Improve Learning
Moving to the Common Core Janet Rummel Assessment Specialist Indiana Department of Education.
1 Confidential to Shelby County Schools and Memphis City Schools Reflective Practice Theory of Action.
February 8, 2012 Session 3: Performance Management Systems 1.
Aligning Academic Review and Performance Evaluation (AARPE)
Rhode Island Model Academy for Personnel Evaluating Teachers Day One Professional Practice.
Student Learning Objectives (SLOs) “101”
KSU Faculty Retreat Resident Educator 16 “What do I need to know and do?”
 In Cluster, all teachers will write a clear goal for their IGP (Reflective Journal) that is aligned to the cluster and school goal.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
Module 3: Unit 1, Session 3 MODULE 3: ASSESSMENT Adolescent Literacy – Professional Development Unit 1, Session 3.
The Jordan Performance Appraisal System (JPAS) is designed to help educators in their continuing efforts to provide high quality instruction to all students.
Evaluator Workshop for Personnel Evaluating Teachers Rhode Island Model The contents of this training were developed under a Race to the Top grant from.
Teacher Effectiveness Pilot II Presented by PDE. Project Development - Goal  To develop a teacher effectiveness model that will reform the way we evaluate.
ADMINISTRATOR EVALUATION AND SUPPORT SYSTEM DAY 4 Summer 2013 {Insert Trainer/Facilitator Name} July 2,
THE DANIELSON FRAMEWORK. LEARNING TARGET I will be be able to identify to others the value of the classroom teacher, the Domains of the Danielson framework.
Evidence-Based Observations Training for Observers of Teachers Module 5 Dr. Marijo Pearson Dr. Mike Doughty Mr. John Schiess Spring 2012.
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
Teacher Growth and Assessment: The SERVE Approach to Teacher Evaluation The Summative or Assessment Phase.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Introduction to... Teacher Evaluation System Teacher Effectiveness 12/6/
A brief introduction to
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
The Michigan Department of Education Program Evaluation Tool (PET) Lessons Learned & Support Documents.
Springfield Effective Educator Development System (SEEDS)
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
Educator Effectiveness Process Introduction to the Grant and Guide to the Unit Meeting.
Calibrating Feedback A Model for Establishing Consistent Expectations of Educator Practice Adapted from the MA Candidate Assessment of Performance.
Forum on Evaluating Educator Effectiveness: Critical Considerations for Including Students with Disabilities Lynn Holdheide Vanderbilt University, National.
Nuts & Bolts Tri 3 AHEP - Getting to the End of the Year!
UPDATE ON EDUCATOR EVALUATIONS IN MICHIGAN Directors and Representatives of Teacher Education Programs April 22, 2016.
Purpose of Teacher Evaluation and Observation Minnesota Teacher Evaluation Requirements Develop, improve and support qualified teachers and effective.
1 Expanded ADEPT Support and Evaluation System Training Module for Cooperating Teachers and Supervising Faculty.
© 2012, Community Training and Assistance Center © 2012, Teaching Learning Solutions Linking ISLLC and your Principal Rubrics to a Case.
Instructional Leadership and Application of the Standards Aligned System Act 45 Program Requirements and ITQ Content Review October 14, 2010.
PGES Professional Growth and Effectiveness System.
MA-PAL Task 3 This task aligns with course assignments from EDC 5630 Supervision and Evaluation of Instruction.
What it means for New Teachers
EVALUATING EPP-CREATED ASSESSMENTS
Planning Instruction Component 3: Session 4
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Dissemination Training
BUMP IT UP STRATEGY in NSW Public Schools
The Framework for Teaching
North Carolina Observation Calibration Process (OCT)
Jean Scott & Logan Searcy July 22, MEGA
Planning Instruction Component 3: Session 4
Mary Weck, Ed. D Danielson Group Member
OTL:NGP:EA:1217.
SGM Mid-Year Conference Gina Graham
Unfilled Positions FWEA Meeting March 18, 2019.
Aligning Academic Review and Performance Evaluation (AARPE)
Presentation transcript:

KVEC Presents PGES Observation Calibration Are You On Target?

Does Not Replace Calibration

Targets  Understand Calibration Purpose & Process  Get Most out of Administrator & Principal Reports  Support for observers Who Need Practice & Coaching

What is Calibration and What is its purpose?

New Observer Training and Calibration Videos in Teachscape Focus

Two 15-minute video segments Feedback Accuracy of Scores Rationales for scores assigned Recommendations for improving accuracy

How are they different Proficiency AssessmentCalibration Measures all sub-skills including: 1.Distinguishing between evidence, interpretation, and bias. 2.Aligning evidence to components 3.Recognizing evidence that has been misaligned to components. 4.Assigning an accurate score for each of the 8 FfT components. A much shorter “check” on one aspect of proficiency-assigning an accurate score for each of the 8 FfT components. Summative/Pass FailFormative /Feedback only Comprehensive measure of proficiencyOnly covers current level of accuracy and steps in maintaining or improving accuracy Average 2-3 hours for each of two parts.Approximately one hour to complete Must pass the Principal proficiency to evaluate teachers. Next Steps is a district decision

Periodic check on accuracy intended to help mitigate rater drift Customizable calibration windows Observers score two 15-minute videos Feedback to observers regarding accuracy of ratings, rationales for scores, and recommendations for next steps Aggregated reports regarding observer performance as compared to expert scores and recommendations for support and/or remediation

Focus for Observers: Develop and Maintain Observation Skills 9 New Observer Certified Observer Training/Practice Calibration Recertification Demonstrate Proficiency Maintain Skills Develop Skills Proficiency Assessment Observer Training Scoring Practice Teachscape Focus Training/Practice Calibration Recertification ` `

Why is Initial Training Not Enough? 10 Research shows that “Rater Drift” occurs over time Undermines validity of observation scores and feedback Can have progressively adverse impact on quality of inferences made about classroom teaching practice Disrupts inter-rater reliability In a study of 458 teachers and 1,800 lessons, variability in teachers’ scores over time was caused by increasing severity of observers’ scoring after taking into account changes in teaching quality Rater Drift is Real. It can be caused by; Fatigue/Stress Colleague Influence/Bias Relative Performance

KDE 4 Year Plan for PGES 11 Initial Proficiency Certification Calibration Recertification Year 1 Year 2 Year 3 Year 4

TeachScape Calibration (Limited time Ky. Pricing) $199 per year for each principal/observer—observer training, 8 scoring practice videos, and 5 calibration windows that will be opened throughout the year. This is exactly like the license you used for year 1 but without the assessment. $149 per year for each principal/observer—8 scoring practice videos and 5 calibration windows that will be opened throughout the year. $99 per year for each observer-6 calibrating windows to check scoring accuracy throughout the year.

How To Order Calibration License Decide which license you will be purchasing. Contact Colleen McHugh and let her know how many of each type of license you need along with the names of the users and who the district administrator will be. Phone: Colleen will send you a quote which you will sign and return to Teachscape with a PO for processing. When the licenses are ready for activation you will receive an from Teachscape with directions for activating the accounts and opening calibration windows.

Calibration Reports and Supports

Administrator Reports Scoring Practice: Rater Agreement Calibration: Calibration Performance Proficiency Tests: Score Range and Recommendations Test Performance Component Proficiency

Gather Evidence, Align it to the Components, Score 16 Taking evidence on a Calibration video Scoring all components based on evidence gathered Align evidence to components

Adjacent Scores Are Not Good Enough

Demonstrating Scoring Accuracy

Needs Practice and Support

Needs Remediation and Monitoring

Recommendations for Administrators 21 Decision RulesEnd of Calibration Window Demonstrating Scoring Accuracy ≤ 2 discrepant (13%)/≥ 12 exact (75%) The observers in this category have indicated that they scored accurately on the two calibration videos. Assuming they arrived at the score the same way the expert scorers did, the observers provided evidence of being able to apply the Framework for Teaching accurately. They are ready to observe classrooms. Needs Practice and Support All score combinations that do not match green or red category The observers in this category have indicated that they need additional practice and support as they prepare to do classroom observations. They should review the rubrics, benchmarks and rangefinders for the components for which they did not have an exact match with the expert scores. It is also suggested that they complete a video from the Scoring Practice. You may want to have these observers be monitored more closely than the observers in the “Demonstrated Scoring Accuracy” category. Needs Remediation and Monitoring ≥ 8 discrepant (50%)/≤ 4 exact (25%) The observers in this category have indicated that they are inconsistent in their application of the Framework for Teaching to videos of classroom practice. A source of their scoring inconsistency may be a due to their professional preferences influencing their judgment and application of the rubric language. They should go through training again with a colleague or coach. It is strongly recommended that they be monitored when doing observations in the classroom. For example, they might be sent into a classroom with another trained observer and their evidence and scores are compared, or you may decide to have them videotape the lesson they are observing so that a second observer can review it.

For Inquiring Minds… What happens if my performance reveals that I am not scoring accurately? Where can I find help? If I fall into the red category can I re-calibrate immediately?

How to Support your Principals who need Practice and Coaching

District Role District receives recommendations on the district level calibration reports Identifies observers who fall into the yellow or red categories Each district has or will have identified structures and processes for support for individuals who need monitoring or more significant interventions

Norming exercises should be a standard component of evaluator training, and evaluators whose observation scores do not accurately calibrate to the master rating should receive more training before they may observe teachers for evaluative purposes -Reform Support Network, June 2013

Discuss this scenario A principal with 3 years experience successfully obtained observer certification in In 2015 the principal completed the calibration requirement and had 9/16 scores that were discrepant or adjacent. -How does your district know that this principal needs support? -How does your district currently provide support to principals for calibration and accuracy?

Processes and Protocols to Support Principals Checklist ✓ One or more district administrators is assigned the task of documenting and tracking evaluator certification and calibration. ✓ A district protocol has been established to support principals who are not successful with certification. ✓ A district protocol has been established to support principals who demonstrate rater drift in calibration. ✓ District staff who support principals have successfully passed Teachscape certification and calibration. ✓ District Professional Learning encourages ongoing reflection on the KYFfT and appropriate alignment of evidence (principal cadres, administrative meetings, etc.).

Job-Embedded Research suggests basing this additional training on practice at the rater’s school, rather than on video analysis of teachers that the evaluator will never observe. This would address a concern raised by experts that evaluators tend to rate teachers they know higher than those with whom they are not familiar.

Support for Capacity Building Another trained administrator might review calibration exercises with the evaluator, identifying any potential misunderstandings regarding KyFfT, evidence, bias, etc.

Support for Capacity Building Another support is to send impartial observers to conduct co-observations with trained/certified school or district leaders

Or evaluators could submit a video of a lesson they observe along with their rating, at which point a trained impartial observer rates the teacher as well.

In any scenario, the goal is to identify those school leaders most in need of support and help them improve their accuracy in the context of the classrooms of teachers they will evaluate

How do principals engage in practice and ongoing support for calibration? Review of KyFfT? Practice identifying evidence? Practice aligning evidence to KyFfT? DLTs? PLCs?

Happy Calibrating