Agenda Overview of evaluation Timeline Next steps.

Slides:



Advertisements
Similar presentations
Lee County Human Resources Glenda Jones. School Speech-Language Pathologist Evaluation Process Intended Purpose of the Standards Guide professional development.
Advertisements

North Carolina Educator Evaluation System. Future-Ready Students For the 21st Century The guiding mission of the North Carolina State Board of Education.
Introduction to Teacher Evaluation August 20, 2014 Elizabeth M. Osga, Ph.D.
Oregon Framework for Teacher and Administrator Evaluation and Support Systems Alignment of State and Federal Requirements SB 290 ESEA Waiver Oregon Framework.
Educator Effectiveness from A to Z in a Small District CASE Presentation July 2014.
PSESD Teacher Principal Evaluation Project Regional Implementation Grants October 25, pm.
OVERVIEW OF CHANGES TO EDUCATORS’ EVALUATION IN THE COMMONWEALTH Compiled by the MOU Evaluation Subcommittee September, 2011 The DESE oversees the educators’
 Reading School Committee January 23,
State Consortium on Educator Effectiveness April 2011 CCSSO Ohio Principal Evaluation System.
Texas Evaluation and Support System
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 2: Overview of the Evaluation Process Spring 2010 Teacher and Leader Quality.
ESEA FLEXIBILITY RENEWAL PROCESS: FREQUENTLY ASKED QUESTIONS January29, 2015.
Systems for Moving Forward
Why change from PDAS? PDAS 1997
Agenda Introductions Objectives and Agenda Review Research & Literature From Session 1 Homework Video Exercise Mid-Year Conferences.
Differentiated Supervision
1 GENERAL OVERVIEW. “…if this work is approached systematically and strategically, it has the potential to dramatically change how teachers think about.
Components of a Quality Induction Program Shirley P. Andrews Valdosta State University Valdosta, Georgia.
Program Overview The College Community School District's Mentoring and Induction Program is designed to increase retention of promising beginning educators.
Educator Effectiveness in Colorado State Policy Framework & Approach October 2014.
Lincoln School District. Agenda  Overview of packets Observations Other Types of Evidence Mid-Year Conferences Closing and Next Steps.
TEACHER KEYS EFFECTIVENESS SYSTEM (TKES). WHY TKES? HOUSE BILL 244 Passed during 2013 legislative session Mandates use of single, state-wide evaluation.
Full District Pilot EDUCATOR EFFECTIVENESS.
Interim Joint Committee on Education June 11, 2012.
Teacher Performance Evaluation and Professional Growth (T-PEPG) Model Module 1: Model Overview 1.
MULTIPLE MEASURES What are they… Why are they… What do we do… How will we know… Dr. Scott P. Myers KLFA Wednesday, August 28, 2013.
1 Orientation to Teacher Evaluation /15/2015.
Teacher Keys Effectiveness System Forsyth County Schools Orientation May 2013 L.. Allison.
PRESENTED BY THERESA RICHARDS OREGON DEPARTMENT OF EDUCATION AUGUST 2012 Overview of the Oregon Framework for Teacher and Administrator Evaluation and.
Kindergarten Individual Development Survey (KIDS) District 97 pilot involvement December 11, 2012.
Every Student READY. North Carolina Educator Evaluation A process for professional growth.
The Professional Learning and Evaluation Model. Missouri Essential Principles of Effective Evaluation Measures educator performance against research-based,
Evaluation Team Progress Collaboration Grant 252.
Overview of the Texas Teacher and Principal Evaluation Systems.
Resident Educator 16 “What do I need to know and do?”
Data Sources Artifacts: Lesson plans and/or curriculum units which evidence planned use of diagnostic tools, pre- assessment activities, activating strategies,
Principal’s Mid-Year Report Shelley Jusick Napoleon Middle School Math Goal.
PERSONNEL EVALUATION SYSTEMS How We Help Our Staff Become More Effective Margie Simineo – June, 2010.
NC Teacher Evaluation Process
Overview of the Texas Teacher and Principal Evaluation Systems.
Woodland Park School District Educator Effectiveness 101 September 2015.
Ohio Superintendent Evaluation System. Ohio Superintendent Evaluation System (Background) Senate Bill 1: Standards for teachers, principals and professional.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
Texas Educator Evaluation & Support System Systems US Department of Education NCLB Waiver-Fall 2013 Condition of Waiver to develop new Texas evaluation.
ESEA, TAP, and Charter handouts-- 3 per page with notes and cover of one page.
Educator Evaluation and Support System Basics. Oregon Framework for Teacher and Administrator Evaluation and Support Systems Alignment of State and Federal.
+ SOUTH DAKOTA PRINCIPAL EFFECTIVENESS MODEL PROCESS OVERVIEW PE WEBINAR I 10/29/2015.
Teacher Evaluation Process Update March 13, 2015 SCASPA Roundtable.
Teacher Evaluation ___________________________ Modified ADEPT Model Assisting Developing Evaluating Professional Teaching
Overview of Student Learning Objectives (SLOs) for
EVAAS for Teachers: Overview and Teacher Reports Every Student READY.
Overview of Student Growth and T-TESS. Keys of Appraisal Student growth is a part of the appraisal process: Formative Ongoing and Timely Formalize what.
Overview of the Texas Teacher and Principal Evaluation Systems.
Rubrics Principal Evaluation Model Examples. Note… In the state examples given the Principal/Asst. Principal is evaluated by the “Superintendent” or “Designee”
Texas Teacher Evaluation and Support System (T-TESS)
Lenoir County Public Schools New North Carolina Principal Evaluation Process 2008.
Professional Growth and Effectiveness System Update Kentucky Board of Education August 8,
EISD Texas Teacher Evaluation and Support System T-TESS
National Summit for Principal Supervisors Building an Effective Evaluation System May 11-13, 2016 Jackie O. Wilson, Interim Director, Professional Development.
Last Updated: 5/12/2016 Goal Setting and Professional Development Plan Teacher Overview.
August 5-12, 2011 EDUCATOR EVALUATION PILOT. Overview The “Big Picture” – Where are we headed? – Where have we been? – How will we get there? Previewing.
CAEP Standard 4 Program Impact Case Study
Introduction to Teacher Evaluation
Using the Snyder Student Rubric Presentation Link:
Introduction to Teacher Evaluation
Educator Effectiveness Regional Workshop: Round 2
Five Required Elements
The Portfolio Process.
Educator Effectiveness System Overview
Overview of Implementation and Local Decisions
Presentation transcript:

Agenda Overview of evaluation Timeline Next steps

Supporting Effective Principals and Teachers on Every Campus Standards Preparation Recruitment Hiring and Induction Evaluation and Professional Development Rewards and Retention Visual shows that overarching attempt to align all parts of an educators career, so that preparation works from the same foundation as evaluation, which works from the same foundation as professional development, etc. The standards, which are aspirational, is that foundation. They’re a ceiling and something that all educators, no matter how proficient they are in their practice, can look to for goals.

Three Keys of Evaluation Formative Ongoing Relationships Feedback collected from the field and priorities of the steering committees landed on these three things as essential components of the new systems: Evaluations are formative – the goal is timely feedback that will allow teachers and principals to grow professionally The necessity of ongoing conversations between appraiser and appraisee – not a one-time visit during the appraisal year, but a continuous dialogue around effective practices That ongoing nature of evaluation, the delivery of the feedback, and the buy-in of teachers/principals require the development of a relationship between appraisers and appraisee so that both are comfortable with the perspective of the other.

New Evaluation Systems Spring 2014 70 districts volunteer Steering Committees develop evaluation instruments Finalize guidelines for pilot Summer 2014 Train pilot districts Highlight the work for the steering committees – the teachers and principals of Texas built the rubrics that will be piloted. Idea with pilot districts is that they will take the baton from the steering committees and will provide feedback to TEA and ESCs on the tools and what additional support and resources they would need to implement the systems. Training is focused on best practices for evaluation. They’re not about rules and procedures. Those can be done in documents. They’re about how to truly implement evaluation systems that support professional growth for teachers and principals.

New Evaluation Systems Fall 2014 Pilot Texas Evaluation and Support System Winter 2014-2015 Train-the-Trainer sessions for statewide Late Spring/Early Summer 2015 Revise evaluation systems based on pilot feedback Train the trainer for statewide rollout – ESCs identify internal staff and contracted personnel to become experts on the new systems so they can train for statewide rollout.

New Evaluation Systems Summer 2015 (tentative) Training on new evaluation system for refinement year participants Fall 2015-Spring 2016 (tentative) Implement system for refinement year With the ask for the refinement year, this is the tentative new timeline. We anticipate expanding the number of districts in refinement year. I think 200 would be a good guess, but we’ll have to confirm the resources are there for that. That leaves statewide rollout in 2016-2017.

Texas Teacher Evaluation and Support System Rubric Teacher Self Assessment Student Growth Value-add scores Portfolios Student Learning Objectives District Pre and Post tests Percentages are 70%-10%-20% for those three components. TEA is still exploring what level of district flexibility there will be in choosing student growth measures. Could districts use SLOs in lieu of VAM? Could districts count a portion of that score from a cumulative/campus score?

Four total domains – Planning, Instruction, Learning Environment, Professional Practices and Responsibilities. 16 total indicators Five performance levels – Improvement Needed through Distinguished. Idea that, unlike PDAS, the difference between performance levels is more than just someone doing the same practice just more frequently. The difference between and Improvement Needed and an Accomplished is that the Accomplished actually does different things than the Improvement Needed teacher. Also, feedback is built into the rubric – a teacher can see where they landed on an indicator and look to the level above and see practices they can target to improve.

Process timeline is similar to PDAS, but changing the name of “summative conference” to “end-of-year conference”. Post conference and end of year conference are no longer waivable; can’t be if we’re to preserve the idea that the process is about feedback and that all teachers have things to focus on for improvement. Student growth data will generally be analyzed at the beginning of the next school year and could lead to a revision of the goals and pd plan. Teacher will keep the goals and pd plan doc “alive” during the school year to track their progress toward achieving goals.

Student Growth Student Learning Objectives Portfolios District pre- and post-tests Value-add measure Student growth data, like observation feedback, is for the purposes of making more informed professional development decisions. These are the options for districts. Wouldn’t get into the specifics at this point – each one of these could be another presentation. That last bullet is the key – the value and purpose of student growth data is in what it tells a teacher about how much she’s reached her students, which students she’s not reaching, and, most importantly, what she needs to work on pedagogically to better reach her students next year.

Observation and Self-Assessment Results (80%) Matrix Approach Student Growth Results (20%) Observation and Self-Assessment Results (80%) Distinguished Accomplished Proficient Developing Improvement Needed Well Above Expectations Proficient* Developing * Above Expectations At Expectations Below Expectations Accomplished* Well Below Expectations 80%=20% matrix. This shows that the observation still drives the score – it’s rare that the student growth results move the final score off the observation results, especially for the middle categories. Asterisks indicate that there is an unsettling discrepancy between the observation results and the student growth results that will require a follow-up. We will work on what that follow-up will need to be next year when we rework the rules. Important to note that scores should be a little of an afterthought. The value is in the process – the process needs to yield valuable feedback for a teacher so he or she can improve instruction. The scores themselves don’t provide that.

Principal Evaluation and Support System The intended purpose of TPESS is to assess the principal’s performance in relation to the Texas Principal Standards. Rubric Goal-Setting and Progress Student Growth Campus-level value-add scores Other measures to be determined Goal setting and progress on this system is to capture individual, campus and district level goals and initiatives that may not be specifically captured in the rubric. Student growth has yet to be determined in this system. Campus VAM will be available and, as of now, will count for half of the student growth score. Pilot campuses and districts will assist in building out other possible measures that could be used to capture student growth or progress. Scoring proportions are: Experience as principal on particular campus 0 years 70% 30% 0% 1 year 20% 10% 2 or more years 60%

Texas Principal Evaluation Support System (TPESS) A standardized principal evaluation system will: Serve as measurement of leadership performance Guide leaders as they reflect upon and improve their effectiveness Focus the goals and objectives of schools and districts as they support, monitor, and evaluate their principals Guide professional development for principals Serve as a tool in developing coaching and mentoring programs for principals, and inform higher education programs in developing the content and requirements of degree programs that prepare future principals.

Steps of the TPESS Process Orientation Self-assessment and Goal Setting Pre-evaluation Conference Data Collection Mid-Year Evaluation Discussion Consolidated Performance Assessment End-of-Year Performance Discussion Final Evaluation and Goal Setting

TPESS Rubric Sample

Objectives of the New Systems Continual improvement of practice Provide clear, useful and timely feedback that informs professional development Meaningfully differentiate performance Use multiple valid measures Evaluate teachers and principals on a regular basis Place personnel in the best position to succeed Just an overview of the goals.