TEAM Evaluator Training

Slides:



Advertisements
Similar presentations
Non-Classroom Teacher Evaluation Guidelines. The single most influential component of an effective school is the individual teachers within that school.
Advertisements

Analyzing Student Work
NIET Teacher Evaluation Process © 2011 National Institute for Excellence in Teaching. All rights reserved. Do not duplicate without permission.
Student Growth Measures in Teacher Evaluation Module 1: Introduction to Student Growth Measures and SLOs.
Student Growth Measures in Teacher Evaluation
TEAM Evaluation Model OverviewTEAM Evaluation Model Overview Zachary Rossley, Deputy Assistant Commissioner, Data and Research Division, Tennessee Department.
Ohio Teacher Evaluation System: Assessment of Teacher Performance
September 2013 The Teacher Evaluation and Professional Growth Program Module 2: Student Learning Objectives.
NIET Teacher Evaluation Process Day 1 © 2011 National Institute for Excellence in Teaching. All rights reserved. Do not duplicate without permission.
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
EVIDENCE BASED WRITING LEARN HOW TO WRITE A DETAILED RESPONSE TO A CONSTRUCTIVE RESPONSE QUESTION!! 5 th Grade ReadingMs. Nelson EDU 643Instructional.
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Professional Growth= Teacher Growth
TEAM Evaluator Training Re-certification Summer 2015Summer 2015.
Instruction Planning Environment
Teacher Development and Evaluation Model October 22, 2012.
Session Materials  Wiki
TEAM Evaluator Training
Introduction to Home/School Compacts
Horizon Middle School June 2013 Balanced Scorecard In a safe, collaborative environment we provide educational opportunities that empower all students.
Student Growth 2.0 Fall,  Face-to-Face Sessions  Student Growth 2.0  Rater Agreement Practices  TPEP/ Washington State Learning Standards.
Washington State Teacher and Principal Evaluation 1.
1 Orientation to Teacher Evaluation /15/2015.
Developing Professional Learning Communities To Promote Response to Intervention Linda Campbell Melissa Nantais.
Stronge Teacher Effectiveness Performance Evaluation System
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
Laying the Groundwork for the New Teacher Professional Growth and Effectiveness System TPGES.
Protocols for Mathematics Performance Tasks PD Protocol: Preparing for the Performance Task Classroom Protocol: Scaffolding Performance Tasks PD Protocol:
New Teachers’ Induction January 20, 2011 Office of Curriculum and Instruction.
Planning and Integrating Curriculum: Unit 4, Key Topic 1http://facultyinitiative.wested.org/1.
THE DANIELSON FRAMEWORK. LEARNING TARGET I will be be able to identify to others the value of the classroom teacher, the Domains of the Danielson framework.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
South Western School District Differentiated Supervision Plan DRAFT 2010.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
OVERVIEW PRESENTATION
Responsiveness to Instruction RtI Tier III. Before beginning Tier III Review Tier I & Tier II for … oClear beginning & ending dates oIntervention design.
10 Principles of a Successful Classroom. Students are presented with meaningful, higher-order, activities that create the context for learning and build.
TEM 4.0 For School Librarians Laurie Butler, Library Services Advisor 2014.
Candidate Assessment of Performance Conducting Observations and Providing Meaningful Feedback Workshop for Program Supervisors and Supervising Practitioners.
Daily Math Review Kindergarten – 2nd February 6, :30 – 3:45pm.
Instructional Leadership Planning with Indicators of Quality Instruction.
Candidate Assessment of Performance CAP The Evidence Binder.
National Board Study Group Meeting Dan Barber 5 th Grade Teacher, Irwin Academic Center
Welcome! March 2013 ELA Network
Daily Math Review 2 nd Grade February 6, :30 – 3:45pm.
Candidate Assessment of Performance CAP The Evidence Binder.
Instructional Leadership: Planning Rigorous Curriculum (What is Rigorous Curriculum?)
1 Far West Teacher Center Network - NYS Teaching Standards: Your Path to Highly Effective Teaching 2013 Far West Teacher Center Network Teaching is the.
PLCs in Mount Airy City Schools Purpose of PLCs Collaborative meetings of educators in which data-driven decisions are made to improve teacher’s instruction.
Calibrating Feedback A Model for Establishing Consistent Expectations of Educator Practice Adapted from the MA Candidate Assessment of Performance.
Springfield Public Schools SEEDS: Collecting Evidence for Educators Winter 2013.
Response to Invention (RTI) A Practical Approach 2016 Mid-Level Conference.
Implementing the Professional Growth Process Session 3 Observing Teaching and Professional Conversations American International School-Riyadh Saturday,
Instructional Leadership Supporting Common Assessments.
Overview of Standards for Literacy in History/Social Studies, Science, and Technical Subjects The Common Core State Standards.
Tell Survey May 12, To encourage large response rates, the Kentucky Education Association, Kentucky Association of School Administrators, Kentucky.
Summative Evaluation Shasta Davis. Dimension: Preparation (Score- 4) Plans for instructional strategies that encourage the development of critical thinking,
Last Updated: 5/12/2016 Texas Teacher Evaluation and Support System (T-TESS) Teacher Overview.
MOVING TO T-TESS Texas Teacher Evaluation and Support System Copyright 2016.
Recertification TEAM Teacher Evaluation Process.
Middle School Training: Ensuring a Strong Foundation of Supports
Instructional Rounds Peninsula School District
Instructional Rounds Peninsula School District
Implementing the Specialized Service Professional State Model Evaluation System for Measures of Student Outcomes.
Texas Teacher Evaluation and Support System (T-TESS)
Quantitative Measures: Measuring Student Learning
Presentation transcript:

TEAM Evaluator Training January 2015

TEAM Teacher Evaluation Process Day 1 Instruction Planning Environment 9:01 Welcome and introductions. Explain parking lot.

Agenda Day Components Day One TEAM Overview Diving into the Rubric Collecting Evidence Pre/Post Conferences Day Two Professionalism Alternate Rubrics Quantitative Measures Closing out the Year 9:02*Creating a poster to reference and for participants to track will model teaching best practices-where in rubric?

Expectations To prevent distracting yourself or others, please put away all cellphones, iPads, and other electronic devices. There will be time during breaks and lunch to use these devices as needed. 9:03 *many use these as learning tools

Overarching Training Objectives Participants will: Be able to implement and monitor the TEAM process Successfully collect and apply evidence to the rubric Gather evidence balancing educator and student actions related to teaching and learning and use that evidence to evaluate and accurately score learning Use a preponderance of evidence to evaluate teaching Be prepared to use the rubric to structure meaningful feedback to teachers 9:05 Emphasize the fact that all participants will start at a procedural level and will be supported to gain a more conceptual understanding. Collecting evidence is a key step in effectively observing and evaluating teachers.

Norms Keep your focus and decision-making centered on students and educators. Be present and engaged. Limit distractions and sidebar conversations. If urgent matters come up, please step outside. Challenge with respect, and respect all. Disagreement can be a healthy part of learning! Be solutions-oriented. For the good of the group, look for the possible. Risk productive struggle. This is a safe space to get out of your comfort zone.

Chapter 1: TEAM Overview 9:11

Evaluation closely links with State Standards Getting students ready for postsecondary education and the workforce is WHY we teach State Standards provide a vision of excellence for WHAT we teach TEAM provides a vision of excellence for HOW we teach 9:13

We aimed to be the fastest improving state in the nation by 2015 Tennessee We measure our success by our progress on NAEP, ACT, and PARCC. In 2013, it was announced that Tennessee had the largest recorded growth in NAEP history!

EXPLORE and PLAN results show Tennessee making substantial growth over the last three years EXPLORE (8th grade) PLAN (10th grade) Tennessee Results National Norm

We have made substantial progress since 2010… TCAP Percent Proficient/Advanced Grades 3-8 Grades 9-12 Kathleen

We have made substantial progress since 2010… Percent Proficient and Advanced 2013 NAEP: 4th and 8th grade results Grades 3-8 Grades 9-12 Kathleen

These gains mean thousands of additional students are performing on grade level Nearly 91,000 additional students are at or above grade level in all math subjects now, as compared to 2010. Nearly 52,000 additional students are at or above grade level in all science subjects, as compared to 2010. Change graphics to table. Reading is a difference of 26,912 students. 2013 2010 2011 2013 2010 *2011 was the baseline year for the Algebra II EOC.

Components of Evaluation: Tested Grades and Subjects Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 9:21 State law currently requires value-added (or a comparable growth measure) to count as 35 percent of the total evaluation score. For teachers in state tested grades/subjects, the 35 percent growth component is their individual TVAAS score. For teachers in districts that have opted-into the portfolio growth models, their portfolio score will count as 35 percent. For teachers without an individual growth measure, this will be a school-, district-, or state-wide TVAAS score that comprises 25 percent. Additional measures for non-tested grades/subjects are in development. Also note that some districts use student perception surveys as a measure on the qualitative side. Generally at 5%. Growth measures progress from a baseline Ex. John grew faster than we would have expected this year based on his testing history. Achievement measures proficiency Ex. John scored a 98 percent on his test. The 15 percent measure is based on a yearly goal set by the educator and his/her evaluator that is measured by current year data. To make the 15% meaningful, the evaluator and educator work together to identify a measure. If there is a disagreement between the educator and the evaluator, the educator’s decision stands. The selection and goal-setting process involves determining which measure most closely aligns to the educator’s job responsibilities and the school’s goals.

Components of Evaluation: Non-tested Grades and Subjects Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 9:23 *This reflects law that was enacted in spring 2013 and was in effect for the 2012-13 and 2013-14 school years. Also note, teacher trumps evaluator’s decision.

Origin of the TEAM rubric TDOE partnered with NIET to adapt their rubric for use in Tennessee. The NIET rubric is based on research and best practices from multiple sources. In addition to the research from Charlotte Danielson and others, NIET reviewed instructional guidelines and standards developed by numerous national and state teacher standards organizations. From this information they developed a comprehensive set of standards for teacher evaluation and development. Work that informed the NIET rubric included: The Interstate New Teacher Assessment and Support Consortium (INTASC) The National Board for Professional Teacher Standards Massachusetts' Principles for Effective Teaching California's Standards for the Teaching Profession Connecticut's Beginning Educator Support Program, and The New Teacher Center's Developmental Continuum of Teacher Abilities. 9:20 The rubric has been used for fifteen years and it's been tested and improved over time

Rubrics General Educator Library Media Specialist School Services Personnel School Audiologist PreK-12 School Counselor PreK-12 School Social Worker PreK-12 School Psychologist PreK-12 Speech/Language Therapist May be used at the discretion of LEA for other educators who do not have direct instructional contact with students, such as instructional coaches who work with teachers. 9:24

Domains Environment Professionalism Planning Instruction Environment Professionalism Planning 9:25 It’s all about instruction.

Evaluation Process Initial Coaching Conversation Pre-Conference Required for teachers who received an overall effectiveness rating or individual growth score of 1 in the previous year Pre-Conference Classroom Visit Post-Conference Professionalism Scoring Summative Conference Repeat as needed depending on number of required observations 9:31 *Initial coaching conversation is plain ole good practice

Coaching Conversations (Video)

Suggested Pacing 9:45, the minimum required number of observations for each teacher will be based on licensure status and evaluation scores from the previous year. Found on pg. 22 in the manual. Formatting is off on purple block

Observation Guidance Coaching Conversation A targeted conversation with any teacher who scored a 1 on overall evaluation or individual growth about the number of required observations and what supports they will receive throughout the year to improve student achievement.   Observing Multiple Domains During One Classroom Visit Districts may choose to observe the instruction domain during the same classroom visit as either the planning domain or the environment domain.   Announced vs. Unannounced Visits At least half of domains observed must be unannounced, but it is the district’s discretion to have more than half of domains observed unannounced. 9:50 (5 min here)Initial coaching conversations should take place before the first official observation of the year. *Multiple domains-Strongly encourage evaluators to observe and score multiple domains together, when possible. *Announced vs. Unannounced-Any time frame windows (“next week”) or hints (“I may be on this hall tomorrow”) violates the spirit of unannounced visits. *Announced informs what they know of Best Practices and Unannounced informs of what happens on a regular basis *If you choose to do more than the required number of observations, this policy has to be applied consistently (i.e. you cannot single out one teacher to receive more observations than the rest of your teachers).

Framing Questions (Activity) Why do we believe that teacher evaluations are important? What should be accomplished by teacher evaluations? What beliefs provide a foundation for an effective evaluation? 10:10 (14 min: 8 min discuss & 6 min share out/record) Have participants discuss at their tables. Trainer should have groups share. Record answers on a chart. Post chart.

Core Beliefs We all have room to improve. Our work has a direct impact on the opportunities and future of our students. We must take seriously the importance of honestly assessing our effectiveness and challenging each other to get better. The rubric is designed to present a rigorous vision of excellent instruction so every teacher can see areas where he/she can improve. The focus of observation should be on student and teacher actions because that interaction is where learning occurs. 10:13 Make connections from participants’ list to this list.

Core Beliefs (Continued) We score lessons, not people. As you use the rubric during an observation, remember it is not a checklist. Observers should look for the preponderance of evidence based on the interaction between students and teacher. Every lesson has strengths and areas that can be improved. Each scored lesson is one factor in a multi-faceted evaluation model designed to provide a holistic view of teacher effectiveness. As evaluators, we also have room to improve. Observing teachers provides specific evidence that should inform decisions about professional development. Connecting teachers for coaching in specific areas of instruction is often the most accessible and meaningful professional development we can offer. 10:16 Make connections from participants’ list to this list.

Materials Walk 10:19 Go over handbook and supplemental materials

Chapter 2: Diving into the Rubric 10:20

Evaluator Expectations Initially, evaluators aren’t expected to be perfectly fluent in the TEAM rubric. The rubric is not a checklist of teacher behaviors. It is used holistically. Just being exposed to the rubric is not sufficient for full fluency. Fully fluent use of the rubric means using student actions and discussions to analyze the qualitative effects of teacher practice on student learning. We’ll learn how to use it together through practice. 10:21 Begin the training with the evaluator expectations. Emphasize here that just being exposed to the rubric is not sufficient. All users must go through in-depth practice and training to be fluent in all of its applications and uses.

The Value of Practice To utilize the rubric tool effectively, each person has to develop his/her skills in order to analyze and assess each indicator in practical application. Understanding and expertise will be increased through exposure and engagement in simulated or practice episodes. This practice will define the evaluator’s understanding and strengthen his/her skills as an evaluator. 10:22 Emphasize that this is the first day of 2 days of exposure and engagement. Ask participants to reflect on this slide throughout the training as they gain more understanding and expertise with the rubric. “Some of you may already have experience with this process, but in the spirit of continuous improvement we’ll continue practicing.” Throughout the school year, you should continue to practice with your school team. There may also be structured opportunities to revisit your practices offered through your local CORE office.

Placemat Consensus 1. Draw a large circle with a smaller circle inside 2. Divide the outer circle in sections for the number of people in your group. 3. Each person will write responses to the topic in their space on the placemat. 4. The group will write their common responses to the topic in the center circle.

Placemat Consensus (Activity) 2 minutes to write individually 3 minutes to talk and reach consensus 5 minutes to debrief Participant A Participant B Consensus Elements Participant D Participant C 10:37 (15 min) Have groups share one element each, without repeating, until all groups believe their list is “covered.” Trainer records on chart as groups share, using quality questioning and academic feedback. QUESTION: What do you look for when observing and evaluating a lesson?

Effective Lesson Summary Defined daily objective that is clearly communicated to students Student engagement and interaction Alignment of activities and materials throughout lesson Rigorous student work, citing evidence and using complex texts Student relevancy Numerous checks for mastery Differentiation 10:39 (Trainer Note) Make connections to: When a lesson is effective, we know it when we see it. But, when “it” is missing…How do we communicate what is missing to someone else? How do we build the missing skills in others? How do we measure “it?” TEAM provides us with what “it” is (i.e. what an effective lesson, effective teaching is), the process for building the skills in others, and the tools by which we measure it (TEAM Instructional Rubrics/Domains).

Significantly Above Expectations Significantly Below Expectations (1) TEAM Rubric TDOE has worked with NIET to define a set of professional indicators, known as the Instructional Rubrics, to measure teaching skills, knowledge, and responsibilities of the teachers in a school. Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:40 Open your Evaluation System Handbook to the Instruction Domain of the General Educator Rubric. These slides will highlight each element of the rubric (Domain, Indicator, Descriptors and Performance Levels).

The Parts of the Rubric: Domains Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:41 These slides will highlight each element of the rubric (Domain, Indicator, Descriptors and Performance Levels) in slide show mode. There are 4 domains included in the qualitative portion of teacher evaluation: planning, environment, instruction, and professionalism

The Parts of the Rubric: Indicators Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:42

The Parts of the Rubric: Descriptors Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:43

The Parts of the Rubric: Performance Levels Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:44 *

What is the Process of Modeling your Thinking (Think-Aloud) I do Think Aloud: Teacher models thinking, revealing his/her metacognition We do Scaffold & Cue: Students work in partners or groups applying thinking, with teacher monitoring and supporting 11:01 Trainer note: Tell the participants that this gradual release of responsibility is one effective teaching strategy. Please remember that there are other effective ways to model your thinking and structure your lessons. For example, the order in which you present these steps can vary. Some Common Core lessons, such as Math Task Analysis might not present in this order. You do Students Explain Thinking: Students demonstrate mastery and explain their thinking

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 11:05 No descriptors highlighted Sample Script for Think Aloud - for Standards and Objectives for Rubric Activity   The portions with the “T” for trainer, are the think aloud portions. The portions in italics under “Step Out” are the portions at which the trainer should stop and step out on the think aloud and then continue again under portions called “Think Aloud.”* Bullet One/Descriptor One-Think Aloud *Write the Questions T-”One way I begin to understand the rubric is by asking myself one of three questions: Why is this important to learning? How does this help me learn? What happens to learning if the practice is ineffective or non-existent? Answering one of these questions will help me connect the rubric of teacher practices to student learning. In looking at Standards and Objectives, I want to start under the “At Expectations” column because “At Expectations” indicates a solid lesson.” T-“The first descriptor reads, Most learning objectives are communicated, connected to state standards and referenced throughout the lesson.” T-“I think I will highlight the words communicated, connected and referenced. I’ve known objectives are to be posted, but communicated tells me something more. Communicated means that they are not only written or said by the teacher but understood by the students. STEP OUT If I ask myself, “How does this help me learn, I know that if I know what I’m supposed to be learning, it me creates a purpose for the instruction and it helps me track my own learning toward the goal, connecting the sub-objectives along the way. If it helps me in this way, it’s likely to help most others in the same way.” Think Aloud T-“I’ve also highlighted connected, as in to the state standards. The creation and adoption of standards is not new to me, and this helps my teachers and me ensure a standards-based curriculum. If I ask myself, “Why is this important to learning, I know that the CCSS are more rigorous and expecting higher levels of thinking and writing. While this is great for our students, it will be a huge challenge for our students and teachers. It will be crucial for our students’ success that teachers are focus on the CCSS as they plan units and lessons.” T-“I also highlighted the word ‘referenced’, as in throughout the lesson. So there should be an objective that aligns to a standard, and it helps students for this to be communicated..now referenced throughout the lesson…This seems simple enough to do..  STEP OUT If I ask myself, “How could this help me learn, I know that knowing the objective during the lesson is the what and that helps, but if I also have the framework for the Why I’m learning what I’m learning, it will help me connect prior learning from the unit or knowledge from what I know to my new learning. The more connections I can make, the easier it is for me to conceptualize the bigger picture, internalize the learning, and generate extending questions. The standard is the WHY I’m learning the objective for a lesson. Bullet Two/Descriptor Two-Think Aloud T-“The second descriptor reads, sub-objectives are mostly aligned to the lesson’s major objective. I know the word sub-objectives must be highlighted. I also know that there are 3 basic reasons for including sub-objectives: to review prior learning, to teach a new sub-skill, and to teach a process that supports the main objective. So, if these are the three major reasons for including sub-objectives that tells me something important about the major objective. So I’ll have to highlight major objective as well. Sub-objectives must support the major objective. If I am going to create sub-objectives that support the major objective, there must be some kind of alignment. So I’ll highlight alignment also.” If I ask myself, What happens to learning if the sub-objectives aren’t aligned to the lesson objective, I begin to think about the purpose of sub-objectives. They are like the check in points of a route on a map. If there are no points along the way, I’ll have no plan to get to the destination. If the points go all over the map, my route will not be smart or efficient. These same check in points could also be good spots to assess where we are during a lesson. Without them, it will be difficult to know if we got here or why we didn’t if we struggle.”

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Descriptor 1 highlighted

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Descriptor 2 highlighted

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Descriptor 3 highlighted

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Descriptor 4 highlighted

Standards and Objectives Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Descriptor 5 highlighted

Instructional Domain (Activity) Directions: Highlight key words from the descriptors under the “At Expectations” column for the remaining indicators with your shoulder partner. You will have 15 minutes to complete this. 11:35 (15 min for this) After the trainer has modeled the think aloud for S/O, the participants will go on to highlight the key words of the rubric with a shoulder partner. They have 15 minutes to do this. Trainer should be circulating throughout the room and providing a model of effective AF & Q. As often as possible point out to participants how the rubric is interconnected…this will be their first exposure to this idea, and the trainer will come back with the whole group soon to point this out. *reference the written questions (how does this help me learn? Etc)

Reflection Questions (Activity) How is the rubric interconnected? (what threads do you see throughout the indicators?) Where do you see overlap? If we are doing this at a proficient level for the teacher, what are the “look fors” at the student level? 11:40 (5 min) Have participants discuss these questions with their shoulder partner. Debrief whole group. This is a very critical point of understanding the holistic aspect of the rubric.

For early finishers… If you finish early, begin making explicit connections between the key words that you have highlighted and actual classroom practices. What would some of these descriptors and key words look like in a classroom observation? Write down the applications that you have made for each of the key words that you highlighted and be prepared to share those when the trainer asks for them.

Look Back at Your Consensus Maps… Find the parts of the rubric that correspond to your consensus maps. For example—If you put “there needs to be an objective” in your consensus map, where in the rubric would that be found? Please do this slide if time permits. The answer to the above question would be the Standards and Objectives indicator.

Before we share out… The TEAM rubric is a holistic tool. What does this mean? Holistic: relating to or concerned with wholes or with complete systems What does this mean about the use of this evaluation and observation tool? In order to use the rubric effectively, both observer and those being observed have to see that each of the parts of each domain can only be understood when put in context of the whole.

Before we share out continued… The rubric is not a checklist. Teaching, and observations of that teaching, cannot only be a “yes/ no” answer. Only through an understanding of the holistic nature of the rubric can we see that many of these parts have to be put in context with each classroom, and with reference to all the other “parts” that go into teaching.

Connections between the Indicators As some have already noticed, the indicators within the instructional domain are very interconnected with each other. As a group, create a chart, diagram or picture that illustrates these connections between your assigned indicators(s). Questions to ask yourself: How does one indicator affect another? How does being effective or ineffective in one indicator impact others?

Share Out Each group will share out their indicator(s) One person should share out what is on the poster, and the other should share where in the manual the information was found. Other groups should listen for: What the indicator means Words and phrases that were highlighted and why Classroom examples

Questioning and Academic Feedback (Activity) The Questioning and Academic Feedback indicators are closely connected with each other. With a partner, look closely at these two indicators and discuss how you think they are linked. (Teacher AND Student links) What does this mean for your observation of these two indicators? 11:48

Thinking and Problem-Solving (Activity) The Thinking and Problem-Solving indicators are closely connected with each other. With a partner, look closely at these two indicators and discuss how you think they are linked. (Teacher AND Student links) What does this new learning mean for your observation of these indicators? 11:51 THK/PS slides & activity. The trainer will emphasize thinking as the process that leads to the product of different problem solving types. For example: teaching analytical thinking to students should result in their ability to identify relevant/irrelevant information. Another example: the process of analytical thinking and creative thinking should result in the product of being able to generate ideas. Ask participants if they see other links between thinking/problem solving and give them a few minutes to do this with a partner. Also have them discuss 3rd bullet point. Have a rep from each table share links and aha’s.

The Thinking and Problem-Solving Link 11:52 The trainer will emphasize thinking as the process that leads to the product of different problem solving types. For example: teaching analytical thinking to students should result in their ability to identify relevant/irrelevant information. The process of analytical thinking should result in the product of being able to problem solve and being able to identify relevant/irrelevant information. Another example: the process of analytical thinking and creative thinking should result in the product of being able to generate ideas. Ask participants if they see other links between thinking/problem solving and give them three minutes to do this with a partner. 2 hours 47 minutes Process Product

Thinking and Problem Solving Link continued Thinking and Problem Solving as described in the rubric are what we expect from students. All other indicators should culminate in high-quality thinking and problem solving by students. How? When teachers ask high level questions it promotes student thinking and when students are interacting with one another in a group, they are exercising research based thinking.

RTI2 RTI2 Tier I instruction is synonymous with effective, differentiated instruction. Effective observation of RTI2 Tier II and Tier III contexts requires a strong understanding of holistic scoring. Ex. Look at the grouping indicator. Which descriptors would you expect to see in the RTI context? Which descriptor may not be relevant? Be intentional about using professional judgment to determine when it is appropriate to observe an educator in an intervention setting. Ex. Is a regular classroom teacher facilitating computer-based intervention rather than delivering instruction today? It may be appropriate to treat this similarly to if you walk in on an assessment. Be intentional about using professional judgment to determine which rubric is the most appropriate for an educator. Ex. An interventionist whose sole responsibility is to facilitate computer-based intervention may be evaluated using the SSP rubric if they are consistently delivering services rather than instruction. *RTI is a good example for understanding holistic vs. checklist scoring. Let’s look at the Grouping indicator. Read the descriptors to yourself. How would this descriptor apply to an RTI setting? What should an evaluator see in an RTI setting and what may not apply to that context?

Planning Domain (Activity) Directions: Highlight key words from the descriptors under the “At Expectations” column with your shoulder partner. You will have 15 minutes to complete this.

Planning—Instructional Plans Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; Little evidence that the plan provides some opportunities to accommodate individual student needs. 2:43

Planning—Instructional Plans Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; little evidence that the plan provides some opportunities to accommodate individual student needs.

Planning—Instructional Plans Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; little evidence that the plan provides some opportunities to accommodate individual student needs.

Planning—Instructional Plans Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; little evidence that the plan provides some opportunities to accommodate individual student needs.

Planning—Instructional Plans Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; little evidence that the plan provides some opportunities to accommodate individual student needs.

Planning—Student Work Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Student Work Assignments require students to: organize, interpret, analyze, synthesize, and evaluate information rather than reproduce it; draw conclusions, make generalizations, and produce arguments that are supported through extended writing; and connect what they are learning to experiences, observations, feelings, or situations significant in their daily lives both inside and outside of school. interpret information rather than reproduce it; draw conclusions and support them through writing; and connect what they are learning to prior learning and some life experiences. mostly reproduce information; rarely draw conclusions and support them through writing; and rarely connect what they are learning to prior learning or life experiences.

Planning—Assessment Planning Assessment Assessment Plans: Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Assessment Assessment Plans: are aligned with state content standards; have clear measurement criteria; measure student performance in more than three ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); require extended written tasks; are portfolio-based with clear illustrations of student progress toward state content standards; and include descriptions of how assessment results will be used to inform future instruction. have measurement criteria; measure student performance in more than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); require written tasks; and include performance checks throughout the school year. are rarely aligned with state content standards; have ambiguous measurement criteria; measure student performance in less than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); and include performance checks, although the purpose of these checks is not clear.

Guidance on Planning Observations The spirit of the Planning domain is to assess how a teacher plans a lesson that results in effective classroom instruction for students. Specific requirements for the lesson plan itself are entirely a district and/or school decision. Unannounced planning observations Simply collect the lesson plan after the lesson. REMEMBER: You are not scoring the piece of paper, but rather you are evaluating how well the teacher’s plans contributed to student learning. Evaluators should not accept lesson plans that are excessive in length and/or that only serve an evaluative rather than an instructional purpose. 3:10

Making Connections: Instruction and Planning (Activity) Review indicators and descriptors from the Planning domain to identify connecting or overlapping descriptors from the Instruction domain. With a partner, discuss the connections between the Instruction domain and the Planning domain. With your table group, discuss how these connections will inform the scoring of the Planning domain and why. Be ready to share out. 3:25 (15 min)Circulate, support, act as a model for good questioning and academic feedback. Debrief whole group. Continue to make connections between teacher AND student actions.

Chapter 3: Pre-Conferences 3:26

Planning for a Pre-Conference (Activity) Evaluators often rely too heavily on physical lesson plans to assess the Planning domain. This should not dissuade evaluators from reviewing physical lesson plans. Use the following guiding questions: What do you want students to know and be able to do? What will the students and teacher be doing to show progress toward the objective? How do you know if they got there? What are some additional questions you would need to ask to understand how a teacher planned to execute a lesson? How would these questions impact the planning of a pre-conference with the teacher? 3:31 (5 min)Have participants do this activity individually. Tell them that they are going to compare the questions that they came up with to the questions that are actually asked in the pre-conference. This is a good time to tell them that they will watch a pre conference and lesson of the same teacher. Trainer Note: To substantively answer all of these questions, you would need a really long document that would serve only an evaluative purpose. The pre-conference is an opportunity for the evaluator to coach the teacher into improved practices to increase student learning

Viewing a Pre-Conference When viewing the pre-conference: What are the questions the conference leader asks? What questions relate to teacher actions and which questions to student actions? How do our questions compare to the ones asked? A reminder of what their job is during the pre-conference. Have them write down all of the questions that are asked during the pre- conference. If they can, also have them write down answers. It is all about the questions that are asked! Do they focus on teacher actions only or do they also ask student focused questions.

Pre-Conference Video 3:37 Video: 6 min

Pre-Conference Reflection (Activity) What questions did the conference leader ask? How did these compare to the ones you would have asked? What questions do you still have? 3:42 (5 min)Lead a discussion of the reflection questions on the slide. Emphasize the importance of questioning during the pre-conference. How prepared was the evaluator? What parts of the Planning domain were included? We will now focus on the Environment domain. *

Chapter 4: Collecting Evidence 11:53

When do you collect evidence? Prior to the Lesson Being Observed Pre-conference (Announced only) Review of lesson plan as applicable During the Lesson What the teacher says and does What the students say and do 11:55 * No scoring evidence is taken during post-conference. Point out to participants that there are 3 different points during the evaluation process to collect evidence. We will first concentrate on how to collect evidence during a lesson. Point out that while this is perhaps the MOST important time to collect evidence, it is not the only time. Before an announced evaluation there should be a pre-conference, and a review of materials, and after each observation there should be a post-conference. The post-conference follows a very specific format. We will dig into pre- and post-conferencing later in the training. Comfortably Confident by the time you sit down for the post conference *clarification of instructional decisions *review of student artifacts *request of teacher materials used After lesson, you should ask clarifying questions concerning the lesson. What evidence do you have that all students achieved the objective of the day? After the Lesson Ask clarifying questions if needed (before the post-conference) Ex. What thought process did you use to group your students?

Collecting Evidence is Essential Detailed Collection of Evidence: Unbiased notes about what occurs during a classroom lesson. Capture: What the students say What the students do What the teacher says What the teacher does Copy wording from visuals used during the lesson. Record time segments of lesson. Remember that using the rubric as a checklist will not capture the quality of student learning. 11:57 Capturing what teachers say and do and what students say and do is essential to the evaluation process. Use the words ‘capturing evidence’. Stress that they should capture enough written evidence to be able to accurately score each indicator. Recording time segments helps evaluator give feedback on lesson structure and pacing. Scientist with microscope analogy-collecting evidence and making sure the focus level is not too specific nor too broad The collection of detailed evidence is ESSENTIAL for the evaluation process to be implemented accurately, fairly, and for the intended purpose of the process.

Evidence Collecting Tips During the lesson: Monitor and record time Use short-hand as appropriate for you Pay special attention to questions and feedback Record key evidence verbatim Circulate without disrupting Focus on what students are saying and doing, not just the teacher T11:58 These are just suggestions. A restatement of the descriptors is not evidence. Capturing enough verbatim dialogue to accurately score each indicator is key. Talk about your personal experiences. Tell them that as they practice they will find what works best for them, and these are all suggestions. Tell them they may want to put quotation marks around verbatim notes. And really hone in on what students are doing, saying...not just teacher.

Sample Evidence Collection Notes Academic Feedback Motivating Students 11:59 This is a sample of how some of the scripting notes look like for one evaluator. The use of “T” for teacher, and “S” for student is strongly suggested as it reminds us that we should be capturing verbatim dialogue. It is important to emphasize that labeling of indicators next to pieces of evidence should happen after the lesson observation and not during the lesson observation. When the participants are categorizing evidence they can go through their notes and label the abbreviated indicators as in the examples above. Only begin to categorize the notes after you have collected all evidence until you have had a lot of experience collecting the evidence. Time

Sample Evidence Collection Notes Students Teacher Noon-lunch until 1:00This is a sample of how some of the scripting notes look like for one evaluator. The use of “T” for teacher, and “S” for student is strongly suggested as it reminds us that we should be capturing verbatim dialogue. It is important to emphasize that labeling of indicators next to pieces of evidence should happen after the lesson observation and not during the lesson observation. When the participants are categorizing evidence they can go through their notes and label the abbreviated indicators as in the examples above. Standards & Objectives Presenting Instructional Content

Observing Classroom Instruction We will view a lesson and gather evidence. After viewing the lesson, we will categorize evidence and assign scores in the Instruction domain. In order to categorize evidence and assign scores, what will you need to do as you watch the lesson? Capture what the students and teacher say and do. Remember that the rubric is NOT a checklist! 1:01Remind participants that their main job should be to capture as much verbatim dialogue as possible. Tell them to get all of their materials ready. Remind them that this is the first practice of many. As a trainer you should be prepared to monitor the level at which participants are capturing evidence during the lesson video. Model the expectation of collecting evidence during the lesson, while observing the participants. You may want to stop the video to remind participants that this practice helps them improve for teachers and their students. Mandatory joke about the rubric not being a checklist.

Questions to ask yourself to determine whether or not a lesson is effective: What did the teacher teach? What did the students and teacher do to work toward mastery? What did the students learn, and how do we know? 1:02 Review these questions, emphasizing again that student learning is the ultimate goal. Make a chart paper poster of these to refer to throughout the training.

Watch a Lesson We will now watch a lesson and apply some of the learning we have had so far about the rubric. Each group will only categorize their evidence for 1-2 indicators on the rubric. In order to do this, it is imperative that you capture as much evidence as you can during the lesson. You will be assigned which indicator(s) after the lesson.

Categorizing Evidence and Scoring Step 1: Zoom in and collect as much teacher and student evidence as possible for each descriptor. Step 2: Zoom out and look holistically at the evidence gathered and ask... where does the preponderance of evidence fall? Step 3: Consider how the teacher’s use of this indicator impacted students moving towards mastery of the objective. Step 4: Assign score based on preponderance of evidence. Carefully monitor and provide feedback as participants gather evidence and rate. Make sure that the evidence and ratings are within one point of national raters. Scores are included in Trainer Materials.

Video #1 1:47 (45 min) 10th world history Remind participants that they will capture evidence for the instruction domain

Evaluation of Classroom Instruction Reflect on the lesson you just viewed and the evidence you collected. Based on the evidence, do you view this teacher’s instruction as Above Expectations, At Expectations, or Below Expectations? Thumbs up: Above Expectations Thumbs down: Below Expectations In the middle: At Expectations 1:48 Trainer note: Ask the participants to reflect on the lesson holistically and determine overall whether their initial perception of the instruction is above At Expectations, At Expectations or below At Expectations. It is suggested that you do the thumbs up/down process to get a read of where the participants are.

Categorize and Score your Indicator(s) Each group will be assigned 1-2 indicators. You will have 20 minutes to complete your indicator(s) First, with a partner in your group agree upon the evidence that you captured for your indicator. Do not score yet! Once all partners have agreed upon their evidence, the group should come together and agree upon evidence. Only then should you score the indicator(s) Give the participants the “Thought Process” Document. Have them read though it and then ask each group to decide how they would score the indicator given the evidence that is presented.   Next, give 10 minutes for them to work with a partner and then 20 minutes for the groups to chart their indicator(s). It is important here to split up the room so at least 2 groups are doing every indicator. This way you can choose the group that is CLOSEST to the score and they can share out with the whole group. For example: Group 1: Standards and Objectives & Motivating Students Group 2: Motivating Students & Presenting Instructional Content Group 3: Presenting Instructional Content & Lesson Structure and Pacing, etc.

Group Roles Once you get to the group work, there are a few roles that need to be assigned: “Holder of the Manual” This person will make sure that we are interpreting each indicator correctly, and answer any questions group members have about it. “Evidence Gatherer” This person will make sure that evidence collected is not just a restatement of the rubric. “Value Judgment Police” This person will make sure people do not use value judgment statements. (Ex. “I would have…”, “She should have…”) “Timekeeper” This person will keep the group on time and on task.

Did you remember to ask yourself these questions? What did the teacher teach? What did the students and teacher do to work toward mastery? What did the students learn, and how do we know? 2:38 Review these questions again, emphasizing that student learning is the ultimate goal. Trainer may want have participants share out/debrief the answers

Debrief Evidence and Scores Whole group will debrief the evidence that was captured and the scores that were given.

Chapter 4: Post-Conferences 10:21

Post-Conference Round Table (Activity) What is the purpose of a post-conference? As a classroom teacher, what do you want from a post-conference? As a classroom teacher, what don’t you want from a post-conference? As an evaluator, what do you want from a post-conference? As an evaluator, what don’t you want from a post-conference? 10:29 (8 min)Insert transitions. Make the connection between the purpose of the post-conference and the purpose of evaluations. Participants will discuss with their tablemates what they do/don’t want from a post-conference; they will experience this from the teachers’ perspective. Quick share, whole group.

Characteristics of an Ideal Post-Conference Teacher did a lot of the talking Teacher reflected on strengths and areas for improvement Teacher actively sought help to improve A professional dialogue about student-centered instruction Collaboration centered on improvement Discussion about student learning More asking, less telling 10:30

Parts of the Post-Conference Introduction Greeting, purpose, time, and general impression question Reinforcement (area of relative strength) Ask self-analysis question Provide evidence from notes Identify potential opportunities for sharing this strength Ex. Peer partnership, sharing at a faculty meeting or PLC, etc. Refinement (area of relative improvement) Give a recommendation for actionable next steps Give a definite follow up timeline Share Scores 10:32 This is the agenda for the post-conference. Stress to participants that they should share this agenda with their teachers.

Developing Coaching Questions Questions should be open-ended. Questions should ask teachers to reflect on practice and student learning. Questions should align to rubric and be grounded in evidence. Questions should model the type of questioning you would expect to see between teachers and students. i.e. open-ended, higher-order, reflective 10:33

Examples of Coaching Questions Questions that clarify goals: What kind of background information did students need to have? What did you want students to learn or be able to do? How did you decide what you wanted to teach? Questions that gauge success of the lesson: How were you assessing the students during the lesson? What were you looking for or listening for to determine if students were able to master the objective? 10:34 Each indicator has its own set of coaching questions (in handbook). You should be prepared with more questions.

Examples of Coaching Questions Questions that anticipate approaches: What problems did you anticipate students would have mastering this objective? Tell me about activities you planned and how they supported the objective. Questions that reflect on the students: Who was successful with this lesson and how did you know? What were you able to do to help them be successful? Who struggled with this lesson? Why do you think they struggled? 10:35

Examples of Coaching Questions Questions that summarize and recall details: What do you think went well during the lesson? How do you know that? What evidence did you see that…? Why is that important? Questions that analyze causal factors: What do you think caused…? What impact do you think that had on…? What was different between what you envisioned and what happened? Why do you think those differences occurred? 10:36

Examples of Coaching Questions Questions that construct new learning/ application: What do you want to be mindful of from now on? How might this affect student learning? How else might this look in your class? 10:37

Examples of Coaching Questions Questions that commit to application: How do you plan to apply what we have talked about? What can you do to maintain this new focus? Questions that reflect on the process: As you reflect on this conversation, how has it supported your learning? How might what we talked about impact your thinking on (a specific indicator)? 10:38

Selecting Areas of Reinforcement and Refinement Remember: Choose the areas that will give you the “biggest bang for your buck”. Do not choose an area of refinement that would overlap your area of reinforcement, or vice-versa. Choose areas for which you have specific and sufficient evidence. 10:39 Pg 9 supplemental materials

Identify Examples: Reinforcement Identify specific examples from your evidence notes of the area being reinforced. Examples should contain exact quotes from the lesson or vivid descriptions of actions taken. For example, if your area of reinforcement is academic feedback, you might highlight the following: In your opening, you adjusted instruction by giving specific academic feedback. “You counted the sides to decide if this was a triangle. I think you missed a side when you were counting. Let’s try again,” instead of just saying “Try again”. 10:40

Identify Examples: Refinement Identify specific examples from your evidence notes of the area being refined. Examples should contain exact quotes from the lesson or vivid descriptions of actions taken. For example, if your area of refinement is questioning, you might highlight the following: Throughout your lesson you asked numerous questions, but they all remained at the ‘remember level’. Ex. “Is this a triangle?” instead of “How do you know this is a triangle?” Additionally, you only provided wait time for three of the six questions you asked. 10:41 then break until 10:56

Post-Conference Video 11:12 PC is 16 min

Post-Conference Debrief (Activity) Discuss with your table group parts of the post-conference that were effective and the reasons why. Discuss with your table group at least one way the evaluator could improve and why. Be ready to share with the group. 11:20 (8 min) This is one way an evaluator chose to address the teacher’s score questions-there is no right or wrong way. The teacher did not feel comfortable having the last part of this post conference as video for training purposes

Procedural Understanding vs. Conceptual Understanding. Procedural Knowledge Conceptual Knowledge Beginning of understanding Thorough understanding/ independence Trainer should remind participants why we are here and what the objectives are. This is just the first practice of many. We will be moving along the continuum of procedural to conceptual knowledge, and only though repeated exposure will they gain a more conceptual understanding of the rubric and all its uses.

Environment Domain (Activity) Just like we did for the other domains, highlight the important words from the descriptors of the Environment domain. Trainer Note: You may want to include a discussion about the professionalism domain to make sure that the teachers have a chance to see it and understand that it is about self reflection and growth.

Environment Domain 3:46 *last model* The trainer can display the screenshot of the Environment domain on this slide to remind participants to be looking at the Environment domain in the General Educator Rubric. After the participants have completed their work, debrief for 5 minutes connections between the 2 rubrics. Teacher sets high and demanding academic expectations for every student I am going to start by looking at the “At Expectations” column under thinking because I’m told that this references a teacher presenting an “effective” lesson.   Trainer: I’m going to look at the first indicator “Expectations.” Let me look at the first descriptor. It says, “Teacher sets high and demanding academic expectations for every student.” When I see the word expectations it makes me think of other areas of the rubric that also reference expectations. If I look at Standards and Objectives the fourth descriptor or bullet in the At Expectations column reads, “Expectations for student performance are clear.” This is a connection between the Learning Environment rubric and the Instructional Rubric with regard to student expectations. I know from looking at the Instructional rubric that there is more than one reference to expectations. Let me look at the Instructional rubric again and see if there is another reference to expectations. If I look at the indicator of Presenting Instructional Content I see that the third descriptor or bullet references expectations as well. This descriptor reads, Presentation of content most of the time includes: “modeling by the teacher to demonstrate performance expectations.” If I look at the indicator of Grouping Students the second descriptor reads, “Most students in groups know their roles, responsibilities, and group work expectations.” This causes me to think about the connection between the academic expectations and performance expectations and now the group work expectations. Expectations seems to be a key word that connects the Environment to other areas of the Instructional rubric.

Environment and Instruction Connections Expectations Teacher sets high and demanding expectations for every student. S/O: Expectations for student performance are clear. There is evidence that most students demonstrate mastery of the objective. PIC: Presentation of content includes modeling by the teacher to demonstrate performance expectations. AM: Activities and materials are challenging. Q: Questions sometimes require active responses. AF: Feedback from students is used to monitor and adjust instruction TKS: Teacher sometimes provides differentiated instructional methods and content to ensure children have the opportunity to master what is being taught Teacher encourages students to learn from mistakes. Teacher creates learning opportunities where most students can experience success. Students complete work according to teacher expectations. 3:48 5 min.: Model; reinforce created tool to use now and at school AF: Feedback from students is used to monitor and adjust instruction Teacher encourages students to learn from mistakes; TKS: T sometimes provides differentiated instructional methods and content to ensure children have the opportunity to master what is being taught Teacher creates learning opportunities where most students experience success; S/O There is evidence that most students demonstrate mastery of the objective Students complete work according to teacher expectations

Environment and Instruction Connections (Activity) With a partner (5 min.) Make connections between the Instruction domain and the Managing Student Behavior indicator in the Environment domain. Individually (10 min.) Make connections between the Instruction domain Environment and Respectful Culture indicators in the Environment domain. 4:03

Video #2 44 minute lesson

Evaluation of Classroom Instruction Reflect on the lesson you just viewed and the evidence you collected. Based on the evidence, do you view this teacher’s instruction as Above Expectations, At Expectations, or Below Expectations? Thumbs up: Above Expectations Thumbs down: Below Expectations In the middle: At Expectations 4:48 Should take 2-3 minutes to do a thumbs up/down Trainer note: Ask the participants to reflect on the lesson holistically and determine overall whether their initial perception of the instruction is above At Expectations, At Expectations or below At Expectations. It is suggested that you do the thumbs up/down process to get a read of where the participants are.

Next Steps Hold on to your evidence and make sure you bring it with you tomorrow. Optional Homework: Try labeling your evidence with the short-hand we discussed today. List any follow up questions you would need to ask the teacher We will score this lesson tomorrow based on the evidence you collected today. 4:50 Note that this delay is intentional to mimic a real-life situation in which you may not be able to score immediately after viewing a lesson. Actually harder to score during training because you are not able to ask the teacher. Remember, though, in your building you always can ask those clarifying questions.

Wrap-up for Today As we reflect on our work today, please use two post-it notes to record the following: One “Ah-ha!” moment One “Oh no!” moment Please post to the chart paper Expectations for tomorrow: We will continue to collect and categorize evidence and have a post-conference conversation 5:00

This Concludes Day 1 Thank you for your participation! Instruction Planning Environment 9:01

Welcome to Day 2! Instruction Planning Environment 9:04 Answer parking lot questions and review selected responses to yesterday’s ticket out the door.

Expectations To prevent distracting yourself or others, please put away all cellphones, iPads, and other electronic devices. There will be time during breaks and lunch to use these devices as needed. 9:05

Day 2 Objectives Participants will: Continue to build understanding of the importance of collecting evidence to accurately assess classroom instruction. Understand the quantitative portion of the evaluation. Identify the critical elements of summative conferences. Become familiar with data system and websites. 9:06 Re-emphasize the fact that all participants will start at a procedural level, and will be supported to gain a more conceptual understanding. Collecting evidence is a key step in effectively observing and evaluating teachers.

Norms Keep your focus and decision-making centered on students and educators. Be present and engaged. Limit distractions and sidebar conversations. If urgent matters come up, please step outside. Challenge with respect, and respect all. Disagreement can be a healthy part of learning! Be solutions-oriented. For the good of the group, look for the possible. Risk productive struggle. This is a safe space to get out of your comfort zone.

Agenda: Day 2 Day Components Day Two Post-Conferences Professionalism Rubric Alternate Rubrics Quantitative Measures Closing out the Year 9:07

Evidence and Scores Remember: In order to accurately score any of the indicators, you need to have sufficient and appropriate evidence captured and categorized. Evidence is not simply restating the rubric. Evidence is: What the students say What the students do What the teacher says What the teacher does 9:08The trainer should emphasize the difference between presenting evidence collected during the lesson and merely restating the rubric. The next slide will give an example of restating the rubric.

Which of these is an example of evidence? Activities and Materials Evidence and Scores Which of these is an example of evidence? Activities and Materials Students used the computer program Kidspiration to develop a Venn Diagram using the two read-alouds as the basis for their comparisons. OR The activities and materials incorporated multimedia and technology. 9:09 The trainer will read the two examples and it will become immediately obvious for participants which of these examples is the example of evidence (A).

Categorizing Evidence and Assigning Scores Using the template provided (pgs. 3-5), you will categorize evidence and assign scores for the Instruction domain. Using the template provided, you will also categorize evidence collected and assign scores on the Environment domain. 10:00 (51 min)Trainer should circulate during this independent activity. It is important to folks who are easily distracted to work in a quiet environment, so if you (trainer) speak to participants, please use a soft voice. Note: Please work independently first.

Consensus Scoring (Activity) Work with your shoulder partner to come to consensus regarding all indicator scores. Work with your table group to come to consensus regarding all indicator scores. 10:20 (20 min) Then debrief national rater scores whole group. (Share national raters’ scores, emphasizing that a difference of “1” below or above is statistically acceptable.) Support participants in understanding that student engagement and learning is not evident in this lesson. Stress here the difference between trying to “do the rubric as a checklist” and teaching with student learning as part of evidence. Notice how this lesson actually scores, even though the teacher does many things that make it look like an evaluator can say “she did that so I have to give her credit.” Stress that environment is about learning environment-not quiet environment. Evidence from different instruction indicators is used to score the environment rubric. If learning isn't happening, and the instruction scores lower, logically-environment can’t score extremely high. Likewise, a dynamic, exciting lesson where students are engaged and learning can score high on instruction; but logically wouldn’t score extremely low on environment. Participants will struggle and even disagree with this. Focus on environment means learning environment that is conducive to and also supports student learning.

Last Practice… This is the third and final practice video during our training. You will watch the lesson, collect evidence, categorize the evidence, and score the instructional indicators. Requirements for certification: No indicator scored +/- 3 away No more than two indicators scored +/- 2 away Average of the twelve indicators must be within +/- .90 11:23

Video #3 12:11 then lunch until 1:11 Lesson is 48 minutes 12th AP English

Categorizing Evidence and Assigning Scores (Activity) Work independently to categorize evidence for all 12 Instruction indicators. After you have categorized evidence, assign scores for each indicator. Are there clarifying questions you would ask the teacher prior to your post-conference? When you have finished, you may check with a trainer to compare your scores with those of the national raters. 2:11 (1 hr includes debrief of national rater scores) Trainer should circulate during this activity. At this point, the participants should work independently and without talking. When they have finished, they should start developing a plan for the post-conference.

Writing Your Post-Conference Plan (Activity) On the sheet provided (pg. 16), write your: Area of reinforcement (relative strength) Self-reflection question Evidence from lesson 2:16 (5 min) Remind participants to use their evidence from the lesson.

Writing Your Post-Conference Plan (Activity) On the sheet provided (pg. 17), write your: Area of refinement Self-reflection question Evidence from lesson Recommendation to improve 2:22 (6 min) then break until 2:37 A prompt for trainer: is your recommendation an actionable next step and does it include a follow up timeline?

Role Play (Activity) With a shoulder partner, take turns leading the post-conference you’ve developed. Participants should be cooperative the first time to practice developing and asking open-ended, reflective questions in the moment. Participants can be more creative in the role they assume the second time to practice different feedback strategies. 3:02 (15 min)*Trainers may choose to quickly model what the role play may look and sound like

Reflect and Debrief (Activity) Discuss with your table What were some strengths of these conferences? What are some struggles with these conferences? Why is questioning more effective than simply “telling” in a post-conference? Tables share something discussed with whole group. 3:10 (8 min)

Whole Group Debrief (Activity) Share some examples of what can be said and done and what should be avoided in the post-conference. How did this experience help you as a learner? How and why is this powerful for student learning? Scores are shared at the end of the conference. Why is it appropriate to wait until the end of the conference to do this? 3:20 (10 min) then break until Remind participants that the scores you determine and record before the conference are not to be negotiated or changed.

Chapter 6: Professionalism 3:21

Professionalism Form Form applies to all teachers Completed within last six weeks of school year Based on activities from the full year Discussed with the teacher in a conference 3:23 The Professionalism rubric should be evaluated within the last six weeks of the school year as a summative measure of the teacher’s roles and responsibilities and should consider evidence from the full year. The evaluator should discuss the ratings with the teacher, and this may be done during an end-of-year conference. The same report applies to all teachers no matter what other rubric may have been used for observations (Library Media Specialist, Alternative Education, or School Services Personnel). The rubric contains all the same standards (generally at level 3) with more specific examples at level 1 and 5. Similar to all other observations, you must select an area of reinforcement and refinement. Professionalism is not meant to be an arduous evidence collection process. The focus should be on the quality of actions and not necessarily on a pre-determined quantity of activity.

Professionalism Rubric 3:25 This rubric was built to support consistent evaluation of the Professionalism Rating Report. As with the other rubrics, evaluators may still give a 2 (below expectations) and 4 (above expectations). Ask: How does this rubric differ from the other rubrics? How is it the same?

Professionalism Rubric (Continued) 326

Rubric Activity With a partner (15 min.) Identify the main differences between the performance levels for each indicator. What would that look like in reality? List examples of evidence that could be used to score each indicator. 3:41 Debrief after participants have had time to discuss and record. Discuss school-level expectations in advance. Focus on quality, not quantity. Ex. Leading a quality PLC would be of higher quality than attending 17 football games.

Chapter 7: Alternate Rubrics 3:42

Reflection on this Year It is important to maintain high standards of excellence for all educator groups. School Services Personnel: Overall Average of 4.29 Library Media Specialists: Overall Average of 4.06 General Educators: Overall Average of 3.78 As you can see, scoring among these educator groups is somewhat higher than what we have seen overall. As evaluators why is that the case? 3:44 Emphasize the importance of honest conversations. If we don’t hold these educators to a high bar, we will continue to get instructional/service delivery that is below standard. Evaluating educators using the SSP and LMS rubrics tends to be more challenging because evaluators have less experience with these roles. That is why we developed the observation guidance documents– to ensure everyone has the information they need to hold all educators to a high standard.

When to Use an Alternate Rubric If there is a compelling reason not to use the general educator rubric, you should use one of the alternate rubrics. Ex. If the bulk of an educator’s time is spent on delivery of services rather than delivery of instruction, you should use an alternate rubric. If it is unclear which rubric to use, consult with the teacher. When evaluating interventionists, pay special attention to whether or not they are delivering services or instruction. 3:45 for example, a highly scripted computer program is more in the service area than instruction.

Pre-Conferences for Alternate Rubrics For the Evaluator For the Educator Discuss targeted domain(s) Evidence the educator is expected to provide and/or a description of the setting to be observed Roles and responsibilities of the educator Discuss job responsibilities Provide the evaluator with additional context and information Understand evaluator expectations and next steps 3:47 How is the preconference for alternate rubrics similar and different than the general educator pre-conference? What is the role of the educator? Similar to general educator but more context may be necessary. The pre-conference is an incredibly valuable opportunity for the evaluator to understand the educator’s particular role.

Library Media Specialist Rubric Look at the Library Media Specialist rubric and notice similarities to the General Educator Rubric: Professionalism: same at the descriptor level Environment: same at the descriptor level Instruction: similar indicators, some different descriptors Planning: specific to duties (most different) 3:48 Because most of you have much more experience using the general educator rubric, we are going to put that knowledge to work! Since the environment and professionalism rubrics are the same, we will not be focusing on those domains here. We are first going to take a look at the instruction domain and do a comparison of that to the general educator instruction rubric. This will help you see how you can use what you already know to help you in evaluating LMS. Let’s take a look at Standards and Objectives in the At Expectations column.

Educator groups using the SSP rubric Audiologists Counselors Social Workers School Psychologists Speech/Language Pathologists Additional educator groups, at district discretion, without primary responsibility of instruction Ex. instructional and graduation coaches, case managers 3:49

SSP Observation Overview All announced Conversation and/or observation of delivery Suggested observation 10-15 minute delivery of services (when possible) 20-30 minute meeting Professional License: Minimum 2 classroom visits Minimum 60 total contact minutes Apprentice License: Minimum 4 classroom visits Minimum 90 total contact minutes 3:51 The minimum contact minutes are required by law, and the pre- and post-conference do not count towards that minimum. It is really important to use professional judgment when conducting SSP observations because these educators are involved in highly confidential conversations.

SSP Planning Planning indicators should be evaluated based on yearly plans Scope of work Analysis of work products Evaluation of services/program – Assessment When observing planning two separate times: the first time is to review the plan the second time is to make sure the plan was implemented 3:54 (3 min) Ask: What are some examples of data besides student test scores that may be used when evaluating an educator using the SSP rubric?

SSP Delivery of Services Keep in mind that the evidence collected may be different than the evidence collected under the General Educator Rubric. Some examples might be: Surveys of stakeholders Evaluations by stakeholders Interest inventories Discipline/attendance reports or rates Progress to IEP goals 3:55

SSP Environment Indicators are the same Descriptors are very similar to general educator rubric Environment for SSP May be applied to work space (as opposed to classroom) and interactions with students as well as parents, community and other stakeholders. 3:56

Observation Guidance Documents Educator groups convened by TDOE to provide additional information for evaluators to inform evaluation using SSP rubric Observation guidance documents were created for the following educator groups: GENERAL EDUCATOR RUBRIC SCHOOL SERVICES PERSONNEL RUBRIC Early Childhood School Counselors Special Education School Audiologists Career and Technical Education (CTE) Speech/Language Pathologists (SLP) Online Teaching School Social Workers (SSW) Alternative Educators Vision Specialists School Psychologists 3:58 Reference Observation Guidance Documents in Evaluation Handbook (pgs. 23-56). We don’t expect any evaluator to memorize these documents, but they are a great resource to reference prior to observing an educator in one of these sub-groups.

Key Takeaways Evaluating educators using the alternate rubrics: Planning should be based on an annual plan, not a lesson plan. Data used may be different than classroom teacher data. The job description and role of the educator should be the basis for evaluation. Educators who spend the bulk of their time delivering services rather than instruction, should be evaluated using an alternate rubric. It is important to maintain high standards for all educator groups. 4:00

Chapter 8: Quantitative Measures 4:01

Components of Evaluation: Tested Grades and Subjects Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 4:02 Don’t forget, some districts use student surveys for evaluative purposes. It comes from the qualitative side.

Components of Evaluation: Non-tested Grades and Subjects Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 4:03

Growth Overview State law currently requires value-added (or a comparable growth measure) to count as 35 percent of the total evaluation score for teachers in tested grades and subjects. State law currently requires value-added to count as 25 percent of the total evaluation score for teachers in non-tested grades and subjects. Any additional changes in the requirement of 35 percent counting as value-added would require legislative action. Additional measures for non-tested grades/subjects. 4:05 If you have an individual growth score, you are required to use it.

Growth measures progress from a baseline Growth vs. Achievement Growth measures progress from a baseline Ex. John grew faster than we would have predicted based on his testing history. Achievement measures proficiency Ex. John scored a 98 percent on his test. A link to a video series about TVAAS as well as some additional guidance documents can be found here: http://team-tn.org/evaluation/tvaas/ 4:06

Individual Value-Added Score Tested Grades/Areas Individual Value-Added Score Includes subjects currently taught 3 year trend scores, where available Any educator with an individual score has to use it Data System All individual value-added scores will be directly imported into the data system by the state. All educators, including those who anticipate earning an individual growth score, must select a school-wide option Timeline Scores are returned by June 15th 4:09 For teachers who have enough students to generate an individual growth score, those scores will be automatically mapped in and will override their school-wide choice. While scores will be available on the accountability app on June 15, it may take a couple of weeks for them to appear in CODE and on the TVAAS website.

Non-tested Grades/Areas School-Wide Value-Added Score 4 composite options: overall, literacy, numeracy, and literacy + numeracy 1 year score TCAP specific, SAT 10 specific and CTE Concentrator Data System Evaluators must select which composite to use All educators, including those who anticipate earning an individual growth score, must select a school-wide option Scores will be imported into the data system by the state Timeline Scores will be returned in mid-late June 4:11

Recommended Composite Districts will determine which composite a non-tested educator will use Subject Recommended Composite Academic Interventionists Overall, Literacy, Math, or Math/Literacy Computer Technology Overall CTE CTE Concentrator/Student (where available) ELL Overall, Literacy Fine Arts Fine Arts Portfolio (in participating districts), Overall, Literacy Health-Wellness and PE HS Core Non-Tested Library Media Specialists SPED School Services Providers World Languages Overall or Literacy Early Grades Overall or Math/Literacy (from feeder schools) 4:14

15 Percent Achievement Measure Overview The 15 percent achievement measure is a yearly goal set by the educator and his/her evaluator that is based on current year data. 4:15 Educator has final say if there is disagreement.

Spirit and Process of the 15 Percent Measure Relationship to core beliefs If our focus is on improving the lives of students, then we have to approach the selection of the measure with that in mind. To make the 15 percent selection meaningful, the evaluator and educator work together to identify a measure. If there is a disagreement between the educator and the evaluator, the educator’s decision stands. The process should involve determining which measure most closely aligns to the educator’s job responsibilities and the school’s goals. 4:17Recheck… Please remember that all selection, scaling, and scoring happens at the local level. Please remember that most measure selections will need to be manually entered, so keep that in mind when considering deadlines.

Process of Selecting the 15 Percent Measure Things to watch for: Measures that are unrelated to a teacher’s instructional responsibilities. Ex. A 5th grade math teacher choosing 4th grade social studies. Scales that are not rigorous or reflective of expectations. Ex. If last year’s P/A% for Social Studies was 90 percent, the scale below would not be rigorous or reflective of expectations. 4:20 Explain that there should be a direct link between what the teacher teaches and the 15% measure that she chooses, but that link may not be subject specific. (Example: graduation rates) Example: In high school, the graduation rate or ACT score would be appropriate for most high school teachers. Recheck due to PARCC 1 2 3 4 5 0-20% 20-40% 40-60% 60-80% 80-100%

Spirit of Scaling the 15% Measure Scales should be determined with the following spirit in mind: Score Equivalent Scale 1 0- ½ years of growth 2 ½-1 years of growth 3 1- 1 ½ years of growth 4 1 ½ - 2 years of growth 5 2+ years of growth 4:22 This table outlines generally what should be kept in mind when creating a scale. Not standardized at a school for all teachers: All teachers start at a different baseline. Set of students and context should inform goal.

Beginning of the Year Conference Evaluator notifies teacher which 35 percent measure will apply. This occurs even for teachers who anticipate receiving an individual growth score. If the teacher has enough students to generate an individual score, that score will be automatically mapped in and will override the selected school-wide measure. Evaluator and teacher choose a 15 percent measure. Evaluator and teacher scale the 15 percent measure. 4:25 It is a best practice to incorporate this conversation into an initial coaching conversation.

Chapter 9: Closing out the Year 4:35

End of Year Conference Time: 15-20 minutes Required Components: Discussion of Professionalism scores Share final qualitative (observation) data scores Share final 15 percent quantitative data (if measure is available) Let the teacher know when the overall score will be calculated Other Components: Commend places of progress Focus on the places of continued need for improvement 4:37

End of Year Conference Saving Time Have teachers review their data in the data system prior to the meeting. Incorporate this meeting with existing end of year wrap-up meetings that already take place at the district/school. 4:38

Observation ratings cannot be challenged. Grievance Process Areas that can be challenged: Fidelity of the TEAM process, which is the law. Accuracy of the TVAAS or achievement data Observation ratings cannot be challenged. 4:39 Trainer should emphasize that each Board of Education must adopt a TEAM Grievance Policy that aligns with the Tennessee State Board of Education Teacher and Principal Evaluation Policy. Teachers can request a Grievance Form through their building level administrator. One of the reasons the claiming process is so important is because improper claiming can result in inaccurate TVASS data—which is grievable.

Relationship Between Individual Growth and Observation We expect to see a logical relationship between individual growth scores and observation scores. This is measured by the percentage of teachers who have individual growth scores three or more levels away from their observation scores. Sometimes there will be a gap between individual growth and observation for an individual teacher, and that’s okay! This is only concerning if it happens for every educator in your building. When we see a relationship that is not logical for many teachers within the same building, we try to find out why and provide any needed support. School-wide growth is not a factor in this relationship. Why? We want to avoid situations where teachers are given conflicting information. For example, if a teacher is consistently being told that he/she is not meeting expectations (i.e. Observed at a Level 1 or 2), but individual growth data shows something different (i.e. TVAAS score of 4 or 5), that can be a very confusing and frustrating experience for the teacher. The reverse of this situation can also be frustrating (Ex. When a teacher is consistently observed at a Level 4 or 5, but their individual growth shows them at a Level 1 or 2).

Data System Reports The CODE data system reports allow district- and school-level administrators to access a lot of valuable data, including, but not limited to, the following: Comparisons of observation scores by evaluator Summaries of average scores School-wide reinforcement and refinement indicators You can access more information about the CODE data system here: http://team-tn.org/evaluation/data-system/ 4:42 Best practice: Observation data should be entered into CODE or your observation data system within one day, so that teachers and principals can see the most up-to-date data. This also gives the teacher an opportunity to catch any clerical errors.

CODE Reports Potential Purpose CODE Report Which teachers are really strong in areas where we struggle as a district? Observation Summary by Teacher (export) Educator Averages by Rubric Domain How should we target school or district PD? Overall Averages by Rubric Indicator Refinement Goals (pie chart) Are there patterns in the way our administrators are scoring? Observer Averages by Rubric Domain Overall Averages by Observer How are the teachers in key AMO subject areas performing on observations? Overall Averages by Subject Subject Averages by Rubric Domain Which teachers should be recommended for leadership opportunities and recognition? Overall Averages by Educator

TEAM Webpage www.team-tn.org

Important Reminders We must pay more attention than ever before to evidence of student learning, i.e. “How does the lesson affect the student?” You are the instructional leader, and you are responsible for using your expertise, knowledge of research base, guidance, and sound judgment in the evaluation process. As the instructional leader, it is your responsibility to continue learning about the most current and effective instructional practices. When appropriate, we must have difficult conversations for the sake of our students! 4:47 (4 min)

Resources E-mail: Websites: Questions: TEAM.Questions@tn.gov Training: TNED.Registration@tn.gov Websites: NIET Best Practices Portal: Portal with hours of video and professional development resources. www.nietbestpractices.org TEAM website: www.team-tn.org Weekly TEAM Updates Email TEAM.Questions@tn.gov to be added to this listserv. Archived versions can also be found on our website here: http://team-tn.org/resources/team-update/ 4:50 *give them time to jot these and tell them the ppt is on the website www.team-tn.org

Expectations for the Year Please continue to communicate the expectations of the rubrics with your teachers. If you have questions about the rubrics, please ask your district personnel or send your questions to TEAM.Questions@tn.gov. You must pass the certification test before you begin any teacher observations. Conducting observations without passing the certification test is a grievable offense and will invalidate observations. Violation of this policy will negatively impact administrator evaluation scores. 4:52

Immediate Next Steps MAKE SURE YOU HAVE PUT AN ‘X’ BY YOUR NAME ON THE ELECTRONIC ROSTER! Please also make sure all information is correct. If you don’t sign in, you will not be able to take the certification test and will have to attend another training. There are NO exceptions! Within the next week, you will be receiving an email to invite you to the NIET Best Practices portal. Email support@niet.org with any problems or questions. You will need to pass the certification test before you begin your observations. Once you pass the certification test, print the certificate and submit it to your district HR representative. 4:55

Thanks for your participation! Have a great year! Instruction Planning Environment 5:00