Download presentation
Presentation is loading. Please wait.
1
Instruction Planning Environment
TEAM Teacher Evaluation Process Overview Christiana Elementary School August 3, 2015 Instruction Planning Environment 8:31 Welcome and introductions. Explain parking lot.
2
Expectations To prevent distracting yourself or others, please put away all cellphones, iPads, and other electronic devices. There will be time during breaks and lunch to use these devices as needed. 8:33 *many use these as learning tools
3
Norms Keep your focus and decision-making centered on students.
Be present and engaged. Limit distractions and sidebar conversations. Challenge with respect, and respect all. Disagreement can be a healthy part of learning! Be solutions-oriented. For the good of the team, look for the possible. Risk productive struggle. This is a safe space to get out of your comfort zone.
4
TEAM: Overview 8:41 am
5
Evaluation closely links with state standards
Getting students ready for postsecondary education and the workforce is WHY we teach State standards provide a vision of excellence for WHAT we teach TEAM provides a vision of excellence for HOW we teach 8:42
6
Tennessee has made major strides in improving educational outcomes.
Elementary and Middle Schools High Schools Fastest improving state in the nation on 4th and 8th grade NAEP Fastest growing graduation rate of any state Consistent gains on TCAP every year since new assessments in 2010 ACT statewide average has increased to 19.3
7
By 2025, 55% of all new jobs will require postsecondary education
At the same time, the world has changed and today’s students need much more to be able to succeed. By 2025, 55% of all new jobs will require postsecondary education Postsecondary graduates are more likely to be employed and have higher earnings than high school graduates. The gaps in employment and earnings between these groups have grown substantially over time.
8
Tennessee students are struggling in the early years after high school.
2007 Cohort of High School Freshmen 10,545 students did not graduate from high school 22,334 students graduated from high school and entered the workforce and earn an average salary of $9,030 annually 40,235 students enrolled in postsecondary. 58 percent were still enrolled in one year (or 20,418 of the 35,055 who enrolled immediately after graduation). 3,514 had completed a certificate or degree within three years. 10,545 students did not graduate from high school: This is the number of students who did not graduate on-time with a regular diploma (the parameters we use for all of our graduation rate calculations). 22,234 students graduated from high school and entered the workforce: This is based on students who graduated from high school with a regular, on-time diploma, and have no postsecondary experience to date. Earn an average salary of $9,030 annually: We have labor/income data on 14,745 of the 22,234 students listed above; this group makes an average salary of $9,030 and has a 16 percent chance of earning above minimum wage for the year. The other approx. 7,500 individuals for whom we don’t have data could be unemployed out of state, working out of state, in the federal government, in the military, self-employed, or in a job that otherwise does not pay into unemployment insurance. - This is data for the first year after graduation. This is calculated by looking at actual earnings over four quarters (or one year). The business rules are set to give students two quarters after high school graduation in which to look for a job, and then looks at their actual earnings over the next four quarters. 40,235 students enrolled in postsecondary: This figure includes public and private institutions, in-state and out-of-state, four-year, two-year and technical colleges (TCATs and some out of state ones as well). There is only a small number of institutions not picked up in our data, primarily small, private, proprietary schools and out-of-state community colleges. Of these students, 60 percent are enrolled in four-year programs, 36 percent in two-year programs, and 4 percent in technical colleges. 58 percent were still enrolled in one year: Of the 35,055 students who enrolled immediately after graduation, 58 percent are still enrolled one year later. This pulls out the students who already earned a degree or certificate from both the numerator and denominator. 3,514 had completed a certificate or degree within three years: Because these students graduated high school in 2011, we only have data from three years out at this moment in time (this summer, we would have data from four years out). There are many students who could still be working toward a four-year degree, but note that 40 percent of students attending postsecondary enrolled in either two-year programs or technical colleges.
9
Tennessee Promise gives students an incredible, new opportunity.
Free, Public K-14 System Grades K-12 Grades 13-14 Additional Postsecondary Education and Career Opportunities Free public education in Tennessee from grades K-14
10
It’s now our responsibility to set students up for success.
Changing World Given our progress, the changing world, and the opportunity of Tennessee Promise, we must reorganize around a new vision: Grades K-12 Grades 13-14 Progress TN Promise SUCCESS AFTER GRADUATION
11
SUCCESS AFTER GRADUATION
To ensure our students are ready for postsecondary success, we must meet the following goals. SUCCESS AFTER GRADUATION SUCCESS AFTER GRADUATION Goal #1 Goal #2 Goal #3 Goal #4 Tennessee will continue its rapid improvement and rank in the top half of states by 2019. The average ACT score in Tennessee will be a 21, allowing more students to earn HOPE scholarships. A majority of high school graduates will go on to earn a certificate, diploma, or degree. Tennessee’s high school seniors will improve faster than any other state’s. Goal #1: Currently we are 37th in 4th grade math, 31st in 4th grade reading, 43rd in 8th grade math, 34th in 8th grade reading Goal #2: Currently the average ACT score is NOTE: Tennessee is one of only 12 states that requires ALL juniors to take the ACT MEASUREMENT The class of 2020 will be on track to achieve 55% post secondary completion within six years. MEASUREMENT We will rank in the top half of states on 4th and 8th grade NAEP in 2019. MEASUREMENT We will be the fastest improving state on 12th grade NAEP in 2017. MEASUREMENT Tennessee will have an average public ACT composite score of 21 by 2020.
12
State Growth Highlights
Year of transition for implementing the state’s new standards in math and English—scores increased on the majority of assessments Nearly 50 percent of Algebra II students are on grade level Up from 31 percent in 2011 High school English scores grew considerably over last year’s results in English I and English II 8:50
13
State Growth Highlights cont.
Achievement gaps for minority students narrowed in math and reading at both the 3-8 and high school levels Approximately 100,000 additional students are on grade level in math compared to 2010 More than 57,000 additional Tennessee students are on grade level in science compared to 2010
14
Origin of the TEAM rubric
TDOE partnered with NIET to adapt their rubric for use in Tennessee. The NIET rubric is based on research and best practices from multiple sources. In addition to the research from Charlotte Danielson and others, NIET reviewed instructional guidelines and standards developed by numerous national and state teacher standards organizations. From this information they developed a comprehensive set of standards for teacher evaluation and development. Work that informed the NIET rubric included: The Interstate New Teacher Assessment and Support Consortium (INTASC) The National Board for Professional Teacher Standards Massachusetts' Principles for Effective Teaching California's Standards for the Teaching Profession Connecticut's Beginning Educator Support Program, and The New Teacher Center's Developmental Continuum of Teacher Abilities. 8:55 The rubric has been used for fifteen years and it's been tested and improved over time
15
Rubrics General Educator Library Media Specialist
School Services Personnel School Audiologist PreK-12 School Counselor PreK-12 School Social Worker PreK-12 School Psychologist PreK-12 Speech/Language Therapist May be used at the discretion of LEA for other educators who do not have direct instructional contact with students, such as instructional coaches who work with teachers. 8:56
16
Instruction Domains Planning Environment Professionalism
9:00 It’s all about instruction.
17
Evaluation Process Initial Coaching Conversation Pre-Conference
Required for teachers who received an overall effectiveness rating or individual growth score of 1 in the previous year Pre-Conference Classroom Visit Post-Conference Professionalism Scoring Summative Conference Repeat as needed depending on number of required observations 9:01 *Initial coaching conversation is plain ole good practice
18
Suggested Pacing 9:15, The minimum required number of observations for each teacher will be based on licensure status and evaluation scores from the previous year. Found on pg. 22 in the manual.
19
Observation Guidance Coaching Conversation
A targeted conversation with any teacher who scored a 1 on overall evaluation or individual growth about the number of required observations and what supports they will receive throughout the year to improve student achievement. Observing Multiple Domains During One Classroom Visit Districts may choose to observe the instruction domain during the same classroom visit as either the planning domain or the environment domain. Announced vs. Unannounced Visits At least half of domains observed must be unannounced, but it is the district’s discretion to have more than half of domains observed unannounced. 9:20 (5 min here)Initial coaching conversations should take place before the first official observation of the year. *Multiple domains-Strongly encourage evaluators to observe and score multiple domains together, when possible. *Announced vs. Unannounced-Any time frame windows (“next week”) or hints (“I may be on this hall tomorrow”) violates the spirit of unannounced visits. *Announced informs what they know of Best Practices and Unannounced informs of what happens on a regular basis *If you choose to do more than the required number of observations, this policy has to be applied consistently (i.e. you cannot single out one teacher to receive more observations than the rest of your teachers).
20
Core Beliefs We all have room to improve.
Our work has a direct impact on the opportunities and future of our students. We must take seriously the importance of honestly assessing our effectiveness and challenging each other to get better. The rubric is designed to present a rigorous vision of excellent instruction so that every teacher can see areas where he/she can improve. The focus of observation should be on student and teacher actions because that interaction is where learning occurs. 9:45 Make connections from participants’ share out to this list.
21
Core Beliefs cont. We score lessons, not people.
As you use the rubric during an observation, remember it is not a checklist. Observers should look for the preponderance of evidence based on the interaction between the students and the teacher. Every lesson has strengths and areas that can be improved. Each scored lesson is one factor in a multi-faceted evaluation model designed to provide a holistic view of teacher effectiveness. 9:50 Make connections from participants’ share out to this list.
22
TEAM: Diving into the Rubric
10:00
23
Effective Lesson Summary
Defined daily objective that is clearly communicated to students Student engagement and interaction Alignment of activities and materials throughout lesson Rigorous student work, citing evidence and using complex texts Student relevancy Numerous checks for mastery Differentiation 10:30 (Trainer Note) Make connections to: When a lesson is effective, we know it when we see it. But, when “it” is missing…How do we communicate what is missing to someone else? How do we build the missing skills in others? How do we measure “it?” TEAM provides us with what “it” is (i.e. what an effective lesson, effective teaching is), the process for building the skills in others, and the tools by which we measure it (TEAM Instructional Rubrics/Domains).
24
Significantly Above Expectations Significantly Below Expectations (1)
TEAM Rubric TDOE has worked with NIET to define a set of professional indicators, known as the Instructional Rubrics, to measure teaching skills, knowledge, and responsibilities of the teachers in a school. Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:40 Open your Evaluation System Handbook to the Instruction Domain of the General Educator Rubric. These slides will highlight each element of the rubric (Domain, Indicator, Descriptors and Performance Levels).
25
The Parts of the Rubric: Domains
Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:41 These slides will highlight each element of the rubric (Domain, Indicator, Descriptors and Performance Levels) in slide show mode. There are 4 domains included in the qualitative portion of teacher evaluation: planning, environment, instruction, and professionalism
26
The Parts of the Rubric: Indicators
Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:42
27
The Parts of the Rubric: Descriptors
Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:43
28
The Parts of the Rubric: Performance Levels
Instruction Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Standards and Objectives All learning objectives are clearly and explicitly communicated, connected to state standards and referenced throughout lesson. Sub-objectives are aligned and logically sequenced to the lesson’s major objective. Learning objectives are: (a) consistently connected to what students have previously learned, (b) know from life experiences, and (c) integrated with other disciplines. Expectations for student performance are clear, demanding, and high. There is evidence that most students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. Most learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are mostly aligned to the lesson’s major objective. Learning objectives are connected to what students have previously learned. Expectations for student performance are clear. Few learning objectives are communicated, connected to state standards and referenced throughout lesson. Sub-objectives are inconsistently aligned to the lesson’s major objective. Learning objectives are rarely connected to what students have previously learned. Expectations for student performance are vague. There is evidence that few students demonstrate mastery of the daily objective that supports significant progress towards mastery of a standard. 10:44 *
29
Connections between the Indicators
As some have already noticed, the indicators within the instructional domain are very interconnected with each other. The TEAM rubric is a holistic tool. What does this mean? Holistic: relating to or concerned with wholes or with complete systems What does this mean about the use of this evaluation and observation tool? In order to use the rubric effectively, both observer and those being observed have to see that each of the parts of each domain can only be understood when put in context of the whole. 12:31Trainer will read slide
30
Holistic Nature The rubric is not a checklist.
Teaching, and observations of that teaching, cannot only be a “yes/ no” answer. Only through an understanding of the holistic nature of the rubric can we see that many of these parts have to be put in context with each classroom, and with reference to all the other parts that go into teaching. 12:32
31
Questioning and Academic Feedback
The Questioning and Academic Feedback indicators are closely connected with each other. Look closely at these two indicators and discuss how you think they are linked. (teacher AND student links) The trainer can emphasize this if it was not emphasized in the group share out. If this has already been emphasized, then go to the next slide.
32
Thinking and Problem-Solving
The Thinking and Problem-Solving indicators are closely connected with each other. Look closely at these two indicators and discuss how you think they are linked. (teacher AND student links) 12:58 THK/PS slides & activity. The trainer will emphasize thinking as the process that leads to the product of different problem solving types. For example: teaching analytical thinking to students should result in their ability to identify relevant/irrelevant information. Another example: the process of analytical thinking and creative thinking should result in the product of being able to generate ideas. Ask participants if they see other links between thinking/problem solving and give them a few minutes to do this with a partner. Also have them discuss 3rd bullet point. Have a rep from each table share links and aha’s. This is another slide that can be emphasized if the group did not make enough connections. If good connections have already been made, then move on to the next slide.
33
The Thinking and Problem-Solving Link
1:00 pm The trainer will emphasize thinking as the process that leads to the product of different problem solving types. For example: teaching analytical thinking to students should result in their ability to identify relevant/irrelevant information. The process of analytical thinking should result in the product of being able to problem solve and being able to identify relevant/irrelevant information. Another example: the process of analytical thinking and creative thinking should result in the product of being able to generate ideas. Ask participants if they see other links between thinking/problem solving and give them three minutes to do this with a partner. Process Product
34
Thinking and Problem Solving Link cont.
Thinking and Problem Solving as described in the rubric are what we expect from students. All other indicators should culminate in high-quality thinking and problem solving by students. How? When teachers ask high level questions it promotes student thinking and when students are interacting with one another in a group, they are exercising research based thinking.
35
Planning—Instructional Plans
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Instructional Plans Instructional plans include: measurable and explicit goals aligned to state content standards; activities, materials, and assessments that: are aligned to state standards. are sequenced from basic to complex. build on prior student knowledge, are relevant to students’ lives, and integrate other disciplines. provide appropriate time for student work, student reflection, and lesson unit and closure; evidence that plan is appropriate for the age, knowledge, and interests of all learners; and evidence that the plan provides regular opportunities to accommodate individual student needs. goals aligned to state content standards; build on prior student knowledge. provide appropriate time for student work, and lesson and unit closure; evidence that plan is appropriate for the age, knowledge, and interests of most learners; and evidence that the plan provides some opportunities to accommodate individual student needs. few goals aligned to state content standards; are rarely aligned to state standards. are rarely logically sequenced. rarely build on prior student knowledge. inconsistently provide time for student work, and lesson and unit closure; Little evidence that the plan provides some opportunities to accommodate individual student needs. 1:07
36
Planning—Student Work
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Student Work Assignments require students to: organize, interpret, analyze, synthesize, and evaluate information rather than reproduce it; draw conclusions, make generalizations, and produce arguments that are supported through extended writing; and connect what they are learning to experiences, observations, feelings, or situations significant in their daily lives both inside and outside of school. interpret information rather than reproduce it; draw conclusions and support them through writing; and connect what they are learning to prior learning and some life experiences. mostly reproduce information; rarely draw conclusions and support them through writing; and rarely connect what they are learning to prior learning or life experiences. 1:10 Have the participants highlight key words in the remaining indicators.
37
Planning—Assessment Planning Assessment Assessment Plans:
Significantly Above Expectations (5) At Expectations (3) Significantly Below Expectations (1) Assessment Assessment Plans: are aligned with state content standards; have clear measurement criteria; measure student performance in more than three ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); require extended written tasks; are portfolio-based with clear illustrations of student progress toward state content standards; and include descriptions of how assessment results will be used to inform future instruction. have measurement criteria; measure student performance in more than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); require written tasks; and include performance checks throughout the school year. are rarely aligned with state content standards; have ambiguous measurement criteria; measure student performance in less than two ways (e.g., in the form of a project, experiment, presentation, essay, short answer, or multiple choice test); and include performance checks, although the purpose of these checks is not clear. 1:20 Discuss how these indicators relate to the Instructional rubric. Pick out key phrases and continue to reference the holistic view of the rubric.
38
Guidance on Planning Observations
The spirit of the Planning domain is to assess how a teacher plans a lesson that results in effective classroom instruction for students. Specific requirements for the lesson plan itself are entirely a district and/or school decision. What observers are trained to do for the planning observations Simply collect the lesson plan after the lesson. REMEMBER: You are not scoring the piece of paper, but rather you are evaluating how well the teacher’s plans contributed to student learning. 1:22 Trainer should emphasize that part of evaluating the planning domain is the student assessment piece. Without the instructional observation, it is difficult to assess student achievement.
39
Guidance on Planning Observations cont.
Evaluators should not accept lesson plans that are excessive in length and/or that only serve an evaluative rather than an instructional purpose. If the planning domain is being scored independently, a full length lesson should accompany that evaluation. To collect the full scope of evidence of student growth, the observer needs to see the lesson in action, not just the paper used for planning purposes. 1:24 In the past we have not fully emphasized the importance of seeing the plan in action. It’s not only important to be able to craft a well written lesson plan, but the observer needs to see it in action—especially to see the student assessment piece.
40
Questions observers ask to determine whether or not a lesson is effective:
What did the teacher teach? What did the students and teacher do to work toward mastery? What did the students learn, and how do we know? 2:07 Review these questions, emphasizing again that student learning is the ultimate goal. Make a chart paper poster of these to refer to throughout the training.
41
Environment Domain 9:55 *last model* The trainer can display the screenshot of the Environment domain on this slide to remind participants to be looking at the Environment domain in the General Educator Rubric. After the participants have completed their work, debrief for 5 minutes connections between the 2 rubrics. Teacher sets high and demanding academic expectations for every student I am going to start by looking at the “At Expectations” column under thinking because I’m told that this references a teacher presenting an “effective” lesson. Trainer: I’m going to look at the first indicator “Expectations.” Let me look at the first descriptor. It says, “Teacher sets high and demanding academic expectations for every student.” When I see the word expectations it makes me think of other areas of the rubric that also reference expectations. If I look at Standards and Objectives the fourth descriptor or bullet in the At Expectations column reads, “Expectations for student performance are clear.” This is a connection between the Learning Environment rubric and the Instructional Rubric with regard to student expectations. I know from looking at the Instructional rubric that there is more than one reference to expectations. Let me look at the Instructional rubric again and see if there is another reference to expectations. If I look at the indicator of Presenting Instructional Content I see that the third descriptor or bullet references expectations as well. This descriptor reads, Presentation of content most of the time includes: “modeling by the teacher to demonstrate performance expectations.” If I look at the indicator of Grouping Students the second descriptor reads, “Most students in groups know their roles, responsibilities, and group work expectations.” This causes me to think about the connection between the academic expectations and performance expectations and now the group work expectations. Expectations seems to be a key word that connects the Environment to other areas of the Instructional rubric.
42
Environment and Instruction Connections
Expectations Teacher sets high and demanding expectations for every student. S/O: Expectations for student performance are clear. There is evidence that most students demonstrate mastery of the objective. PIC: Presentation of content includes modeling by the teacher to demonstrate performance expectations. AM: Activities and materials are challenging. Q: Questions sometimes require active responses. AF: Feedback from students is used to monitor and adjust instruction TKS: Teacher sometimes provides differentiated instructional methods and content to ensure children have the opportunity to master what is being taught Teacher encourages students to learn from mistakes. Teacher creates learning opportunities where most students can experience success. Students complete work according to teacher expectations. 10:00 5 min.: Model; reinforce created tool to use now and at school AF: Feedback from students is used to monitor and adjust instruction Teacher encourages students to learn from mistakes; TKS: T sometimes provides differentiated instructional methods and content to ensure children have the opportunity to master what is being taught Teacher creates learning opportunities where most students experience success; S/O There is evidence that most students demonstrate mastery of the objective Students complete work according to teacher expectations
43
Evidence and Scores Remember:
In order to accurately score any of the indicators, an observer has to have sufficient and appropriate evidence captured and categorized. Evidence is: What the students say What the students do What the teacher says What the teacher does 11:00The trainer should emphasize the difference between presenting evidence collected during the lesson and merely restating the rubric. The next slide will give an example of restating the rubric.
44
TEAM: Professionalism
2:45
45
Professionalism Form Form applies to all teachers
Completed within last six weeks of school year Based on activities from the full year Discussed with the teacher in a conference 2:46 The Professionalism rubric should be evaluated within the last six weeks of the school year as a summative measure of the teacher’s roles and responsibilities and should consider evidence from the full year. The evaluator should discuss the ratings with the teacher, and this may be done during an end-of-year conference. The same report applies to all teachers no matter what other rubric may have been used for observations (Library Media Specialist, Alternative Education, or School Services Personnel). The rubric contains all the same standards (generally at level 3) with more specific examples at level 1 and 5. Similar to all other observations, you must select an area of reinforcement and refinement. Professionalism is not meant to be an arduous evidence collection process. The focus should be on the quality of actions and not necessarily on a pre-determined quantity of activity.
46
Professionalism Rubric
2:47 Debrief this rubric with your participants. Begin at “At Expectations” and then move to “Significantly Above”. This rubric was built to support consistent evaluation of the Professionalism Rating Report. As with the other rubrics, evaluators may still give a 2 (below expectations) and 4 (above expectations). Ask: How does this rubric differ from the other rubrics? How is it the same?
47
Professionalism Rubric (Continued)
2:48
48
TEAM: Alternate Rubrics
3:00
49
When to Use an Alternate Rubric
If there is a compelling reason not to use the general educator rubric, you should use one of the alternate rubrics. Ex. If the bulk of an educator’s time is spent on delivery of services rather than delivery of instruction, you should use an alternate rubric. If it is unclear which rubric to use, consult with the teacher. When evaluating interventionists, pay special attention to whether or not they are delivering services or instruction. 3:03 For example, a highly scripted computer program is more in the service area than instruction.
50
Pre-Conferences for Alternate Rubrics
For the Evaluator For the Educator Discuss targeted domain(s) Evidence the educator is expected to provide and/or a description of the setting to be observed Roles and responsibilities of the educator Discuss job responsibilities Provide the evaluator with additional context and information Understand evaluator expectations and next steps 3:04 How is the preconference for alternate rubrics similar and different than the general educator pre-conference? What is the role of the educator? Similar to general educator but more context may be necessary. The pre-conference is an incredibly valuable opportunity for the evaluator to understand the educator’s particular role.
51
Library Media Specialist Rubric
Look at the Library Media Specialist rubric and notice similarities to the General Educator Rubric: Professionalism: same at the descriptor level Environment: same at the descriptor level Instruction: similar indicators, some different descriptors Planning: specific to duties (most different) 3:05 Because most of you have much more experience using the general educator rubric, we are going to put that knowledge to work! Since the environment and professionalism rubrics are the same, we will not be focusing on those domains here. We are first going to take a look at the instruction domain and do a comparison of that to the general educator instruction rubric. This will help you see how you can use what you already know to help you in evaluating LMS. Let’s take a look at Standards and Objectives in the At Expectations column.
52
Educator groups using the SSP rubric
Audiologists Counselors Social Workers School Psychologists Speech/Language Pathologists Additional educator groups, at district discretion, without primary responsibility of instruction Ex. instructional and graduation coaches, case managers 3:06 NOTE: Interventionists would only fall under this group if they are just facilitating students using a computer program. If they are delivering instruction, they should still be evaluated using the General Educator rubric.
53
SSP Observation Overview
All announced Conversation and/or observation of delivery Suggested observation 10-15 minute delivery of services (when possible) 20-30 minute meeting Professional License: Minimum 2 classroom visits Minimum 60 total contact minutes Apprentice License: Minimum 4 classroom visits Minimum 90 total contact minutes 3:07 The minimum contact minutes are required by law, and the pre- and post-conference do not count towards that minimum. It is really important to use professional judgment when conducting SSP observations because these educators are involved in highly confidential conversations.
54
SSP Planning Planning indicators should be evaluated based on yearly plans Scope of work Analysis of work products Evaluation of services/program – Assessment When observing planning two separate times: the first time is to review the plan the second time is to make sure the plan was implemented 3:08 (3 min) Ask: What are some examples of data besides student test scores that may be used when evaluating an educator using the SSP rubric?
55
SSP Delivery of Services
Keep in mind that the evidence collected may be different than the evidence collected under the General Educator Rubric. Some examples might be: Surveys of stakeholders Evaluations by stakeholders Interest inventories Discipline/attendance reports or rates Progress to IEP goals 3:09
56
SSP Environment Indicators are the same
Descriptors are very similar to general educator rubric Environment for SSP May be applied to work space (as opposed to classroom) and interactions with students as well as parents, community and other stakeholders. 3:09
57
TEAM: Quantitative Measures
3:21
58
Components of Evaluation: Tested Teachers with Prior Data
Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 3:22 For teachers in state tested grades/subjects, the 35 percent growth component is their individual TVAAS score.
59
Components of Evaluation: Tested Teachers without Prior Data
Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 3:23 For teachers without an individual growth measure, this will be a school-, district-, or state-wide TVAAS score that comprises 25 percent.
60
Components of Evaluation: Non-tested Teachers
Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator 3:24 For teachers without an individual growth measure, this will be a school-, district-, or state-wide TVAAS score that comprises the 10 percent growth measure.
61
Components of Evaluation: Non-tested Teachers using Portfolio Models
Qualitative includes: Observations in planning, environment, and instruction Professionalism rubric Quantitative includes: Growth measure TVAAS or comparable measure Achievement measure Goal set by teacher and evaluator For teachers in districts that have opted-into the portfolio growth models, their portfolio score serves as their individual growth score.
62
Summary The previous slides reflect a state law that was enacted in spring 2015 and is in effect for the school year. For more information about the specific components of this law, please go to the TEAM website. As we develop more communications around this new law and its implications, they will be shared on our website, through TEAM Update, and through Director Update. If you have specific questions, please reach out to 3:30
63
Individual Value-Added Score
Tested Grades/Areas Includes subjects currently taught 3 year trend scores, where available Any educator with an individual score has to use it Individual Value-Added Score All individual value-added scores will be directly imported into the data system by the state. All educators, including those who anticipate earning an individual growth score, must select a school-wide option Data System 3:36 For teachers who have enough students to generate an individual growth score, those scores will be automatically mapped in and will override their school-wide choice. While scores should be available on June 15, it may take a couple of weeks for them to appear in the state database and on the TVAAS website.
64
Non-tested Grades/Areas
4 composite options: overall, literacy, numeracy, and literacy + numeracy 1 year score TCAP specific, SAT 10 specific and CTE Concentrator School-Wide Value-Added Score Evaluators must select which composite to use All educators, including those who anticipate earning an individual growth score, must select a school-wide option Scores will be imported into the data system by the state Data System 3:37
65
Recommended Composite
Districts will determine which composite a non-tested educator will use Subject Recommended Composite Academic Interventionists Overall, Literacy, Math, or Math/Literacy Computer Technology Overall CTE CTE Concentrator/Student (where available) ELL Overall, Literacy Fine Arts Fine Arts Portfolio (in participating districts), Overall, Literacy Health-Wellness and PE HS Core Non-Tested Library Media Specialists SPED School Services Providers World Languages Overall or Literacy Early Grades Overall or Math/Literacy (from feeder schools) 3:38
66
Spirit and Process of the 15 Percent Measure
Relationship to core beliefs If our focus is on improving the lives of students, then we have to approach the selection of the measure with that in mind. To make the 15 percent selection meaningful, the evaluator and educator work together to identify a measure. If there is a disagreement between the educator and the evaluator, the educator’s decision stands. The process should involve determining which measure most closely aligns to the educator’s job responsibilities and the school’s goals. 3:39 Please remember that all selection, scaling, and scoring happens at the local level. Please remember that most measure selections will need to be manually entered, so keep that in mind when considering deadlines.
67
Spirit of Scaling the 15 Percent Measure
Scales should be determined with the following spirit in mind: Score Equivalent Scale 1 0- ½ years of growth 2 ½-1 years of growth 3 1- 1 ½ years of growth 4 1 ½ - 2 years of growth 5 2+ years of growth 4:00 This table outlines generally what should be kept in mind when creating a scale. Not standardized at a school for all teachers: All teachers start at a different baseline. Set of students and context should inform goal.
68
Beginning of the Year Conference
Evaluator notifies teacher which 35 percent measure will apply. This occurs even for teachers who anticipate receiving an individual growth score. If the teacher has enough students to generate an individual score, that score will be automatically mapped in and will override the selected school-wide measure. Evaluator and teacher choose a 15 percent measure. Evaluator and teacher scale the 15 percent measure. 4:01 It is a best practice to incorporate this conversation into an initial coaching conversation. NOTE: If there is disagreement between the evaluator and the teacher on the achievement measure, the teacher’s choice stands. However, if the district chooses, they may appeal the teacher’s choice, at which point the state would make the final decision.
69
TEAM: Closing out the Past Year
4:02
70
Observation ratings cannot be challenged.
Grievance Process Areas that can be challenged: Fidelity of the TEAM process, which is the law. Accuracy of the TVAAS or achievement data Observation ratings cannot be challenged. 4:05 Trainer should emphasize that each Board of Education must adopt a TEAM Grievance Policy that aligns with the Tennessee State Board of Education Teacher and Principal Evaluation Policy. Teachers can request a Grievance Form through their building level administrator. One of the reasons the claiming process is so important is because improper claiming can result in inaccurate TVASS data—which is grievable.
71
TEAM Webpage 4:07
72
The New Evaluation and Licensure Database
The new database will link evaluation and licensure One stop shop for educators
73
Important Reminders We must pay more attention than ever before to evidence of student learning, i.e. “How does the lesson affect the student?” You are the instructional leader, and you are responsible for using your expertise, knowledge of research base, guidance, and sound judgment in the evaluation process. As a teacher, it is your responsibility to continue learning about the most current and effective instructional practices. When appropriate, we must have difficult conversations for the sake of our students! 4:09 (1 min)
74
Resources E-mail: Websites: Questions: TEAM.Questions@tn.gov
Training: Websites: NIET Best Practices Portal: Portal with hours of video and professional development resources. TEAM website: Weekly TEAM Updates to be added to this listserv. Archived versions can also be found on our website here: 4:10 *give them time to jot these and tell them the ppt is on the website
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.