Download presentation
Presentation is loading. Please wait.
Published bySilas Pope Modified over 9 years ago
1
WHY TEACHER BASED TEAMS in SUPPORT OF ALL STUDENTS?
Create shared responsibility for each student as part of “all of our kids” Eliminate teachers working alone Provide effective ways for differentiated instruction Establish ongoing and embedded professional development within the TBT 9/14/12
2
Build Capacity to Train TBTs in Ohio 5-Step Process
Provide TBT Training in Ohio 5-Step Process Collect Data on Quality of TBT Implementation Set Benchmark Standards Use BLT Student Performance and Adult Implementation Data to Provide Guidance and Support to BLTs Determines district wide and/or building-to-building support needed from internal and external sources DLT Monitor TBT Implementation and instructional practices Use the data to make decisions around professional development and other supports needed by TBTs Identify Strengths and Weaknesses of TBT Student Data Provide timely flow of BLT Data to DLT Level (as defined by DLT) Articulate roles and responsibilities of BLT to building staff BLT Give common assessment to students Analyze results Use assessment data to group students by needs or deficit skills Provide intervention/enrichment- by differentiating instruction Re-assess students, evaluate effectiveness of practices Summarize student performance and instructional practice data and report to BLT TBT Roles/Responsibilities of the Building Administrator include BLT and TBT! 9/14/12
3
This is the process used for School improvement in the Ohio Improvement Process.
Many resources are available through the SST and, if you’re interested we can show you where they are. 9/14/12
4
What we Know About TBT’s
With a balance of administrative support and pressure, teacher groups are more likely to persist with addressing problems long enough to make a causal connection between instructional decisions and achievement gains Gallimore et. al 2009 9/14/12
5
Promoting and Participating
in Teacher Learning and Development E.S. = 0.84 Hattie’s Research An effect-size of d=1.0 indicates an increase of one standard deviation... A one standard deviation increase is typically associated with advancing children's achievement by two to three years*, improving the rate of learning by 50%, or a correlation between some variable (e.g., amount of homework) and achievement of approximately r=0.50. When implementing a new program, an effect-size of 1.0 would mean that, on average, students receiving that treatment would exceed 84% of students not receiving that treatment. Cohen (1988) argued that an effect size of d=1.0 should be regarded as a large, blatantly obvious, and grossly preceptible difference [such as] the difference between a person at 5'3" (160 cm) and 6'0" (183 cm)—which would be a difference visible to the naked eye. Hattie, 2009: 7-8 ( Read more: What works and what doesn't Under Creative Commons License: Attribution Non-Commercial No Derivatives So an effect size of "1" is very good indeed. And correspondingly rare; the chart below reports only about 75 individual studies reached that level. Hattie's more recent table of the later expanded number of meta-analyses themselves shows, by my counting, only 21 meta-analyses showing mean effect-sizes of over 1 (out of 800+) (See Hattie, 2009; fig. 2.2 p.16) Read more: What works and what doesn't Under Creative Commons License: Attribution Non-Commercial No Derivatives So that is what Hattie calls the "hinge point". He uses a "barometer" chart or gauge on which he can impose a needle in an appropriate position (diagram based on Hattie, 2009, fig.2.4 p.19 et passim) Reverse effects are self-explanatory, and below 0.0 Developmental effects are 0.0 to 0.15, and the improvement a child may be expected to show in a year simply through growing up, without any schooling. (These levels are determined with reference to countries with little or no schooling.) Teacher effects "Teachers typically can attain d=0.20 to d=0.40 growth per year—and this can be considered average" (p.17) ...but subject to a lot of variation. Desired effects are those above d=0.40 which are attributable to the specific interventions or methods being researched. Much less deserves less effort, and is marginal. On the other hand, sometimes simple interventions, such as advance organisers, pay off because although not terrifically effective, the return on a very small investment is substantial. "Problem-solving teaching" has an effect-size of 0.61 (2009 figures), and comes fairly naturally in most disciplines. But "Problem-based learning" overall had d=0.15, and developing it requires a very substantial investment in time and resources. So that's a non-starter, isn't it? Not necessarily; like a potent drug it needs to be correctly prescribed. For acquisition of basic knowledge it actually had a negative effect; but for consolidation and application and work at the level of principles and skills it could go up to d=0.75. Not much use in primary schools, but a different matter on professional courses at university (which is where it is generally found). (Hattie, 2009: ) Read more: What works and what doesn't Under Creative Commons License: Attribution Non-Commercial No Derivatives 9/14/12
6
2/27/12 Teachers change their practices when: they have an opportunity to develop a collective understanding of high quality instruction and are provided ongoing opportunities to collectively reflect, discuss, deliberately practice, receive coaching and then adjust their teaching. McNulty pg (Darling-Hammond and Richardson, 2009) These components should be what happens during TBTs. It used to be the guidance on changing teacher practice was that teachers needed to be more reflective about their own practice. Now the research found there needs to be a COLLECTIVE REFLECTION on teaching practices. Teachers do not improve when working in isolation. In addition, there was the understanding that teachers needed the opportunity to practice new teaching strategies. Now, however, there’s also the understanding that the feedback they get from coaching helps people to refine their own practice. The research has evolved and gotten deeper over time, so that we now know “deliberate practice” is the essential component of continuously improving. McNulty, 2011 9/14/12
7
% Teachers Implementing with Fidelity
Teacher Implementation related to Student Achievement % Teachers Implementing with Fidelity The Myth of Linearity The graph shows that student achievement will only improve when greater than 90% of all staff are implementing a strategy with fidelity. If you don’t have at least 90%... your scores will bounce up and down but WILL NOT show substantial sustained growth. Student Scores 9/14/12
8
NON-NEGOTIABLES FOR AN EFFECTIVE LEARNING ORGANIZATION
Intensive training and support in the (5 step) process Multiple opportunities for practice Coaching in the process and opportunities for observation McNulty (2010) 9/14/12
9
Facilitation Approaches and Standards
READ: Approaches – Instructional and Facilitative Standards for Successful Team Meetings The first page decides on processes and the second on procedures and norms.
10
Facilitation Approaches and Standards:
Think about some of the key learning that your team might be engaged in next year (i.e. strategies, modules, common core.) Think about examples of when you might use each one of these ideas in your team meetings. Be ready to share. The research literature related to facilitation often refers to a number of approaches, key skills and ways of working. This activity provides an opportunity to take a brief look at these aspects of facilitation. Participants locate their Facilitation knowledge and skills handout. Each table group member will read a segment and then all will share. 9/14/12 10
11
Collaborative Inquiry
… a way of ensuring that collaboration goes beyond casual story swapping and becomes true, intentional joint work that results in new understandings that will move practice forward To ensure this agendas, protocols, and norms are all important. Using a model can also assist your team in establishing procedures when looking at student work. Katz, Earl & Jaafar, (2009). Building and Connecting Learning Communities, p.74. 9/14/12 11 11
12
Implement the Plan Systemically and Systematically
Procedural Coherent Thorough Regularity Breadth Depth Sustainability Shared Ownership Systemically and Systematic Breadth – Broad implementation across the district. Includes implementation in every building and classroom. Spread of activity structures, materials, and classroom organization; as well as the spread of underlying beliefs, norms, and principles Depth - Deep and consequential change in classroom practice. Sustainability - Maintaining practices in the face of competing priorities, changing demands, and teacher and administrator turn over Shared ownership – Shared ownership of the plan goals and strategies so that they held by … administrators, schools, and teachers who have the capacity to sustain, spread, and deepen strategies themselves Systematically Procedural – Well articulate step by step process Coherent – Logically or ordered or integrated system of implementation that is consistent Thorough – Fully detailed Regularity – Consistency If systematic components are set-up and adhered to, the monitoring, will be more sound/legitimate. Take out the description of these words on __ paper. Last week when you were talking about identifying your critical needs you considered the high level of implementation for curriculum, assessment, instruction and PD. Lets just focus on Assessment right now. Divide your table in ½. Half of you focus on systemic and half of you focus on systematic 9/14/12 12
13
Use data to identify critical needs
Ohio Improvement Process Use data to identify critical needs Develop goals, research-based strategies, indicators, & action steps focused on critical needs identified in Stage 1 STAGE 0 Planning and Preparation Establish collaborative structures and processes Review data. Gather summative evidence of implementation and impact 9/14/12
14
The Ohio 5-Step Process: A Cycle of Inquiry
Collect and chart data Step 2 Analyze student work specific to the data Step 3 Establish shared expectations for implementing specific effective changes in the classroom Step 4 Implement changes consistently across all classrooms Step 5 Collect, chart and analyze post data The Ohio 5-Step Process: A Cycle of Inquiry The 5 Step Process 9/14/12
15
Baseline Survey Results from Across Ohio
Ohio BASELINE survey results show that we aren’t good at following through with steps 3-5. From a State-wide perspective, we aren’t at a 90% implementation level for any of the steps. Survey done in November 2010 with DLTs and February 2011 with Bldg Admins. Over 700 responses. 9/14/12
16
The Checklist Manifesto: How to Get Things Right
by Atul Gawande Why is using a Checklist so important if we already have the 5 Steps?! Refer to The Checklist Manifesto: How to Get Things Right by Atul Gawande. Gawande, a surgeon and associate professor at Harvard Medical School, sought the best method to minimize physician care error. Daniel Goodman, a Boeing aviation checklist expert who develops lists to avert human error during flight. Goodman explains the idea behind good checklists: “Good checklists are precise. They are efficient, to the point, and easy to use in the most difficult situations. They do not try to spell everything – A checklist cannot fly a plane. Instead they provide reminders of only the most critical and important steps – The ones that even highly skilled professionals could miss.” Dr. Atul Gawande's new book, "The Checklist Manifesto," grew out of research that the surgeon and author did for the World Health Organization on ways to reduce surgical deaths worldwide. Gawande's solution was to expand the use of a remarkably low-tech idea first proposed by Dr. Peter Pronovost, a critical care specialist at Johns Hopkins University: require all doctors, nurses and other operating room staff to run through a 2-minute surgical safety checklist before, during and after each surgery. The three-part checklist covers everything from staff introductions to making sure that blood is available for a transfusion. Check it out here. Gawande and the WHO tested the checklist in eight hospitals around the world, including ones in London, Seattle, New Delhi and rural Tanzania. They found that the simple list made a big difference. It reduced surgical complications by more than a third, on average, in both poor hospitals and wealthier ones. 9/14/12
17
gre Hand-Out Screen shot of example checklist. This was revised as the team evolved in their TBT work. The checklist ensures that they are asking the right questions and supports fidelity of implementation. 9/14/12
18
Region 6: TBT Steps 1-2 Going Deeper
3/26/12 From the assessment arrow moving to the right, are the higher levels that TBTs need to working in for their to be evidence of student growth. We need to be moving 9/14/12 Region 6: TBT Steps 1-2 Going Deeper
19
Collect and Chart Student Data from a
Step 1 Collect and Chart Student Data from a Common Assessment Teacher Created End-of-Unit Assessments Purchased Questions You may want to take time for DLTs to discuss what assessment data their TBTs have that can be used for the 5-Step Process. If your districts were part of OISM or currently utilize RTI, you may need to talk about CBMs versus standards-based assessments. CBMs can show if there are specific skills deficits. They don’t typically have the ability to support an item analysis that will hone in on content knowledge that needs re-taught or expanded. Remember: common assessments do not need to be commonly created - need to start where the district is in the process in developing common assessments 9/14/12
20
Summative district and
Who Needs the Data? DLT/BLT/TBT The Data Coach’s Guide: Love, Stiles, Mundry & DiRanna, c. 2008 Summative district and state assessments (aggregated, disaggregated; srand, item, and student work) Data about people, practices, perceptions (e.g., demographic, enrollment, survey, interview, observation data, curriculum maps) Benchmark common assessments (e.g., end-of-unit, common grade-level tests reported at item level; aggregated, disaggregated; strand, item, and student work) Formative common assessments (e.g., math problem of the week, writing samples, science journals, other student work) Formative classroom assessments for learning (e.g., student self-assessments, descriptive feedback, selected response, written response, personal communications, performance assessments) Annual 2-4 times a year Quarterly or end of unit 1-4 times a month Daily - Weekly HAND-OUT This has helped many buildings organize and define what assessments they use and for what purpose. 9/14/12
21
9/14/12
22
This district is writing their own common assessments as part of their plan. Assessments from math and reading at each grade level were brought to a DLT meeting. The team used a protocol to examine them for rigor and relevance. They will continue to tune their quarterly assessments throughout their next school year and look at results to see if there are correlations between OAA passage and their common assessment scores. 9/14/12
23
You don’t fatten the pig by weighing it.
CAUTION: We need frequent formative assessments BUT we have to implement meaningful instruction inbetween those formative assessments. 9/14/12
24
Use TBT process to be PROACTIVE in strengthening the Core instruction.
2 3 4 5 1 A lot of our problems could be circumvented if we concentrated on our core instruction. In this version you are working to strengthen your core instruction so that there are fewer students in the yellow and red after your post assessment. Use the MEASURE UP tool to display where their building’s core is. This can be found at www. Schoolleaders.org Measure Up for 2010 will be released after the Ohio Department of Education releases the test data to the public later this month (anticipated approximately Aug 26). MU is published as soon as the data can be prepared. This year (2011) Measure Up will be a web-based tool. No more downloads. Stay tuned... 5 Step TBT Process Use TBT process to be PROACTIVE in strengthening the Core instruction. 9/14/12
25
2 3 4 5 1 2 3 4 5 1 2 3 4 5 1 5 Step TBT Process 5 Step TBT Process
This is an example of using TBTs as part of an intervention/enrichment push-out program. Like OISM/RtI How many of your buildings are involved in RtI? It is critical that they know that TBT work IS RtI work – same 5 Step protocol, same purpose BUT TBT time should be founded on changing the instructional CORE to avoid having to find set-aside intervention time/personnel/resources. 9/14/12
26
Step 1: Gave and Scored Assessment
Here is an example of a computer generated score sheet for a test given to 4th grade readers Determine the highest response items Determine the lowest scoring questions Look at test item examples and discuss possible causes for strong/weak response ratings 9/14/12
27
Summative district and
3/26/12 The Data Coach’s Guide: Love, Stiles, Mundry & DiRanna, c. 2008 Summative district and state assessments (aggregated, disaggregated; srand, item, and student work) Data about people, practices, perceptions (e.g., demographic, enrollment, survey, interview, observation data, curriculum maps) Benchmark common assessments (e.g., end-of-unit, common grade-level tests reported at item level; aggregated, disaggregated; strand, item, and student work) Formative common assessments (e.g., math problem of the week, writing samples, science journals, other student work) Formative classroom assessments for learning (e.g., student self-assessments, descriptive feedback, selected response, written response, personal communications, performance assessments) Annual 2-4 times a year Quarterly or end of unit 1-4 times a month Daily - Weekly HOW MUCH IS TOO MUCH!? We hear this question a lot. 9/14/12 Region 6: TBT Steps 1-2 Going Deeper
28
Frequent Testing/Effects of Testing
3/26/12 Frequent Testing/Effects of Testing .34 An important point is…Repeated testing on its own is not effective. So when teachers, principals, etc. state that there is too much “testing”, they may be correct IF they only test without using the resulting data to plan their instruction. 9/14/12 Hattie 2009 Region 6: TBT Steps 1-2 Going Deeper
29
3/26/12 Feedback .73 BUT, as soon as we “flip” our thinking to the fact that assessments are a form of student feedback to teachers about their learning. It’s the learning from testing on both the student and teacher’s part that makes the difference. 9/14/12 Hattie 2009 Region 6: TBT Steps 1-2 Going Deeper
30
What Does Feedback Mean?
3/26/12 What Does Feedback Mean? “The mistake I was making was seeing feedback as something teachers provided to students – they typically did not, although they made claims that they did at the time, and most of the feedback they did provide was social and behavioral. It was only then when I discovered that feedback was most powerful when it is from the student to the teacher that I understand it better. When teachers seek, or at least open to, feedback from students as to what students know, what they understand, where they make errors, when they have misconceptions, when they are not engaged – then teaching and learning can be synchronized and powerful. Feedback to teachers helps make learning visible. “ (Hattie 2009 pg. 173) Quick Write or draw - Reflection Time Give them time to reflect on the quote. Does it bring an aha? 9/14/12 Region 6: TBT Steps 1-2 Going Deeper
31
0.90 Formative Evaluation! Hattie 2009 3/26/12
Getting systematic feedback from students to teachers through the use of formative evaluations Seeking evidence on where students are not doing well and to improve teaching, looking at ALL students in this way and the openness to new experiences is what makes the difference. 9/14/12 Hattie 2009 Region 6: TBT Steps 1-2 Going Deeper
32
Formative Evaluation! Pay attention to the formative effects of your teaching, as it is these attributes of seeking formative evaluation of the effects (intended and unintended) of the programs that makes for excellence in teaching. 0.90 9/14/12 Hattie pg
33
Analyze student work for strengths and weaknesses.
Step 2 Analyze student work for strengths and weaknesses. Transition slide to Step 2 of the 5 step process 9/14/12
34
Step 2: Analyze Your Data Highest Scoring Question
Now by looking at the scored assessments, we find which questions our students did the best on. Why do we think this one was easy? Does this question have enough rigor? Should we try assessing this again on our next test but write the question at a higher cognitive level? Pg 3 9/14/12
35
Lowest Questions Here are three of the lowest scoring questions. Depending on what the TBT decides, it may be that only one item and related standard/outcome may be addressed in the TBT instructional strategy. This may be different from what another TBT may choose, but that is ok, as all need addressed. A mixed curriculum TBT could use these types of items/discussion to determine cross-curricular areas of need. #7 shows an instructional gap in my classroom. I need to revisit this and test again to check #8 shows me that they aren’t looking back in the text to find evidence of how they would make an inference. I’m going to use the Prove-It strategy and test again for inferencing. #12 is an example of how all content areas can support each other through summary writing Pg 5 9/14/12
36
5-Step Process Card Coaching Prompts
Share the tools they can use as they watch the first chunk of the video. Divide into 2 person teams and count off as person #1 and Person #2. #1 will watch the 5-Step process during the video sequences and #2 will watch for potential coaching points. 9/14/12
37
gre TEAM DIALOGUE – 2 Minutes: Share what is alike/different from your current practice related to Step 2… 9/14/12
38
Step 3 Establish shared expectations for implementing specific differentiated strategies in the classroom. The next segment will show Step 3. Move to next slide. 9/14/12
39
A TBT I worked with decided in their early stages to “just each focus a little more in their daily instruction” on their low area. After doing a post assessment, they realized that they needed a lot more structure to their instructional plan and followed the check list examples on this screen. 9/14/12
40
TBTs and Your Instructional Framework
What’s the difference between a strategy and an Instructional Framework? Do we know how this fits with TBTs? 9/14/12
41
Charlotte Danielson’s Instructional Framework
Domain 1: Planning and Preparation 1a Demonstrating Knowledge of Content Pedagogy 1b Demonstrating Knowledge of Students 1c Setting Instructional Outcomes 1d Demonstrating Knowledge of Resources 1e Designing Coherent Instruction 1f Designing Student Assessments Domain 2: Classroom Environment 2a Creating an Environment of Respect and Rapport 2b Establishing a Culture for Learning 2c Managing Classroom Procedures 2d Managing Classroom Behavior 2e Organizing Physical Space Domain 3: Instruction 3a Communicating With Students 3b Using Questioning and Discussion Techniques 3c Engaging Students in Learning 3d Using Assessment in Instruction 3e Demonstrating Flexibility and Responsiveness Domain 4: Professional Responsibilities 4a Reflecting on Teaching 4b Maintaining Accurate Records 4c Communicating with Families 4d Participating in a Professional Community 4e Growing and Developing Professionally 4f Showing Professionally 9/14/12
42
FRAMEWORK STRATEGIES BEFORE DURING: AFTER: Frayer Method
Communicate Learning Target Word Study/vocabulary Activate Prior Knowledge Frayer Method Vocabulary Cluster K-W-L Chart Anticipation Guide DURING: Present and model the content Practice and deepen content knowledge Shared Reading Paired Reading Echo Reading (using) Think Alouds AFTER: Checking for Understanding Retell Writing As An Extension of Reading Write a Review Create a Timeline of Story Application of New Vocabulary 9/14/12
43
Implement changes consistently across all classrooms
Step 4 Implement changes consistently across all classrooms Step 4 is the actual implementation 9/14/12
44
Not only should administrators do walk-throughs based on the instructional strategies that are being implemented related to each of their TBTs, BUT the teachers should also be observing each other to learn how each other is implementing the chosen strategy and to learn from each other. 9/14/12
45
Collect, Chart and Analyze Post-Assessment Data
Step 5 Collect, Chart and Analyze Post-Assessment Data Once you have begun the TBT cycle, Step 5 can actually end up being Step 1 of your next round. 9/14/12
46
9/14/12
47
Don’t forget to evaluate each cycle of your TBT
9/14/12
48
Region 6: TBT Steps 1-2 Going Deeper
3/26/12 From the assessment arrow moving to the right, are the higher levels that TBTs need to working in for their to be evidence of student growth. We need to be moving 9/14/12 Region 6: TBT Steps 1-2 Going Deeper
49
Sharing Our Learning 9/14/12
50
TEAM DIALOGUE – 5 Minutes: Share what is alike/different from your current practice related to Steps 3-5… 9/14/12
51
Communication Loop? What does the BLT need?
Monthly update from each team What is the most important information that needs shared for us to be a LEARNING community? What does the DLT/Transformation Team need? At least quarterly updates from the BLT on the progress of their TBTs Implementation Data (Adults) Pre/Post Data (Students) 9/14/12
52
Different TBT Configurations
Vertical Teams Cross Content Same Grade Level/Same Content Multiple Grades/Same Content Area Within Class Intervention/Enrichment! Centers Differentiated Work (Flexible Grouping) 9/14/12
53
Kindergarten TBT Timeline
9/14/12
54
Review the schedule this building uses for its vertical team meetings
9/14/12
55
Vertical Teams Review the vertical team background with them 9/14/12
56
Cross-Content Version of TBTs
9/14/12
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.