Download presentation
Presentation is loading. Please wait.
1
Assessment & Reflective Practice
Our Cornerstone for Change
2
The Layout of Professional Development for EIP
Day 1 -Collaborative Strategic Decision-Making Developing a process and framework Day 2 -Assessment and Reflective Practice Examining the use of assessment Identifying how reflective practice works Day 3 -Instructional Repertoire Building new ways to develop strategies focused on improved student outcomes Remind the participants the scope and sequence of the core skills training. At this point they have developed a decision-making process and began to examine their collaborative culture within their school or district. Today they will be looking at how assessment and reflective practice supports the decision-making process.
3
Central Themes Building a Collaborative Learning Community
Using Strategic Decision-Making Building Capacity to Develop, Implement and Sustain an Effective Process Remind the participants the central themes. These will be part of the day as well.
4
Components of EIP Leadership Collegial & Family Partnerships
Strategic Decision-Making Assessment & Reflective Practice Instructional Repertoire Accountability & Documentation Remind participants of the six components. They major focus will be assessment and reflective practice.
5
Objectives for Today Analyze and examine student work, learning and behavior using protocols in order to define a focus area for improvement; Develop effective monitoring systems that chart student progress from baseline to a specified target; and Define reflective practice and identify how it will improve implementation integrity, as well as enhance instructional practice. Review the objectives for the day.
6
Lessons Learned Using assessment and reflection should result in a change in instructional practice. Assessments focus on environment, curriculum, and instruction, not just the student. Reflection is a process that focuses on how teachers can enhance their practice. This slide sets the tone as to why assessment and reflective practice is one of the EIP components. These are lessons that have learned from schools which have implemented EIP and national research on effective early intervention.
7
Indicators of a Quality Decision-Making Process
Identify the focus area or concern Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan Remind participants of the indicators of quality decision-making. These were used at the last session to help the develop their decision-making process.
8
Which Indicators Relate to Assessment & Reflective Practice?
Identify the focus area of improvement Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan Have participants take a few minutes to select which indicators are connected to assessment and reflective practice. Have them share out their ideas.
9
Indicators That Will Be Covered Today
Identify the focus area of improvement Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan These are the ones that will most directly dealt with today.
10
What is Assessment? This begins the overall background information on assessment.
11
The Purpose of Assessment
“Assessment is a process of collecting data for the purpose of making decisions about individuals or groups and this decision-making role is the reason that assessment touches so many people’s lives.” Salvia & Ysseldyke (2001) This quote helps define the purpose of assessment. Assessment drives decisions.
12
What is the Purpose for Assessment?
To make instructional decisions Community District Classroom Student School Grade Level Decisions influences all of the spheres we come in contact with, whether the decisions are made for a single student, a whole classroom, a grade level, a whole school, a district, or even a community.
13
What Makes Decision-Making Strategic?
Data Driven Action Perception of an Issue Action From The part that makes a decision strategic is the data and assessment process we use. (This slide was previously discussed in the first session.) We often first have an impression of a situation. It is the data that will verify the perception of the situation as accurate or redefine the perception in more objective terms. To simply move from our perception to action with the verification of data is to make decisions with a lack of information. Data to Verify To Action Perception of an Issue Based on SWIS
14
Reflective Practice Assessment Evaluation Planning Assessment Student
Instruction Assessment Student Outcomes Assessment and reflection are processes. Reflective practice is the overarching or big picture examination a teacher does to improve and change his/her practice. Assessment is on-going throughout the instructional process. Pre-assessment helps us plan. Monitoring during instruction helps us adjust. Post assessment helps us evaluate the effectiveness of our instruction and what the student actually learned. Assessment
15
Characteristics of Assessment
Functional (Effective, Useful) Relevant Direct Multidimensional Formative Frequent, Repeated Individually Focused Technically Adequate Multidimensional—multiple sources, multiple settings, multiple methods. Helps formulate high probability intervention activities. Repeatable measures to allow for frequent progress monitoring. Individually focused—no standard batteries. Drawn from actual curriculum and directly related to an identified focus area of improvement Utilizes the context as a means to determine function of behavior Serves as evidence of our perceptions and our instructional practice
16
When You Think “Assessment”
What is the question that needs to be answered? What information do you intend to obtain from your assessment? What will you do to get the information? How will you use the information you got? We need to be selective and strategic about the assessment we use. We can better focus ourselves if we first develop a question that we need to answer using our assessment. The education culture tends to collect a great deal of data and the key is to determine which data will be the most useful.
17
Phases of Collaborative Inquiry
Collecting Data Analyzing Organizing Data-Driven Dialogue Framing the Question Drawing Conclusions, Taking Action Monitoring Results Phases of Collaborative Inquiry Assessment is more than collecting data. It is becoming a collective inquiry process or a reflective process. This is why we have a component which combines assessment and reflective practice as one. Assessment is only as good they way we use it to enhance our practice. Love, N., 2002
18
What Data Do We Use? Looking at Numbers Quantitative data (Numbers)
Defining the gap between expectations and current performance Monitoring the progress and growth Move Beyond Numbers Qualitative data (Descriptions) Developing a focus area or the cause of a concern Defining the context Examining the implications of decisions There are two forms of data that help us to assess what we are doing. Quantitative data are the actual numbers. The numbers can tell us how well we are doing or help us define the exact level of growth and change needed. Qualitative data are the contextual observations and descriptions that can not be measured or counted. These are the words. They can help us develop a focus of concern or define the cause of a concern. They help us analyze what we do in context and help us determine the implications of our decisions. We need both types of data to support what we are doing.
19
Assessment Testing vs. Assessment Tests
One area that merits clarification is the distinction between tests and assessments. Tests are a piece of assessment, but they did contribute to the whole story in assessment. Tests are like a photograph-a single snapshot taking at a point in time. It only tells a single moment of a story. Assessments on the other hand are like a movie. It tells an on-going story over time.
20
How Do Grades Support or Hinder Assessment?
Grading Practices How Do Grades Support or Hinder Assessment? One common form of communication about student progress is grading. Grades often come from test scores. Grading needs to be examined closely to determine the part of grades that can connect with assessment.
21
What Grade Would You Give?
Let’s take a scenario of three students. There are 100 items these students need to know from a unit that was just taught. On the end of unit test, this is how each student scored. Have participants spend a few minutes deciding what grade each student would get. Have them briefly share out what grade they give and why.
22
What Grade Would You Give Now?
Now let’s put a twist on the story. What if we had pre-tested each student and learned that student x knew 20 items before the unit was taught, and student y knew 60 items, while, student z knew 90. After the unit student x learned 30 items, student y learned 20, while student z learned 10. Have the participants now discuss how this information may or may not change the grades they would give. Have them share out any changes in grades.
23
Let’s Reflect What does this exercise tell us about grading?
How reliable are grades in terms of assessing student progress? Have participants discuss what this exercise tells us about grades. How reliable are grades in telling us about student learning, or student progress? Can grades be used as effective means to assess actual student progress? Have participants share out their thinking.
24
Examining Student Work
Observation Review Test Decision-Making This slide demonstrates visually how we need to used multiple forms of data to build a whole puzzle. The assessment process makes it possible to converge the data. It is this convergence that makes our decisions strategic. Interview Examining Student Work
25
Examining Student Work
Test Review Observation Interview Examining Student Work Decision-Making This demonstrates the convergence.
26
This shows how the picture become clearer for us and how we can better see what decisions to make.
27
Domains of Assessment Curriculum Environment Instruction Student(s)
Context of learning What we teach Outcomes of Learning How we teach Student(s) Instruction Environment Curriculum Remind participants of the domains that we need to examine in assessment. Adapted from Heartland Area Education Agency
28
E (Examine Student Work)
Adapted from Heartland AEA 11 DOMAINS R (Review) I (Interview) O (Observe) T (Test) E (Examine Student Work) C Curriculum Permanent products District Standards Lesson plans Teachers Curriculum Specialists Administrators Implementation of standards Decisions on selection of content Readability of texts Standards in Practice SLICE Tuning Protocol E Environment School Rules, handbooks Policies Parents Students Interaction patterns Environmental analysis Observational based assessments Initial Line of Inquiry I Instruction Implementation of CCT Teacher expectations Antecedents, conditions, consequences Descriptive Review Lesson Study S Student Student records Target area Dimensions & nature of the problem Student performance Discrepancy btw setting demands & performance This matrix helps to understand the various forms of data and how they relate to the domains of assessment.
29
Figure 1.The Richness and Complexity of Student Assessment Data
Nation/International Assessments Are students performing optimally? Large Scale Assessments Are students meeting the state standards? Diagnostic Assessments What are students’ cognitive strengths and needs? Student Report Cards How are students performing in general? Performance Assessment Can students apply and generalize what they’ve learned? Classroom Curriculum Unit Tests, Quizzes Did Students learn it? Formative Assessments Are students learning it? Annually to students in selected grades As needed/usually 1X/year Once/curriculum unit Weekly Daily Specificity of Information Rate of Feedback This is one example of a model of comprehensive approach to using assessment information. Please note the frequency and level of connection to single student, classroom, whole, school, and district. Have participants discuss the usefulness and purpose of these types of assessments. How do they connect to assessing curriculum, environment, instruction, and students? 2 North Central Regional Educational Laboratory Policy Issues Issue 6 Nov 2000 1. Using Student Assessment Data: What can We Learn from Schools? Allison Cromley
30
Assessment & Reflective Practice
Focused Assessment Focused Assessments Assessment & Reflective Practice Problem Validation Increased Objectivity In-Depth Analysis Examining Student Work Universal Assessment Observation-Feedback on Instruction This slide demonstrates how assessment looks on the continuum. Remind participants that this is the continuum EIP uses. Connect the bottom as universal practices, the middle as interventions, and the top as intensive support. Lesson Study Curriculum-Based Observational-Based Reflective Practice Formal & Informal Monitoring Student Progress All Students in School (Adapted from Ortiz, 1987; Horner, 1998; Sugai, 2001)
31
A Key Factor for Assessment
In 2000, a Harvard study was conducted examining the issue of disproportionality in special education. Connecticut was cited as one of the states identified as in need of improvement in this area. One major area of improvement for the state is the use of non-biased assessments. This research has indicated that CT does not always practice non-bias assessment. Students of color are at risk for being misidentified, particularly African-American and Hispanic males who are more likely to be labeled as ED or ID than their white counterparts.
32
Insert Slide(s) on Non-Bias Assessment
33
Using Assessment to Identify the Focus Area for Improvement
This begins that actually use of assessments and the connection to decision-making. The first indicator of a quality decision-making process is defining a focus area for improvement. This indicator relies heavily on our use of data.
34
Using Your Homework Select a “case” to use for the next session
Single student e.g., a gifted student A specific group of students e.g., ELL A classroom or grade level e.g., improving math instruction A whole school e.g., lunchroom behavior A whole district e.g., increasing time with non-disabled peers or a new science curriculum Remind participants that their homework from last session was to bring a case. Ask them to pull out all of the information they brought today on that case. Get a feel for what kinds of cases are in the room. (See list above.) Tell participants that this case will be used today and then again at the next session. The intent is to use the case practice new skills. (Note that some parts may seem more time consuming to do, but that is due to the learning curve. All of the material covered today should eventually become an automatic practice and therefore take less time as they become more skilled in this way of thinking.)
35
Identify the Focus Area for Improvement
What is happening? Frame a question in terms of the impact on student learning Examine the context by collecting and analyzing data Develop a hypothesis to define a central area of focus Remind participants of the considerations for this indicator. Refer to the sheet where the indicators are listed. Today we will walk through each of these.
36
Remember… We Need to Develop a Question
Frame a question in terms of the impact on student learning Frames our thinking in terms of inquiry vs. judging Aligns our thinking to student learning Remind participants that what covered last time was the intent of creating a question is to emphasize inquiry as the drive behind our decision-making. It will also provide us with a focus directly connected to student learning. Connect this back to the “When you think assessment” slide and how we need to frame a question to help us focus the types of assessments we need to use for our case. Example: Instead of “Ann Marie never stays in her seat.”—Why is Ann Marie having a hard time staying in her seat?
37
Use Your Case Examine the information you have about your case.
What is the question you want to answer? Write your question on your worksheet. Have participants take out the worksheet for the day. Have them briefly examine their case and the information known about the case. Have them frame a question they would like answered about the case. Have them share out examples of questions.
38
Examine the Context Examine the context by collecting and analyzing data Determine when, where, how long, with whom, and under what conditions Develop a rationale for the occurrence using data Use evidence to explain what we see as reason for performance gaps Remind participants that examining our context is taking into account the when, where, and with whom this issue is occurring. We need to recognize that we ourselves influence the issue.
39
Domains of Assessment Curriculum Environment Instruction Student(s)
Context of learning What we teach Outcomes of Learning How we teach Student(s) Instruction Environment Curriculum Remind participants of those domains which define the context. Adapted from Heartland Area Education Agency
40
Essential Questions to Analyze Curriculum
What content standards does this address? What are the performance standards? What is the essential content? What is the level of expectation? How are the curricula standards and materials adapted to meet instructional level? Share with participants that understanding the context is framing our thinking around some essential questions. Refer to the sheet of essential question. These are some examples of questions for examining curriculum influences.
41
Essential Questions to Analyze Environment
How are expectations clearly communicated? What are the task directions? What are the opportunities for student choice? What are the physical influences on the learning? What are the social/interpersonal influences on the learning? How do the student and teacher collaborate in the learning process? These are some examples of questions for examining environmental influences.
42
Essential Questions to Analyze Instruction
What is the amount of student engagement and relevant practice? Is there appropriate pacing? What teaching strategies are used? How are tasks organized for students? Is there an instructional match? How does the feedback support student learning? These are some examples of questions for instructional curriculum influences.
43
Essential Questions to Analyze Student Performance
What does the student know? What can the student do? What are the student’s strengths? What are the student’s interests? What it the instructional level? What learning strategies does the student use? How does the student organize information and approach new learning? How does the student self-monitor? What are the patterns in errors? These are some examples of questions for examining student academic performance.
44
Essential Questions to Ask About Behavior
When is the behavior most/least likely to occur? Where is the behavior most/least likely to occur? With whom is the behavior most/least likely to occur? What happens immediately before/after the behavior? What do others do when the behavior occurs? What other environmental conditions may contribute to the behavior? These are some examples of questions for examining student behavior. Pennsylvania Department of Education, Initial Line of Inquiry Gary LaVigna (2000) Behavioral Assessment and Advanced Support Strategies
45
For Example… Dog Cat Apple Ball Chad 3 4 2 1 Gickliing
Since assessment is only as good as the ability to analyze the information, the essential questions become a useful means of collectively examining our evidence. For example: We have a first grade classroom and one of the curriculum objectives is for students to alphabetize words by the first letter. As the classroom teacher, you have written these words on the board in this order. You have asked students to please put the words in alphabetical order on their paper. (Click mouse for Chad’s paper.) Chad is one of your students and this was the paper he handed in. Have participants use the essential questions and analyze Chad’s paper. Have them share out what they can say about curriculum, environment, instruction, and Chad’s learning. Gickliing
46
For Example… How Does Chad Approach Alphabetizing?
3 4 2 1 Dog Cat Apple Ball Share that like them, you were confused as to what Chad had done, so you asked to Chad to explain what he wrote. Chad says “That’s easy. The third word goes first, the fourth word goes next, the second word goes next, and the first word goes last.” (Click the mouse to show each step.) Gickliing
47
For Example… How Does Chad Approach Alphabetizing?
3 4 2 1 Dog Cat Apple Ball Most participants needs a repeat of Chad’s thinking, so repeat as the first slide. Gickliing
48
What Does This Tell Us About…
Curriculum How effective is the curriculum for Chad? Environment What are the environmental influences on Chad’s learning? Instruction What instructional methodology strengthens Chad’s learning? Have participants now reflect again on the essential questions and respond as to what they see happening with Chad’s work. Have them share out their insights on Chad. Point out the importance of not taking a student product at face value. Highlight how sophisticated Chad’s thinking is and that he is beyond the curriculum.
49
Use Your Case Examine the assessments you currently have about your case. What assessment data could tell you about…? Curriculum Environment Instruction Student(s) Have participants use the information they brought today. Have them categorize the information as to assessments that help us understand curriculum, environment, instruction, and students. Have them record their ideas on the worksheet. Have participants share out if there is which category has the most assessments and which ones have the least. Have participants share out one example of assessment in each category and why they put it in that category.
50
Using Protocols to Define the Focus Area of Improvement
A Means to Collaboratively Analyze Assessments This section defines the usefulness and purpose of protocols.
51
What are Protocols? Tools for analysis Structured dialogue
Collaborative inquiry More than one perspective Reflective practice A protocol consists of agreed upon guidelines for a conversation, and it is the existence of this structure- which everyone agrees to- that permits a certain kind of conversation to occur- often a kind of conversation people are not in the habit of having Protocols are vehicles for building the skills and culture necessary for collaborative work. Thus, using protocols often allows groups to build trust be doing substantive work together
52
Centers Using the premise of your “case” select the most appropriate center Descriptive Review Initial Line of Inquiry Behavior Academic SIOP Each group of participants (school and district based) will select a protocol based on the type of information they need for their case. Review briefly each of the protocols before participants self-select. This review will help them decide the best protocol for their case. Each center should have a designated facilitator who knows the protocol to further explain it in detail. The center time should be broken into a more detailed overview done by a facilitator and then followed by the actual practice of the protocol. In the practice have each group use there real case. Have them select a facilitator and other specified roles for the protocol. TAs should serve as gatekeepers to ensure the protocol is being followed.
53
A Sample Protocol for Examining Student Work
Descriptive Review Descriptive Review is an examining student work protocol.
54
Descriptive Review What does it look like? When would we use it?
Examination of a student product (e.g. writing sample, math assignment, etc.) Round Robin responses to selected questions (e.g. describe what you see?) When would we use it? C Determining next curriculum area E Connecting the context & student work I Determining next steps for instruction S Having a deeper analysis of student learning Review this slide to describe it to participants. Note that this protocol is used with a single selection of student work. The work sample should be representative of the work related to the focus area. The strength of the protocol is the examination of analysis of student learning (what the student can do) and determining the next steps for instruction.
55
Descriptive Review What do you need? How does it work?
Facilitator to run the process Presenting teacher to provide the context of the student work & a focus for reflection A Student work sample hard copy of the student work How does it work? Follow articulated steps Select key questions to ask for each round (one question per round) Each member of the group provides one response to the question (Round robin fashion) (Can go around more than once for more responses) Review this slide to describe it to participants. The protocol is based on key questions that are selected either by the facilitator or the presenting teacher ahead of time. Each participant then responds to the questions in a pure Round Robin fashion. The presenting teacher does not participate in the rounds and takes notes on what is said. There is no Q& A dialogue during the rounds.
56
Descriptive Review Sample Timetable
Steps Time Review of Process 5 minutes Setting the Tone 15 minutes Work is Presented with Context Descriptive Rounds 30 minutes Hearing from the Teacher 10 minutes Reflecting Review this slide to describe it to participants. The facilitator reviews how the process works and sets the tone for the protocol (e.g. that today we will reflect on the effectiveness of the instruction on paragraph development.) The teacher then provides a brief context of the assignment, content, and the student work. The teacher avoids providing information on the student’s background, ability, etc. The facilitator directors two to five rounds of questions. The teacher then responds to s/he heard from the group. Each member of the group discusses what they learned and they got out of the process.
57
Descriptive Review Review the Process Setting the Tone
The facilitator provides the directions and timelines for the process. Setting the Tone The group reviews the intention of the process. The group agrees to the reflective process. Hide this slide and use it only for the descriptive review center.
58
Descriptive Review Work is Presented/Context Descriptive Rounds
Teacher puts the work out for the team to see and provides a brief introduction to the work. Descriptive Rounds Selection of rounds is based on type of work and focus of reflection. Each round builds on the previous one, seeking to deepen an appreciation for the instruction, task, and student learning. Hide this slide and use it only for the descriptive review center.
59
Descriptive Review Hearing from the Teacher Reflecting
Presenter has time to say what was heard. Reflecting The group reflects on the process. Each member highlights what was learned. Hide this slide and use it only for the descriptive review center.
60
Descriptions vs. Judgments
See, Hear, Touch Evidence based Specific language Judgments Inferences Feelings Assumptions Hide this slide and use it only for the descriptive review center. The other concern we have with language is the use of judgment words vs. description words. To describe is to note exactly what you see and hear. A test for a description is to ask, “Did I see it? Is it an exact quote? To what degree with others agree they saw or heard the same thing?” Judgments are inferences. We use judgments based on our observations to create perceptions. Judgments are not always connected to “good” or “bad” feelings, instead they are connected to assumptions about what was seen or heard. Judgments are less accurate than descriptions and therefore can often mislead decisions and actions. Perceptions
61
A Sample Protocol for Examining Behavior
Initial Line of Inquiry This is an inquiry protocol.
62
Initial Line of Inquiry
What does it look like? Facilitated dialogue focused on behavior and the context around behavior Structured responses to key questions using anecdotal and assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on behavior E Connecting environmental conditions to behavior I Determining instructional effects on behavior S Having a deeper analysis of student behavior Review this slide to describe it to participants. The intent of Initial Line of Inquiry is to organize perceptions and observations into a usable format in order to develop a hypothesis of why a certain behavior is occurring.
63
Initial Line of Inquiry
What do you need? Facilitator to run the process Team of people who Know the student Know functional analysis General observations Observational Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations Develop a hypothesis Review this slide to describe it to participants. The facilitator needs to be very structured and directive. Whenever possible hard data needs to provide evidence to support or refute perceptual statements that are made. People need to be careful of judgments.
64
Behaviors Exist in Context
Behaviors are context related Challenging behaviors result from unmet needs Effective supports come from an understanding of why a behavior occurs Hide this slide and use it only for the Initial Line of Inquiry center. People need to understand that behavior is contextual (cause and effect relationship). Refer to Curriculum, Environment, and Instruction.
65
The Anatomy of a Behavior
Event Automatic Thought Emotion/Feeling Behavior Response/Consequence Hide this slide and use it only for the Initial Line of Inquiry center. This diagrams how behavior actually occurs. Provide an adult example such as, a parent told me that I was not being fair to their child. My thought process was that this parent is accusing me of an unethical act without all of the facts about their child’s behavior. My feeling is anger and frustration. I tell the parent that this is my classroom and I have set rules and consequences that are fair and are the same for all children. The parent starts yelling at me that I don’t know their child and the parent complains to the principal.
66
Antecedents Behavior Consequences The ABCs of Behavior
Review this slide to describe it to participants. Through observing and collecting data on the antecedents of the behavior, the behavior, and the consequences, or what happens after the behavior we can develop a hypothesized reason or function for the behavior. By looking at the antecedents of behavior we can determine when a behavior is mostly likely to occur and least likely to occur and under what circumstances. This also allows us to manipulate the environment in order to reduce the likelihood of the behavior occurring. Consequences either strengthen the response of the behavior or weaken it.
67
ABC Chart Time Antecedent Behavior Consequence 9:05 9:10 9:17 9:18
Teacher gives class an independent writing assignment X looks out window Teacher prompts X to begin writing 9:10 X picks up pen and scribbles on page Teacher walks away 9:17 Teacher prompts X to stop scribbling and begin writing X rips paper up and throws it on the floor Teacher tells X to go to office 9:18 X stands up and goes to office X stays in office until next period Hide this slide and use it only for the Initial Line of Inquiry center One source of helpful data is an observation scripted into and ABC fashion. It needs to be clarified that antecedents are the things that happen immediately before the behavior and consequences are things that happen immediately after the behavior (not necessarily just what is imposed as a reinforcement or punishment by the teacher)
68
The Format for Initial Line of Inquiry
Strengths of Student: Slow Triggers (Setting Events) Fast Triggers (Antecedents) Problem Behavior Perceived Function Actual Consequence Review this slide to describe it to participants. This is the actual format of the Initial Line of Inquiry. The facilitator or a designated recorder will record on this format. The recorder should not be someone who has a lot to contribute about the student and the context of the behavior. Pennsylvania Department of Education, Initial Line of Inquiry
69
Consequences Consequence is the immediate natural response to a behavior Undesirable outcome (not likely to occur again) Desirable outcome (likely to occur again) Imposed consequences do not always yield the results we want Hide this slide and use it only for the descriptive review center. Clarify the true definition of consequences.
70
What is the Function of Behavior?
Avoidance What is avoided with the behavior? Gains What is gained or achieved with the behavior? Hide this slide and use it only for the descriptive review center. All behavior is intended to avoid or gain something. For example throwing my textbook may be an attempt to avoid the work because I know that I will be put into timeout when I throw things. Or my cracking a joke during math may be my attempt to gain peer acceptance as they laugh at my jokes. Some behaviors actually serve more than one purpose. The same type of behavior may be avoidance for student and a gain for another.
71
Make a Statement About the Behavior
Three parts include: When {antecedent/trigger} occurs, The {student(s)} do/does {behavior of concern}, In order to {perceived function}. Review this slide to describe it to participants. The analysis of the observations and data will help us achieve an hypothesis. These are the parts needed for the hypothesis statement. The strength of this protocol is the development of a hypothesis. Pennsylvania Department of Education, Initial Line of Inquiry
72
Hypothesis Statement:
When Jeff is given an independent writing assignment, he rips his paper up and throws it on the floor, in order to escape the writing task. Hide this slide and use it only for the Initial Line of Inquiry center An example of hypothesis.
73
A Sample Protocol for Examining Academic Performance
Initial Line of Inquiry This is an inquiry protocol.
74
Initial Line of Inquiry
What does it look like? Facilitated dialogue focused on the context around academic achievement Structured responses to key questions using assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on achievement E Connecting environmental conditions to achievement I Determining instructional effects on achievement S Having a deeper analysis of student learning Review this slide to describe it to participants. The intent of Initial Line of Inquiry is to organize perceptions, data, and observations into a usable format in order to develop a hypothesis of why a certain academic concern is occurring.
75
Initial Line of Inquiry
What do you need? Facilitator to run the process Team of people who Know the student Know the curriculum & instruction General observations Curriculum Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations & assessments Develop a hypothesis Review this slide to describe it to participants. The facilitator needs to be very structured and directive. Whenever possible hard data needs to provide evidence to support or refute perceptual statements that are made. People need to be careful of judgments.
76
Learning Variables This protocol focuses on four learning variables:
Curricular Instructional Student Performance Environmental Hide this slide and use it only for the Initial Line of Inquiry center The protocol uses an analysis of these learning variables. There are the same variables we have been discussing as context.
77
The Format for Initial Line of Inquiry
Curriculum Instruction Student Performance Environment Review this slide to describe it to participants. This is the actual format of the Initial Line of Inquiry. The facilitator or a designated recorder will record on this format. The recorder should not be someone who has a lot to contribute about the student and the context of the behavior. Pennsylvania Department of Education, Initial Line of Inquiry
78
Three Part Hypothesis What variables (factors) block learning?
How does the student learn? What strategies would support how the student learns? Review this slide to describe it to participants. The analysis of the observations and data will help us achieve an hypothesis. These are the parts needed for the hypothesis statement. The strength of this protocol is the development of a hypothesis. Pennsylvania Department of Education, Initial Line of Inquiry
79
Hypothesis Statement When Jeff is given an independent writing assignment that requires at least five paragraphs in respond to a prompt, he writes simple detail sentences that lack a main idea or a central theme and therefore, Jeff needs to organize his writing of main idea and detail sentences under a central theme by using a structured graphic organizer, such as TOWER. Hide this slide and use it only for the Initial Line of Inquiry center An example of hypothesis.
80
A Sample Protocol for Examining English Language Learners
SIOP
81
SIOP What does it look like? When would we use it? C E I S
Review this slide to describe it to participants.
82
SIOP What do you need? How does it work?
Review this slide to describe it to participants.
83
Insert SIOP Slides
84
Use Your Case Reflect on what you learned using this protocol. What can say you about…? Curriculum Environment Instruction Student(s) Have participants reflect on what was discussed and learned from the protocol about their case. Have them record the big ideas in the four areas on the worksheet.
85
Other Protocols to Consider
Action Reflection Protocol (Education Development Center, Newton, MA.) Case Story (Coalition for Essential Schools) Collaborative Analysis of Student Learning (CAStle) ASCD Consultancy (CES/Annenberg Institute National School Reform Faculty) Final Word Protocol (Coalition for Essential Schools) Lesson Study (Japan) Primary Language Record (Centre for Language in Primary Education, London) Slice (Joseph McDonald) Tuning Protocol These are some additional protocols that could be used.
86
Using Assessment to Develop an Hypothesis
This section discusses how to turn assessment analysis into a hypothesis about the focus area.
87
Develop a Hypothesis Develop a hypothesis to define a central focus
Examines the relationship among the context variables Determines why this is Remind participants that the purpose of a hypothesis is to understand why something is occurring, i.e. answer the question the developed. It helps define the relationship of the context variables.
88
Symptoms vs. Causes Symptoms Observable Details
A list of separate concerns Causes Inferred from behaviors Underlying reason/function Determined by grouping and analyzing objective, observable evidence To develop a hypothesis, we need to have a good understanding of symptoms vs. causes. Symptoms are outcomes of a concern. They are observable in nature. They are similar to things like, fever, rash, “feeling tired”, etc. Causes are the actual reasons for the symptoms. They are not observable, but are inferred from the symptoms. The cause needs to be identified in order to align the right strategy and support. However, it is not an exact science, so one caution is that causes are developed from carefully analyzing objective, observable evidence, not based on perceptions. For example, the above symptoms could be a result of any number of causes. Further probing and investigating is needed to narrow to the most likely cause. It becomes on educated guess in lieu of the provable “blood-test”.
89
Symptoms vs. Causes Cause Symptoms Lack of fluency
Frequent word recognition errors Errors tend to be visual Mispronounces words Frequent spelling errors Cause An example of an academic concern. Have participants hypothesize a cause. Have them share out.
90
Symptoms vs. Causes Cause Symptoms Does not complete work
Frequently moves around the room during academic tasks Acts out during teacher directed lessons Cause An example of a behavior concern. Have participants hypothesize a cause. Have them share out.
91
Making a Statement About the Focus Area of Improvement
When {condition or trigger} occurs, {the student, class, school, etc.} does {focus area}, in order to {perceived function}. When there is an indoor recess, the students in grade 4 talk loudly and get out of their seats during lunch, in order to release energy. The format of a hypothesis is three parts, condition, behavior/action, and a function. This is an example. Note to participants that the behavior/action is/are the symptoms and the function is the cause.
92
So What Do We Want to Happen?
The desired outcome is developed from changing the currently reality to a new one. Take a look at your hypothesis. What is it that you want to happen instead? Have participants briefly define what they see as the potential desired outcome.
93
Use Your Case Use your analysis to develop a hypothesis
When {condition or trigger} occurs, {the student, class, school, etc.} does {focus area}, in order to {perceived function}. If the participants used Initial Line of Inquiry, then would have already developed a hypothesis. Have them reflect on the hypothesis and record it on the worksheet. The other protocols did develop hypothesis. Have them use what they learned in the protocol to develop one and record it on the worksheet.
94
Establishing Baseline and Developing Monitoring Systems
Measuring Progress This section is about the recording of student progress.
95
What Do These Words Mean?
Have participants complete this activity using what do words mean handout They need to first write a percentage next each word independently (should be a silent activity) Independently mark a percentage next to each word. Always Occasionally Rarely Often Sometimes Frequently Usually A lot Never Once and a while
96
What Do These Words Mean?
Now have participants compare their figures as a table group and record the range of percentages. Compare what you wrote with your table group. Record the range of percentages.
97
What Do These Words Mean?
Have participants share out a range of percentages for some of the above listed words. You can use the pen feature on Powerpoint to write the percentages on the screen. Always Occasionally Rarely Often Sometimes Frequently Usually A lot Never Once and a while
98
What Do These Words Mean?
Have participants share ideas about what this means in terms of the language we use and the determination of student progress. What do these ranges tell us about the way we generally describe what we see?
99
Types of Vague Language
Nouns/Pronouns and Verbs “My students don’t listen.” Comparators “I want my students to do better on their quizzes.” Rule Words “I have to give C’s to students who have modified work.” Universal Qualifiers “All of the parents are upset about the report card.” Vague language is often used to describe people and observations. We need to avoid the use of vague language and ask focused questions to help others define their use of vague language. Vague language does not help us in supporting the student progress with evidence. L. Lipton & B. Wellman, 2003
100
Baseline Baseline helps us determine the starting point.
101
Establish Baseline Establish baseline of current level of performance
Determine a starting point before anything is implemented Determine what the student(s) currently know(s) and able to do Baseline is the exact starting point of a student’s progress. This is the point before we begin any strategies or interventions.
102
Baseline Data Baseline data needs to align with the focus area.
Clearly define the focus Observable (can be seen or heard) Measurable (can be counted) Specific (clear terms, no room for a judgment call) It is always numbers. Baseline must be define in terms that are observable, measurable, and specific. It is always numbers.
103
Which Ones Are Observable, Measurable, & Specific?
Paying attention Aggressive behavior Out of seat Off task Throwing objects Homework completion Comprehension Spelling errors Phonemic awareness Math facts known Writing narrative Correct words per minute Have participants process these various examples of common focus areas and discuss which ones are observable, measurable, and specific. Have them process any that could be changed to make them observable, measurable, and specific. Have participants share out their thinking. How would you change vague and non-measurable terms to be observable, measurable, and specific?
104
A general rule of thumb is 3. Sensitive to small changes over time.
Baseline Data A general rule of thumb is 3. Sensitive to small changes over time. We generally take at least three “dip-sticks” or assess three times with the same exact assessment in order to get an average for the starting point. (Everyone has off days and unusually on days, so a one time assessment is not accurate.) We also need to use assessments that are sensitive to small changes. For example, CMT’s are less sensitive due to the fact they are given once a year. Although it can be helpful when assessing the overall writing content, holistic scoring are not sensitive to changes in the use of mechanics of writing.
105
Which Assessments Provide Quality Baseline?
Holistic writing score Duration of a behavior or task Rubrics Grades Communication journal Frequency count of behavior or task Running record or DRA Anecdotal record Error analysis of student work ABC Chart Have participants process these various forms of assessment and discuss which ones are helpful for quality baseline. Have them process any that could be changed to make them better quality. Have participants share out their thinking.
106
Use Your Case Using the question and hypothesis you developed, develop a plan to establish baseline. What will be assessed? How? By whom? When? Have participants develop a plan to establish baseline on their worksheet. If they already have baseline, have them reflect on the quality of the baseline. i.e. did it meet the standards just discussed. If it did then have them record their baseline data on the worksheet.
107
Setting Targets After baseline is established when them need to develop a target.
108
Determine the Gap Determine the specific gap between current and desired performance Determine what needs to specifically change Establish what the student needs to learn Establish what conditions are needed to accelerate the learning To set a target we need to first have an understanding of what is expected for all students. What are the performance standards for all students?
109
The Achievement Gaps Gap Years in School Demands/
Expected Performance Skills Gap We then compare the difference between what is expected with the currently level of performance. Baseline Years in School KU-CRL
110
Set a Target Set a target for expected outcome and timeframe for accomplishment Determine the grade level performance standard Determine the rate of learning for most students in this area Use the gap analysis to determine a reasonable target and a specific timeframe for this target to be achieved Not only does the target need to consider where all students need to be, but also the rate of learning that occurs. These two factors will help us determine an effective target. In light of the demands of NCLB, we can no longer accept small progress, we need to examine larger gains faster. How to do this will be further explored on day 3.
111
Using Benchmarks Break down the time to meet a given goal in shorter increments Set a performance mark for each benchmark Build each benchmark on the previous one-interval monitoring Use to articulate the rate of progress Benchmarks are an effective means of achieving a large target. Benchmarks incrementally build upon one another. They simply break the target into smaller timeframes with a specific performance mark. They help articulate the rate of progress that is expected.
112
The Goal Line Time Demands/ Skills Goal
16 weeks Expectations for All Students Student’s Projected Line of Growth Benchmark -8 weeks Benchmark -6 weeks Benchmark -4 weeks Baseline/Current Level of Performance First Step: Expectations for all students need to be the first determination. It needs to be remembered that these expectation of their own rate of increase over time. This should be accounted for when setting the goal. Second Step: The student’s current level of performance needs to be determined (whether school-wide average or as an individual student). Third Step: The benchmarks need to be set by examining the rate of acquisition of learning that can be expected. This defines the line of projected growth. Time
113
Use Your Case How will you get all this information?
Using your current information, discuss what is needed for you to develop a target goal and a set of benchmarks. Do you have baseline? Can you define the expected performance for all students? Can you assess the gap? How will you get all this information? Have participants examine the current information they have on their case and discuss what other information they still need to set a target. Have them make a plan to get this information.
114
Writing a Desired Outcome
Clearly define the outcome Observable (can be seen) Measurable (can be counted) Specific (clear terms, no room for a judgment call) May sometimes require smaller benchmarks When {condition} occurs, {the student} will {desired outcome} from {baseline} to {target} by {timeline}. The desired outcome needs to be written in a clear, specific, and measurable format. (Similar to baseline.) We can take our hypothesis and rewrite in these terms in order to establish exactly what will be achieved. We need to use our breakdown of baseline, target goals, and timeframes, to expand what will be achieved into a clear way to determine progress.
115
Use Your Case Using your current information, develop a desired outcome. When {condition} occurs, {the student} will {desired outcome} from {baseline} to {target} by {timeline}. What are you missing to complete this sentence? When will you obtain this? Have participants discuss what other information they still need to write a desired outcome. Have them make a plan to get this information.
116
Monitoring Systems Now that we have examined how to establish a baseline and a target goal, we need to examine how to develop a monitoring system as part of our action plan.
117
Develop a Monitoring System
Develop a monitoring system that aligns with the baseline data and a criterion for measuring the progress This first thing we need to remember is that monitoring assessments are exactly the same as the baseline. (Comparing apples to apples) We can use our target and benchmarks as criterion for measuring progress.
118
Monitoring vs. Evaluating
On-going and frequent Part of the implementation process Provide information for adjustments in plan Evaluating A specific point in time A review of the implementation process Provide information for decisions on next steps There is a distinction between monitoring and evaluating. Monitoring is an on-going piece of the implementation process. It is done at least weekly and provides information to adjust the plan. Evaluation is more of a summative assessment of what actually occurred. It focuses on how was the plan implemented and the student outcomes of that plan. The information is used to make decisions on what needs to occur next.
119
How Will We Monitor? Determine who will monitor the progress
Determine the assessment process to use and connect it to the baseline Predetermine intervals for monitoring Determine a timeline for evaluation Daily Weekly We need to develop up front a monitoring system, not after a plan of action is already being implemented. Monitoring is something that occurs either weekly or daily.
120
Monitor the Progress Monitor the level and rate of progress of student learning Monitor on a frequent basis (daily or weekly) Student progress Implementation Integrity Check for rate of progress as it relates to the target goal line We use the monitoring information to determine the rate of progress as well as the progress. Is progress as expected or slow? We also use monitoring to assess how well the plan is being implemented. Did we do what we said we going to do, as often as we said we were going to it?
121
Charting Progress Time Demands/ Skills Goal
Expectations for All Students Student’s Current Progress Baseline/Current Level of Performance We use the monitoring assessment to calculate the likelihood of a target goal being reached. A potential decision-making rule would be if three points fall below the goal line, a change should be considered (e.g., more time for intervention, more intense, different instructional strategy etc.) The reason for this is because research shows that 3 data points are a good indication of trend. Therefore if the trend is below your goal line, you will likely not reach the goal based on the current trend even if the student is improving. For example: What is the likelihood of this student achieving goal? Time
122
Charting Progress Time Demands/ Skills Goal
Expectations for All Students Student’s Progress Baseline/Current Level of Performance How about this student? Time
123
Documenting Student Progress
Quantitative Information Graphing progress (e.g., attendance, homework completion, correct words per minute, etc.) Noting scores/levels and assessments used Stating student growth in terms of numbers Qualitative Information Narratives written in objective, observable language Noting the analysis of scores and the context (curriculum, instruction, and environment) Recording student progress can take many various forms. The most important information for determining progress comes from quantitative information, the numbers. This information is the most factual in reporting progress. Graphing is the easiest way to demonstrate progress. Qualitative information can enhance our understanding of student progress, since there are many aspects of growth that cannot be captured by numbers. This information can elaborate for us the analysis of scores and the relationship the context has on the student progress. However, this information tends to be more subjective in nature and cannot easily defend growth. Qualitative information should not be used by itself when articulating student progress.
124
Tips for Documenting Student Progress
Use the same assessment process and tools for baseline and monitoring Sensitive to small changes over time. Report the information in the same format (e.g. graphing). Align the assessment with the intervention (e.g. DRA, OBA). Monitor student progress on a frequent and regular basis in order to make quality judgments about the progress. Documenting progress needs to be viewed as a science. The more objective and observable the assessment, the more reliable the tracking of the student progress. We need to be consistent in our assessment process and tools, as well as our reporting format. Graphing is the ideal way to demonstrate the student progress because it is visual. Monitoring must happen frequently in order to have an accurate understanding of the progress. We can make decisions about instruction based on outdated or infrequent information. Imagine if weather reports only gave temperature readings once a month. It would be similar to trying to decided what to wear today based on the temperature readings a month old.
125
Use Your Case Using your potential desired outcome, discuss a possible monitoring plan. What will be assessed? How? By whom? When? How frequently? How does it relate to the baseline? Have participants refer back to their plan for baseline and use that information to develop a monitoring system on their worksheet.
126
Our Cornerstone for Change
Reflective Practice Our Cornerstone for Change Reflective Practice is an extension of assessment.
127
Why Reflect? “If teachers are to become skilled at independently identifying and addressing idiosyncratic learning problems of their students, they must learn to reflect critically on student work as well as on their own teaching practices.” This quote sets the rationale for why we need to reflect. “Lifelines to the classroom: Designing support for beginning teachers”, by Kendyll Stansbury and Joy Zimmerman. Knowledge Brief, WestEd, 2000.
128
Evaluate the Student Progress and Plan
What changes occurred? Evaluate and analyze the overall progress by comparing the baseline data to the outcome data Examine the degree of implementation integrity of the plan Determine what changes occurred Use a decision guide to make adjustments and/or revisions to the plan The indicators for quality decision-making include the need to use reflective practice in order to enhance our practice.
129
What Reflective Educators Do?
Commit to continuous improvement Assume responsibility of learning Demonstrate thinking skills for inquiry Take action that aligns with new understanding Educators who reflect commit to improve their practice, focus on learning, process through inquiry, and change practices based on their reflection. Reflective Practice to Improve Schools J. York-Barr, et.al.
130
Draw Conclusions About Impact of Teaching on Student Learning
Reflection Cycle Collect Data From a Variety Of Sources Analyze Data Modify Practice SDE BEST has developed a reflective cycle that is familiar to beginning teachers and mentors. This is similar to the Love cycle presented earlier today. Draw Conclusions About Impact of Teaching on Student Learning Evaluate Student Learning BEST Training 2001
131
What Did We Change? Curriculum Environment Instruction
Context of learning What we teach Outcomes of Learning How we teach Student(s) Instruction Environment Curriculum Remember it is how we change that will influence change in student learning. We need to reflect on what did we change? In order to improve Adapted from Heartland Area Education Agency
132
Integrity Did we do what we said we would do?
Reasons why we tend not to follow through: Lack of defined or appropriate focus Plan was not clearly defined or comprehensive to include appropriate strategies The skill levels needed to implement the plan were not adequate The right resources (time, money, personnel) were not supplied One of the aspects of reflective practice that need to stress is implementation integrity. The level of which we actually implement what we said we were going to implement is part of the accountability that we own to our students. The research on percentages of implementation is very discouraging.
133
Measuring the Effectiveness of Implementation
Did we achieve our goal for student outcomes? Did we do what we said we were going to do to promote student success? How do we know this? Did we set a predetermined goal line? Did we monitor student progress towards this goal line? Did we examine why the goal was met or not met? When we reflect, here are some essential questions to consider to help us examine the level implementation integrity we actually have.
134
Self-Reflection Dialogue how the protocol you used today will serve as reflective practice and as a means to ensure implementation integrity. Descriptive Review Initial Line of Inquiry SIOP
135
With Your Technical Assistant
Reflect how today’s information influences the process you have developed thus far. Review the previous dialogue about your school’s /district’s use of collegial support and family partnerships. Examine the various ways of teaming and determine how collegial support and family partnerships could potentially look for your school/district.
136
On Your Own… Select a protocol and try it with a small group.
Review today’s case today and add any additional assessment data needed. Collect your baseline. Revise your hypothesis and desired outcome as needed. (I.e. complete the worksheet)
137
Bring with You Next Time
Bring your case back. (everything you brought today) Bring your baseline. Add any specific programs or strategies you are currently using to address this case. Be sure to bring the curriculum and sample lesson plans that relate to the focus.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.