Download presentation
Presentation is loading. Please wait.
Published byMargaretMargaret Morrison Modified over 8 years ago
1
Assessment & Reflective Practice Our Cornerstone for Change 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 Connecticut State Department of Education · Division of Educational Programs and Services
2
2 The Layout of Professional Development for EIP Day 1 -Collaborative Strategic Decision-Making Developing a process and framework Day 2 -Assessment and Reflective Practice Examining the use of assessment Identifying how reflective practice works Day 3 -Instructional Repertoire Building new ways to develop strategies focused on improved student outcomes
3
3 Central Themes Building a Collaborative Learning Community Using Strategic Decision-Making Building Capacity to Develop, Implement and Sustain an Effective Process
4
4 Objectives for Today To examine the use of protocols for analyzing student work in order to define a focus area for improvement; To develop effective monitoring systems that chart student progress from baseline to a specified target; and To define reflective practice and identify how it will improve implementation integrity, as well as enhance instructional practice.
5
5 Components of EIP Leadership Collegial & Family Partnerships Strategic Decision-Making Assessment & Reflective Practice Instructional Repertoire Accountability & Documentation
6
6 Lessons Learned Using assessment and reflection should result in a change in instructional practice. Assessments focus on environment, curriculum, and instruction, not just the student. Reflection is a process that focuses on how teachers can enhance their practice.
7
7 Indicators of a Quality Decision-Making Process Identify the focus area or concern Determine the desired outcome Generate alternative strategies Examine strategies for feasibility Develop a plan of action, including a monitoring system Implement & monitor student progress & the plan Evaluate student progress & the plan
8
What is Assessment?
9
9 The Purpose of Assessment “Assessment is a process of collecting data for the purpose of making decisions about individuals or groups and this decision-making role is the reason that assessment touches so many people’s lives.” Salvia & Ysseldyke (2001)
10
10 What is the Purpose for Assessment? To make instructional decisions
11
11 Data to Verify From To Perception of an Issue Action What Makes Decision-Making Strategic? Data Driven Action Action Based on SWIS Perception of an Issue
12
12
13
13 Characteristics of Assessment Functional (Effective, Useful) Relevant Direct Multidimensional Formative Frequent, Repeated Individually Focused Technically Adequate
14
14 When You Think “Assessment” What is the question that needs to be answered? What information do you intend to obtain from your assessment? What will you do to get the information? How will you use the information you got?
15
15 Phases of Collaborative Inquiry Collecting Data Analyzing Data Organizing Data-Driven Dialogue Framing the Question Drawing Conclusions, Taking Action Monitoring Results Love, N., 2002
16
16 What Data Do We Use? Looking at Numbers Quantitative data (Numbers) Defining the gap between expectations and current performance Monitoring the progress and growth Move Beyond Numbers Qualitative data (Descriptions) Developing a focus area or the cause of a concern Defining the context Examining the implications of decisions
17
17 Testing vs. Assessment
18
18 Domains of Assessment Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency
19
19 DOMAINSR (Review)I (Interview)O (Observe)T (Test) E (Examine Student Work) C Curriculum Permanent products District Standards Lesson plans Teachers Curriculum Specialists Administrators Implementation of standards Decisions on selection of content Readability of texts Standards in Practice SLICE Tuning Protocol E Environment School Rules, handbooks Policies Teachers Administrators Parents Students Interaction patterns Environmental analysis Observational based assessments Classroom environment scales, checklists, etc. Initial Line of Inquiry Standards in Practice I Instruction Permanent products Teachers Administrators Parents Students Implementation of CCT Teacher expectations Antecedents, conditions, consequences Classroom environment scales, checklists, etc. Initial Line of Inquiry Descriptive Review Lesson Study Tuning Protocol S Student Student records Teachers Administrators Parents Students Target area Dimensions & nature of the problem Student performance Discrepancy btw setting demands & performance Initial Line of Inquiry Descriptive Review
20
20 Nation/International Assessments Are students performing optimally? Large Scale Assessments Are students meeting the state standards? Diagnostic Assessments What are students’ cognitive strengths and needs? Student Report Cards How are students performing in general? Performance Assessment Can students apply and generalize what they’ve learned? Classroom Curriculum Unit Tests, Quizzes Did Students learn it? Formative Assessments Are students learning it? Figure 1.The Richness and Complexity of Student Assessment Data Specificity of Information Rate of Feedback North Central Regional Educational Laboratory Policy Issues Issue 6 Nov 2000 2 1. Using Student Assessment Data: What can We Learn from Schools? Allison Cromley Annually to students in selected grades As needed/usually 1X/year Once/curriculum unit Weekly Daily
21
21 Assessment & Reflective Practice (Adapted from Ortiz, 1987; Horner, 1998; Sugai, 2001) All Students in School Universal Assessment Focused Assessment Lesson Study Observational-Based Curriculum-Based Observation-Feedback on Instruction In-Depth Analysis Increased Objectivity Focused Assessments Reflective Practice Examining Student Work Problem Validation Formal & Informal Monitoring Student Progress
22
22 A Key Factor for Assessment In 2000, a Harvard study was conducted examining the issue of disproportionality in special education. Connecticut was cited as one of the states identified as in need of improvement in this area.
23
Using Assessment to Identify the Focus Area for Improvement
24
24 Identify the Focus Area for Improvement What is happening? Frame a question in terms of the impact on student learning Examine the context by collecting and analyzing data Develop a hypothesis to define a central area of focus
25
25 Remember… We Need to Develop a Question Frame a question in terms of the impact on student learning Frames our thinking in terms of inquiry vs. judging Aligns our thinking to student learning
26
26 Examine the Context Examine the context by collecting and analyzing data Determine when, where, how long, with whom, and under what conditions Develop a rationale for the occurrence using data Use evidence to explain what we see as reason for performance gaps
27
27 Domains of Assessment Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency
28
28 Essential Questions to Analyze Curriculum What content standards does this address? What are the performance standards? What is the essential content? What is the level of expectation? How are the curricula standards and materials adapted to meet instructional level?
29
29 Essential Questions to Analyze Environment How are expectations clearly communicated? What are the task directions? What are the opportunities for student choice? What are the physical influences on the learning? What are the social/interpersonal influences on the learning? How do the student and teacher collaborate in the learning process?
30
30 Essential Questions to Analyze Instruction What is the amount of student engagement and relevant practice? Is there appropriate pacing? What teaching strategies are used? How are tasks organized for students? Is there an instructional match? How does the feedback support student learning?
31
31 Essential Questions to Analyze Student Performance What does the student know? What can the student do? What are the student’s strengths? What are the student’s interests? What it the instructional level? What learning strategies does the student use? How does the student organize information and approach new learning? How does the student self-monitor? What are the patterns in errors?
32
32 Essential Questions to Ask About Behavior When is the behavior most/least likely to occur? Where is the behavior most/least likely to occur? With whom is the behavior most/least likely to occur? What happens immediately before/after the behavior? What do others do when the behavior occurs? What other environmental conditions may contribute to the behavior? Pennsylvania Department of Education, Initial Line of Inquiry Gary LaVigna (2000) Behavioral Assessment and Advanced Support Strategies
33
Using Protocols to Define the Focus Area of Improvement A Means to Collaboratively Analyze Assessments
34
34 What are Protocols? Tools for analysis that are characterized by: Structured dialogue Collaborative inquiry More than one perspective Reflective practice
35
35 The Purpose of Protocols Provide a safe environment to share and reflect with colleagues Give and receive feedback on our practices and the relationship to student learning Focus on student work/performance Make the most efficient use of our time
36
A Sample Protocol for Examining Student Work Descriptive Review
37
37 Descriptive Review What does it look like? Examination of a student product (e.g. writing sample, math assignment, etc.) Round Robin responses to selected questions (e.g. describe what you see?) When would we use it? C Determining next curriculum area E Connecting the context & student work I Determining next steps for instruction S Having a deeper analysis of student learning
38
38 Descriptive Review What do you need? Facilitator to run the process Presenting teacher to provide the context of the student work & a focus for reflection A Student work sample hard copy of the student work How does it work? Follow articulated steps Select key questions to ask for each round (one question per round) Each member of the group provides one response to the question (Round robin fashion) (Can go around more than once for more responses)
39
A Sample Protocol for Examining Behavior Initial Line of Inquiry
40
40 Initial Line of Inquiry What does it look like? Facilitated dialogue focused on behavior and the context around behavior Structured responses to key questions using anecdotal and assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on behavior E Connecting environmental conditions to behavior I Determining instructional effects on behavior S Having a deeper analysis of student behavior
41
41 Initial Line of Inquiry What do you need? Facilitator to run the process Team of people who Know the student Know functional analysis General observations Observational Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations Develop a hypothesis
42
A Sample Protocol for Examining Academic Performance Initial Line of Inquiry
43
43 Initial Line of Inquiry What does it look like? Facilitated dialogue focused on the context around academic achievement Structured responses to key questions using assessment data Develops a hypothesis for the focus area of improvement When would we use it? C Determining curriculum effects on achievement E Connecting environmental conditions to achievement I Determining instructional effects on achievement S Having a deeper analysis of student learning
44
44 Initial Line of Inquiry What do you need? Facilitator to run the process Team of people who Know the student Know the curriculum & instruction General observations Curriculum Based Assessments Overhead or chart paper How does it work? Follow articulated steps and key questions Record information on the format provided by protocol Facilitate a collaborative dialogue about the meaning of the observations & assessments Develop a hypothesis
45
45 Other Protocols to Consider Action Reflection Protocol (Education Development Center, Newton, MA.) Case Story (Coalition for Essential Schools) Collaborative Analysis of Student Learning (CAStle) ASCD Consultancy (CES/Annenberg Institute National School Reform Faculty) Final Word Protocol (Coalition for Essential Schools) Lesson Study (Japan) Primary Language Record (Centre for Language in Primary Education, London) Slice (Joseph McDonald) Tuning Protocol
46
Using Assessment to Develop an Hypothesis
47
47 Develop a Hypothesis Develop a hypothesis to define a central focus Examines the relationship among the context variables Determines why this is
48
48 Symptoms vs. Causes Symptoms Observable Details A list of separate concerns Causes Inferred from behaviors Underlying reason/function Determined by grouping and analyzing objective, observable evidence
49
49 Making a Statement About the Focus Area of Improvement When {condition or trigger} occurs, {the student, class, school, etc.} does {focus area}, in order to {perceived function}. When there is an indoor recess, the students in grade 4 talk loudly and get out of their seats during lunch, in order to release energy.
50
Establishing Baseline and Developing Monitoring Systems Measuring Progress
51
Establishing Baseline and Developing Monitoring Systems Measuring Progress
52
52 Types of Vague Language Nouns/Pronouns and Verbs “My students don’t listen.” Comparators “I want my students to do better on their quizzes.” Rule Words “I have to give C’s to students who have modified work.” Universal Qualifiers “All of the parents are upset about the report card.” L. Lipton & B. Wellman, 2003
53
Baseline
54
54 Establish Baseline Establish baseline of current level of performance Determine a starting point before anything is implemented Determine what the student(s) currently know(s) and able to do
55
55 Baseline Data Baseline data needs to align with the focus area. Clearly define the focus Observable (can be seen or heard) Measurable (can be counted) Specific (clear terms, no room for a judgment call) It is always numbers.
56
56 Baseline Data A general rule of thumb is 3. Sensitive to small changes over time.
57
Setting Targets
58
58 Determine the Gap Determine the specific gap between current and desired performance Determine what needs to specifically change Establish what the student needs to learn Establish what conditions are needed to accelerate the learning
59
59 Demands/ Skills Years in School The Achievement Gaps KU-CRL Gap Baseline Expected Performance
60
60 Set a Target Set a target for expected outcome and timeframe for accomplishment Determine the grade level performance standard Determine the rate of learning for most students in this area Use the gap analysis to determine a reasonable target and a specific timeframe for this target to be achieved
61
61 Using Benchmarks Break down the time to meet a given goal in shorter increments Set a performance mark for each benchmark Build each benchmark on the previous one-interval monitoring Use to articulate the rate of progress
62
62 Demands/ Skills Time The Goal Line Expectations for All Students Baseline/Current Level of Performance Goal Student’s Projected Line of Growth Benchmark -4 weeksBenchmark -8 weeksBenchmark -6 weeks16 weeks
63
63 Writing a Desired Outcome Clearly define the outcome Observable (can be seen) Measurable (can be counted) Specific (clear terms, no room for a judgment call) May sometimes require smaller benchmarks When {condition} occurs, {the student} will {desired outcome} from {baseline} to {target} by {timeline}.
64
Monitoring Systems
65
65 Develop a Monitoring System Develop a monitoring system that aligns with the baseline data and a criterion for measuring the progress
66
66 Monitoring vs. Evaluating Monitoring On-going and frequent Part of the implementation process Provide information for adjustments in plan Evaluating A specific point in time A review of the implementation process Provide information for decisions on next steps
67
67 How Will We Monitor? Determine who will monitor the progress Determine the assessment process to use and connect it to the baseline Predetermine intervals for monitoring Determine a timeline for evaluation Daily Weekly
68
68 Monitor the Progress Monitor the level and rate of progress of student learning Monitor on a frequent basis (daily or weekly) Student progress Implementation Integrity Check for rate of progress as it relates to the target goal line
69
69 Demands/ Skills Time Charting Progress Expectations for All Students Baseline/Current Level of Performance Goal Student’s Current Progress
70
70 Demands/ Skills Time Charting Progress Expectations for All Students Baseline/Current Level of Performance Goal Student’s Progress
71
71 Documenting Student Progress Quantitative Information Graphing progress (e.g., attendance, homework completion, correct words per minute, etc.) Noting scores/levels and assessments used Stating student growth in terms of numbers Qualitative Information Narratives written in objective, observable language Noting the analysis of scores and the context (curriculum, instruction, and environment)
72
72 Tips for Documenting Student Progress Use the same assessment process and tools for baseline and monitoring Sensitive to small changes over time. Report the information in the same format (e.g. graphing). Align the assessment with the intervention (e.g. DRA, OBA). Monitor student progress on a frequent and regular basis in order to make quality judgments about the progress.
73
Reflective Practice Our Cornerstone for Change
74
74 Why Reflect? “If teachers are to become skilled at independently identifying and addressing idiosyncratic learning problems of their students, they must learn to reflect critically on student work as well as on their own teaching practices.” “Lifelines to the classroom: Designing support for beginning teachers”, by Kendyll Stansbury and Joy Zimmerman. Knowledge Brief, WestEd, 2000.
75
75 Evaluate the Student Progress and Plan What changes occurred? Evaluate and analyze the overall progress by comparing the baseline data to the outcome data Examine the degree of implementation integrity of the plan Determine what changes occurred Use a decision guide to make adjustments and/or revisions to the plan
76
76 What Reflective Educators Do? Commit to continuous improvement Assume responsibility of learning Demonstrate thinking skills for inquiry Take action that aligns with new understanding Reflective Practice to Improve Schools J. York-Barr, et.al.
77
77 Reflection Cycle Collect Data From a Variety Of Sources Analyze Data Evaluate Student Learning Modify Practice Draw Conclusions About Impact of Teaching on Student Learning BEST Training 2001
78
78 What Do We Change? Context of learning What we teach Outcomes of Learning How we teach S tudent(s) I nstruction E nvironment C urriculum Adapted from Heartland Area Education Agency
79
79 Integrity Did we do what we said we would do? Reasons why we tend not to follow through: Lack of defined or appropriate focus Plan was not clearly defined or comprehensive to include appropriate strategies The skill levels needed to implement the plan were not adequate The right resources (time, money, personnel) were not supplied
80
80 Measuring the Effectiveness of Implementation Did we achieve our goal for student outcomes? Did we do what we said we were going to do to promote student success? How do we know this? Did we set a predetermined goal line? Did we monitor student progress towards this goal line? Did we examine why the goal was met or not met?
81
81 With Your Technical Assistant Reflect how today’s information influences the process you have developed thus far. Review the previous dialogue about your school’s /district’s use of collegial support and family partnerships. Examine the various ways of teaming and determine how collegial support and family partnerships could potentially look for your school/district.
82
82 On Your Own… 1. Select a protocol and try it with a small group. 2. Review today’s content and add any additional assessment information needed to case study. Collect baseline. Revise hypothesis and desired outcome, if necessary. Utilize technical assistance support to complete assessment worksheet for case study.
83
83 Bring with You Next Time Bring same case study and supplemental materials with you, including updated assessment information. Bring curriculum and sample lesson plans for case that relate to identified focus area in need of improvement.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.