25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 ctserc.org.

Slides:



Advertisements
Similar presentations
Progress Monitoring. Progress Monitoring Steps  Monitor the intervention’s progress as directed by individual student’s RtI plan  Establish a baseline.
Advertisements

Teacher Evaluation Model
The IEP: Progress Monitoring Process October 29, 2013 Vickie Pitney Carey Raph.
Defensible IEPs Douglas County School District 1 Module V: Documentation and Timelines.
Progress Monitoring project DATA Assessment Module.
Plan Evaluation/Progress Monitoring Problem Identification What is the problem? Problem Analysis Why is it happening? Progress Monitoring Did it work?
Chapter 4 How to Observe Children
Novice Webinar 2 Overview of the Four Types and Purposes of Assessment.
CHANGING ROLES OF THE DIAGNOSTICIAN Consultants to being part of an Early Intervention Team.
Response to Intervention RTI – SLD Eligibility. What is RTI? Early intervention – General Education Frequent progress measurement Increasingly intensive.
CHAPTER 3 ~~~~~ INFORMAL ASSESSMENT: SELECTING, SCORING, REPORTING.
Assessing Students for Instruction
Universal Screening and Progress Monitoring Nebraska Department of Education Response-to-Intervention Consortium.
Using Scientific Research-Based Interventions to Improve Outcomes for All Students Progress Monitoring.
What should be the basis of
Chapter 9 Instructional Assessment © Taylor & Francis 2015.
performance INDICATORs performance APPRAISAL RUBRIC
Using Scientific Research-Based Interventions to Improve Outcomes for All Students Common Assessment and Universal Screening Version
Curriculum Based Measures vs. Formal Assessment
Writing Measurable Annual Goals and Benchmarks/ Short-term Objectives
Strategies for Writing Meaningful, Measureable IEP Goals Presented to all Davis School District Related Service Providers November 2011 Based on a presentation.
ASSESSMENT& EVALUATION Assessment is an integral part of teaching. Observation is your key assessment tool in the primary and junior grades.
Early Release Professional Development NC FALCON Module 3: Collecting and Documenting Evidence February 16, 2011 Kathy Walker.
Assessing the Curriculum Gary L. Cates, Ph.D., N.C.S.P.
Our Leadership Journey Cynthia Cuellar Astrid Fossum Janis Freckman Connie Laughlin.
Beginning the Process for IEP Development 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
2003 Special Education Service Agency (907) Special Education Service Agency 2003Sp Data Assessment Lori Roth, Education Specialist Special Education.
Interstate New Teacher Assessment and Support Consortium (INTASC)
Reflective Practice Creating Opportunities for Exploration and Growth 25 Industrial Park Road, Middletown, CT · (860) Connecticut.
Fall, How does understanding the levels of assessment assist the LCMT with identification, development, implementation, and evaluation of strategies.
25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Task 4 Mathematics Boot Camp Fall, 2015.
Progress Monitoring and Response to Intervention Solution.
Classroom Assessments Checklists, Rating Scales, and Rubrics
2 The combination of three concepts constitutes the foundation for results: 1) meaningful teamwork; 2) clear, measurable goals; and 3) regular collection.
INDIVIDUALIZED FAMILY SERVICE PLAN-IFSP. IFSP The Individualized Family Service Plan (IFSP) is a process of looking at the strengths of the Part C eligible.
Writing IEPs Aligned with the General Curriculum May 2, 2003 Kimberly Mearman and Michelle LeBrun-Griffin SERC.
Leadership of self linked with a system of formative assessment Cynthia Cuellar Astrid Fossum Janis Freckmann Connie Laughlin.
Understanding Meaning and Importance of Competency Based Assessment
Developing Modifications Aligned with General Education Curriculum 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
1 Curriculum Based Measures Improving Student Outcomes through Progress Monitoring.
Progress Monitoring Intensive Behavior Supports, 2008 December, 2008.
Response to Intervention Methods of Classroom Data Collection Jim Wright
Issues in Selecting Assessments for Measuring Outcomes for Young Children Issues in Selecting Assessments for Measuring Outcomes for Young Children Dale.
After lunch - Mix it up! Arrange your tables so that everyone else seated at your table represents another district. 1.
Fourth session of the NEPBE II in cycle Dirección de Educación Secundaria February 25th, 2013 Assessment Instruments.
+ PROGRESS MONITORING: IEP Goals and Benchmarks By Marlene Chavez.
Classroom Management SPEC 534 Session #2. Objectives Identify the factors that contribute to student behavior, including the impact educators have on.
Problem Solving December 17, 2012 Aubrey Tardy Michelle Urbanek.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Treatment Integrity Degree to which something is implemented as designed, intended, planned: –Delivery of instruction/intervention –Formative evaluation.
Interventions Identifying and Implementing. What is the purpose of providing interventions? To verify that the students difficulties are not due to a.
Fidelity of Implementation A tool designed to provide descriptions of facets of a coherent whole school literacy initiative. A tool designed to provide.
25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
Assessment My favorite topic (after grammar, of course)
Training Modules  Participants will understand… Progress Monitoring Graphing Data Data-Based Decisions Linking Cases to PS/RtI.
Using Assessments to Monitor and Evaluate Student Progress 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
1 Average Range Fall. 2 Average Range Winter 3 Average Range Spring.
CONDUCTING EDUCATIONAL RESEARCH Guide to Completing a Major Project Daniel J. Boudah Chapter 5 Designing and Conducting Experimental Research Lucy B. Houston,
DECISION-MAKING FOR RESULTS HSES- Data Team Training.
Observing and Assessing Young Children
School-Based Problem-Solving for Individuals (SBIT)
Analyzing for Bridges and Gaps Beginning the Process for IEP Development 25 Industrial Park Road, Middletown, CT · (860) ctserc.org.
1 IEP Case Study: PLEPs, PLOPs, PLAFPs, and IEPs Week 7 and 8 (Combined)
Teaching and Learning Cycle and Differentiated Instruction A Perfect Fit Rigor Relevance Quality Learning Environment Differentiation.
Progress Monitoring IEPs: Combining Well-Written Goals, Curriculum-Based Measurement and Technology Tools for Success Facilitated by Jennifer Gondek Instructional.
The IEP: Progress Monitoring Process. Session Agenda Definition Rationale Prerequisites The Steps of Progress Monitoring 1.Data Collection –Unpack Existing.
Academic Seminar – Week 6 Lesson Plans & Formative Assessment Graphs.
Data Review Team Time Fall 2013.
Diagnosis and Remediation of Reading Difficulties
Presentation transcript:

25 Industrial Park Road, Middletown, CT · (860) ctserc.org

 Use high quality assessment procedures to monitor the student’s progress on IEP goals and objectives in relationship to general education curriculum and setting demands.  Use a wide variety of qualitative and quantitative data  Develop monitoring systems embedded in implementation of the IEP  Determine how monitoring will be used to evaluate student progress p. 77

 How are monitoring and evaluating distinct?  What are the essential characteristics of monitoring systems? p. 77

 Facilitator – someone to keep group focused  Recorder 1 – someone to document the work of the group on the wall chart  Recorder 2 – someone to document the work of the group on paper Revised 11/07SERC4

Revised 11/07SERC5 p. 78

“Assessment is a process of collecting data for the purpose of making decisions about individuals or groups and this decision- making role is the reason that assessment touches so many people’s lives.” Salvia & Ysseldyke (2001)

 Systematic process  Evaluation of effectiveness of instruction and implementation  Assessment of student progress  Means to track the rate of improvement (Albers, 2007) p. 79

 Assessment for Developing an IEP (Albers, 2007)  Identification  Determination of specific gaps  Selection of specific instruction, accommodations, or modifications  Assessment of IEP Effectiveness  Determination if the IEP is having the desired impact  Examination of the IEP implementation fidelity  Adjustments in the instruction (Albers, 2007) p. 79

Monitoring  On-going and frequent  Part of the implementation process  Provide information for adjustments in plan Evaluating  A specific point in time  A review of the implementation process  Provide information for decisions on next steps p. 79

 Quantitative data (Numbers)  Defining the gap between expectations and current performance  Monitoring the progress and growth  Qualitative data (Descriptions)  Developing a focus area or the cause of a concern  Defining the context  Examining the implications of decisions p. 80

 Norm-referenced  Standardized or Scripted  Comparison to a representative group  Bell curve ▪ WISC ▪ Woodcock Johnson  Pros  Determines how we compare to our peers  Cons  Labels us  Does not relate to local curriculum  One shot deal p. 80

 Criterion-referenced  Based on a specific skill area  Can be scripted, but not necessarily ▪ Brigance ▪ CMT/CAPT ▪ DRA  Pros  Determines specific skill area strengths and weaknesses  Connects to curriculum  Cons  Does not reflect daily lessons  One shot deal p. 80

 Curriculum-based assessment  Based on specific curriculum  Closely connected to instruction ▪ Running record ▪ Writing samples ▪ Student products  Pros  Directly connects to curriculum and daily lessons  On-going  Cons  Consistency of assessment procedure p. 80

 Curriculum-based measurement  Based on local norms  Closely connected to specific interventions and accommodations ▪ Reading Fluency (correct words per minute)  Pros  Directly connects to specific interventions and accommodations  On-going  Standardized  Cons  Developing local norms takes time p. 80

 Observation-based assessment  Based on observations of behavior/actions  Observable, measurable, specific ▪ Scripting ▪ Probing questions ▪ Specific counting ▪ tallying ▪ duration  Pros  Assesses actions beyond paper-pencil  Assesses context  Cons  Observer bias p. 80

 Record Review ("Heartland Area Education Agency 11", 2003)  Based on file reviews and permanent products  Examines patterns overtime ▪ E.g. Cumulative Record, Student portfolio, Health Record  Pros  Provides information of patterns over time  Assists in getting information from past teachers  Cons  Can be subjective/highly interpretative  Can provide a bias perspective p. 80

 Interviews ("Heartland Area Education Agency 11", 2003)  Based on conversations, surveys, or observation checklists  Examines patterns in perceptions ▪ E.g. Student Interview, Family Interviews, Teacher behavior checklist  Pros  Provides patterns in observations  Assists in understanding the whole child  Cons  Can be subjective/highly interpretative  Can provide a bias perspective p. 80

 Measures outcomes  Establishes targets  Considering benchmarks set in general education and current student performance  Focuses on decision making to inform instruction  Uses multiple assessment measures  Uses frequent probes (at least monthly)  Graphs and analyses data  Level of progress  Rate of progress p. 81

 Type of measurement  Accuracy  Frequency  Duration  Assessment tools that will be used p. 81

 When in small group activities, the student will write his idea and his peer idea on paper and underline the parts of his peer idea that he likes, 100% of the time based on observations  Accuracy?  Frequency?  Duration? 10/07SERC

 Given an a-b-c pattern, the student will use manipulatives to determine if it is repeating or growing scoring a 5/6 on a rubric measuring the use of the graphic organizer.  Accuracy?  Frequency?  Duration? 10/07SERC

 When in lecture and provided a note taking format, the student will record notes for at check sheets and observations.  Accuracy?  Frequency?  Duration? 10/07SERC

 Assessment process that will be used  Who will monitor the progress  Intervals for monitoring ▪ Daily ▪ Weekly ▪ Monthly p. 81

 Documentation of the level and rate of progress  E.g. graphing  Timeline for evaluation p. 81

 Establish baseline of current level of performance  Determine a starting point before anything is implemented  Determine what the student(s) currently know(s) and is able to do p. 82

 Baseline data needs to align with the focus area.  Clearly define the focus ▪ Observable (can be seen or heard) ▪ Measurable (can be counted) ▪ Specific (clear terms, no room for a judgment call)  It is always numbers. p. 82

 A general rule of thumb is 3.  Sensitive to small changes over time. p. 82

Given multi-digit addition problems with regrouping, the student will accurately solve them…  What is an effective means to collect data on this objective?

# of multi-digit problems completed accurately in 5 minutes  Graph the results for each student (independently)  Set a performance criteria (as a table group) Student Student 20011

 Review the goal/objectives you wrote  What is the assessment process for collecting baseline?  If you have the baseline already, what is it?

 Establish the expected performance level of all students  Establish the baseline for this student  Connect the line from the baseline to the expected performance for all students in one year  Determine the benchmark that could be achieved for this student in one year’s time p. 83

Demands/ Skills Days  Gap Baseline Expected Performance p. 83

Demands/ Skills Days  Gap Baseline p. 83 Student’s Projected Line of Growth  Goal

# of multi-digit problems completed accurately in 5 minutes  Benchmark - 8 correct problems in 5 minutes Student Student 20011

Draw a thick line on the benchmark

Draw a line that covers at least 3 points and intersects with the benchmark

Set a target based on this line

Where would you set the target?

 Look back at the performance criteria you set before the exercise  Compare to the target you just set  What did you notice?  What new insights do you have?

 Quantitative Information  Graphing progress (e.g., attendance, homework completion, correct words per minute, etc.)  Noting scores/levels and assessments used  Stating student growth in terms of numbers  Qualitative Information  Narratives written in objective, observable language  Noting the analysis of scores and the context (curriculum, instruction, and environment) p. 84

 Monitor the level and rate of progress of student learning  Monitor on a frequent basis (daily or weekly) ▪ Student progress ▪ Implementation Integrity  Check for rate of progress as it relates to the target goal line

Demands/ Skills Days  Baseline Goal Student’s Current Progress

Student Student  Complete the graph for each student

 Is this student making progress?  Is the rate of growth acceptable?  Is the implementation of the IEP working?  What are the potential factors that creating this growth pattern?

 How does this rate of growth compare to what was expected?  Has this student met mastery? (the benchmark)  What are the potential factors that creating this growth pattern?

 Trendlines can help monitor rate as well as level of progress  “Eyeball” – draw a line that covers at least three points  Excel  Analysis  Trendlines

Try drawing a trendline

 For the goal/objectives you wrote, determine the monitoring process that will be used  Who will monitor the progress?  What assessment process will be used?  How often will data be collected? ▪ Daily ▪ Weekly ▪ Monthly  When will the data be evaluated? p. 85