Presentation is loading. Please wait.

Presentation is loading. Please wait.

25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 ctserc.org.

Similar presentations


Presentation on theme: "25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 ctserc.org."— Presentation transcript:

1 25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 ctserc.org

2  Use high quality assessment procedures to monitor the student’s progress on IEP goals and objectives in relationship to general education curriculum and setting demands.  Use a wide variety of qualitative and quantitative data  Develop monitoring systems embedded in implementation of the IEP  Determine how monitoring will be used to evaluate student progress p. 77

3  How are monitoring and evaluating distinct?  What are the essential characteristics of monitoring systems? p. 77

4  Facilitator – someone to keep group focused  Recorder 1 – someone to document the work of the group on the wall chart  Recorder 2 – someone to document the work of the group on paper Revised 11/07SERC4

5 Revised 11/07SERC5 p. 78

6

7 “Assessment is a process of collecting data for the purpose of making decisions about individuals or groups and this decision- making role is the reason that assessment touches so many people’s lives.” Salvia & Ysseldyke (2001)

8  Systematic process  Evaluation of effectiveness of instruction and implementation  Assessment of student progress  Means to track the rate of improvement (Albers, 2007) p. 79

9  Assessment for Developing an IEP (Albers, 2007)  Identification  Determination of specific gaps  Selection of specific instruction, accommodations, or modifications  Assessment of IEP Effectiveness  Determination if the IEP is having the desired impact  Examination of the IEP implementation fidelity  Adjustments in the instruction (Albers, 2007) p. 79

10 Monitoring  On-going and frequent  Part of the implementation process  Provide information for adjustments in plan Evaluating  A specific point in time  A review of the implementation process  Provide information for decisions on next steps p. 79

11  Quantitative data (Numbers)  Defining the gap between expectations and current performance  Monitoring the progress and growth  Qualitative data (Descriptions)  Developing a focus area or the cause of a concern  Defining the context  Examining the implications of decisions p. 80

12

13  Norm-referenced  Standardized or Scripted  Comparison to a representative group  Bell curve ▪ WISC ▪ Woodcock Johnson  Pros  Determines how we compare to our peers  Cons  Labels us  Does not relate to local curriculum  One shot deal p. 80

14  Criterion-referenced  Based on a specific skill area  Can be scripted, but not necessarily ▪ Brigance ▪ CMT/CAPT ▪ DRA  Pros  Determines specific skill area strengths and weaknesses  Connects to curriculum  Cons  Does not reflect daily lessons  One shot deal p. 80

15  Curriculum-based assessment  Based on specific curriculum  Closely connected to instruction ▪ Running record ▪ Writing samples ▪ Student products  Pros  Directly connects to curriculum and daily lessons  On-going  Cons  Consistency of assessment procedure p. 80

16  Curriculum-based measurement  Based on local norms  Closely connected to specific interventions and accommodations ▪ Reading Fluency (correct words per minute)  Pros  Directly connects to specific interventions and accommodations  On-going  Standardized  Cons  Developing local norms takes time p. 80

17  Observation-based assessment  Based on observations of behavior/actions  Observable, measurable, specific ▪ Scripting ▪ Probing questions ▪ Specific counting ▪ tallying ▪ duration  Pros  Assesses actions beyond paper-pencil  Assesses context  Cons  Observer bias p. 80

18  Record Review ("Heartland Area Education Agency 11", 2003)  Based on file reviews and permanent products  Examines patterns overtime ▪ E.g. Cumulative Record, Student portfolio, Health Record  Pros  Provides information of patterns over time  Assists in getting information from past teachers  Cons  Can be subjective/highly interpretative  Can provide a bias perspective p. 80

19  Interviews ("Heartland Area Education Agency 11", 2003)  Based on conversations, surveys, or observation checklists  Examines patterns in perceptions ▪ E.g. Student Interview, Family Interviews, Teacher behavior checklist  Pros  Provides patterns in observations  Assists in understanding the whole child  Cons  Can be subjective/highly interpretative  Can provide a bias perspective p. 80

20  Measures outcomes  Establishes targets  Considering benchmarks set in general education and current student performance  Focuses on decision making to inform instruction  Uses multiple assessment measures  Uses frequent probes (at least monthly)  Graphs and analyses data  Level of progress  Rate of progress p. 81

21  Type of measurement  Accuracy  Frequency  Duration  Assessment tools that will be used p. 81

22  When in small group activities, the student will write his idea and his peer idea on paper and underline the parts of his peer idea that he likes, 100% of the time based on observations  Accuracy?  Frequency?  Duration? 10/07SERC

23  Given an a-b-c pattern, the student will use manipulatives to determine if it is repeating or growing scoring a 5/6 on a rubric measuring the use of the graphic organizer.  Accuracy?  Frequency?  Duration? 10/07SERC

24  When in lecture and provided a note taking format, the student will record notes for at check sheets and observations.  Accuracy?  Frequency?  Duration? 10/07SERC

25  Assessment process that will be used  Who will monitor the progress  Intervals for monitoring ▪ Daily ▪ Weekly ▪ Monthly p. 81

26  Documentation of the level and rate of progress  E.g. graphing  Timeline for evaluation p. 81

27

28  Establish baseline of current level of performance  Determine a starting point before anything is implemented  Determine what the student(s) currently know(s) and is able to do p. 82

29  Baseline data needs to align with the focus area.  Clearly define the focus ▪ Observable (can be seen or heard) ▪ Measurable (can be counted) ▪ Specific (clear terms, no room for a judgment call)  It is always numbers. p. 82

30  A general rule of thumb is 3.  Sensitive to small changes over time. p. 82

31 Given multi-digit addition problems with regrouping, the student will accurately solve them…  What is an effective means to collect data on this objective?

32 # of multi-digit problems completed accurately in 5 minutes  Graph the results for each student (independently)  Set a performance criteria (as a table group) Student 13546 Student 20011

33  Review the goal/objectives you wrote  What is the assessment process for collecting baseline?  If you have the baseline already, what is it?

34  Establish the expected performance level of all students  Establish the baseline for this student  Connect the line from the baseline to the expected performance for all students in one year  Determine the benchmark that could be achieved for this student in one year’s time p. 83

35 Demands/ Skills Days  Gap Baseline Expected Performance p. 83

36 Demands/ Skills Days  Gap Baseline p. 83 Student’s Projected Line of Growth  Goal

37 # of multi-digit problems completed accurately in 5 minutes  Benchmark - 8 correct problems in 5 minutes Student 13546 Student 20011

38 Draw a thick line on the benchmark

39

40 Draw a line that covers at least 3 points and intersects with the benchmark

41

42 Set a target based on this line

43 Where would you set the target?

44

45  Look back at the performance criteria you set before the exercise  Compare to the target you just set  What did you notice?  What new insights do you have?

46  Quantitative Information  Graphing progress (e.g., attendance, homework completion, correct words per minute, etc.)  Noting scores/levels and assessments used  Stating student growth in terms of numbers  Qualitative Information  Narratives written in objective, observable language  Noting the analysis of scores and the context (curriculum, instruction, and environment) p. 84

47  Monitor the level and rate of progress of student learning  Monitor on a frequent basis (daily or weekly) ▪ Student progress ▪ Implementation Integrity  Check for rate of progress as it relates to the target goal line

48 Demands/ Skills Days  Baseline Goal Student’s Current Progress

49 Student 13546898786797 Student 20011234667878  Complete the graph for each student

50

51  Is this student making progress?  Is the rate of growth acceptable?  Is the implementation of the IEP working?  What are the potential factors that creating this growth pattern?

52

53  How does this rate of growth compare to what was expected?  Has this student met mastery? (the benchmark)  What are the potential factors that creating this growth pattern?

54  Trendlines can help monitor rate as well as level of progress  “Eyeball” – draw a line that covers at least three points  Excel  Analysis  Trendlines

55

56 Try drawing a trendline

57  For the goal/objectives you wrote, determine the monitoring process that will be used  Who will monitor the progress?  What assessment process will be used?  How often will data be collected? ▪ Daily ▪ Weekly ▪ Monthly  When will the data be evaluated? p. 85

58


Download ppt "25 Industrial Park Road, Middletown, CT 06457-1520 · (860) 632-1485 ctserc.org."

Similar presentations


Ads by Google