Download presentation
Presentation is loading. Please wait.
Published byMarjorie Reynolds Modified over 8 years ago
1
ASSESSMENT
2
Special Education The term ‘special education’ means specially designed instruction, at no cost to parents, to meet the unique needs of a child with a disability (Sec. 1400) Services are provided in response to child’s needs, not categorically
3
Special Education is problem-solving “Special education exists because all general education programs fail to educate effectively some portion of students assigned to those classrooms” (Deno, 1989). Special education seeks to solve the problem of students who fail to succeed in the mainstream.
4
Special Education: Underlying Assumptions Special education programs are a problem-solving component of the school system whose function is to identify and serve individuals whose performance is significantly discrepant from their peers. (Stan Deno)
5
Disabilities as Performance Discrepancies One way to define disabilities is to specify the difference between the performance required of the individual in a given situation and the performance actually achieved.
6
Disabilities as Performance Discrepancies Performance discrepancies are the disabilities that must be overcome if an individual is to be perceived as successful.
7
Defining Assessment Within the context of the problem-solving model, assessment becomes “A tool for improving educational outcomes for children” because it provides us with the information to modify instruction and set appropriate goals.
8
Importance of Valid, Reliable Assessment and Results Historical overrepresentation of minority groups in SpEd Race/Ethnicity% Student Population % Ethnic Group Qualifying for SpEd (ages 6 – 21) Most Common Disability (after SLD) American Indian/Alaska Native 1.214.1SL Asian/Pacific Islander4.54.6SL Hispanic19.28.4SL Black17.312.6SL, MR, ED White57.98.8SL Note: All statistics from the National Center for Education Statistics, 2004.
9
Importance of Valid, Reliable Assessment and Results Biggest factor in qualification for services is referral for assessment –What factors might make some students qualify at higher rates? –Which students might not be referred that need extra support? New approaches to qualification (i.e., RTI or multi-tiered approach) attempt to reduce overrepresentation by providing support to students early, prior to assessment for SpEd
10
Need for Purposeful Assessment In the context of the school, assessment should… Be linked to a purpose Address specific questions about student knowledge and skills Provide data that support instructional design and modification Support increased student outcomes Utilize a balanced approach (i.e., multiple methods)
11
Assessment review Formative (today’s focus) Occurs throughout instruction (e.g., screening, diagnostic test, progress monitoring) Provides information about student performance relative to instructional goals Allows teacher to determine whether instruction is effective and make changes to improve outcomes Summative Measures the result of instruction Occurs at the end of a unit or year (e.g. unit/chapter test, state assessment) Provides a picture regarding whether students met instructional goals
12
Formative Assessment Activities in Math Initial Math Assessment determining placement and appropriate instruction Progress Monitoring determining growth toward goals Mastery determining mastery of skills as move through scope and sequence Instructional Error Analysis determining error patterns during instruction and remediating
13
Curriculum-Based Assessment (CBA) CBA is type of formative assessment: Integrally linked to the curriculum (as an alternative to standardized testing) Based on a student’s ongoing performance Supports teachers in making data-based instructional decisions CBA & Curriculum-Based Measurement (CBM): CBM is a more specific category of CBA CBM generally refers to tools with established technical adequacy and standardized administration, and usually uses norms
14
Types of CBA Survey CBA Focused CBA: Untimed Timed
15
Survey CBA Test that measures a wide span of concepts, knowledge, and skills Assessment focuses on several mathematics standards Tests students for placement in math curriculum and instructional group
16
Untimed focused CBA Measures a narrower span of skills than a survey CBA Assesses narrow skill in greater depth No time limit Test for placement Evaluate mastery of lesson objectives Check for maintenance of a skill
17
Timed focused CBA (a.k.a., probe) Measures a narrower span of skills than a survey CBA Set time limit Test for placement Evaluate mastery of lesson objectives Check for maintenance of a skill Used to examine fluency with a focused skill Score is sensitive to small changes in performance
18
CBA Application Which type of CBA would you use to check whether a student met an IEP goal? Why is it useful to know the different types of assessment? How does knowing the different types of CBA impact your teaching?
19
Initial Math Assessment Referenced to a typical or specific curriculum Survey CBA tests Determine approximate developmental level of skills Conduct initial error analysis Diagnostic or Specific-level tests Focus on determining placement into scope and sequence Fact pretests Focus on determining specific fact weaknesses and placement into fact program
20
Survey-Level Tests Use placement test from program, design your own based on grade level, or use placement tests from course website. Administer test to group.
21
Beginning Math Assessment K-1 Counting (rote to 100; skip by 10s, 5s, 2s; from number other than 1) Numeral ID (1-99) Numeral writing (1-99) Number sense (quantity comparison, rational counting) Operations: add/subtract - no renaming Add/subtract story problems Survey CBA (K-1)
22
Grades 2 and up Counting by 1s and several skip counting series) Numeral ID (to millions) Numeral writing (to millions) Operations: add/subtract/multiply/divide Problem solving: add/subtract/multiply/divide Survey CBA (Intermediate)
23
For students working with fractions, decimals, and percents (roughly, grades 6 and up) Reading and writing numbers Rounding decimals Identifying and manipulating fractions Identifying and finding percent values Decimal, fraction, and percent conversions Operations with decimals and fractions Single and multi-step story problems using fractions and decimals Survey CBA (Upper Level)
24
Not sure of what level? Administer the last three addition/subtraction problems of the lower level assessment 24 63 57 +32 + 5 -35 Correct strategy and procedure? (fact errors OK) Administer the higher level assessment. Note. Have both assessments ready just in case.
25
Administration: Things to Consider Rapport Take a few moments to introduce yourself Briefly explain why you’re working with them. Materials Be prepared (extra pencils, calculator, manipulatives, number line, etc…) Reward Ask the teacher what type of reward might be OK. Suggested rewards - (schoolwide behavior cards, stickers, high-fives, etc..)
26
Administration Counting Read directions Test each item Number Identification / Writing Use student worksheet Use stopping criteria (5 consecutive errors)
27
Administration - operations Beginning - administer all Stopping criteria Intermediate and Upper Level - administer each operation Stopping criteria for each operation Modifications for operations Counters (concrete or representational) Prompt renaming if needed Note prompts used during assessment Calculator (after 1st working the problems) Note. In your report, indicate when student used a calculator to work each problem
28
Administration - problem solving Beginning Read each problem to the student Intermediate and Upper Level Check that the student can read each problem or read problem to the student Calculators or fact tables OK as you’re testing problem solving - not computation
29
Scoring Counting Record highest number correctly counted Number ID and Writing: Indicate items correct (+) Incorrect responses - record error (what student said for error analysis) or NR for No Response Use stopping criteria for number identification and writing Leave items not tested blank Highlighting For all parts: Highlight items missed on the record sheet Use highlighted items after test for easy visual analysis Scoring
30
On the student’s work Mark each problems correct (+ or C) Circle problem (or part of problem) that is incorrect and write in correct answer. Code types of error based on your error analysis On the summary sheet Note strategies used For Problem Solving section - complete score summary on administration guide Scoring operations and story problems
31
Survey Level Tests Summarize group performance using data sheet Evaluate errors and identify skill areas where students are having trouble If necessary, administer another survey-level test
34
Diagnostic Tests Using data from survey level test, determine student’s current functioning across several skills Use the following decision rules for deciding which items to put on Diagnostic Assessment
35
Decision Rules Did the student do her/his best work on the level test? Were there distractions in the testing environment or was the student unwilling to try hard for you (i.e., are the errors on the test "can't" or "won't" errors)? If you believe the results of the level test represent the student's best effort, then identify the error type (i.e., fact, component, or strategy).
36
Decision Rules If student made component or strategy errors on a problem type, plan on including that problem type on your diagnostic test. For each problem type you decide to put on the diagnostic test, go to the scope and sequence chart in the DI Math text and select at least two previous skills which students should have mastered, and two later skills you believe the student has not mastered (for goal setting).
37
Decision Rules Identify any unique preskills that you believe the student may not have mastered and include these on the diagnostic assessment. Design three questions for each of the skills you have decided to test. You may select questions directly from the DI Math text. Write the questions on the math summary chart.
39
Diagnostic Assessment Design your diagnostic assessment using the math summary chart. If students are young, most of your questions will be oral If questions are oral, you will need to design a data recording sheet Administer; record data; conduct an error analysis
42
Fact Pretests Students may start at various sets. Students who know few facts would start at set A. Students who know more facts would begin at later points. In order to determine the set at which students might begin, administer a written pretest that includes the 100 basic facts Available online at http://depts.washington.edu/facts/
44
Fact Pretests Allow students 2 minutes, instructing them to work as many problems as they can. Use the following guidelines to place students into the sequence: 20 or more facts answered correctly can start at Set G. 30 or more facts answered correctly can start at Set M. 45 or more facts answered correctly can start at Set R. 60 or more facts correctly in the 2 minute pretest probably need not be placed in a fact program for that type of fact
47
Suggestions for administration Day One Administer Survey Level tests Day Two Analyze Survey Level test and develop Diagnostic test Administer Fact Pretest Day Three Administer Diagnostic test
48
Guidelines for a Structured Assessment Situation Have materials organized and ready to use. Ask the child to sit next to you; on your right, if right handed, on your left, if left handed. Put the student(s) at ease before testing. Provide motivation for working hard (free time, stickers, stars, etc.)
49
Guidelines for a Structured Assessment Situation Describe the purpose for testing (to determine what the student knows, what they need to learn). Give clear directions, then give the child the test. Record student responses so that student doesn't see. Follow the testing procedures accurately. Reinforce good effort, even when student is performing poorly.
50
Guidelines for a Structured Assessment Situation Do not allow facial gestures or verbal comments that will tell the student he/she gave a wrong answer. Do not tell answers or give hints; you are testing, not teaching. If the student is unable to read the story problems you may read the words to her/him.
51
Guidelines for a Structured Assessment Situation You may give prompts after recording the student's initial response to get more information about conditions under which the student can perform the task. Record as much information as possible; record data accurately.
52
Guidelines for a Structured Assessment Situation Stop when the student becomes obviously frustrated. Thank the student for working with you and give the student a sticker, verbal praise, or whatever you set up earlier.
53
Questions about the assignment?
54
CBA, Progress Monitoring, & Benchmarking Benchmarking: Using a timed focused CBA three times a year (fall, winter, and spring) to gauge student progress Can examine student performance relative to benchmark goals or peer performance Usually consists of giving three probes and taking the median score, depending on the Acts as a more reliable measure of student performance
56
CBA, Progress Monitoring, & Benchmarking cont’d. Progress Monitoring: Using a timed focused CBA on a pre-determined schedule (e.g., weekly, bi-weekly) to monitor increases or decreases in student performance Consists of giving one probe at each point in the schedule Score acts as an indicator of student performance Consistent administration allows teacher to monitor changes in student performance Scores can be used to identify need for instructional change or intervention
57
Progress Monitoring Example
58
Web-based data management systems for mathematics AIMSweb Provides access to probes for grades K-8 Lists norms for grades K-8 (most valid for K-6) Records score history Differentiates between benchmarking and progress monitoring Graphs scores and integrates intervention lines Creates classroom and individual reports Option to monitor RTI cases Others? EasyCBM and DIBELS in development
59
Sample probe
60
Scoring Multiple methods of scoring probes: Traditional method: number of problems answered correctly Digits correct: Can score digits correct in just the answer Can score digits correct in the answer and the work shown by the student (i.e., process and answer) Scoring with digits correct provides a more sensitive picture of student performance Small changes from week-to-week can be seen as students increase number of digits correct on each probe
61
Scored probe
62
Measuring Student Performance From student performance data, we can look at measurement in three ways: Percentage Most common method of measuring student performance Usually represented relative to some criterion Rate Provides information on accuracy and fluency with a skill Usually demonstrated on a line graph to display changes in rate over time Error Analysis Examining types of errors to support instructional decisions
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.