Download presentation
Presentation is loading. Please wait.
Published byThomasina Daniel Modified over 8 years ago
1
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education
2
GOOD ASSESSMENT DATA Tell what students know and are able to do Provide information for making instructional decisions Come in many forms
3
DIFFERENT KINDS OF ASSESSMENT Give you different information Lead to different learning Meet the needs of different learners
4
NO ONE KIND OF ASSESSMENT… Is adequate and sufficient, in and of itself, to tell you what students know and are able to do OR to provide information for making instructional decisions Satisfies a sound accountability system
5
MULTIPLE MEASURES NEEDED IN THE CLASSROOM Multiple Choice Regular Enhanced Justified Open Ended Prompts Essay Short Answer Constructed Response Investigations (CRI) Performance Tasks
7
SAMPLE MULTIPLE CHOICE ON AN INDIVIDUAL BASIS, TAKE THE MULTIPLE CHOICE SAMPLE TEST FOR: Traditional Multiple Choice Enhanced Multiple Choice Justified Multiple Choice AFTER COMPLETING THE SAMPLE TESTS, DISCUSS THE PROS AND CONS WITH YOUR COLLEAGUES.
8
CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Present a self-contained question or problem in the stem Contain as much of the item’s content as possible in the stem Does not begin each response option by repeating the same words Avoid negatively stated stems
9
CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Provide only one correct or best answer Make response options as brief as possible Make response options grammatically parallel Make each choice option grammatically consistent with the stem Avoid “all of the above” and “none of the above”
10
CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Avoid unintentional clues (e.g., one lengthy alternative) Make all alternatives plausible Alternate correct answer position randomly
11
IS THE MULTIPLE CHOICE QUESTION… Assessing factual knowledge? Assessing conceptual understanding? Language dependent? Providing evidence to inform instruction?
12
OPEN-ENDED ASSESSMENTS: Some Considerations Identify what you want kids to know (i.e., is this a concept or more rote memory?) Does the assessment provide for inclusion of all students? Does the assessment include appropriate elements of Bloom’s Taxonomy?
13
OPEN-ENDED ASSESSMENTS: Bloom’s Taxonomy
14
TYPES OF OPEN-ENDED ASSESSMENTS Short answer Essay CRI (Constructed Response Investigation)
15
CRITERIA FOR OPEN-ENDED ASSESSMENTS SHORT ANSWER ITEMS Prefer direct questions to incomplete statements Solicit concise responses Place the blank in the margin for a direct question and near the end of an incomplete sentence
16
CRITERIA FOR OPEN-ENDED ASSESSMENTS SHORT ANSWER ITEMS Restrict the number of blanks for incomplete statements to one or, at the most, two Provide sufficient answer space Use equal length blanks
17
CRITERIA FOR OPEN-ENDED ASSESSMENTS ESSAY ITEMS Define the student’s task explicitly Specify the point value and an approximate time limit for each question Employ more questions that require relatively short answers rather than fewer questions with long answers
18
CRITERIA FOR OPEN-ENDED ASSESSMENTS ESSAY ITEMS Do not allow students to choose between questions Write a trial response to each question to verify the question’s quality
19
CRITERIA FOR OPEN ENDED ASSESSMENTS Constructed Response Investigations Requires students to answer questions with their own words and ideas Requires students to apply knowledge and skills in “real world” situations Requires students to self-generate extended responses that support conjectures, justify conclusions, and substantiate experimental results
20
OPEN-ENDED ASSESSMENTS: Analysis of Student Work What concept is the prompt assessing? What evidence do you have about student understanding of the concept? What evidence do you have about student misconceptions? Is task accessible to ALL students?
21
OPEN-ENDED ASSESSMENTS: Analysis of Student Work Does the student understand the concept and have the ability to communicate it? Does the student understand the concept but cannot communicate it? The student neither understands nor can communicate his or her understanding of the concept
22
AN OPEN-ENDED PROMPT Grade 5 Directions: This is an open-ended question. Your answer will be judged on how well you show your understanding of science and how well you can explain it to others. Please write your response in the space below the question and on the next page, if necessary. You may include a picture to help explain your answer. Neesha put snails and plants together in a jar of pond water. She sealed the jar and placed it under a bright light. After several days she checked the jar and found that the snails and plants were alive and healthy. Explain why they stayed alive.
23
OPEN-ENDED PROMPTS: A Look at Student Work IN DISCUSSION GROUPS OF 2 OR 3: Examine the four samples of student work from the “Neesha and the Snails” prompt. Use the provided rubric to determine the quality of each sample paper. Be ready to share with the entire group.
24
OPEN-ENDED PROMPTS: A Look at Student Work ACTUAL SCORES FOR STUDENT WORK SAMPLES Student Sample A = 2 Student Sample B = 3 Student Sample C = 4 Student Sample D = 1
25
CRITERIA FOR PERFORMANCE TASK Get Data Manipulate Data Interpret Data Apply Data
26
SAMPLE PERFORMANCE TASK Building Materials For the task Building Materials students were given rocks and testing tools and were asked to investigate the properties of rocks to see which rock would be best for building tables and benches for a museum picnic area.
27
A SAMPLE PERFORMANCE TASK IN DISCUSSION GROUPS OF 2 OR 3: Examine the sample Performance Task, “Building Materials” Identify the following essential criteria: - What are students doing to “GET DATA”? - What are students doing to “MANIPULATE DATA”? - What are students doing to “INTERPRET DATA”? - What are students doing to “APPLY DATA?”
28
PERFORMANCE TASK: Analysis of Student Work What did the student do? What evidence do you have about student understanding of the concept? What type of questions were asked? What evidence do you have about student understanding of the Science Process Skills? Is the task accessible to ALL students?
29
RUBRICS AND SCORING GUIDES A RUBRIC is an established set of generic criteria for scoring or rating students’ tests, portfolios, or performances. A SCORING GUIDE is a specific set of criteria for a particular assignment/assessment. Holistic Scoring- All items on a task are scored as a whole Analytic- Each item on a test is scored individually Component- Similar items on a task are grouped and scored together
30
RUBRICS AND SCORING GUIDES A Rubric and Scoring Guide: Describes the levels of performance student is expected to attain relative to a desired standard of achievement. Includes descriptors, or performance descriptions, which tell the evaluator what characteristics or signs to look for in a student’s work and how to place that work on a predetermined scale. Rubrics and Scoring Guides are often supplemented by BENCHMARKS, or performance samples (i.e., anchor papers) that serve as a concrete standard against which other samples may be judged.
31
RUBRICS AND SCORING GUIDES A Four-Point Rubric or Scoring Guide: Levels of Mastery 4 = Exemplary response (Exceeds the standard) 3 = Good response (Meets the standard) 2 = Inadequate response (Below standard) 1 = Poor response (Does not meet standard)
32
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Presented by Dean Gilbert, Science Consultant Los Angeles County Office of Education Gilbert_Dean@lacoe.edu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.