ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

Inquiry-Based Instruction
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Alternate Choice Test Items
Measuring Complex Achievement: Essay Questions
Gary D. Borich Effective Teaching Methods 6th Edition
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Constructed Response Assessment. Constructed Response Definition A student-created response to a test item, as an essay response. Assessment items requiring.
Lamar State College-Port Arthur February 16 & 17, 2011.
D EVELOPING ASSESSMENT RUBRICS Ai Vu, Science Coordinator Integrated Middle School Science Partnership Alameda County Office of Education Materials from.
Building an Accurate SBPR RECORD KEEPING ASSESSMENT AND INSTRUCTION GRADE LEVEL STANDARDS SBPR 1.
Authentic Assessment Abdelmoneim A. Hassan. Welcome Authentic Assessment Qatar University Workshop.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Principles of High Quality Assessment
Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
ASSESSMENT OF ESSAY TYPE QUESTIONS. CONSTRUCTING QUESTIONS Construct questions that test HIGHER LEVEL PROCESSES SUCH AS Construct questions that test.
Assessment Basics and Active Student Involvement.
Essay Assessment Tasks
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
SCORING. INTRODUCTION & PURPOSE Define what SCORING means for the purpose of these modules Explain how and why you should use well-designed tools, such.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Completion, Short-Answer, and True-False Items
What is Open response?.  A Situation Reading Open Response will have a story, a poem, or an article to read.  A Task Set of questions/prompt to answer.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Protocols for Mathematics Performance Tasks PD Protocol: Preparing for the Performance Task Classroom Protocol: Scaffolding Performance Tasks PD Protocol:
Choose Your Own Adventure. Introduction Use this as a guide when working on a Smarter Balanced question.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 ITEM TYPES IN A TEST Missing words and incomplete sentences Multiple choice.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Alternative Assessment
Teaching Today: An Introduction to Education 8th edition
ASSESSMENT TECHNIQUES THE FOUR PART MODEL Presented by Daya Chetty 20 APRIL 2013.
Performance and Portfolio Assessment. Performance Assessment An assessment in which the teacher observes and makes a judgement about a student’s demonstration.
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Session 4 Performance-Based Assessment
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Stage 2 Understanding by Design Assessment Evidence.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
INSTRUCTIONAL OBJECTIVES
The Achievement Chart Mathematics Grades Note to Presenter:
Assessment Basics and Active Student Involvement Block II.
Performance Based Assessment. What is Performance Based Assessment? PBA is a form of assessment that requires students to perform a task rather than an.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Rubrics: Using Performance Criteria to Evaluate Student Learning PERFORMANCE RATING PERFORMANCE CRITERIABeginning 1 Developing 2 Accomplished 3 Content.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Designing a Culminating Task Presented by Anne Maben UCLA Science & Literacy Coach Based on the model by Jay McTighe, Maryland Assessment Consortium.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Colorado Academic Standards Colorado English Language Proficiency (CELP) Standards There are now five English language development standards: Standard.
Goals To understand assessment of student science learning. To learn about RIPTS Standard 9.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Designing Quality Assessment and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
EDU 385 Session 8 Writing Selection items
Preparation for the American Literature Eoc
Classification of Tests Chapter # 2
Multiple Choice Item (MCI) Quick Reference Guide
Multiple Choice Item (MCI) Quick Reference Guide
Presentation transcript:

ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education

GOOD ASSESSMENT DATA Tell what students know and are able to do Provide information for making instructional decisions Come in many forms

DIFFERENT KINDS OF ASSESSMENT Give you different information Lead to different learning Meet the needs of different learners

NO ONE KIND OF ASSESSMENT… Is adequate and sufficient, in and of itself, to tell you what students know and are able to do OR to provide information for making instructional decisions Satisfies a sound accountability system

MULTIPLE MEASURES NEEDED IN THE CLASSROOM Multiple Choice Regular Enhanced Justified Open Ended Prompts Essay Short Answer Constructed Response Investigations (CRI) Performance Tasks

SAMPLE MULTIPLE CHOICE ON AN INDIVIDUAL BASIS, TAKE THE MULTIPLE CHOICE SAMPLE TEST FOR: Traditional Multiple Choice Enhanced Multiple Choice Justified Multiple Choice AFTER COMPLETING THE SAMPLE TESTS, DISCUSS THE PROS AND CONS WITH YOUR COLLEAGUES.

CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Present a self-contained question or problem in the stem Contain as much of the item’s content as possible in the stem Does not begin each response option by repeating the same words Avoid negatively stated stems

CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Provide only one correct or best answer Make response options as brief as possible Make response options grammatically parallel Make each choice option grammatically consistent with the stem Avoid “all of the above” and “none of the above”

CHARACTERISTICS OF GOOD MULTIPLE CHOICE ITEMS Avoid unintentional clues (e.g., one lengthy alternative) Make all alternatives plausible Alternate correct answer position randomly

IS THE MULTIPLE CHOICE QUESTION… Assessing factual knowledge? Assessing conceptual understanding? Language dependent? Providing evidence to inform instruction?

OPEN-ENDED ASSESSMENTS: Some Considerations  Identify what you want kids to know (i.e., is this a concept or more rote memory?)  Does the assessment provide for inclusion of all students?  Does the assessment include appropriate elements of Bloom’s Taxonomy?

OPEN-ENDED ASSESSMENTS: Bloom’s Taxonomy

TYPES OF OPEN-ENDED ASSESSMENTS  Short answer  Essay  CRI (Constructed Response Investigation)

CRITERIA FOR OPEN-ENDED ASSESSMENTS SHORT ANSWER ITEMS  Prefer direct questions to incomplete statements  Solicit concise responses  Place the blank in the margin for a direct question and near the end of an incomplete sentence

CRITERIA FOR OPEN-ENDED ASSESSMENTS SHORT ANSWER ITEMS  Restrict the number of blanks for incomplete statements to one or, at the most, two  Provide sufficient answer space  Use equal length blanks

CRITERIA FOR OPEN-ENDED ASSESSMENTS ESSAY ITEMS  Define the student’s task explicitly  Specify the point value and an approximate time limit for each question  Employ more questions that require relatively short answers rather than fewer questions with long answers

CRITERIA FOR OPEN-ENDED ASSESSMENTS ESSAY ITEMS  Do not allow students to choose between questions  Write a trial response to each question to verify the question’s quality

CRITERIA FOR OPEN ENDED ASSESSMENTS Constructed Response Investigations Requires students to answer questions with their own words and ideas Requires students to apply knowledge and skills in “real world” situations Requires students to self-generate extended responses that support conjectures, justify conclusions, and substantiate experimental results

OPEN-ENDED ASSESSMENTS: Analysis of Student Work  What concept is the prompt assessing?  What evidence do you have about student understanding of the concept?  What evidence do you have about student misconceptions?  Is task accessible to ALL students?

OPEN-ENDED ASSESSMENTS: Analysis of Student Work  Does the student understand the concept and have the ability to communicate it?  Does the student understand the concept but cannot communicate it?  The student neither understands nor can communicate his or her understanding of the concept

AN OPEN-ENDED PROMPT Grade 5 Directions: This is an open-ended question. Your answer will be judged on how well you show your understanding of science and how well you can explain it to others. Please write your response in the space below the question and on the next page, if necessary. You may include a picture to help explain your answer. Neesha put snails and plants together in a jar of pond water. She sealed the jar and placed it under a bright light. After several days she checked the jar and found that the snails and plants were alive and healthy. Explain why they stayed alive.

OPEN-ENDED PROMPTS: A Look at Student Work IN DISCUSSION GROUPS OF 2 OR 3: Examine the four samples of student work from the “Neesha and the Snails” prompt. Use the provided rubric to determine the quality of each sample paper. Be ready to share with the entire group.

OPEN-ENDED PROMPTS: A Look at Student Work ACTUAL SCORES FOR STUDENT WORK SAMPLES Student Sample A = 2 Student Sample B = 3 Student Sample C = 4 Student Sample D = 1

CRITERIA FOR PERFORMANCE TASK  Get Data  Manipulate Data  Interpret Data  Apply Data

SAMPLE PERFORMANCE TASK Building Materials For the task Building Materials students were given rocks and testing tools and were asked to investigate the properties of rocks to see which rock would be best for building tables and benches for a museum picnic area.

A SAMPLE PERFORMANCE TASK IN DISCUSSION GROUPS OF 2 OR 3: Examine the sample Performance Task, “Building Materials” Identify the following essential criteria: - What are students doing to “GET DATA”? - What are students doing to “MANIPULATE DATA”? - What are students doing to “INTERPRET DATA”? - What are students doing to “APPLY DATA?”

PERFORMANCE TASK: Analysis of Student Work  What did the student do?  What evidence do you have about student understanding of the concept?  What type of questions were asked?  What evidence do you have about student understanding of the Science Process Skills?  Is the task accessible to ALL students?

RUBRICS AND SCORING GUIDES  A RUBRIC is an established set of generic criteria for scoring or rating students’ tests, portfolios, or performances.  A SCORING GUIDE is a specific set of criteria for a particular assignment/assessment.  Holistic Scoring- All items on a task are scored as a whole  Analytic- Each item on a test is scored individually  Component- Similar items on a task are grouped and scored together

RUBRICS AND SCORING GUIDES A Rubric and Scoring Guide: Describes the levels of performance student is expected to attain relative to a desired standard of achievement. Includes descriptors, or performance descriptions, which tell the evaluator what characteristics or signs to look for in a student’s work and how to place that work on a predetermined scale. Rubrics and Scoring Guides are often supplemented by BENCHMARKS, or performance samples (i.e., anchor papers) that serve as a concrete standard against which other samples may be judged.

RUBRICS AND SCORING GUIDES A Four-Point Rubric or Scoring Guide: Levels of Mastery 4 = Exemplary response (Exceeds the standard) 3 = Good response (Meets the standard) 2 = Inadequate response (Below standard) 1 = Poor response (Does not meet standard)

ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Presented by Dean Gilbert, Science Consultant Los Angeles County Office of Education