Using Test Item Analysis to Improve Students’ Assessment

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Assessing Student Performance
The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
Item Analysis.
Test Development.
How to Make a Test & Judge its Quality. Aim of the Talk Acquaint teachers with the characteristics of a good and objective test See Item Analysis techniques.
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
© McGraw-Hill Higher Education. All rights reserved. Chapter 3 Reliability and Objectivity.
Course assessment: Setting and Grading Tests and Examinations By Dr C. Bangira Chinhoyi University of Technology Organized by the Academy of Teaching and.
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Test Construction Processes 1- Determining the function and the form 2- Planning( Content: table of specification) 3- Preparing( Knowledge and experience)
Item Analysis What makes a question good??? Answer options?
New York State Physical Setting: Physics A Review of the June 2006 Exam NYS AAPT Fall Meeting Binghamton, NY J. Zawicki SUNY Buffalo State College T. Johnson.
Lesson Seven Item Analysis. Contents Item Analysis Item Analysis Item difficulty (item facility) Item difficulty (item facility) Item difficulty Item.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Lesson Nine Item Analysis.
Copyright 2001 by Allyn and Bacon Standardized Testing Chapter 14.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Challenge Question: Why is being organized on the first day of school important? Self-Test Questions: 1.How do I get my classroom ready? 2.How do I prepare.
ANALYZING AND USING TEST ITEM DATA
Standardized Test Scores Common Representations for Parents and Students.
Assessment in Language Teaching: part 2 Today’s # 24.
Classroom Assessment A Practical Guide for Educators by Craig A
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Classroom Assessment and Grading
Understanding and Using Standardized Tests
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Part #3 © 2014 Rollant Concepts, Inc.2 Assembling a Test #
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Presentation by : Kesang Tshering
Chapter 7 Item Analysis In constructing a new test (or shortening or lengthening an existing one), the final set of items is usually identified through.
Techniques to improve test items and instruction
Group 2: 1. Miss. Duong Sochivy 2. Miss. Im Samphy 3. Miss. Lay Sreyleap 4. Miss. Seng Puthy 1 ROYAL UNIVERSITY OF PHNOM PENH INSTITUTE OF FOREIGN LANGUAGES.
Teaching Today: An Introduction to Education 8th edition
NRTs and CRTs Group members: Camila, Ariel, Annie, William.
Lab 5: Item Analyses. Quick Notes Load the files for Lab 5 from course website –
Data for Student Success August, 2009 Mission Pointe “It is about focusing on building a culture of quality data through professional development and web.
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Assessment Item Writing Workshop Ken Robbins FDN-5560 Classroom Assessment Click HERE to return to the Documentation HERE.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Standards-Based Science Assessment. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Introduction to Item Analysis Objectives: To begin to understand how to identify items that should be improved or eliminated.
This material is based upon work supported by the National Science Foundation under Grant No and Any opinions, findings, and conclusions.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Maximizing Instructional Time Student Engagement through Questioning.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
Dept. of Community Medicine, PDU Government Medical College,
Workshop 2014 Cam Xuyen, October 14, 2014 Testing/ assessment/ evaluation BLOOM’S TAXONOMY.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Testing & Measurement  Criterion-Referenced Tests.  Pre-tests  Post-tests  Norm-Referenced Tests.
Writing Selection Items
ARDHIAN SUSENO CHOIRUL RISA PRADANA P.
Classroom Analytics.
Using Formative Assessment to Improve Student Achievement
Greg Miller Iowa State University
Classroom Assessment Ways to improve tests.
Dept. of Community Medicine, PDU Government Medical College,
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Using Test Item Analysis to Improve Students’ Assessment Institutional Assessment starts with Classroom Assessment

Learning Objectives of This Session 1. Explain difficulty index and discrimination index 2. Calculate difficulty index and discrimination index 3. Identify ineffective distracters 4. Evaluate multiple-choice test items based on analysis results 5. Apply table of specifications to improve content validity

Purpose of Item Analysis 1. Ensure accurate measurement of knowledge or skill 2. Enhance student learning 3. Increase student engagement 4. Avoid demoralizing students 5. Increase confidence in drawing conclusions Outcome achievement Level of knowledge or skill mastery Teaching effectiveness

Components of a multiple-choice item Test items used to measure the lowest level of cognitive taxonomy are (stem) Analysis (distracter) Application (distracter) Knowledge (key) Comprehension (distracter) The correct answer usually numbered as 1 The wrong answers usually numbered as 0

Two important indexes for Item Analysis Item Difficulty Index To tell how hard the item is Item Discrimination Index To tell how well the item to distinguish between high ability and low ability students

Item Difficulty Index Is defined as the percentage or proportion of test takers who correctly answer the item. For example, in a class of 30 students, if 20 students get the item correct and 10 are incorrect, the item difficulty index is 20/30 =0.67 Range from 0 to 1

ITEM DIFFICULTY = NO. CORRECT / TOTAL Students Item1 Item2 Item3 Item4 Item5 Robert 1 1 1 1 1 Millie 1 0 1 1 1 Dean 1 0 0 1 1 Shenan 1 1 0 1 1 Cuny 1 1 1 1 1 Corky 1 0 1 1 1 Randy 1 1 0 1 1 Jeanne 1 1 0 0 1 Iliana 1 1 1 0 1 Lindsey 0 0 0 0 1 Item p = 0.9 0.6 0.5 0.7 1.0

Optimal P Values for Items with Varying Number of Options Optimal Mean p Value 2 0.85 3 0.77 4 0.74 5 0.69

Special Assessment Situations and Item Difficulty Previously discussed item difficulty is most applicable to norm-referenced tests For criterion-referenced tests or classroom tests, it is normal to have average p values as high as 0.9 because we expect most students to be successful If a test were developed to select the upper 25%, it would be desirable to have items with p values that average 0.25 In summary, although a mean p of 0.5 is optimal, item difficulty levels vary with purpose of a test.

Item Discrimination Index Is defined as the difference of item difficulty between those who succeeded (called upper group or high- achievement group) and those who failed the test (called lower group or low-achievement group) D = discrimination index (range from -1 to 1) PU = difficulty index in the upper group PL = difficulty index in the lower group For example, Pu=0.8, PL=0.3, D=0.8-0.3=0.5

Guidelines for Evaluating D Values Discrimination Index 0.4 and larger excellent 0.3-0.39 good 0.11-0.29 fair 0.00-0.11 (it is OK for classroom tests) poor Negative miskeyed or major flaw

Student ID 1 2 3 4 5 6 7 8 9 10 Total R Q G I B F E T S C K M O A D N H L J P Discrmination-index  

Distracter Analysis It allows you to examine how many students in the upper group and the lower group selected each option on a multiple-choice item We expect distracters to be selected by more students in the lower group than students in the upper group. An effective distracter must be selected by some students.

Distracter Analysis Options Item 1 A* B C D Number in the upper group 22 3 2 Number in the lower group 9 7 8 6

Distracter Analysis Options Item 2 A* B C D Number in the upper group 17 9 4 Number in the lower group 13 6 11

Building a Table of Specifications 1. Selecting content areas 2. Selecting learning outcomes to be tested 3. Determining the levels of objectives 4. Determining the question type 5. Determining the points for each question 6. Building a table

A Sample of Table of Specifications Content Area Learning Objectives Level of objective Item Type number point Item Analysis Explaining item difficulty and discrimination index Calculate P and D Identify ineffective distracters Comprehension Application Multiple-choice Constructed 2 1 4 Preparing a classroom test Apply table of specifications Project 5