ANALYZING AND USING TEST ITEM DATA

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Alternate Choice Test Items
Item Analysis.
Test Development.
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
Instructional Uses of Test Results Interpreting Item-Level Reports.
Using Test Item Analysis to Improve Students’ Assessment
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Some Practical Steps to Test Construction
Test Construction Processes 1- Determining the function and the form 2- Planning( Content: table of specification) 3- Preparing( Knowledge and experience)
Item Analysis What makes a question good??? Answer options?
Chapter 8 Developing Written Tests and Surveys Physical Fitness Knowledge.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 TESTS Purposes of the test Type of test Objectives of the test Content.
Lesson Seven Item Analysis. Contents Item Analysis Item Analysis Item difficulty (item facility) Item difficulty (item facility) Item difficulty Item.
Objective Exam Score Distribution. Item Difficulty Power Item
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Item Analysis Prof. Trevor Gibbs. Item Analysis After you have set your assessment: How can you be sure that the test items are appropriate?—Not too easy.
Lesson Nine Item Analysis.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Social Science Research Design and Statistics, 2/e Alfred P. Rovai, Jason D. Baker, and Michael K. Ponton Internal Consistency Reliability Analysis PowerPoint.
Classroom Assessment: Concepts and Applications Chapter 5: Summative Assessments.
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Part #3 © 2014 Rollant Concepts, Inc.2 Assembling a Test #
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
Chapter 7 Item Analysis In constructing a new test (or shortening or lengthening an existing one), the final set of items is usually identified through.
Techniques to improve test items and instruction
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Group 2: 1. Miss. Duong Sochivy 2. Miss. Im Samphy 3. Miss. Lay Sreyleap 4. Miss. Seng Puthy 1 ROYAL UNIVERSITY OF PHNOM PENH INSTITUTE OF FOREIGN LANGUAGES.
NRTs and CRTs Group members: Camila, Ariel, Annie, William.
Lab 5: Item Analyses. Quick Notes Load the files for Lab 5 from course website –
ASSESMENT IN OPEN AND DISTANCE LEARNING Objectives: 1.To explain the characteristic of assessment in ODL 2.To identify the problem and solution of assessment.
Administering, Analyzing, and Improving the Written Test
6/4/2016Slide 1 The one sample t-test compares two values for the population mean of a single variable. The two-sample t-test of population means (aka.
Grading and Analysis Report For Clinical Portfolio 1.
1 Item Analysis - Outline 1. Types of test items A. Selected response items B. Constructed response items 2. Parts of test items 3. Guidelines for writing.
RELIABILITY AND VALIDITY OF ASSESSMENT
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Introduction to Item Analysis Objectives: To begin to understand how to identify items that should be improved or eliminated.
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
Language Testing How to make multiple choice test.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
In-Service Teacher Training Assessment in IGCSE Biology 0610 Session 2: Question papers and mark schemes.
Psychometrics: Exam Analysis David Hope
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
Dept. of Community Medicine, PDU Government Medical College,
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Norm Referenced Your score can be compared with others 75 th Percentile Normed.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Items analysis Introduction Items can adopt different formats and assess cognitive variables (skills, performance, etc.) where there are right and.
Understanding Your PSAT/NMSQT Results
ARDHIAN SUSENO CHOIRUL RISA PRADANA P.
Update on Data Collection and Reporting
Chapter 14 Assembling, Administering, and Appraising classroom tests and assessments.
Classroom Analytics.
Understanding Your PSAT/NMSQT Results
Test Development Test conceptualization Test construction Test tryout
Understanding Your PSAT/NMSQT Results
Understanding Your PSAT/NMSQT Results
TOPIC 4 STAGES OF TEST CONSTRUCTION
Dept. of Community Medicine, PDU Government Medical College,
Using statistics to evaluate your test Gerard Seinhorst
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Understanding Your PSAT/NMSQT Results
Analyzing test data using Excel Gerard Seinhorst
Day 52 – Box-and-Whisker.
Understanding Your PSAT/NMSQT Results
Tests are given for 4 primary reasons.
Presentation transcript:

ANALYZING AND USING TEST ITEM DATA

Purposes and Elements of Item Analysis To select the best available items for the final form of the test. To identify structural or content defects in the items. To detect learning difficulties of the class as a whole To identify the areas of weaknesses of students in need of remediation.

Three Elements in an Item Analysis Examination of the difficulty level of the items, Determination of the discriminating power of each item, and Examination of the effectiveness of distractors in a multiple choice or matching items.

The difficulty level of an item is known as index of difficulty. Index of difficulty is the percentage of students answering correctly each item in the test Index of discrimination refers to the percentage of high-scoring individuals responding correctly versus the number of low-scoring individuals responding correctly to an item. This numeric index indicates how effectively an item differentiates between the students who did well and those who did poorly on the test.

Preparing Data for Item Analysis Arrange test scores from highest to lowest. Get one-third of the papers from the highest scores and the other third from the lowest scores. Record separately the number of times each alternative was chosen by the students in both groups.

Compute the index of difficulty for each item, following the formula: Add the number of correct answers to each item made by the combined upper and lower groups. Compute the index of difficulty for each item, following the formula: IDF = (NRC/TS)100 where IDF = index of difficulty NRC = number of students responding correctly to an item TS = total number of students in the upper and lower groups

Compute the index of discrimination, based on the formula: IDN = (CU –CL) NSG where IDN = index of discrimination CU = number of correct responses of the upper group CL = number of correct responses of the lower group NSG = number of students per group

Using Information about Index of Difficulty The difficulty index of a test item tells a teacher about the comprehension of or performance on material or task contained in an item.

Total No. of Correct Answers Item Group Answers A B C D Total No. of Correct Answers Difficulty Index H – L Discrimination Index 1 H 20 L 20 3 14 2 1 10 7 3 0 21 52.5 7 0.35 2 0 0 18 2 0 3 9 8 27 67.5 9 0.45 3 3 8 4 4 10 2 4 4 10 25.0 6 0.30 4 3 3 4 10 2 4 10 4 14 35.0 5 2 2 1 1 10 4 5 16 40.0 0.70

For an item to be considered a good item, its difficulty index should be 50%. An item with 50% difficulty index is neither easy nor difficult. If an item has a difficulty index of 67.5%, this means that it is 67.5% easy and 32.5% difficult. Information on the index of difficulty of an item can help a teacher decide whether a test should be revised, retained or modified.

Interpretation of the Difficulty Index Range Difficulty Level 20 & below 21 – 40 41 – 60 61 – 80 81 & above Very Difficult Difficult Average Easy Very Easy

Using Information about Index of Discrimination The index of discrimination tells a teacher the degree to which a test item differentiates the high achievers from the low achievers in his class. A test item may have positive or negative discriminating power. An item has a positive discriminating power when more students from the upper group got the right answer than those from the lower group When more students from the lower group got the correct answer on an item than those from the upper group, the item has a negative discriminating power.

There are instances when an item has zero discriminating power – when equal number of students from upper and lower group got the right answer to a test item. In the given example, item 5 has the highest discriminating power. This means that it can differentiate high and low achievers.

Interpretation of the Index of Discrimination Range Verbal Description .40 & above .30 – .39 .20 – .29 .09 – .19 Very Good Item Good Item Fair Item Poor Item

When should a test item be rejected? Retained? Modified or revised? A test item can be retained when its level of difficulty is average and discriminating power is positive. It has to rejected when it is either easy/very easy or difficult/very difficult and its discriminating power is negative or zero. An item can be modified when its difficulty level is average and its discrimination index is negative.

Examining Distractor Effectiveness An ideal item is one that all students in the upper group answer correctly and all students in the lower group answer wrongly. And the responses of the lower group have to be evenly distributed among the incorrect alternatives.

Developing an Item Data File Encourage teachers to undertake an item analysis as often as practical Allowing for accumulated data to be used to make item analysis more reliable Providing for a wider choice of item format and objectives Facilitating the revision of items Facilitating the physical construction and reproduction of the test Accumulating a large pool of items as to allow for some items to be shared with the students for study purposes.

Limitations of Item Analysis It cannot be used for essay items. Teachers must be cautious about what damage may be due to the table of specifications when items not meeting the criteria are deleted from the test. These items are to be rewritten or replaced.