Dept. of Community Medicine, PDU Government Medical College,

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Performance Assessment
Item Analysis.
How to Make a Test & Judge its Quality. Aim of the Talk Acquaint teachers with the characteristics of a good and objective test See Item Analysis techniques.
Issues of Reliability, Validity and Item Analysis in Classroom Assessment by Professor Stafford A. Griffith Jamaica Teachers Association Education Conference.
Using Test Item Analysis to Improve Students’ Assessment
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Advanced Topics in Standard Setting. Methodology Implementation Validity of standard setting.
Stephen C. Court Educational Research and Evaluation, LLC A Presentation at the First International Conference on Instructional Sensitivity Achievement.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Antonio Anne Margarette Madalang Kyle Marron Ritual Krizza
Some Practical Steps to Test Construction
Test Construction Processes 1- Determining the function and the form 2- Planning( Content: table of specification) 3- Preparing( Knowledge and experience)
Jack Holbrook Inquiry-based Teaching/Learning (IBSE)
Item Analysis What makes a question good??? Answer options?
Item Analysis Ursula Waln, Director of Student Learning Assessment
Lesson Seven Item Analysis. Contents Item Analysis Item Analysis Item difficulty (item facility) Item difficulty (item facility) Item difficulty Item.
PLT Review Session Dr. Brian E. Harper.
Item Analysis Prof. Trevor Gibbs. Item Analysis After you have set your assessment: How can you be sure that the test items are appropriate?—Not too easy.
Lesson Nine Item Analysis.
Multiple Choice Test Item Analysis Facilitator: Sophia Scott.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Principles of High Quality Assessment
ANALYZING AND USING TEST ITEM DATA
Understanding Validity for Teachers
Essay Assessment Tasks
GUIDELINES FOR SETTING A GOOD QUESTION PAPER
Business and Management Research
Chap. 3 Designing Classroom Language Tests
Office of Institutional Research, Planning and Assessment January 24, 2011 UNDERSTANDING THE DIAGNOSTIC GUIDE.
Induction to assessing student learning Mr. Howard Sou Session 2 August 2014 Federation for Self-financing Tertiary Education 1.
River Valley Primary School – Strive for the Best WELCOME Presentation slides can be downloaded from:
The Genetics Concept Assessment: a new concept inventory for genetics Michelle K. Smith, William B. Wood, and Jennifer K. Knight Science Education Initiative.
1 New England Common Assessment Program (NECAP) Setting Performance Standards.
Chapter 7 Item Analysis In constructing a new test (or shortening or lengthening an existing one), the final set of items is usually identified through.
 Closing the loop: Providing test developers with performance level descriptors so standard setters can do their job Amanda A. Wolkowitz Alpine Testing.
Techniques to improve test items and instruction
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Group 2: 1. Miss. Duong Sochivy 2. Miss. Im Samphy 3. Miss. Lay Sreyleap 4. Miss. Seng Puthy 1 ROYAL UNIVERSITY OF PHNOM PENH INSTITUTE OF FOREIGN LANGUAGES.
Lab 5: Item Analyses. Quick Notes Load the files for Lab 5 from course website –
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Using Assessments and Data to Improve Student Learning Day 3 1.
Grading and Analysis Report For Clinical Portfolio 1.
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Assessment and Testing
PLC Team Leader Meeting
INSTRUCTIONAL OBEJECTIVES PURPOSE OF IO IO DOMAINS HOW TO WRITE SMART OBJECTIVE 1.
Chapter 6 - Standardized Measurement and Assessment
Review: Stages in Research Process Formulate Problem Determine Research Design Determine Data Collection Method Design Data Collection Forms Design Sample.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
Psychometrics: Exam Analysis David Hope
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
ARDHIAN SUSENO CHOIRUL RISA PRADANA P.
Concept of Test Validity
Data Analysis and Standard Setting
Chapter 14 Assembling, Administering, and Appraising classroom tests and assessments.
Copyright © ODL Jan 2005 Open University Malaysia
Prepared by: Toni Joy Thurs Atayoc, RMT
Greg Miller Iowa State University
Teaching Listening Based on Active Learning.
Understanding Your PSAT/NMSQT Results
TOPIC 4 STAGES OF TEST CONSTRUCTION
Dept. of Community Medicine, PDU Government Medical College,
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Providing feedback to learners
Tests are given for 4 primary reasons.
Presentation transcript:

Dept. of Community Medicine, PDU Government Medical College, ITEM ANALYSIS Dr. Umed V Patel Associate Professor Dept. of Community Medicine, PDU Government Medical College, Rajkot

OBJECTIVES At the end of this session the learner should be able to: Define item analysis and to understand the purpose of item analysis Explain the procedure for calculation of Difficulty index, Discrimination index and Distracter effectiveness. Review MCQs and make judgment on the basis of item analysis and suggest improvements for defective items. List uses of item analysis

What do you like? Movies Cricket Statistics All of the above

What is common theme in all four photographs What is your analysis?

MCQs (Items) Objective evaluation is becoming more important and popular in education With greater usage of MCQs in various kind of examinations, Large no. of MCQs (Question bank) with different level of difficulty requires. In addition, MCQs are used to assess class performance as a part of formative evaluation. Formative evaluation: It is to determine how much and how well students have learned, to serve as a feedback to students & teachers.

VALIDATION It is an important step in formulation of MCQs. Validation require at TWO stages. Pre-validation: It is a process to which a constructed item is subjected prior to appearing in an examination. Post validation: It is the process of analysis of the item after it has appeared in a test. (item analysis)

DEFINITION Item analysis is the process of analyzing the performance of a MCQ after it has appeared in a question paper. Statistical procedure is require to analyze a question (item)

Purpose of Item analysis: Whether the item is of appropriate level of difficulty Whether the item is capable of discriminating between the knowledgeable and less knowledgeable students. To get the “functionality” of alternatives(options) to the corrected response To create question bank with known difficulty and discrimination capacity.

Item Analysis Provides:…. Distracter Effectiveness Discrimination Index Difficulty Index

Item Analysis Provides:…. 1. Difficulty Index (P) The question difficulty is the percentage of students who selected the correct response 2. Discrimination Index (item effectiveness - d)  Indicates how well the question separates the students who know the material well from those who don’t 3. Distracter Effectiveness – Effectiveness of the options(alternatives) given

PROCEDURE The essential steps of item analysis are: 1. Score the whole test for all the students. 2. Rank the students in order of merit based on their test scores. 3. Take the top third as high achievers and bottom third as low achievers.

Options (H) (L) Total response N(%) 4. Prepare a table for each item as follows: Example. Item No 1 ` Key (correct response “C”) Options No. selecting the option amongst high achievers (H) No. selecting the option amongst low achievers (L) Total response N(%) A 5 10 15 (15%) B C 30 40 (40%) D 20 (20%) E Nil 2 2 (2%) Not responded 8 -- Total (N) 50 100

5. Calculate various indices 1. Difficulty index Indicated by the symbol ‘P’ and is calculated by the formula P = H + L x 100 N Where, H = No. of students answering the item correctly in the high achieving group. L = No. of students answering the item correctly in the low achieving group. N = Total number of students in two groups including non-responders. Thus, for the given example the value of ‘P’ = 30 + 10 x 100 = 40 % 100

Interpretation of Difficulty Index (P) P Value Interpretation 50-60% Good 30-70% Acceptable <30% & >70% Requires Modification (too difficult or too easy 0 10 20 30 40 50 60 70 80 90 100

indicated by the symbol ‘d’ and is calculated by the formula 2. Discrimination index indicated by the symbol ‘d’ and is calculated by the formula d = H - L x 2 N Where, H = No. of students answering the item correctly in the high achieving group. L = No. of students answering the item correctly in the low achieving group. N = Total number of students in two groups including non-responders. Thus for the given example the value of ‘d’ = 30 - 10 x 2 = 0.4 100 Simply, d= 30 - 10 = 60 - 20 = 0.6 - 0.2 = 0.4 50 50 100 100

Interpretation of Discrimination Index [d] d Value Interpretation ≥0.35 Excellent discrimination 0.25 to 0.34 Good 0.20 to 0.24 Acceptable <0.20 Requires Modification ._________________._____. ___._____.___________. -1 0 0.20 0.25 0.35 1 A negative Discrimination Index almost always indicates a wrong item/key

3. Distracter effectiveness Any of the distracters in the item, which has not attracted even 5% of the total responded item, is said to be a “non-functional” distracter. In the given example, option number ‘E’ is nonfunctional having attracted only 2% of students. So, “alternative E” should be changed

Item Analysis – An Example

1. Award of a score to each student ROLL NO. MARKS 1 16 13 15 2 14 8 3 21 9 4 12 20 5 11 17 6 18 7 24 19 10 22 23 ROLL NO. MARKS 25 14 37 16 26 22 38 7 27 8 39 18 28 15 40 13 29 41 19 30 23 42 11 31 12 43 9 32 10 44 33 45 17 34 21 46 35 47 36 48 20

2. Ranking in order of merit Marks Roll No. 25 - 13 2,40,46 24 7 12 4,11,31,33, 44 23 30 11 5,22,42 22 26, 10 9,24,32 21 3,34 9 15,17,19,43 20 16,23,48 8 14,27,29 19 8,41 38 18 12,20,39 6 17 18,45 5 16 1, 21,37,47 15 6,13,28,36 14 10,25,35

3. Divide the student in 3 groups - on the basis of performance Question -1 20th June 2010 Total Students : 48 Higher Group 24,23,22,21,21,20,20,20,19,19,18, 18, 18,17,17,16 Middle Group 16,16,16,15,15,15,15,14,14,14,13,13,13,12,12,12 Lower Group 12,12,11,11,11,10,10,10,9,9,9,9,8,8,8,7 DISCARD MIDDLE GROUP Q.1 Frequency Table Alternatives Responses of Higher Group Responses of Lower Group a 2 3 b 5 8 C 9 = H 3 = L d - 1 Not responded Total 16

Calculation of Indices Q-1 H = 9 , L = 3 N = 32 Difficulty Index (P) (expressed as %) P = H + L X 100 So, P = 9 + 3 X 100 = 37.5% N 32 ========================================================== Discrimination Index (d) (expressed As A Decimal Fraction) d = H - L X 2 N So, here, d = 9 – 3 X 2 = 0.39 32

Distracter Effectiveness or functionality Effectiveness of the options Any of the distracters(alternatives) in the item which has not attracted even 5% of the total responses is said to be a nonfunctional distracter Alternatives Responses of Higher Group Responses of Lower Group Total responses a 2 3 5 (16.1%) b 5 8 13 (41.9%) C 9 = H 3 = L 12(38.7%) d - 1 1(3.2%) Not responded --- Total 16 31 (100%) Alternative d have attracted <5% student. So, it is non- functional and require modification

Question Bank Card Subject Objective Tested Question:…. Alternatives (A) (B) (C) (D) Correct Answer : Stem Year P value Item Type & Action d value

Question Bank Card angle glaucoma Subject : Autonomic Nervous System-glaucoma Objective Tested : Able to name drug/S contraindicated in narrow angle glaucoma Question No. 1 Which of the following drug is contraindicated in narrow angle glaucoma? Alternatives Phenylephrine Timolol Homatropine Latanoprost Correct Answer : (C) Stem Year P value Item Type & Action d value IV 2010 38.7% Acceptable 0.39 Excellent Retain

Uses of Item analysis To determine if item function as expected To improve item quality To provides data/information for helping the students to improve their learning in class room teaching Creation of an item bank with different level of difficulty

But now, its your turn to bat Thank You But now, its your turn to bat

ITEM ANALYSIS – GROUP TASK Three hundred students were administered an MCQ test containing 200 items. The frequency distribution table of the following items showing the number of students in the upper and lower group (100 each) who selected each alternative are prepared. Number underlined is response to correct answer. Calculate Difficulty Index, Discrimination Index and Distracter Effectiveness of the item allotted to you. Group Q No TOP GROUP BOTTOM GROUP A B C D NR Group 1 1 3 19 5 68 11 12 73 Group 2 2 51 23 25 14 42 8 Group 3 93 4 33 17 13 32 Group 4 60 10 24 35 29 6

ITEM ANALYSIS – GROUP TASK Calculate Difficulty Index, Discrimination Index and Distracter Effectiveness of the item allotted to you….. P = H + L x 100 (30-70% Acceptable, <30%/>70% require modification) N d = H - L x 2 (>0.35 Excellent, <0.20 require modification) N Group Q No TOP GROUP BOTTOM GROUP A B C D NR Group 1 1 3 19 5 68 11 12 73 Group 2 2 51 21 23 25 14 42 8 Group 3 93 4 33 17 13 32 Group 4 60 10 24 35 6 16

Thank You Again