Chris Orem Jerusha Gerstner Christine DeMars  Presentation and Discussion  Item writing guidelines  Examples  Develop, Evaluate, and Revise Items.

Slides:



Advertisements
Similar presentations
SYSTEMIC ASSESSMENT Workshop Faculty of Medicine Zagazig University PART-I Prof.A.F.M.FAHMY January 2010.
Advertisements

Alternate Choice Test Items
Item Analysis.
Test Taking Strategies
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
Using Test Item Analysis to Improve Students’ Assessment
1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
Designing the Test and Test Questions Jason Peake.
Developing Tests and Test Questions/Items
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Selected Response Tests
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Item Analysis What makes a question good??? Answer options?
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
Education 325: Assessment for Classroom Teaching G. Galy, PhD Week 5.
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Test and Types of Tests.
Psychological Tests Ch 15 notes.
Ginny Price CETL TEST DEVELOPMENT. WRITING MULTIPLE CHOICE ITEMS Write only a few items at a time Immediately after preparing class lesson or after class.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
©2003 Pearson Education, Inc., publishing as Longman Publishers. Study Skills Topic 13 Preparing & Taking Exams PowerPoint by JoAnn Yaworski.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Objectives To know basic concepts and rationale of MCQ To know different types of MCQ To illustrate anatomy of each type To discuss guidelines construction.
Dr. Majed Wadi MBChB, MSc Med Edu. Objectives To discuss the concept of vetting process To describe the findings of literature review regarding this process.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Assessment Item Writing Workshop Ken Robbins FDN-5560 Classroom Assessment Click HERE to return to the Documentation HERE.
Lectures ASSESSING LANGUAGE SKILLS Receptive Skills Productive Skills Criteria for selecting language sub skills Different Test Types & Test Requirements.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Writing Objective Test Questions Xiufeng Liu, PhD Director, Office of Educational Innovation & Assessment Professor, Department of Learning & Instruction.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Dan Thompson Oklahoma State University Center for Health Science Evaluating Assessments: Utilizing ExamSoft’s item-analysis to better understand student.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Multiple-Choice Item Design February, 2014 Dr. Mike Atkinson, Teaching Support Centre, Assessment Series.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
COMMON TEST TECHNIQUES FROM TESTING FOR LANGUAGE TEACHERs.
EDU 385 Session 8 Writing Selection items
Constructing Exam Questions
Multiple Choice Item (MCI) Quick Reference Guide
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Multiple Choice Item (MCI) Quick Reference Guide
Presentation transcript:

Chris Orem Jerusha Gerstner Christine DeMars

 Presentation and Discussion  Item writing guidelines  Examples  Develop, Evaluate, and Revise Items

 A test blueprint is a table of specifications that weights each objective according to how important it is or how much time is spent covering that objective, links objectives to test items and summarizes information  Essential for Competency Testing Test blueprint BLUEPRINT TIPS -List objectives in a table -Identify length of test -Designate number of items per objective -Evaluate the importance of each objective and assign items accordingly -Often assign one point per item

ObjectiveWeightingItems Objective 125%10 Objective 225%10 Objective 325%10 Objective 425%10

 Some items are more appropriate when testing different kinds of knowledge; or when tapping into different kinds of cognitive processes.  We’ll focus on multiple choice items because recruiting faculty to score open-ended items would be difficult. Type of ItemConstructionScoring True/FalseDifficultEasy MatchingEasy CompletionEasyDifficult Multiple ChoiceDifficultEasy EssayEasyDifficult Writing Items

ContentStyle/format Writing the stem Writing the distracters

 Focus on a single problem when writing an item  Use new situations to assess application  Avoids memorization exercises  Allows for synthesis and evaluation  Keep content of items independent  students shouldn’t be able to use one item to answer another, although a set of items may tap into a shared scenario  Avoid opinion-based items  Address a mix of higher-order and lower- order thinking skills. Content

Bloom Haladyna Analysis Synthesis Application Comprehension Knowledge Evaluation Predicting Evaluating Defining Recalling Problem Solving Higher Order Skills Lower Order Skills

 Try item stems such as "If..., then what happens?", "What is the consequence of...?", or "What would happen if...?" (predicting)  Ask students to make a decision based on predetermined criteria, or to choose criteria to use in making a decision, or both (evaluating).  Require the student to use combinations of recalling, summarizing, predicting, and evaluating to solve problems.

 Avoid excess words – be succinct  Use specific, appropriate vocabulary  Avoid bias (age, ethnicity, gender, disabilities)  Write stems and options in third person  Underline or bold negative or other important words  Have others review your items Style/format

 The stem should clearly state the problem  Place the main idea of the question in the stem, not the item options  Keep the stem as short as possible  Don’t provide clues to correct answer in stem (e.g., grammatical clues)  If the stem is a complete sentence, end with a period and begin all response options with upper-case letters.  If the stem is an incomplete sentence, begin all response options with lower case letters.  Use negative stems rarely Writing the stem

Writing the distracters  Make sure there is only one correct answer for each item  Develop as many effective plausible options as possible, but three are sufficient (Rodriguez, 2005)  It is better to have fewer options than to write BAD options to meet some quota!  Vary the location of the correct answer when feasible (Flip a coin), or put options in logical order (e.g. chronological, numerical)  Avoid excessive use of negatives or double negatives  Keep options independent  Keep options similar (in format)  Length and wording

 DO use as distracters:  common student misconceptions (perhaps from open-ended responses from previous work)  words that “ring a bell” or “sound official”  responses that fool the student who has not mastered the objective  DO NOT use as distracters:  responses that are just as correct as the right answer  implausible or silly distracters  Use “all of the above” and “none of the above” sparingly  Don’t use “always” or “never”  Don’t give clues to the right answer Writing the distracters

The best way to increase the reliability of a test is to: A. increase the test length B. removing poor quality items C. Tests should be readable for all test takers.

What’s wrong with this item? California: A). Contains the tallest mountain in the United States B). Has an eagle on its state flag. C). Is the second largest state in terms of area. *D). Was the location of the Gold Rush of What is the main reason so many people moved to California in 1849? A). California land was fertile, plentiful, and inexpensive. *B). Gold was discovered in central California C). The east was preparing for a civil war. D). They wanted to establish religious settlements. Stem should state the problem

Bleeding of the gums is associated with gingivitis, which can be cured by the sufferer himself by brushing his teeth daily. A. true B. false

The United States should adopt a foreign policy based on: A). A strong army and control of the North American continent. B). Achieving the best interest of all nations. C). Isolation from international affairs. *D). Naval supremacy and undisputed control of the world’s sea lanes. According to Alfred T. Mahan, the United States should adopt a foreign policy based on: A). A strong army and control of the North American continent. B). Achieving the best interest of all nations. C). Isolation from international affairs. *D). Naval supremacy and undisputed control of the world’s sea lanes. What’s wrong with this item? More than One Possible Answer

Following these rules does not guarantee that items will perform well empirically. Testing companies, using paid item-writers and detailed writing guidelines, ultimately only use 1/3 (or fewer) of the items operationally. It is normal and expected to have to revise items after pilot-testing.

ITEM ANALYSIS STEPS: Item difficulty - proportion of people who answered the item correctly - an item should not be too easy or too difficult - problems arise from poor wording, trick questions, or speediness Item Discrimination - correlation of item and total test, should be higher than item as an indicator of the overall test score - the higher the better - can be (but shouldn’t be) negative Distractor Analysis - frequency of response selections - item is problematic if people with a high overall score are selecting incorrect responses frequently

 Ultimately, you want to know whether students are achieving your objectives  Tests are used to indirectly measure these knowledge, skills, attitudes, etc.  Items you write are a sample of all possible items to measure that objective  the more you write (and the better your items are!) the more reliably you can measure the objective  You want to be sure that the items you create are measuring achievement of the objective, and NOT test-wiseness, reading ability, or other factors

 Haladyna, T. M. (1999). Developing and validating multiple-choice test items. Mahwah, NJ: Lawrence Erlbaum Associates.  Rodriguez, M. C. (2005). Three options are optimal for multiple-choice items: A meta-analysis of 80 years of research. Educational Measurement: Issues and Practice, 24,  Downing & Haladyna (2006). Handbook of test development. Mahwah, NJ: Lawrence Erlbaum Associates.  Burton S.J. et al (1991). How to prepare Multiple-Choice Test Items: Guidelines for University Faculty. Brigham Young University Testing Services.