College of Nursing January 2011 Best Practices for Writing Objective Test Items.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

AP Exam- Tips and Tricks
An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Alternate Choice Test Items
Item Analysis.
Test Taking Strategies
ENGLISH B HIGHER LEVEL The Mackay School – May 2014 Examinations.
Selected Response Tests
Standards for Question Writing Mary Grantner, MA Senior Manager, Physician Assessment.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Test Writing: Moving Away from Publisher Material
Improving Test Taking Strategies. Test Taking Skills  Most students have NEVER been taught test taking strategies.  Studies show that as many as 20.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Narrowing the Gulf Annual Conference 2010 March 2010 Mastering the Art of Writing Objective Test Items.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Completion, Short-Answer, and True-False Items
“Taking Tests” Session 5 STUDY SKILLS
Classroom Assessments Checklists, Rating Scales, and Rubrics
August 2007FFP Testing and Evaluation Techniques Chapter 7 Florida State Fire College Ocala, Florida.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Objectives To know basic concepts and rationale of MCQ To know different types of MCQ To illustrate anatomy of each type To discuss guidelines construction.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Exam Taking Kinds of Tests and Test Taking Strategies.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 ITEM TYPES IN A TEST Missing words and incomplete sentences Multiple choice.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Test Taking Strategies. Prepare to avoid errors: Analyze your past results and errors Arrive early and prepared for tests Be familiar with exam question.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Assessment Item Writing Workshop Ken Robbins FDN-5560 Classroom Assessment Click HERE to return to the Documentation HERE.
2009 Professional Development Day October 2009 Mastering the Art of Test Writing.
8 Strategies for the Multiple Choice Portion of the AP Literature and Composition Exam.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Building Exams Dennis Duncan University of Georgia.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
University of Baltimore Test Development Solutions (TDS) Thomas Fiske, M.S. - Test Development Team Lead Charles Glover, M.S. - Test Developer; Diann M.
Multiple-Choice Item Design February, 2014 Dr. Mike Atkinson, Teaching Support Centre, Assessment Series.
Test Taking Skills Make sure you prove what you know!
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
VIDEO TEXT RISK APPLICATION SPEED INTERNET DATA INFORMATION CONSUME BUSINESS CAPITAL RESOURCE VIDEO MEDIA ECONOMIC DIVERSE YES NO TRAINING TOWER COMMERCIAL.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Document Development Cycle
EDU 385 Session 8 Writing Selection items
Classification of Tests Chapter # 2
Multiple Choice Item (MCI) Quick Reference Guide
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Multiple Choice Item (MCI) Quick Reference Guide
Narrowing the Gulf Annual Conference 2010
Presentation transcript:

College of Nursing January 2011 Best Practices for Writing Objective Test Items

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Writing Objective Test Items Presenter  Dr. James Coraggio, Director, Academic Effectiveness and Assessment Contributor  Alisha Vitale, Collegewide Testing Coordinator 2

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Writing Objective Test Items Former Life…  Director of Test Development, SMT  Director of Measurement and Test Development, Pearson  Taught EDF 4430 Measurement for Teachers, USF January Academic Effectiveness and Assessment3

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Purpose  This presentation will address the importance of establishing a test purpose and developing test specifications.  This presentation will explain how to create effective multiple choice test questions.  The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers. 4

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment5 Agenda  Purpose of a Test  Prior to Item Writing  Advantages of Objective Tests  Types of Objective tests  Writing Multiple Choice Items  The Test-wise Student  Test Instructions  Test Validity

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Purpose of a Test  To clearly delineate between those that know the content and those that do not.  To determine whether the student knows the content, not whether the student is a good test- taker.  Likewise, confusing and tricky questions should be avoided to prevent incorrect responses from students who know (and understand) the material. 6

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Prior to Writing Items  Establish the test purpose  Conduct the role delineation study/job analysis  Create the test specifications January Academic Effectiveness and Assessment7

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Establish the Test Purpose Initial Questions  How will the test scores be used?  Will the test be designed for minimum competency or content mastery?  Will the test be low-stakes, moderate-stakes, or high-stakes (consequences for examinees)?  Will the test address multiple levels of thinking ( higher order, lower order, or both )?  Will there be time constraints? January Academic Effectiveness and Assessment8

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Establish the Test Purpose  Responses to those initial questions have implications such as  the overall length of the test,  the average difficulty of the items,  the conditions under which the test will be administered, and  the type of score information to be provided.  Take the time to establish a singular purpose that is clear and focused so that goals and priorities will be effectively met. January Academic Effectiveness and Assessment9

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Conduct the Job Analysis  The primary purpose of a role delineation study or job analysis is to provide a strong linkage between competencies necessary for successful performance on the job and the content on the test.  This work has already been conducted by the National Council Licensure Examination for Registered Nurses [See Report of Findings from the 2008 RN Practice Analysis: Linking the NCLEX-RN® Examination to Practice, NCSBN, 2009] January Academic Effectiveness and Assessment10

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications  Test specifications are essentially the ‘blue print’ used to create the test.  Test specifications operationalize the competencies that are being assessed.  NCLEX-RN® Examination has established test specifications. [See 2010 NCLEX- RN ® Detailed Test Plan, April 2010, Item Writer/Item Reviewer/Nurse Educator Version] January Academic Effectiveness and Assessment11

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment12

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment13

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications January Academic Effectiveness and Assessment14

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Create Test Specifications Test specifications:  Support the validity of the examination  Provide standardized content across administrations  Allow for subscores that can provide diagnostic feedback to students and administrators  Inform the student (and the item writers) of the required content January Academic Effectiveness and Assessment15

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 Item Development  After developing the test specifications, item development can begin.  The focus on the remaining presentation will be on creating ‘appropriate’ objective items. January Academic Effectiveness and Assessment16

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment17 Objective Tests  Measure several types of learning (also levels)  Wide content, short period of time  Variations for flexibility  Easy to administer, score, and analyze  Scored more reliability and quickly  What type of learning cannot be measured?

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment18 Types of Objective Tests  Written-response  Completion (fill-in-the-blank)  Short answer  Selected-response  Alternative response (two options)  Matching  Keyed (like matching)  Multiple choice

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment19 Written-response  Single questions/statements or clusters (stimuli)  Advantages  Measure several types of learning  Minimizes guessing  Points out student misconceptions  Disadvantages  Time to score  Misspelling and writing clarity  Incomplete answers  More than one possible correct response (novel answers)  Subjectivity in grading

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment20 Completion A word that describes a person, place or thing is a ________. 1.Remove only ‘key’ words 2.Blanks at end of statement 3.Avoid multiple correct answers 4.Eliminate clues 5.Paraphrase statements 6.Use answer sheets to simplify scoring

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment21 Short Answer Briefly describe the term proper noun. ____________________________  Terminology – Stimulus and Response 1.Provide an appropriate blank (word (s) or sentence). 2.Specify the units (inches, dollars) 3.Ensure directions for clusters of items and appropriate for all items

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment22 Selected-response Select from provided responses  Advantages  Measure several types of learning  Measures ability to make fine distinctions  Administered quickly  Cover wide range of material  Reliably scored  Multiple scoring options (hand, computer, scanner)  Disadvantages  Allows guessing  Distractors can be difficult to create  Student misconceptions not revealed

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment23 Alternative Response T F 1. A noun is a person place or thing. T F 2. An adverb describes a noun. 1.Explain judgments to be made 2.Ensure answers choices match 3.Explain how to answer 4.Only one idea to be judged 5.Positive wording 6.Avoid trickiness, clues, qualifiers

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment24 Matching Item Column A Column B __Person, place, or thing. a. Adjective __Describes a person, place, or thing. b. Noun Terminology – premises and responses 1.Clear instructions 2.Homogenous premises 3.Homogenous responses (brief and ordered) 4.Avoid one-to-one

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment25 Keyed Response Responses a. A noun b. A pronoun c. An adjective d. An adverb ___Person, place, or thing. ___Describes a person, place, or thing.  Like matching items, more response options

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment26 MC Item Format What is the part of speech that is used to name a person, place, or thing? A) A noun* B) A pronoun C) An adjective D) An adverb

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment MC Item Terminology  Stem: Sets the stage for the item; question or incomplete thought; should contain all the needed information to select the correct response.  Options: Possible responses consisting of one and only one correct answer  Key: correct response  Distractor: wrong response, plausible but not correct, attractive to an under-prepared student 27

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Competency  Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students.  Assessing lower division students on graduate level material is an ‘unfair’ expectation.  The competent student should do well on an assessment, items should not be written for only the top students in the class. 28

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Clarity  Clear, precise item and instructions  Correct grammar, punctuation, spelling  Address one single issue  Avoid extraneous material (teaching)  One correct or clearly best answer  Legible copies of exam 29

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Bias  Tests should be free from bias…  No stereotyping  No gender bias  No racial bias  No cultural bias  No religious bias  No political bias 30

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Level of Difficulty  Ideally, test difficulty should be aimed at a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e., workforce area). 31

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Level of Difficulty  To make a M/C item more difficult, make the stem more specific or narrow and the options more similar.  To make a M/C item less difficult, make the stem more general and the options more varied. 32

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Trivial and Trick Questions  Avoid trivia and tricks.  Avoid humorous or ludicrous responses.  Items should be straight forward. They should cleanly delineate those that know the material from those that do not.  Make sure every item has value and that it is contributing to the final score. 33

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines When you don’t know the answer  As with all exams, attempt the questions that are easiest for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’. Use a process of elimination  Try to narrow your choice as much as possible: which of the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable? 34

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Look for grammatical inconsistencies  In extension-type questions a choice is nearly always wrong if the question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.:  The apparent distance hypothesis explains…  b) The distance between the two parallel lines appears… 35

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Be wary of options containing definitive words and generalizations  Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often. Favor look-alike options  If two of the alternatives are similar, give them your consideration. e.g.: A. tourism consultants B. tourists C. tourism promoters D. fairy penguins 36

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test Taking Guidelines Favor numbers in the mid-range  If you have no idea what the real answer is, avoid extremes. Favor more inclusive options  If in doubt, select the option that encompasses others. e.g.: A. an adaptive system B. a closed system C. an open system D. a controlled and responsive system E. an open and adaptive system. Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank. 37

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Are familiar with item formats  Use informed and educated guessing  Avoid common mistakes  Have testing experience  Use time effectively  Apply various strategies to solve different problem types 38

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Vary your keys: ‘Always pick option ‘C’. ’  Avoid ‘all of the above’ and ‘none of the above.’  Avoid extraneous information: It may assist in answering another item.  Avoid item ‘bad pairs’ or ‘enemies.’  Avoid clueing with the same word in the stem and the key. 39

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Test-wise Students  Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues.’ 40

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment Item Format Considerations  Information in the stem  Avoid negatively stated stem, qualifiers  Highlight qualifiers if used  Avoid irrelevant symbols (“&”) and jargon  Standard set number of options (Prefer only four)  Ideally, you should tie an item to reference (and rationale) 41

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment42 Test Directions Highlight Directions 1.State the skill measured. 2.Describe any resource materials required. 3.Describe how students are to respond. 4.Describe any special conditions. 5.State time limits, if any.

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment43 Ensure Test Validity  Congruence between items and course objectives  Congruence between item and student characteristics  Clarity of items  Accuracy of the measures  Item formatting criteria  Feasibility-time, resources

Best Practices for Writing Objective Test Items January 2011 March 2010 January 2010 January Academic Effectiveness and Assessment44 Questions

College of Nursing January 2011 Best Practices for Writing Objective Test Items