Constructing Exam Questions

Slides:



Advertisements
Similar presentations
1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
Advertisements

Designing the Test and Test Questions Jason Peake.
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Gary D. Borich Effective Teaching Methods 6th Edition
Selected Response Tests
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Test and Types of Tests.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
Completion, Short-Answer, and True-False Items
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Assessment Literacy Series 1 -Module 3- Item Specifications.
Prepare and Use Knowledge Assessments. IntroductionIntroduction Why do we give knowledge tests? What problems did you have with tests as a student? As.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 ITEM TYPES IN A TEST Missing words and incomplete sentences Multiple choice.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Law of Contrariness "Our chief want in life is somebody who shall make us do what we can. Having found them, we shall then hate them for it." Ralph Waldo.
Assessment and Testing
Building Exams Dennis Duncan University of Georgia.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Multiple Choice Items EDUC 307. Multiple Choice Items  Definition: This format consists of a stem that poses a question or sets a problem and a set of.
True-False; Matching; Short- Answer Test Items EDUC 307.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Multiple-Choice Item Design February, 2014 Dr. Mike Atkinson, Teaching Support Centre, Assessment Series.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa.
Presentation by: Muhammad Riaz Anjum Nasir Mahmood PRESENTATION WRITING ESSAY TYPE QUESTIONS.
New Teacher Institute Day 2
Curriculum and Assessment Design Training plans: Whole school ASSESSMENT 2 Meaningful assessment overview.
Writing Selection Items Multiple Choice
Writing Reading Items Module 2 Activity 4.
Writing Reading Items Module 2 Activity 4.
EDU 385 Session 8 Writing Selection items
Module 11 Developing Strategies for Test Taking
How to show what you know!
Classroom Analytics.
Good MC Questions for Training and Evaluation
Stimulus 2. Stem 3. Options 4. Distractors
EDP 303 Presentations Lawrence W. Sherman, Ph. D
Summative Assessment Grade 6 April 2018 Develop Revise Pilot Analyze
Multiple Choice Item (MCI) Quick Reference Guide
Testing Receptive Skills
TOEFL Reading Overview
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Developing Tests and Test Questions/Items
Multiple Choice Questions
A Guide to Taking Multiple Choice Tests
Multiple-Choice and Matching Exercises
Reconsidering Objective Assessments In Canvas
Multiple Choice Item (MCI) Quick Reference Guide
EDUC 2130 Quiz #10 W. Huitt.
Test Construction: The Elements
“QA” = quality assurance
Presentation transcript:

Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education

Objectives Use Bloom’s Taxonomy to identify the knowledge level of assessment items Compose an effective and meaningful stem for Multiple Choice items Construct effective and plausible distractors for Multiple Choice items Construct effective assessment questions in the form of: Multiple Choice, True/False/ and Essay/Short Answer

Planning the Assessment Use Bloom’s taxonomy to categorize content/concepts by knowledge level. Create a table to breakdown and organize assessment items by outcome level Example: http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Validity- does the assessment measure the learning outcomes you intend to measure with your assessment items? Coordinating assessment content with instruction content increases the validity of the assessment. http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Constructing the Stem The stem should be meaningful and present a problem Focus on a learning outcome Lack of a definite problem in the stem often assesses student’s inference as opposed to a direct measurement of performance http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

http://cft. vanderbilt http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Try to avoid including irrelevant material in the stem Can decrease the reliability and validity of the assessment item http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Consider making the stem a question or partial sentence http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/ Consider making the stem a question or partial sentence A question stem allows students to focus on answering the assessment item, increasing validity Avoid using inserted blanks as it increases cognitive load- can decrease validity of assessment item

Constructing the Distractors All distractors should be plausible Serve to effectively indicate what students know Incorrect distractors should be chosen by students who did not achieve the learning outcome Incorrect distractors should be ignored by the students who did http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Distractors should be clear and concise Excessive wordiness assesses reading/comprehension as opposed to the actual desired outcome http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Avoid using “all of the above” or “none of the above” Students who can identify more than one correct distractor can choose correct answer without achieving the learning outcome Conversely, students who can identify on incorrect distractor can choose correct answer without achieving the learning outcome Distractors can contain inadvertent clues in the language, length, format, and grammar Keep grammar consistent with the stem Keep length similar Keep format similar Keep language similar http://cft.vanderbilt.edu/guides-sub-pages/writing-good-multiple-choice-test-questions/

Constructing True/False, Essay/Short Answer Items True/False Items Typically used at the knowledge level of assessment to measure recall of factual knowledge Potential to measure higher level knowledge Strengths of True/False Items Easy to write and answered quickly by students Provide a large sampling of content vs time allotted Weaknesses of True/False Items Problem with guessing- 50/50 chance of guessing correctly without achieving learning outcome Ambiguity due to writing statements that are indisputably true or false http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Guidelines to Constructing True/False questions: Assess significant material Write questions that are indisputably true or false Avoid taking statements directly from textbooks Assess only one point in each item Avoid trick questions Avoid terms such as : always, all, never…instead use: usually, often, many times Avoid negatively worded statements Randomize the order to avoid giving clues Keep true and false statements/questions similarly formatted http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Essay/Short Answer Items Consider class-size Consider time available to prepare the assessment Strengths of Essay/Short Answer Items Effective way to measure higher level thinking Require less time to construct Discourages students from memorizing and regurgitating facts Can provide students with realistic tasks/scenarios Requires student s to organize, construct, and communicate their thoughts Weaknesses of True/False Items Provide a smaller sampling of content due to time needed to answer each question Require more time for grading: reading and scoring Scoring is often more subjective than objective which can decrease reliability http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Guidelines to Constructing Essay/Short Answer questions: Create clearly defined questions…lead the student towards the approach you desire To cover more content, consider more questions requiring shorter answers Avoid giving optional essay questions- students choosing different essay questions to answer are essentially taking different exams Indicate how much each essay/short answer question is worth Avoid questions that only require providing factual information http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Consider that higher level processes build upon lower level processes If the lower level processes are not achieved, then the higher level one will be deficient Examples of essay/short answer items by level of Bloom’s Taxonomy http://www.iub.edu/~best/pdf_docs/better_tests.pdf

Questions?