 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.

Slides:



Advertisements
Similar presentations
Writing constructed response items
Advertisements

An Introduction to Computer- assisted Assessment Joanna Bull and Ian Hesketh CAA Centre Teaching and Learning Directorate.
Using Common Testing Techniques Effectively
Alternate Choice Test Items
Test Taking Strategies
Test-Taking Strategies
Measuring Complex Achievement: Essay Questions
1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
Designing the Test and Test Questions Jason Peake.
Selected Response Tests
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Writing Instructional Objectives
Stages of testing + Common test techniques
Improving Test Taking Strategies. Test Taking Skills  Most students have NEVER been taught test taking strategies.  Studies show that as many as 20.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Classroom Assessment A Practical Guide for Educators by Craig A
Essay Assessment Tasks
Get the most information out of the time you have available.
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Test and Types of Tests.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
Question 1 A.) Answer A B.) Answer B C.) Answer C D.) Answer D Welcome to “Who Wants to Be a Millionaire” To play: Each time a question page is presented,
Completion, Short-Answer, and True-False Items
Constructing Objective Test Items: Simple Forms
Classroom Assessments Checklists, Rating Scales, and Rubrics
Evaluation of Student Learning: Test Construction & Other Practical Strategies Faculty Professional Development Fall 2005 Dr. Kristi Roberson-Scott.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Multiple Choice Question Design Karen Brooks & Barbara Tischler Hastie.
Measuring Complex Achievement
Session 2 Traditional Assessments Session 2 Traditional Assessments.
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Writing Supply Items Gronlund, Chapter 8. Supply Type Items Require students to supply the answer Length of response varies –Short-answer items –Restricted-response.
Guidelines to Test Preparation Guidelines to Test Preparation.
Biology Partnership Assessment Pedagogy Session Saturday, September 29, 2012 Dr. Susan Butler.
Writing Multiple Choice Questions. Types Norm-referenced –Students are ranked according to the ability being measured by the test with the average passing.
Assessment and Testing
Guidelines to Test Preparation Guidelines to Test Preparation.
What are the stages of test construction??? Take a minute and try to think of these stages???
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Assessment Item Types: SA/C, TF, Matching. Assessment Item Types Objective Assessments Objective Assessments Performance Assessments Performance Assessments.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
COMMON TEST TECHNIQUES FROM TESTING FOR LANGUAGE TEACHERs.
Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa.
Presentation by: Muhammad Riaz Anjum Nasir Mahmood PRESENTATION WRITING ESSAY TYPE QUESTIONS.
Test Based on Response There are two kinds of tests based on response. They are subjective test and objective test. 1. Subjective Test Subjective test.
EDU 385 Session 8 Writing Selection items
Chapter 14 Assembling, Administering, and Appraising classroom tests and assessments.
Classroom test and Assessment
Constructing Exam Questions
Classification of Tests Chapter # 2
Multiple Choice Item (MCI) Quick Reference Guide
Testing Receptive Skills
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 8 Objective Test Items.
Multiple-Choice and Matching Exercises
Multiple Choice Item (MCI) Quick Reference Guide
Presentation transcript:

 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses

 Can test large amounts of content  Students can answer 3-4 questions per minute

 They are easy  It is difficult to discriminate between students that know the material and students who don't  Students have a chance of getting the right answer by guessing  Need a large number of items for high reliability

 Avoid double negatives.  Avoid long/complex sentences.  Use specific determinants with caution: never, only, all, none, always, could, might, can, may, sometimes, generally, some, few.  Use only one central idea in each item.  Don't emphasize the trivial.  Use exact quantitative language  Don't lift items straight from the book.  Make more false than true (60/40). (Students are more likely to answer true.)

Good for:  Knowledge level  Some comprehension level, if appropriately constructed

 Terms with definitions  Phrases with other phrases  Causes with effects  Parts with larger units  Problems with solutions

Advantages:  Maximum coverage at knowledge level in a minimum amount of space/preptime  Valuable in content areas that have a lot of facts Disadvantages:  Time consuming for students  Not good for higher levels of learning

 Need 15 items or less.  Give good directions on basis for matching.  Use items in response column more than once (reduces the effects of guessing).  Use homogenous material in each exercise.  Make all responses plausible.  Put all items on a single page.  Put response in some logical order (chronological, alphabetical, etc.).  Responses should be short.

Good for:  Application, synthesis, analysis, and evaluation levels Types:  Question/Right answer  Incomplete statement  Best answer

 Very effective  Versatile at all levels  Minimum of writing for student  Guessing reduced  Can cover broad range of content

 Difficult to construct good test items.  Difficult to come up with plausible distractors/alternative responses.

 Stem should present single, clearly formulated problem.  Stem should be in simple, understood language; delete extraneous words.  Avoid "all of the above"--can answer based on partial knowledge (if one is incorrect or two are correct, but unsure of the third...).  Avoid "none of the above."

 Make all distractors plausible/homoegenous.  Don't overlap response alternatives (decreases discrimination between students who know the material and those who don't).  Don't use double negatives.  Present alternatives in logical or numerical order.

 Place correct answer at random (A answer is most often).  Make each item independent of others on test.  Way to judge a good stem: student's who know the content should be able to answer before reading the alternatives  List alternatives on separate lines, indent, separate by blank line, use letters vs. numbers for alternative answers.  Need more than 3 alternatives, 4 is best.

Good for:  Application, synthesis, analysis, and evaluation levels

 Easy to construct  Good for "who," what," where," "when" content  Minimizes guessing  Encourages more intensive study-student must know the answer vs. recognizing the answer

 May overemphasize memorization of facts  Take care - questions may have more than one correct answer  Scoring is laborious

 When using with definitions: supply term, not the definition-for a better judge of student knowledge.  For numbers, indicate the degree of precision/units expected.  Use direct questions, not an incomplete statement.

 If you do use incomplete statements, don't use more than 2 blanks within an item.  Arrange blanks to make scoring easy.  Try to phrase question so there is only one answer possible.

Good for:  Application, synthesis and evaluation levels Types:  Extended response: synthesis and evaluation levels; a lot of freedom in answers  Restricted response: more consistent scoring, outlines parameters of responses

 Students less likely to guess  Easy to construct  Stimulates more study  Allows students to demonstrate ability to organize knowledge, express opinions, show originality.

 Can limit amount of material tested, therefore has decreased validity.  Subjective, potentially unreliable scoring.  Time consuming to score.

 Provide reasonable time limits for thinking and writing.  Avoid letting them to answer a choice of questions (You won't get a good idea of the broadness of student achievement when they only answer a set of questions.)

 Give definitive task to student-compare, analyze, evaluate, etc.  Use checklist point system to score with a model answer: write outline, determine how many points to assign to each part  Score one question at a time-all at the same time.

Good for:  Knowledge, synthesis, evaluation levels

 Useful as an instructional tool-allows students to learn at the same time as testing.  Allows teacher to give clues to facilitate learning.  Useful to test speech and foreign language competencies.

 Time consuming to give and take.  Could have poor student performance because they haven't had much practice with it.  Provides no written record without checklists.

Good for:  Knowledge, application, synthesis, evaluation levels

 Can assess compatible skills: writing, documentation, critical thinking, problem solving  Can allow student to present totality of learning.  Students become active participants in the evaluation process

 Can be difficult and time consuming to grade.

Good for:  Application of knowledge, skills, abilities Advantages:  Measures some skills and abilities not possible to measure in other ways

 Can not be used in some fields of study  Difficult to construct  Difficult to grade  Time-consuming to give and take