University of Baltimore Test Development Solutions (TDS) Thomas Fiske, M.S. - Test Development Team Lead Charles Glover, M.S. - Test Developer; Diann M.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

Alternate Choice Test Items
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
THE FINAL TEST Debra Fazio. Achieve Your Highest Potential © 2005 Pearson Education, Inc. publishing as Longman Publishers. Be prepared. Stay alert. Seek.
What is Assess2Know ® ? Assess2Know is an assessment tool that enables districts to create high-quality reading and math benchmark assessments for grades.
What is a Good Test Validity: Does test measure what it is supposed to measure? Reliability: Are the results consistent? Objectivity: Can two or more.
1. 2 Dr. Shama Mashhood Dr. Shama Mashhood Medical Educationist Medical Educationist & Coordinator Coordinator Question Review Committee Question Review.
Designing the Test and Test Questions Jason Peake.
Gary D. Borich Effective Teaching Methods 6th Edition
Item Analysis: A Crash Course Lou Ann Cooper, PhD Master Educator Fellowship Program January 10, 2008.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Test-Taking Tactics. 2 “Knowing is not enough; we must apply.” -- Goethe.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
M ATCHING I TEMS Presenter Pema Khandu B.Ed.II(S)Sci ‘B’
Summative Assessment: Rubrics and Tests Effective Teaching and Learning English Study Program FKIP – UNSRI April
Test Construction Processes 1- Determining the function and the form 2- Planning( Content: table of specification) 3- Preparing( Knowledge and experience)
Chapter 4 Validity.
Social Science Faculty Meeting January 2010 Mastering the Art of Test Writing Roundtable Discussion.
Objective Examination Dr. Niraj Pandit MD Department of Community Medicine SBKS MIRC.
Copyright 2001 by Allyn and Bacon Classroom Evaluation & Grading Dr.Bill Bauer EDUC 202.
Stages of testing + Common test techniques
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Now that you know what assessment is, you know that it begins with a test. Ch 4.
RELIABILITY BY DESIGN Prepared by Marina Gvozdeva, Elena Onoprienko, Yulia Polshina, Nadezhda Shablikova.
Module 6 Test Construction &Evaluation. Lesson’s focus Stages in Test Construction Tasks in Test Test Evaluation.
Narrowing the Gulf Annual Conference 2010 March 2010 Mastering the Art of Writing Objective Test Items.
Faculty Meeting February 6,  Rate yourself on your ability to write a multiple choice question.  Write two things you consider when writing a.
STRATEGIES TO EVALUATE COMPETENCE Christina B. DeBiase, EdD WVU School of Dentistry.
Ginny Price CETL TEST DEVELOPMENT. WRITING MULTIPLE CHOICE ITEMS Write only a few items at a time Immediately after preparing class lesson or after class.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Tips for Top Tests FOSL fall September 10, 2007 Adapted from “Tools for Teaching” by Barbara Gross Davis.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Exam Taking Kinds of Tests and Test Taking Strategies.
1 Writing Test Blueprints and Test Items “Software for Creating and Delivering Assessments With Powerful Reports”
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
2009 Professional Development Day October 2009 Mastering the Art of Test Writing.
©2007 Pearson Education, Inc. publishing as Longman Publishers Chapter 7: Test- Taking Strategies Breaking Through: College Reading, 8/e by Brenda Smith.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Research Methodology and Methods of Social Inquiry Nov 8, 2011 Assessing Measurement Reliability & Validity.
Building Exams Dennis Duncan University of Georgia.
March 11, 2013 Chicago, IL American Board of Preventive Medicine American Board of Preventive Medicine Clinical Informatics Examination Committee Measurement.
Module 7 1. What do we know about selected- response items? Well constructed selected- response items can target: factual knowledge comprehension analysis.
Materials produced under Phare 2006 financial support Phare TVET RO 2006/ EUROPEAN UNION Project financed under Phare MoERI/ NCDTVET-PIU.
Multiple Choice Items EDUC 307. Multiple Choice Items  Definition: This format consists of a stem that poses a question or sets a problem and a set of.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Language Testing How to make multiple choice test.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Multiple Choice When faced with a Multiple Choice Exam (MC), do you… A.Cower in fear because MC is difficult B.Answer B to everything C.Stand.
Reviewing, Revising and Writing Effective Social Studies Multiple-Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. SELECTED.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
Objective Examination: Multiple Choice Questions Dr. Madhulika Mistry.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa Muhammad Riaz Anjum Nasir Mahmood Irum Bajwa.
Introduction to the Validation Phase
Curriculum and Assessment Design Training plans: Whole school ASSESSMENT 2 Meaningful assessment overview.
Writing Selection Items Multiple Choice
Concept of Test Validity
EDU 385 Session 8 Writing Selection items
Constructing Exam Questions
Multiple Choice Item (MCI) Quick Reference Guide
Testing Receptive Skills
Multiple Choice Item (MCI) Quick Reference Guide
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

University of Baltimore Test Development Solutions (TDS) Thomas Fiske, M.S. - Test Development Team Lead Charles Glover, M.S. - Test Developer; Diann M. Brady, M.S. - Test Developer Steve Zarate - Talent Acquisition Specialist

Two Main Considerations in Test Development Validity “ “ Validity refers to the degree to which evidence and theory support the interpretations of test scores for proposed uses of tests. Validity is, therefore, the most fundamental considerations in developing tests and evaluating tests.” (APA Standards, 2014, p. 11) Reliability Reliability denotes “consistency of the scores across instances of the testing procedure.” (APA Standards, 2014, p. 33)

Validity +Writing items: +Does the item measure relevant and important subject matter? +Do the item stem and key address what the minimally competent candidate is required to know to meet the stated objective? +Are the distracters (options) plausible? Copyright © 2011 Prometric. All rights reserved. 3

Reliability Copyright © 2011 Prometric. All rights reserved. 4 ConsistentRepeatable Reliable

Influences on Reliability Copyright © 2011 Prometric. All rights reserved. 5 Reliability Test Length DifficultyContent Clarity of Questions

Fairness Copyright © 2011 Prometric. All rights reserved. 6 Fairness Clarity of Questions Represents Relevant Knowledge Avoids Sensitive Issues References to Recognized Source

Testing Concept: Cognition +Achieving a reliable, valid, fair exam to identify minimally competent candidates involves including items that require different levels of mental processing or COGNITION. +Multiple Choice Questions (MCQs) “convergent thinking” require a unique correct (predictable or calculable) answer. +Higher cognitive level item = higher likelihood that question will determine if the candidate has mastered the materials “divergent thinking”. They do not require a single predetermined correct answer and therefore cannot be directly assessed by MCQs. Copyright © 2011 Prometric. All rights reserved. 7

Examples of Item Formats +Closed Stem: Who is the current President of the United States? +Open Stem: The current President of the United States is Copyright © 2011 Prometric. All rights reserved. 8

Writing Items  Grammar and Formatting  Grammatical consistency between stem and options  List the options in logical order  Write in the active voice vs. passive voice  Use third person, excluding personal pronouns Exclude personal pronouns like “you”  Avoid qualifiers such as “never”, “all”, “always”  Include labels with numbers 9

 EXCEPT or NOT: Do not write negatively worded questions unless the information being tested is referenced in the negative Often overlooked May be confusing  These items are easy to write, but typically do NOT perform well! Writing Items 10

The Stem +Present a complete problem +Be stated in the positive +Be short and concise +Be free of extraneous information +Test relevant content +Measure a single concept +Not be tricky or unnecessarily difficult +Can be answered without reading the options Copyright © 2011 Prometric. All rights reserved. 11

The Key(s) +The key(s) must: +Be the only correct answer(s) among the options +Not depend on opinion or regional differences +Be similar in content, language, length and style to the distracters +No words repeated from the stem +Be referenced (This is important for legal defensibility) +Have a rationale if appropriate Copyright © 2011 Prometric. All rights reserved. 12

The Distracters (Options) +The distracters must: +Be plausible but incorrect +Be real +Be similar in content, language, length and style to the key +Not repeat words from the stem +Be mutually exclusive; no overlap +Not include ‘all of the above’ or ‘none of the above’ +Avoid determiners like “always” or “never” Copyright © 2011 Prometric. All rights reserved. 13

Flaw: Non-specific or non-directed stem 1. Greater Manchester is: A) One of the 39 counties in England B) A large county in the North West C) Formerly part of Lancashire D) The home of no professional sports team Remember: Unfocused stems do not provide candidates with enough information to formulate a response before reading the choices.

Flaw: Heterogeneous Options +When one or more options is radically different in content or form from the other options After surgery, compression bandages are used: A)Very frequently B)Very infrequently C)To prevent haematoma* D)To prevent gangrene Copyright © 2011 Prometric. All rights reserved. 15

Flaw: Conspicuous Key +The key to an item should not be considerably longer, shorter, more detailed, or stated in more technical language than any of the other options In tennis, a topspin grip is used to make the ball: A)Faster B)Slower C)Rotate more frequently as it moves through the air* D)Bounce Copyright © 2011 Prometric. All rights reserved. 16

Questions Copyright © 2011 Prometric. All rights reserved. 17