TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.

Slides:



Advertisements
Similar presentations
A Focus on Higher-Order Thinking Skills
Advertisements

Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
The Assessment Toolbox Linda Suskie Middle States Commission on Higher Education AB Tech February 2005.
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
FACULTY DEVELOPMENT PROFESSIONAL SERIES OFFICE OF MEDICAL EDUCATION TULANE UNIVERSITY SCHOOL OF MEDICINE Using Statistics to Evaluate Multiple Choice.
Measuring Complex Achievement: Essay Questions
Understanding Depth 0f knowledge
When an exam is announced,
Designing the Test and Test Questions Jason Peake.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Making Assignment Expectations Clear: Create a Grading Rubric Barb Thompson Communication Skills Libby Daugherty Assessment FOR Student Learning 1.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Classroom Assessment (1)
Seminar /workshop on cognitive attainment ppt Dr Charles C. Chan 28 Sept 2001 Dr Charles C. Chan 28 Sept 2001 Assessing APSS Students Learning.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Test Writing: Moving Away from Publisher Material
Copyright 2001 by Allyn and Bacon Classroom Evaluation & Grading Dr.Bill Bauer EDUC 202.
Principles of High Quality Assessment
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Essay Assessment Tasks
Clickers in the Classroom Monday Models Spring 08 source:
March16, To keep up with the current and future standards of high school graduates. To align with college and career readiness standards. To ensure.
Oscar Vergara Chihlee Institute of Technology July 28, 2014.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Depth of Knowledge A HEAP of Complexity. BLOOM’S TAXONOMYBLOOM’S REVISED TAXONOMY KNOWLEDGE “The recall of specifics and universals, involving little.
Completion, Short-Answer, and True-False Items
1 Focusing on the FCAT Test-Taking Strategies Grades 3-5 Nancy E. Brito, Department of Assessment , PX47521 Information.
CASL: Target -Method Match Statesville Middle School January 13, 2009.
Become familiar with constructed response in the law Understand the meaning of constructed response Review open-response questions Know when to assess.
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Designing Common Core Assessment Questions for Educators Creating Type II and Type III Assessments.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 ITEM TYPES IN A TEST Missing words and incomplete sentences Multiple choice.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
Michigan State Assessments: What Do Families Need to Know?
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
Writing Supply Items Gronlund, Chapter 8. Supply Type Items Require students to supply the answer Length of response varies –Short-answer items –Restricted-response.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Target -Method Match Selecting The Right Assessment.
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Assessment and Testing
Bloom’s Taxonomy A Focus on Higher-Order Thinking Skills.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Summary of Assessments By the Big Island Team: (Sherry, Alan, John, Bess) CCSS SBAC PARCC AP CCSSO.
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
1 Focusing on the FCAT Test-Taking Strategies Grades 6-8 Nancy E. Brito, Department of Assessment , PX47521
1 Focusing on the FCAT Test-Taking Strategies Grades 9-11 Nancy E. Brito, Department of Assessment , PX47521
Essay Questions. Two Main Purposes for essay questions 1. to assess students' understanding of and ability to think with subject matter content. 2. to.
Assessment My favorite topic (after grammar, of course)
Test Question Writing Instructor Development ANSF Nurse Training Program.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
Review: Performance-Based Assessments Performanc-based assessment Real-life setting H.O.T.S. Techniques: Observation Individual or Group Projects Portfolios.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Designing Scoring Rubrics
A Focus on Higher-Order Thinking Skills
Constructing Exam Questions
Preparation for the American Literature Eoc
Creating Open-Ended Questions ERPD October 31, 2012
Learning Assessment Learning Teaching Dr. Md. Mozahar Ali
A Focus on Higher-Order Thinking Skills
Synthesis Evaluation Analysis Application Comprehension Knowledge
Presentation transcript:

TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS

OBJECTIVES Describe the pro’s and con’s of using different question formats other than multiple choice. Identify and resolve common problems when constructing open-ended questions. Develop standardized grading procedures for open-ended questions. Discuss exam length and other challenges when using open-ended question formats.

ASKING THE RIGHT QUESTION

REVIEW Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge Low High

QUESTION TYPES Selection Response Construction Response RecognitionRecall

QUESTION TYPES Selection Response True / False Matching Multiple Choice Construction Response Fill in the Blank Short AnswerEssay

TRUE / FALSE Pro’s easy to write easy to score Con’s limited ability to assess mastery high probability of guessing Best Use: Dichotomous, factual info Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge

MATCHING Pro’s Can assess a lot of info in a confined space Fairly low probability of guessing Con’s Assess recognition not recall Best Use: Knowledge recall Add additional “distractor” items to increase rigor Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge

MULTIPLE CHOICE Pro’s Very versatile Easy to score Con’s More challenging to write Assess recognition over recall Best Use: Factual, conceptual, or procedural information Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge

SHORT-ANSWER Pro’s Assess unassisted recall Relatively easy to write Con’s Only useful if you can give a short answer. Must be worded carefully to avoid scoring problems Best Use: Assessing information that you expect to be memorized Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge

ESSAY Pro’s Can test higher complex objectives Can test process / reasoning Realistic tasks Con’s Take longer to answer Hard to grade fairly, consistently Take longer to grade Best Use: Assessing highest level objectives Bloom’s Evaluation Synthesis Analysis Application Comprehension Knowledge

SUMMARY Each question type has pro’s and con’s Question type should be guided by the learning objective to be assessed. Limited options for assessing higher taxonomy

CONSTRUCTING OPEN- ENDED QUESTIONS

WELL-DEVELOPED ESSAY QUESTIONS Mirror well-defined learning objectives Assess most appropriate content types Require content recall, evaluation, and reasoning Are clearly written Provide boundaries Have well-defined grading criteria

APPROPRIATE CONTENT Content that justifies high-level mastery Construction, higher-order taxonomy Analysis (analyze, compare, contrast, interpret) Evaluation (evaluate, explain, justify) Synthesis (develop, construct, modify) Complex, multi-step thought processes Simulation of real-world processes “given a patient-case scenario” If content / processes can be assessed by methods other than essay questions, they probably should.

KEY ELEMENTS An ideal essay question requires students to: Recall facts Make an evaluative judgment or develop a novel solution Explain reasoning behind response The question should include: Task Problem situation

WRITING FOR CLARITY Ensure your question requires higher-order thinking Make sure the task is defined and focused Make sure the problem situation includes adequate detail

SETTING BOUNDARIES Increasing structure prevents grading problems bluffing Avoid indeterminate questions Students can redefine and answer with info the know well Give time / space limits Establish rules for answers

ESTABLISHING GRADING CRITERIA Must have specific criteria for grading identified a priori Create a model answer Assign point values Peer review Identify essentials in response Determine whether partial credit is allowable and how will be awarded. Grade blinded

DIFFERENT GRADING APPROACHES Comprehension / Understanding Screen responses for key elements Assign points for each element present Scores for presence of content only (potential bluffing) Reasoning / Complex Processes Must assess complete response (time consuming) Consider a rubric approach to limit subjectivity Problem – Solution Encourage requiring students to show work Can grade multiple steps in process

GRADING TIPS Grading scheme should be easily interpreted by other graders Self-explanatory Point values that can easily be tallied When multiple graders, encourage frequent communication and comparison of results Before grading, screen a sample of responses for consistency

OTHER ISSUES Exam length Addressing unidentified “correct” answers Partial credit or all-or-none Practice examples Multiple examples Should not be the same questions as on exam