Developing Assessment Instruments

Slides:



Advertisements
Similar presentations
Testing What You Teach: Eliminating the “Will this be on the final
Advertisements

Selected Response Tests
Objectives WRITING WORTHWHILE OBJECTIVES FOR YOUR CLASS.
Instructional Strategies for E- Learning C. Candace Chou, Ph.D. University of St. Thomas.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Principles of High Quality Assessment
Identifying Subordinate Skills & Entry Behaviors
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Measuring Learning Outcomes Evaluation
Standardized Test Scores Common Representations for Parents and Students.
Barry Williams1 Developing an Instructional Strategy Dick & Carey 8.
Barry Williams1 Writing Objectives Dick & Carey Chp. 6.
Chapter 9 Instructional Assessment © Taylor & Francis 2015.
Computer Assisted Testing. Definition of computer-assisted tests Tests that are administered at computer terminals, or on personal computers.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Identifying Subordinate Skills & Entry Behaviors
Chapter 1 Assessment in Elementary and Secondary Classrooms
Writing Performance Objectives Introduction to Instructional Design Writing Performance Objectives Dr. Lloyd Rieber The University of Georgia Department.
Developing Evaluation Instruments
Assessment COURSE ED 1203: INTRODUCTION TO TEACHING COURSE INSTRUCTOR
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
Barry Williams1 Systematic Instructional Design Barry Williams Ph.D.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Instructional Design Gayle Henry. Instructional Design Instructional Design is creating experiences for the learner where how they learn is achieved in.
10/08/05Slide 1 Instructional Systems Design in Distance Education Goal: This lesson will discuss Instructional Systems Design as it relates to distance.
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Barry Williams1 Analyzing Learners & Context Dick & Carey, Chp. 5.
Classroom Assessments Checklists, Rating Scales, and Rubrics
EDU 385 Education Assessment in the Classroom
Assessment in Education Patricia O’Sullivan Office of Educational Development UAMS.
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
© E. Kowch iD Instructional Design Evaluation, Assessment & Design: A Discussion (EDER 673 L.91 ) From Calgary With Asst. Professor Eugene G. Kowch.
Evaluating Instruction
Cut Points ITE Section One n What are Cut Points?
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Classroom Assessment, Grading, and Standardized Testing
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Prepared by: Mariz B. Gancia Angel Grace B. Hernaez Performance Assessment of Process and Product.
Appraisal and Its Application to Counseling COUN 550 Saint Joseph College Ability, Intelligence, Aptitude and Achievement Testing For Class #12 Copyright.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 4 Overview of Assessment Techniques.
14 The Role of Assessment Permission granted to reproduce for educational use only.© Goodheart-Willcox Co., Inc. Assessment—the methods used to gain.
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
Do not on any account attempt to write on both sides of the paper at once. W.C.Sellar English Author, 20th Century.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
CEIT 225 Instructional Design Prof. Dr. Kürşat Çağıltay
COM 535, S08 Writing Performance Objectives February 20, 2008.
Assessment and the Institutional Environment Context Institutiona l Mission vision and values Intended learning and Educational Experiences Impact Educational.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Chapter 1 Assessment in Elementary and Secondary Classrooms
Chapter 6: Checklists, Rating Scales & Rubrics
EDU 385 Session 8 Writing Selection items
Understanding Results
Statistics and Research Desgin
Instructional Design : Design Phase Unit 3
Conducting a Goal Analysis
14 The Role of Assessment. 14 The Role of Assessment.
Analyzing Learners & Context
Designing Your Performance Task Assessment
Presentation transcript:

Developing Assessment Instruments Dick & Carey Chap. 7

Criterion-Referenced Tests Designed to measure explicit behavioral objectives Used to evaluate: learner performance effectiveness of the instruction

Criterion-Referenced Also called objective-referenced Refers directly to explicit “criterion” or specified performance “Criterion-Referenced Test” must: match test item and performance objective stipulate degree of mastery of the skill

Types of Criterion Tests Pretest: 1. Consists of items that: measure entry behavior skills test skills to be taught draw from skills below the entry behavior line 2. Helps determine appropriateness of required entry skills. 3. Used during formative evaluation process. May be discarded in final version of instruction.

Types of Criterion Tests Posttest: 1. Assesses all the objectives, focusing on terminal objectives 2. Helps identify ineffective instructional segments 3. Used during the design process and may be eventually modified to measure only terminal objectives

Designing Tests for Learning Domains Intellectual & Verbal Information paper & pencil Attitudinal state a preference or choose an option Psychomotor performance quantified on checklist subordinate skills tested in paper-and-pencil format

Determining Mastery Levels Approach # 1 mastery defined as level of performance normally expected from the best learners arbitrary (norm-referenced) Approach # 2 defined in statistical terms, beyond mere chance mastery varies with critical nature of task example: nuclear work Vs. paint a house

Writing Test Items What should test items do? Match the behavior of the objective Use the correct “verb” to specify the behavior Match the conditions of the objective

Writing Test Items How many test items do you need? Determined by learning domains Intellectual requires three or more Wide range use random sample

Writing Items (continued) What types (true / false, multiple choice, etc..) to use? clues provided by the behavior listed in the objective review “Types of Test Items” this chap. p 148

Writing Items (continued) Item types tempered by: amount of testing time ease of scoring amount of time to grade probability of guessing ease of cheating, etc. availability of simulations

Writing Items (continued) What types are inappropriate? true / false for definition discrimination, not definition Acceptable alternatives from “best possible” for simulations list steps

Constructing Test Items Consider: vocabulary setting of test item (familiar Vs. unfamiliar) clarity all necessary information trick questions double negatives, misleading information, etc.

Other Factors Sequencing Items Consider clustering by objective Test Directions Clear and concise General Section specific Evaluating Tests / Test Items

Measuring Performance, Products, & Attitudes Write directions to guide learner activities and

Evaluating Performance, Products, & Attitudes Construct an instrument to evaluate these activities a product, performance, or attitude Sometimes includes both process and a product For example -- TRDEV 518

Test Directions for Performance, Products, & Attitudes Determine the Amount of guidance? Special conditions time limits, special steps, etc. Nature of the task (i.e., complexity) Sophistication level of the audience

Assessment Instruments for Performance, Products, & Attitudes Identify what elements are to be evaluated cleanliness, finish, tolerance of item, etc. Paraphrase each element Sequence items on the instrument Select the type of judgment for rater Determine instrument scoring

Formats for Assessments of Performance, Products, & Attitudes Checklist Rating Scale Frequency Counts Etc.

Performance, Products, & Attitudes -- Scoring Guidelines?

Evaluating Congruency Skills, Objectives, & Assessments should refer to the same behaviors To check for congruency Construct an Congruency Evaluation Chart include: Subskills, Behavioral Objectives, & Test Items