Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Evaluation Overview - Basics. Purpose of Testing Diagnostic Formative Summative.
Test Development.
Standardized Tests: What Are They? Why Use Them?
Copyright ©2007 by Pearson Education, Inc. Upper Saddle River, New Jersey All rights reserved. Gary D. Borich Effective Teaching Methods, 6e Gary.
Increasing your confidence that you really found what you think you found. Reliability and Validity.
VALIDITY AND RELIABILITY
Research Methodology Lecture No : 11 (Goodness Of Measures)
Gary D. Borich Effective Teaching Methods 6th Edition
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Chapter Two Sociological Investigation
Writing High Quality Assessment Items Using a Variety of Formats Scott Strother & Duane Benson 11/14/14.
Learning Objectives, Performance Tasks and Rubrics: Demonstrating Understanding and Defining What Good Is Brenda Lyseng Minnesota State Colleges.
How to Assess Psychomotor Skills
Chapter 8 Developing Written Tests and Surveys Physical Fitness Knowledge.
Chapter 10 Collecting Quantitative Data. SURVEY QUESTIONNAIRES Establishing Procedures to Collect Survey Data Recording Survey Data Establishing the Reliability.
Principles of High Quality Assessment
Measuring Learning Outcomes Evaluation
Technical Issues Two concerns Validity Reliability
7 By: Kelly Atwaroo Class: Infant II Topic: Caring of pets Subject Area: General Science Curriculum Area: Living things Previous Knowledge: Basic knowledge.
Developing Evaluation Instruments
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
Sink or Swim Prof. Andreas Prinz Introduction Models of higher education Effective students Study strategies How to teach habits? Summary.
Taxonomies of Learning Foundational Knowledge: Understanding and remembering information and ideas. Application: Skills Critical, creative, and practical.
Subject Matter Expert/Author: Assoc. Prof. Dr Rashid Johar (OUM) Faculty of Science and Foundation Studies Copyright © ODL Jan 2005 Open University Malaysia.
LECTURE 06B BEGINS HERE THIS IS WHERE MATERIAL FOR EXAM 3 BEGINS.
Technical Adequacy Session One Part Three.
The second part of Second Language Assessment 김자연 정샘 위지영.
Session 2 Learning Outcomes: articulating our expectations Active Learning Initiative Patrick A. Frantom, Ph.D. Department of Chemistry Adapted from Driscoll,
Instrumentation (cont.) February 28 Note: Measurement Plan Due Next Week.
Teaching Today: An Introduction to Education 8th edition
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Evaluating Instruction
Cut Points ITE Section One n What are Cut Points?
What is design? Blueprints of the instructional experience Outlining how to reach the instructional goals determined during the Analysis phase The outputs.
Learning Theory Last Update Copyright Kenneth M. Chipps Ph.D
 Cognitive objectives ◦ Describe the knowledge that learners are to acquire  Affective objectives ◦ Describe the attitudes, feelings, and dispositions.
LEARNING DOMAINS & OBJECTIVES Southern Illinois University Carbondale Instructor Workshop.
Selecting a Sample. Sampling Select participants for study Select participants for study Must represent a larger group Must represent a larger group Picked.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Developing Assessment Instruments
Marketing Research Approaches. Research Approaches Observational Research Ethnographic Research Survey Research Experimental Research.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
1 Announcement Movie topics up a couple of days –Discuss Chapter 4 on Feb. 4 th –[ch.3 is on central tendency: mean, median, mode]
A Taxonomy for Learning, Teaching, and Assessing Exploring Student Learning Outcomes Bloom’s Taxonomy Contributions to Wikispaces are licensed under a.
Developing Assessment Instruments Instructional Design: Unit, 3 Design Phase.
EVALUATION SUFFECIENCY Types of Tests Items ( part I)
Review: Performance-Based Assessments Performanc-based assessment Real-life setting H.O.T.S. Techniques: Observation Individual or Group Projects Portfolios.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
Writing Instructional Objectives. What is an objective? l A statement describing a proposed change of what the learner can do when (s)he has successfully.
 Good for:  Knowledge level content  Evaluating student understanding of popular misconceptions  Concepts with two logical responses.
Data Collection Methods NURS 306, Nursing Research Lisa Broughton, MSN, RN, CCRN.
SIUC Instructor Workshop Learning Domains and Objectives.
Learning Objectives Write the Objective Teach Based on the Objective
Chapter 6: Checklists, Rating Scales & Rubrics
Test Based on Response There are two kinds of tests based on response. They are subjective test and objective test. 1. Subjective Test Subjective test.
CHAPTER 3: Practical Measurement Concepts
Assessment and Evaluation
Concept of Test Validity
ASSESSMENT OF STUDENT LEARNING
Chapter Six Training Evaluation.
IN THE NAME OF “ALLAH” THE MOST BENIFICENT AND THE MOST MERCIFUL
TESTING AND EVALUATION IN EDUCATION GA 3113 lecture 1
EDUC 2130 Quiz #10 W. Huitt.
Learning Objectives Write the Objective Teach Based on the Objective
Presentation transcript:

Measuring Human Performance

Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning, Behavior, and Results. These categories are discrete and can be measured. The goal of this presentation is to bring to light many of the topics, concerns, and issues that must be understood before carrying out the business of testing, measuring, or evaluating the success of training in the work force today.

What is a test? What is testing? n The instrument used to collect data n A process of collecting quantifiable information about the degree to which a competence or ability is present in the test taker. (Anderson, BC)

Reasons for Testing n Prerequisite tests n Entry test n Diagnostic test n Post test n Equivalency test

Norm Reference Vs Criterion Reference

Norm Referenced Testing n Test items separate test-takers one from another n Normal distribution curve

Criterion Referenced Testing n Test items based on specific objectives n Mastery Curve / Skewed from Normal Distribution

SKA n Skill n Knowledge n Attitude

Domains of Learning n Cognitive n Affective n Psychomotor

Bloom’s Taxonomy for Cognitive Levels n Knowledge n Comprehension n Application n Analysis n Synthesis n Evaluation

Krathwohl’s Taxonomy for Affective Levels n Receiving n Responding n Valuing n Organization n Characterization by a value or value complex

Simpson’s Taxonomy for Psychomotor Levels n Perception n Set n Guided Response n Mechanism n Complex Overt Response n Adaptation n Origination

Test Items Related to Bloom’s Taxonomy n Multiple Choice – Most flexible across the Taxonomy spectrum, especially first three levels n Advantages: – Guessing probability low – Diagnostic capabilities – East to grade – Statistical Analysis

Multiple Choice cont….. n Disadvantages – Difficult to write – Provides keys for recall – doesn’t do well for high level cognition evaluation

True and False n Could be used at all levels but…. n Advantages – easy to write – easy to score – can to item analysis

T/F cont…. n Disadvantages – 50/50 guess factor – often used when M/C seems too hard to write n Reliability is so poor…..Very little evaluation value. n So why do teachers often include T/F?

Matching n Best suited for Application level….not recommended for any by me. n Advantages – Easy to write – East to Grade – Statistical Analysis

Matching cont… n Disadvantage: – Requires the two lower learning level – Process of elimination diminishes probability – low reliability n Why would a teacher use Matching?

Fill in the Blank n Best suited for the lower levels n Advantage – Recall is essential, few clues n Disadvantage – Single word or phrase – grading beyond single word or phase is in trouble – enters the realm of subjective grading..poor reliability

Short Answer n Can get to the high order thinking n Advantages – Easy to write – produces original responses n Disadvantages – Basically same as fill in….reliability

Essay n The best for higher order n Advantage – high order – creative ability – writing ability

Essay cont… n Disadvantage – Tough to grade – forget stats n You’ll see this often in Master’s and Ph.d. classes

Validity n Does the test measure what it is suppose to measure. n How close to the bull’s eye did it hit.

Reliability n How consistent is the test n Is there a tight pattern of hits

Types of Validity n Concurrent Validity n Content Validity n Criterion Related Validity n Predictive Validity n Construct Validity

Types of Reliability n Test-Retest Reliability n Inner-Rater Reliability

What is the real score of a test? n An error factor must be considered n test score + error factor

Ten Evaluation Instruments for Technical Training n Interviews n Questionnaires n Group Discussion n Critical Incident n Work Diaries

Instruments cont... n Performance Records n Simulation Role-Play n Observation n Written Test n Performance Test

Designing Tests n Questions you must ask yourself – Who is the test designed for? – What do you want to know? – How many Questions will be required? – How will it be administered? – How will it be scored?

3 Methods of Test Construction n Topic Based n Statistical Based n Objective Based

Topical Based Test n Selection done by chapter n Selection done by topic n Selection done by the importance of the topic

Limitations of Topic System n Procedure lacks precision n Doesn’t identify test takers n Not designed on learners level n Doesn’t specify competencies

Statistical Selection n Items statistically selected n Standardized n Norm Referenced

Limitations of Statistical n What is measured not specific n Lacks precision of CRT n Difficult to select items

Objectives Based Test n Based on defined competencies n Applies to criterion referenced tests and scores

Testing and Kirpatrick’s Four Levels n The more downward, from the performance of the company to the performance of the individuals, the more difficult to obtain. n The more downward...the more usable the information

Four Levels n REACTION n LEARNING n BEHAVIOR n RESULTS

Reaction n Checking individuals reaction often means, measuring “Customer Satisfaction” n Happy rating sheets n observations n other n How can you quantify the responses?

Learning n Measurable behavior changes in the three “SKA” Dimensions

Behavior n Behavior change due to training program. n Surveys n Interviews n Other

Results n Measurable by looking at changes in: n production n quality n Safety n Sales n other