Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.

Slides:



Advertisements
Similar presentations
Assessing Student Performance
Advertisements

The Research Consumer Evaluates Measurement Reliability and Validity
Independent and Dependent Variables
VALIDITY AND RELIABILITY
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Mary Jo Sariscsany Assessing Health- Related Fitness and Physical Activity 13 chapter.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Assessment: Reliability, Validity, and Absence of bias
Industrial and Organizational Psychology Performance Appraisal
Concept of Measurement
Chapter 41 Training for Organizations Research Skills.
Chapter 5 Instrument Selection, Administration, Scoring, and Communicating Results.
Questions to check whether or not the test is well designed: 1. How do you know if a test is effective? 2. Can it be given within appropriate administrative.
Now that you know what assessment is, you know that it begins with a test. Ch 4.
Foundations of Educational Measurement
McMillan Educational Research: Fundamentals for the Consumer, 6e © 2012 Pearson Education, Inc. All rights reserved. Educational Research: Fundamentals.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Helping Learners. 1. Helping Learners Improve their Cognitive Understanding. 2. Help Learners Improve their Physical and Motor Fitness. 3. Help Learners.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Evaluating a Research Report
Grading and Reporting Chapter 15
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
 Calibrating Your Grading Criteria with a Little Help from Milli Vanilli Lindsay Portnoy, Educational Foundations.
Measuring Complex Achievement
Developing and Validating an Assessment Measure. Goals, Objectives & Criteria  It is critical that employees have a clear understanding about what part.
Alternative Assessment
Teaching Today: An Introduction to Education 8th edition
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Chapter 4: Test administration. z scores Standard score expressed in terms of standard deviation units which indicates distance raw score is from mean.
Chap. 2 Principles of Language Assessment
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 13 Assessing Affective Characteristics.
Chapter 4: Measurement, Assessment, and Program Evaluation
For ABA Importance of Individual Subjects Enables applied behavior analysts to discover and refine effective interventions for socially significant behaviors.
Assessment What is it? Collection of relevant information for the purpose of making reliable curricular decisions and discriminations among students (Gallahue,
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Issues in Validity and Reliability Conducting Educational Research Chapter 4 Presented by: Vanessa Colón.
Performance Objectives and Content Analysis Chapter 8 (c) 2007 McGraw-Hill Higher Education. All rights reserved.
Why do we need fitness tests?. Task Fitness testing for Year 10 What equipment will you use? What are the pre-test procedures? What are the instructions?
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
Selection, Administration, Scoring, and Communicating Assessment Results Chapter 5.
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Chapter 13 Assessment of Sport Skills and Motor Abilities.
INSTRUCTIONAL OBJECTIVES
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
Measurement Experiment - effect of IV on DV. Independent Variable (2 or more levels) MANIPULATED a) situational - features in the environment b) task.
Chapter 6 - Standardized Measurement and Assessment
C H A P T E R Principles of Test Selection and Administration.
Assessment My favorite topic (after grammar, of course)
CIV Fitness/S&C Steven Tikkanen – F129 1 Sutherland College Health & Recreation Semester Version 1.
Assessment of Sport Skills and Motor Abilities
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Principles of Instructional Design Assessment Recap Checkpoint Questions Prerequisite Skill Analysis.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 8 Construction and Administration of Psychomotor Tests.
EVALUATING EPP-CREATED ASSESSMENTS
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
Chapter 6: Checklists, Rating Scales & Rubrics
Chapter 4 Developing Performance-Based Assessments
Concept of Test Validity
ASSESSMENT OF STUDENT LEARNING
Classroom Assessments Checklists, Rating Scales, and Rubrics
Performance Appraisal Basics
Measurement Reliability and Validity
Assessment of Sport Skills and Motor Abilities
Performance Management
Presentation transcript:

Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended item construction +/- Factors affecting response rate

Psychomotor Assessments HPER 3150 Dr. Ayers

DISCLAIMER Product vs Process Testing ch. 13

Guidelines for Sport Skills Testing and Motor Performance Tests acceptable reliability and validity simple to administer and take easy to understand instructions inexpensive & do not require extensive equipment reasonable time for preparation and administration

involve only one participant teacher must assess EACH student individually encourage correct form suitable difficulty interesting and meaningful exclude extraneous variables provide for accurate scoring

provide target scoring guidelines (if accuracy matters) sufficient trials yield diagnostic scores

Effective Testing Procedures Pretest Duties time, forms, procedures, instructions Testing Duties location, materials, cheating, safety, absences, make-ups Posttest Duties transcription, item analysis (practical value; construct validation), reporting, confidentiality

Issues in Skills Testing In addition to reliability and validity (the most important issues) Feasibility Testing method Objective skills tests? Alternative/authentic assessment? – See chapter 8

Skills Test Classification Objective Accuracy-based (targets) Repetitive Performance (wall-volley; process matters) Total Body Movement (focus; game-like or product-based?) Distance or Power (focus; game-like or product-based?)

IMPORTANT TEACHING POINT Teach force/speed BEFORE accuracy

Subjective Rating Scales Relative Rank-order value? +/- Absolute Evaluation against a fixed standard Using critical elements helps Common Errors Halo Effect (bias for/against performer) "Standard" Error (judges employ different perspectives) Central-tendency (hesitancy to assign extreme values) Which of these is like norm-referenced assessment?

Rating Scales Suggestions Develop well-constructed scales Train raters well Explain common rating errors to raters Permit ample time to observe performance If possible, use multiple raters

Developing Well-Constructed Scales State objectives in terms of observable behavior Select traits that determine success Define selected traits in observable behavior

Select and develop the rating instrument Define degrees of success Test and revise the rating scale Use the scale in an actual testing situation

Ability or Skill? Ability general, innate psychomotor trait (overarm throw) Skill specific, learned psychomotor capacity (vb spike) Specificity determined with concurrent validity (sport-specific)

Measurement Aspects of the Domain of Human Performance (abilities) Muscular Strength Speed Agility Anaerobic Power Flexibility Balance Kinesthetic perception Tell a friend one specific example of two different ability measures

Effective testing consists of... Including sport relevant variables Selecting reliable and valid tests Developing sport specific protocols Controlling test administration Maintaining individuals’ rights to respect Repeating the tests periodically Interpreting the results for the performance and interested parties (e.g., coaches, parents, etc.)

Purposes of Human Performance Testing and Analysis Selection Classification Diagnosis Tell another friend one specific example of how to use testing to do one of these

PsychomotorAssignment With a partner: Find 1 skill test and 1 ability psychomotor test that you think will be useful to you professionally and bring a copy to our next class Description, equipment, procedures, recording form We will select a few of these to conduct in lab on Friday, so choose carefully!

PsychomotorAssessments Explain why one thing you learned today will matter to you as a professional