Review: Performance-Based Assessments Performanc-based assessment Real-life setting H.O.T.S. Techniques: Observation Individual or Group Projects Portfolios.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Assessing Student Performance
1 COMM 301: Empirical Research in Communication Kwan M Lee Lect4_1.
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Gary D. Borich Effective Teaching Methods 6th Edition
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Assessment/tests/measures Semantics??. “Definitions” n “Assessment” -- the gathering and integration of psychological-related data for the purpose of.
Chapter 8 Developing Written Tests and Surveys Physical Fitness Knowledge.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 TESTS Purposes of the test Type of test Objectives of the test Content.
Formative and Summative Evaluations
Principles of High Quality Assessment
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Assessing and Evaluating Learning
Standardized Test Scores Common Representations for Parents and Students.
Chapter 9 Descriptive Research. Overview of Descriptive Research Focused towards the present –Gathering information and describing the current situation.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 9 Subjective Test Items.
Now that you know what assessment is, you know that it begins with a test. Ch 4.
Formative and Summative Assessment
Essay Assessment Tasks
Revision Sampling error
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Questionnaires and Interviews
Chap. 3 Designing Classroom Language Tests
Building Effective Assessments. Agenda  Brief overview of Assess2Know content development  Assessment building pre-planning  Cognitive factors  Building.
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Standardization and Test Development Nisrin Alqatarneh MSc. Occupational therapy.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
Evaluating a Research Report
Beyond Multiple Choice: Using Performance and Portfolio Assessments to Evaluate Student Learning.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
EDU 385 Education Assessment in the Classroom
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
Student assessment AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
ASSESMENT IN OPEN AND DISTANCE LEARNING Objectives: 1.To explain the characteristic of assessment in ODL 2.To identify the problem and solution of assessment.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
© 2010 Cengage Learning. Atomic Dog is a trademark used herein under license. All rights reserved. Chapter 4 Analyzing Jobs.
EDU 385 Classroom Assessment Session 6 Preparing and Using Achievement Tests.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Educational Psychology: Theory and Practice Chapter 10 & 13 Motivating Students to Learn & Assessing Student Learning.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Assessing in the Cognitive Domain KNR 341 Chapter 8 Dr. Henninger.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
 What are ‘Constructed-Response’ test questions?  How can I write excellent ‘constructed response’ test questions?  How well do ‘constructed response’
Section 29.1 Marketing Research Chapter 29 conducting marketing research Section 29.2 The Marketing Survey.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Designing Scoring Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Review: Performance-Based Assessments Performanc-based assessment Real-life setting H.O.T.S. Techniques: Observation Individual or Group Projects Portfolios Performances Student Logs or Journals

Developing performance-based assessments Determining the purpose of assessment Deciding what constitutes student learning Selecting the appropriate assessment task Setting performance criteria

Review: Grading Grading process: Making grading fair, reliable, and valid Determine defensible objectives Ability group students Construct tests which reflect objectivity No test is perfectly reliable Grades should reflect status, not improvement Do not use grades to reward good effort Consider grades as measurements, not evaluations Objectives of instruction Test selection and administration Results compared to standards Final grades

Cognitive Assessments Physical Fitness Knowledge HPER 3150 Dr. Ayers Physical Fitness Knowledge

Test Planning Types Mastery (driver’s license) Meet minimum requirements Achievement (mid-term) Discriminate among levels of accomplishment

Table of Specifications (content-related validity) Content Objectives history, values, equipment, etiquette, safety, rules, strategy, techniques of play Educational Objectives (Blooms’ taxonomy, 1956) knowledge, comprehension, application, analysis, synthesis, evaluation

Table of Specifications for a 33 Item Exercise Physiology Concepts Test (Ask-PE, Ayers, 2003) T of SPECS-E.doc

Test Characteristics When to test Often enough for reliability but not too often to be useless How many questions (p guidelines) More items yield greater reliability Format to use (p. 147 guidelines) Oral (NO), group (NO), written (YES) Open book/note, take-home Advantages: ↓anxiety, ask more application Qs Disadvantages: ↓ incentive to prepare, uncertainty of who does work

Test Characteristics Question types Semi-objective short-answer completion mathematical Objective t/f Matching multiple-choice Classification Essay

Semi-objective Questions Short-answer, completion, mathematical When to use (factual & recall material) Weaknesses Construction Recommendations (p. 151) Scoring Recommendations (p. 152)

Objective Questions True/False, matching, multiple-choice When to use (M-C: MOST IDEAL) FORM7 (B,E).doc Pg : M-C guidelines Construction Recommendations (p ) Scoring Recommendations (p )

Figure 8.1 The difference between extrinsic and intrinsic ambiguity (A is correct) Too easy Extrinsic ambiguity (weak Ss miss) Intrinsic Ambiguity (all foils = appealing)

Cognitive Assessments I Explain one thing that you learned today to a classmate

Review: Cognitive Assessments I Test types MasteryAchievement Table of Specifications Identify content, assign cognitive demands, weight areas Provides support for what type of validity? Questions Types Semi-objective: short-answer, completion, mathematical Objective: t/f, match, multiple-choice Which is desirable: intrinsic/extrinsic ambiguity

Essay Questions When to use (definitions, interpretations, comparisons) Weaknesses Scoring Objectivity Construction & Scoring recommendations (p )

Administering the Written Test Before the Test During the Test After the Test

Characteristics of “Good” Tests Reliable Valid Average difficulty Discriminate Gotten correct by more knowledgeable students Missed by less knowledgeable students Time consuming to write

Quality of the Test Reliability Role of error in an observed score Error sources in written tests Inadequate sampling Examinee’s mental/physical condition Environmental conditions Guessing Changes in the field (dynamic variable being measured)

Quality of the Test Validity CONTENT key for written tests Is critical information assessed by a test? T of Specs helps support validity Overall Test Quality Based on individual item quality (steps 1-8, pg )

Item Analysis Used to determine quality of individual test items Item Difficulty Percent answering correctly Item Discrimination How well the item "functions“ Also how “valid” the item is based on the total test score criterion

Item Difficulty 0 (nobody got right) – 100 (everybody got right) Goal=50%

Item Discrimination <20% & negative (poor)20-40% (acceptable) Goal > 40%

Figure 8.4 The relationship between item discrimination and difficulty Moderate difficulty maximizes discrimination

Sources of Written Tests Professionally Constructed Tests (FitSmart, Ask-PE) Textbooks (McGee & Farrow, 1987) Periodicals, Theses, and Dissertations

Questionnaires Determine the objectives Delimit the sample Construct the questionnaire Conduct a pilot study Write a cover letter Send the questionnaire Follow-up with non-respondents Analyze the results and prepare the report

Constructing Open-Ended Questions Advantages Allow for creative answers Allow for respondent to detail answers Can be used when possible categories are large Probably better when complex questions are involved Disadvantages Analysis is difficult because of non-standard responses Require more respondent time to complete Can be ambiguous Can result in irrelevant data

Constructing Closed-Ended Questions Advantages Easy to code Result in standard responses Usually less ambiguous Ease of response relates to increased response rate Disadvantages Frustration if correct category is not present Respondent may chose inappropriate category May require many categories to get ALL responses Subject to possible recording errors

Factors Affecting the Questionnaire Response Cover Letter Be brief and informative Ease of Return You DO want it back! Neatness and Length Be professional and brief Inducements Money and flattery Timing and Deadlines Time of year and sufficient time to complete Follow-up At least once (2 about the best response rate you will get)

The BIG Issues in Questionnaire Development Reliability Consistency of measurement Stability reliability: 2-4 wks between administrations Validity Truthfulness of response Good items, expert reviewed, pilot testing, confidentiality/anonymity Representativeness of the sample To whom can you generalize?

Cognitive Assessments II Ask for clarity on something that challenged you today