Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects.

Slides:



Advertisements
Similar presentations
An Introduction to Test Construction
Advertisements

Assessing Student Performance
How to Make a Test & Judge its Quality. Aim of the Talk Acquaint teachers with the characteristics of a good and objective test See Item Analysis techniques.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Item Writing Techniques KNR 279. TYPES OF QUESTIONS Closed ended  Checking yes/no, multiple choice, etc.  Puts answers in categories  Easy to score.
Constructing Exam Questions Dan Thompson & Brandy Close OSU-CHS Educational Development-Clinical Education.
Chapter 8 Developing Written Tests and Surveys Physical Fitness Knowledge.
ASSESSMENT IN EDUCATION ASSESSMENT IN EDUCATION. Copyright Keith Morrison, 2004 TESTS Purposes of the test Type of test Objectives of the test Content.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
Copyright 2001 by Allyn and Bacon Classroom Evaluation & Grading Dr.Bill Bauer EDUC 202.
Principles of High Quality Assessment
Measuring Human Performance. Introduction n Kirkpatrick (1994) provides a very usable model for measurement across the four levels; Reaction, Learning,
Assessing and Evaluating Learning
Chapter 9 Descriptive Research. Overview of Descriptive Research Focused towards the present –Gathering information and describing the current situation.
Validity and Reliability Neither Valid nor Reliable Reliable but not Valid Valid & Reliable Fairly Valid but not very Reliable Think in terms of ‘the purpose.
Revision Sampling error
Chapter 10 Questionnaire Design Chapter Objectives explain why it is important for managers or business researchers to know how to design good questionnaires.
Business and Management Research
TEXAS TECH UNIVERSITY HEALTH SCIENCES CENTER SCHOOL OF PHARMACY KRYSTAL K. HAASE, PHARM.D., FCCP, BCPS ASSOCIATE PROFESSOR BEYOND MULTIPLE CHOICE QUESTIONS.
Educational Psychology: Theory and Practice Chapter 13 Assessing Student Learning.
1 Development of Valid and Reliable Case Studies for Teaching, Diagnostic Reasoning, and Other Purposes Margaret Lunney, RN, PhD Professor College of.
Questionnaires and Interviews
Classroom Assessment and Grading
Chapter 8 Measuring Cognitive Knowledge. Cognitive Domain Intellectual abilities ranging from rote memory tasks to the synthesis and evaluation of complex.
Classroom Assessments Checklists, Rating Scales, and Rubrics
CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CONSTRUCTING OBJECTIVE TEST ITEMS: MULTIPLE-CHOICE FORMS CHAPTER 8 AMY L. BLACKWELL JUNE 19, 2007.
T 7.0 Chapter 7: Questioning for Inquiry Chapter 7: Questioning for Inquiry Central concepts:  Questioning stimulates and guides inquiry  Teachers use.
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
Assessment and Evaluation TCH 347 Social Studies Methods Department of Teacher Education Shippensburg University Han Liu, Ph. D.
The Analysis of the quality of learning achievement of the students enrolled in Introduction to Programming with Visual Basic 2010 Present By Thitima Chuangchai.
EDU 385 Education Assessment in the Classroom
Session 2 Traditional Assessments Session 2 Traditional Assessments.
Teaching Today: An Introduction to Education 8th edition
ASSESMENT IN OPEN AND DISTANCE LEARNING Objectives: 1.To explain the characteristic of assessment in ODL 2.To identify the problem and solution of assessment.
© 2010 Cengage Learning. Atomic Dog is a trademark used herein under license. All rights reserved. Chapter 4 Analyzing Jobs.
Performance-Based Assessment Authentic Assessment
EDU 385 Classroom Assessment Session 6 Preparing and Using Achievement Tests.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Educational Psychology: Theory and Practice Chapter 10 & 13 Motivating Students to Learn & Assessing Student Learning.
Common Formative Assessments for Science Monica Burgio Daigler, Erie 1 BOCES.
Assessment Specifications Gronlund, Chapter 4 Gronlund, Chapter 5.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Assessing in the Cognitive Domain KNR 341 Chapter 8 Dr. Henninger.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Review: Cognitive Assessments II Ambiguity (extrinsic/intrinsic) Item difficulty/discrimination relationship Questionnaires assess opinions/attitudes Open-/Close-ended.
Test Question Writing Instructor Development ANSF Nurse Training Program.
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Review: Performance-Based Assessments Performanc-based assessment Real-life setting H.O.T.S. Techniques: Observation Individual or Group Projects Portfolios.
Assessment Issues Presented by Jeffrey Oescher Southeastern Louisiana University 4 January 2008.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 6 Construction of Knowledge Tests.
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
 What are ‘Constructed-Response’ test questions?  How can I write excellent ‘constructed response’ test questions?  How well do ‘constructed response’
Section 29.1 Marketing Research Chapter 29 conducting marketing research Section 29.2 The Marketing Survey.
Assessment in Education ~ What teachers need to know.
Copyright © Springer Publishing Company, LLC. All Rights Reserved. DEVELOPING AND USING TESTS – Chapter 11 –
Writing Selection Items
Designing Scoring Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
Chapter 6: Checklists, Rating Scales & Rubrics
Classroom Assessments Checklists, Rating Scales, and Rubrics
Learning About Language Assessment. Albany: Heinle & Heinle
Concepts in Tests and Measurements
Business and Management Research
Learning Assessment Learning Teaching Dr. Md. Mozahar Ali
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Review: Alternative Assessments Alternative/Authentic assessment Real-life setting Performance based Techniques: Observation Individual or Group Projects Portfolios Exhibitions Student Logs or Journals

Developing alternative assessments Determine purpose Define the target Select the appropriate assessment task Set performance criteria Determine assessment quality

Review: Grading Grading process: Making grading fair, reliable, and valid Determine defensible objectives Ability group students Construct tests which reflect objectivity No test is perfectly reliable Grades should reflect status, not improvement Do not use grades to reward good effort Consider grades as measurements, not evaluations Objectives of instruction Test selection and administration Results compared to standards Final grades

Cognitive Assessments Physical Fitness Knowledge HPER 3150 Dr. Ayers Physical Fitness Knowledge

Test Planning Types Mastery (driver’s license) Achievement (mid-term)

Table of Specifications (content-related validity) Content Objectives history, values, equipment, etiquette, safety, rules, strategy, techniques of play Educational Objectives (Blooms’ taxonomy, 1956) knowledge, comprehension, application, analysis, synthesis, evaluation

Table of Specifications for a 33 Item Exercise Physiology Concepts Test (Ask-PE, Ayers, 2003) T of SPECS-E.doc

Test Characteristics When to test Often enough for reliability but not too often to be useless How many questions (p guidelines) More items yield greater reliability Format to use (p. 186 guidelines) Oral (NO), group (NO), written (YES) Open book/note, take-home: (dis)advantages of both Question types Semi-objective (short-answer, completion, mathematical) Objective (t/f, matching, multiple-choice, classification) Essay

Semi-objective Questions Short-answer, completion, mathematical When to use (factual & recall material) Weaknesses Construction Recommendations (p. 190) Scoring Recommendations

Objective Questions True/False, matching, multiple-choice When to use (M-C: MOST IDEAL) FORM7 (B,E).doc Pg : M-C guidelines Construction Recommendations (p ) Scoring Recommendations

Cognitive Assessments I Explain one thing that you learned today to a classmate

Review: Cognitive Assessments I Test types MasteryAchievement Table of Specifications (value, use, purpose) Questions Types Semi-objective: short-answer, completion, mathematical Objective: t/f, match, multiple-choice Essay (we did not get this far)

Figure 10.1 The difference between extrinsic and intrinsic ambiguity (A is correct) Too easy Extrinsic ambiguity (weak Ss miss) Intrinsic Ambiguity (all foils = appealing)

Essay Questions When to use (definitions, interpretations, comparisons) Weaknesses Scoring Objectivity Construction & Scoring recommendations (p )

Administering the Written Test Before the Test During the Test After the Test

Characteristics of Good Test Items Leave little to "chance" Reliable Relevant Valid Average difficulty Discriminate Gotten correct by more knowledgeable students Missed by less knowledgeable students Time consuming to write

Quality of the Test Reliability and Validity Overall Test Quality Individual Item Quality

Item Analysis Used to determine quality of individual test items Item Difficulty Percent answering correctly Item Discrimination How well the item "functions“ Also how “valid” the item is based on the total test score criterion

Item Difficulty 0 (nobody got right) – 100 (everybody got right) Goal=50%

Item Discrimination <20% & negative (poor)20-40% (acceptable) Goal > 40%

Figure 10.4 The relationship between item discrimination and difficulty

Sources of Written Tests Professionally Constructed Tests (FitSmart, Ask-PE) Textbooks (McGee & Farrow, 1987) Periodicals, Theses, and Dissertations

Questionnaires Determine the objectives Delimit the sample Construct the questionnaire Conduct a pilot study Write a cover letter Send the questionnaire Follow-up with non-respondents Analyze the results and prepare the report

Constructing Open-Ended Questions Advantages Allow for creative answers Allow for respondent to detail answers Can be used when possible categories are large Probably better when complex questions are involved Disadvantages Analysis is difficult because of non-standard responses Require more respondent time to complete Can be ambiguous Can result in irrelevant data

Constructing Closed-Ended Questions Advantages Easy to code Result in standard responses Usually less ambiguous Ease of response Disadvantages Frustration if correct category is not present Respondent may chose inappropriate category May require many categories to get ALL responses Subject to possible recording errors

Factors Affecting the Questionnaire Response Cover Letter Be brief and informative Ease of Return You DO want it back! Neatness and Length Be professional and brief Inducements Money and flattery Timing and Deadlines Time of year and sufficient time to complete Follow-up At least once (2 about the best response rate you will get)

The BIG Issues in Questionnaire Development Reliability Consistency of measurement Validity Truthfulness of response Representativeness of the sample To whom can you generalize?

Cognitive Assessments II Ask for clarity on something that challenged you today