Assessing Student Performance

Slides:



Advertisements
Similar presentations
Parts of a Lesson Plan Any format that works for you and your JTEs is ok… BUT! Here are some ideas that might help you set up your LP format. The ALTs.
Advertisements

Ed-D 420 Inclusion of Exceptional Learners. CAT time Learner-Centered - Learner-centered techniques focus on strategies and approaches to improve learning.
Performance Assessment
Assessment Adapted from text Effective Teaching Methods Research-Based Practices by Gary D. Borich and How to Differentiate Instruction in Mixed Ability.
Wortham: Chapter 2 Assessing young children Why are infants and Preschoolers measured differently than older children and adults? How does the demand for.
L2 program design Content, structure, evaluation.
Mary Jo Sariscsany Assessing Health- Related Fitness and Physical Activity 13 chapter.
Using Multiple Choice Tests for Assessment Purposes: Designing Multiple Choice Tests to Reflect and Foster Learning Outcomes Terri Flateby, Ph.D.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Chapter Fifteen Understanding and Using Standardized Tests.
(IN)FORMATIVE ASSESSMENT August Are You… ASSESSMENT SAVVY? Skilled in gathering accurate information about students learning? Using it effectively.
Education 3504 Week 3 reliability & validity observation techniques checklists and rubrics.
© 2008 McGraw-Hill Higher Education. All rights reserved. CHAPTER 16 Classroom Assessment.
C R E S S T / U C L A Improving the Validity of Measures by Focusing on Learning Eva L. Baker CRESST National Conference: Research Goes to School Los Angeles,
Essay Assessment Tasks
Quality Grading Physical Education August 29, 2013 Evaluating Student Achievement-OAISD Workshop Colleen Lewis,Ph.D and Ingrid Johnson, Ph.D. Grand Valley.
Classroom Assessment: Concepts and Applications Chapter 5: Summative Assessments.
David Steer Department of Geosciences The University of Akron Learning objectives and assessments Oct 2013.
Authentic Assessment Principles & Methods
Subject Matter Expert/Author: Assoc. Prof. Dr Rashid Johar (OUM) Faculty of Science and Foundation Studies Copyright © ODL Jan 2005 Open University Malaysia.
ASSESSMENT OF STUDENT LEARNING Manal bait Gharim.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Understanding Meaning and Importance of Competency Based Assessment
EDU 385 Education Assessment in the Classroom
Understanding the TerraNova Test Testing Dates: May Kindergarten to Grade 2.
Teaching Today: An Introduction to Education 8th edition
General Information Iowa Writing Assessment The Riverside Publishing Company, 1994 $39.00: 25 test booklets, 25 response sheets 40 minutes to plan, write.
Week 5 Lecture 4. Lecture’s objectives  Understand the principles of language assessment.  Use language assessment principles to evaluate existing tests.
Classroom Evaluation & Grading Chapter 15. Intelligence and Achievement Intelligence and achievement are not the same Intelligence and achievement are.
Evelyn Wassel, Ed.D. Summer  Skilled in gathering accurate information about students learning?  Using it effectively to promote further learning?
Performance-Based Assessment HPHE 3150 Dr. Ayers.
Authentic Assessment Kellie Dimmette CI Pretest on Evaluation Part I 1.C & D 2.B & C 3.T 4.Valid, reliable 5.T 6.T 7.T 8.A & B 9.C 10.B.
Grading and Analysis Report For Clinical Portfolio 1.
Formative assessment and effective feedback at Manor Lakes College
Effective Grading Strategies Alison Morrison-Shetlar Faculty Center for Teaching and Learning Adapted from the book Effective Grading by Barbara Walvoord.
Classroom Assessment (1) EDU 330: Educational Psychology Daniel Moos.
Lecture by: Chris Ross Chapter 7: Teacher-Designed Strategies.
Assessment and Testing
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Assessment. Workshop Outline Testing and assessment Why assess? Types of tests Types of assessment Some assessment task types Backwash Qualities of a.
Assessment Ice breaker. Ice breaker. My most favorite part of the course was …. My most favorite part of the course was …. Introduction Introduction How.
Stuart Birnbaum Department of Geological Sciences The University of Texas at San Antonio Learning objectives and assessments June 15, 2015.
Session 1 Introduction: Assessment & Evaluation Assessment & Evaluation.
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Category 1 Category 2Category 3Category.
Characteristics of Psychology Tests
NORM-REFERENCED VS. CRITERION-REFERENCED TESTS. Who are the best and worst runners? Times for the 100 meter dash: Runner 1 – Runner 2 – Runner.
Assessment Assessment is the collection, recording and analysis of data about students as they work over a period of time. This should include, teacher,
Educational Research Chapter 8. Tools of Research Scales and instruments – measure complex characteristics such as intelligence and achievement Scales.
Monitoring and Assessment Presented by: Wedad Al –Blwi Supervised by: Prof. Antar Abdellah.
 Teaching: Chapter 14. Assessments provide feedback about students’ learning as it is occurring and evaluates students’ learning after instruction has.
Grading based on student centred and transparent assessment of learning outcomes Tommi Haapaniemi
©2013, The McGraw-Hill Companies, Inc. All Rights Reserved Chapter 7 Assessing and Grading the Students.
Lecturer - Dr Justin Rami Assessment & Feedback Lecturer – Dr Justin Rami.
Testing & Measurement  Criterion-Referenced Tests.  Pre-tests  Post-tests  Norm-Referenced Tests.
Chapter 1 Assessment in Elementary and Secondary Classrooms
Designing Scoring Rubrics
C H A P T E R 3 Rubrics Chapter 3 Rubrics.
ASSESSMENT METHODS – Chapter 10 –.
Assessing Musical Behavior
SECTION 3 Grading Criteria
CHAPTER 3: Practical Measurement Concepts
Types of Tests.
Creating Analytic Rubrics April 27, 2017
ASSESSMENT OF STUDENT LEARNING
Learning About Language Assessment. Albany: Heinle & Heinle
COMPETENCIES & STANDARDS
Assessments TAP 1- Strand 5.
Understanding and Using Standardized Tests
Assessments-Purpose and Principles
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Assessing Student Performance OK Great Work! Assessing Student Performance Try this... Needs work

Performance Objective Given a unit of instruction, develop a valid, reliable, criterion referenced student assessment instrument that scores at least 70 points on the evaluation checksheet..

Distinguish among evaluation, measurement, and testing Enabling Objectives Distinguish among evaluation, measurement, and testing Differentiate between formative and summative assessment. Differentiate between criterion-referenced and norm-referenced assessment. Explain validity in student assessment. Explain reliability in student assessment. Plan for criterion-referenced assessment of student performance.

Why assess student performance? Assign grades Gauge student progress and award credit for task completion Improve instruction Motivate students to work Provide feedback to students

Basics Evaluation - the general process of estimating student progress toward achieving performance objectives Measurement - the use of a specific tool to estimate an outcome Testing - one specific form of evaluation that uses a measurement tool to formally evaluate student performance

Methods of Assessment Testing “Objective” “Subjective” Performance demonstration other than test Psychomotor Task Project Lab Skill Demonstration Higher-Level Cognitive Task Paper Portfolio Etc…

Testing is Either Formative or Summative

Formative Testing The process of using measurement tools to conduct evaluation for the purpose of IMPROVING student PERFORMANCE Student receives feedback of results Teacher considers results in planning subsequent instruction Grades are not recorded!!

Summative Testing The process of using measurement tools to conduct evaluation for the purpose of ASSIGNING student GRADES Student receives feedback of results Teacher considers results in planning subsequent instruction Grades are recorded

A Test Can Be Norm-Referenced or Criterion-Referenced

Norm-Referenced Test Measures student performance against other students Student scores better or worse than other students Competition is between student and peers Grade is based on location on “the curve” Best students get “A,” poorest students fail

Normal Curve On most measures of human behavior, graphing individual results will result in a “bell-shaped,” or normal curve Most individual scores will fall toward the middle (mean) Fewer scores will fall toward the upper and lower ends Average Scores Highest Scores Lowest Scores

Making Test Norm Referenced Make test intentionally difficult Average score should be about 50% Strong students should tend to score high and weak students should tend to score low Award As for highest scores, Fs for lowest scores, Cs for average scores

Criterion-Referenced Test Measures student performance against predetermined standards Student meets or does not meet the standard Competition is between the student and the skill, knowledge, or ability Grade is based on accomplishment Everybody can earn a passing grade if they meet the standard

Making Tests Criterion-Referenced Remember that a performance objective has a: Condition, Task, & Standard Criterion = Standard Write test items using performance objective standard statements and your test will be criterion-referenced Every objective  1 or more test items Every item  an objective Validity is assured

Characteristics of a Test Validity Reliability Objectivity Discrimination (applies to norm-referenced test only) Comprehensiveness “Score-Ability”

Validity A valid test measures: what it is intended to measure what the teacher intended for the students to learn what the teacher actually taught A valid test is FAIR

Questions about Validity Does the test actually measure what you intend it to measure? Did you teach the content and skills that are being tested? Does the test require the student to know or do something other than what you intended and/or taught? Does some aspect of the test prevent the student who may know the material from responding correctly?

Example of Problem in Validity You taught the names and uses of hand tools using lecture with overheads and handouts. But: On the test, you ask the students to describe how to maintain the tools in good condition. The problem is you taught one thing (names & uses) but tested knowledge of another (maintenance).

Another Example You taught the students to write resumes in the classroom and had them hand write their own resumes, but provided no computer instruction. But: You have them prepare their resumes on a computer and grade heavily on appearance. The problem is you are evaluating their word processing skills at least as much as their resume writing skills.

A Third Validity Problem You intended to teach the students how to repair a small engine. You taught the lesson in the classroom using overheads, chalkboard, and a teacher demonstration. The students never touched an engine. But: On test day, you give them a disassembled engine to reassemble. The problem is you thought you taught a psychomotor skill, really taught only cognitive content, but are testing the psychomotor skill you never taught

Reliability A reliable test provides accurate and consistent results Test reliability can be viewed from two perspectives: Student reliability Scorer reliability

Student Reliability Test items are readable and clear Instructions are simple and unambiguous Responses test only knowledge of the subject matter and not test wiseness, reading ability, agility, or other unrelated trait

Scorer Reliability Items can be scored consistently Same scorer would produce similar results on repeated evaluations Different scorers would produce similar results if working independently

Objectivity Objectively written Objectively administered items are reliable items are valid Objectively administered Objectively Scored

Discrimination Important ONLY for norm-referenced testing Test separates more knowledgeable students from less knowledgeable students Discriminating test is intended to reward best students and punish weakest students Ideal for using normal curve to interpret score

Comprehensiveness Assessment covers or samples all of the content Every performance objective is represented Multiple items address each objective

Score-Ability Test has scorer reliability Scoring is easily completed “Objective” items are easiest to score “Subjective” items can be scored “objectively”

Review Evaluation vs. Measurement vs. Testing Criterion-Referenced or Norm-Referenced Formative or Summative Characteristics of a Test: Validity Reliability Objectivity Discrimination (applies to norm-referenced test only) Comprehensiveness “Score-Ability”

The Answer Rare to find an educator who claims to have the right answer, but…in Career and Technical Education Testing should be BOTH normative and summative AND Testing should be criterion-referenced

So What? Assessment can be positive or threatening Do not use assessment as punishment or as a threat Use assessment to improve student performance and instruction Assign grades fairly: validly, reliably, objectively, and comprehensively