Classroom Assessment: Bias

Slides:



Advertisements
Similar presentations
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
Advertisements

The ABCs of Assessment Improving Student Learning Through New Approaches to Classroom Assessment.
VALUE – ADDED 101 Ken Bernacki and Denise Brewster.
Validity in Action: State Assessment Validity Evidence for Compliance with NCLB William D. Schafer, Joyce Wang, and Vivian Wang University of Maryland.
Laura M. B. Kramer, Ph.D. December, 2009 Copyright © 2009 Mississippi Department of Education MAAECF Teacher Training Making Sure the MAAECF.
Issues of Technical Adequacy in Measuring Student Growth for Educator Effectiveness Stanley Rabinowitz, Ph.D. Director, Assessment & Standards Development.
Issues Related to Assessment with Diverse Populations
Assessment: Reliability, Validity, and Absence of bias
Examining Differential Item Functioning of "Insensitive" Test Items Examining Differential Item Functioning of "Insensitive" Test Items Juliya Golubovich,
Issues in Assessment Mathematics Assessment and Intervention.
VALIDITY.
© 2008 McGraw-Hill Higher Education. All rights reserved. 1 CHAPTER 5 Sociocultural Diversity.
New Hampshire Enhanced Assessment Initiative: Technical Documentation for Alternate Assessments Alignment Inclusive Assessment Seminar Brian Gong Claudia.
Chapter 6 Ethical and Legal Issues in Psychoeducational Assessment
Understanding Validity for Teachers
Chapter 4. Validity: Does the test cover what we are told (or believe)
1 Evaluating Psychological Tests. 2 Psychological testing Suffers a credibility problem within the eyes of general public Two main problems –Tests used.
Experimental Design The Gold Standard?.
REFLECTING ON ASSESSMENT DESIGN. INTRODUCTION & PURPOSE.
Managing Diverse Employees in a Multicultural Environment chapter five Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin.
Cara Cahalan-Laitusis Operational Data or Experimental Design? A Variety of Approaches to Examining the Validity of Test Accommodations.
Confidential and Proprietary. Copyright © 2010 Educational Testing Service. All rights reserved. 10/7/2015 A Model for Scaling, Linking, and Reporting.
A MULTIDIMENSIONAL APPROACH TO THE IDENTIFICATION OF TEST FAIRNESS EXPLORATION OF THREE MULTIPLE-CHOICE SSC PAPERS IN PAKISTAN Syed Muhammad Fahad Latifi.
1 Bias and Sensitivity Review of Items for the MSP/HSPE/EOC August, 2012 ETS Olympia 1.
An Overview of Virginia Standards of Learning Item and Test Development.
Performance and Portfolio Assessment. Performance Assessment An assessment in which the teacher observes and makes a judgement about a student’s demonstration.
C R E S S T / U C L A UCLA Graduate School of Education & Information Studies Center for the Study of Evaluation National Center for Research on Evaluation,
Validity Validity is an overall evaluation that supports the intended interpretations, use, in consequences of the obtained scores. (McMillan 17)
McGraw-Hill/Irwin © 2012 The McGraw-Hill Companies, Inc. All rights reserved. Obtaining Valid and Reliable Classroom Evidence Chapter 4:
Ch 9 Internal and External Validity. Validity  The quality of the instruments used in the research study  Will the reader believe what they are readying.
Vincent Briseno EDTC  In a technology-enhanced environment, the MTT demonstrates knowledge of:  Instructional design  Development  Assessment.
Catholic College at Mandeville Assessment and Evaluation in Inclusive Settings Sessions 3 & /14/2015 Launcelot I. Brown Lisa Philip.
PARCC Bias and Sensitivity Review
Diversity in Education. Diversity Being different Unlikeness Variety Multiformity Point of difference Individuals representing more than one national.
1. Chapter Three Cultural and Linguistic Diversity and Exceptionality 2.
BY MADELINE GELMETTI INCLUDING STUDENTS WITH DISABILITIES AND ENGLISH LEARNERS IN MEASURES OF EDUCATOR EFFECTIVENESS.
What Do Teachers Need to Know About Assessment? Professor Norland ED342.
GENDER BIAS AND FAIRNESS Lisa Dunn and Christina Fernandez February 28, 2009.
Multicultural Goals & Characteristics ED 294 Introduction to Multicultural Education.
Aim: What is audience profiling?
Perception and Communication
SECTION 3 Grading Criteria
VALIDITY by Barli Tambunan/
Sampling Procedures Cs 12
What Do Teachers Need to Know About Assessment?
Chapter 3: Legal, Ethical, and Diversity Foundations and Perspectives in Assessment ONLINE MODULE.
Correlation & Experimentation
Cultural and Linguistic Diversity and Exceptionality
International & Diversity Subcommittee November 3, 2017
METHODOLOGY AND MEASUREMENT ASPECTS
Journalism 614: Reliability and Validity
Classroom Assessment Validity And Bias in Assessment.
the BIG 3 OVERVIEW OF CRITERIA FOR EVALUATING EDUCATIONAL ASSESSMENT
Chapter 9 Analyzing Bias and Assuring Fairness
AREA OF STUDY 2: INTELLIGENCE & PERSONALITY
DOMESTIC VIOLENCE FAIRNESS AND CULTURAL CONSIDERATIONS
Test construction 1. Testing in a cross-cultural context
Human Diversity Why learn about human diversity?
Classroom Assessment Ways to improve tests.
Standards and Assessment Alternatives
Samuel O. Ortiz, Ph.D. Professor St. John’s University
Learning about the Item Review Process: An Overview
Chapter 2: Using Your Helping Skills with Diverse Populations
William D. Schafer, Joyce Wang, and Vivian Wang University of Maryland
UMC Inclusion Training
Teaching Students with Other Special Learning Needs
jot down your thoughts re:
Field procedures and non-sampling errors
jot down your thoughts re:
EDUC 2130 Quiz #10 W. Huitt.
Presentation transcript:

Classroom Assessment: Bias Characteristics of a test that offend or unfairly penalize examinees because of their membership in particular groups or populations: Religion. socioeconomic status. race, ethnicity or gender. Geography. Bias can result from offensiveness when the content of the test (or items) offend certain classes of individuals. E.g., when the items or tasks reflect stereotypes. When menial tasks are depicted with women, while more important tasks are depicted with men. Such bias compromises the validity inferences based on the assessment. Bias can result from unfair penalization when particular classes of students’ performance on the assessment suffers because the content (or tasks required) in the assessment is unfamiliar or foreign to them (the students). E.g., Higher socioeconomic students may hold an advantage when an assessment relies to heavily on experiences generally reserved for those families who can afford the experience. Vacations to historic sites. Access to cable TV. Unfair penalization results only when it is not the students poor accomplishment that leads to poorer performance.

Further Considerations of Bias and Validity Two kinds of bias (from Howe, 1995) Predictive and Criterion Predictive bias: External: Concerned with the question, “How well does the test predict the performance it is design to measure?” Internal: Concerned with how well items in the test predict overall performance on the test.

More on bias and validity Criterion bias: Across criteria bias: When the performance measured by the test is overly emphasized relative to other (often desirable) criteria. Within criterion bias: When the performance being measured by the test is (partially) differentially defined by the test itself.

Fairness is Required in All Assessments Assessments need to be fair to students from all ethnic and socioeconomic backgrounds. Is the language in word problems interpreted by students of all backgrounds? Does the assessment accommodate students with disabilities? Is the assessment free of artifacts that perpetuate racial, ethnic, or gender stereotypes?

Poor Performance  Bias Just because different classes of individuals perform differently on an assessment does not mean that the assessment is biased against one or more of the groups. All assessments are biased against ignorance. Poor performance can indicate poor achievement.

Strategies for Eliminating Bias in Assessments Judgmental Approaches. Review panels. Judgements regarding specific items and overall test content. Differential Item Functioning (DIF) procedures. Being sensitive to the possibility of bias in classroom assessments. Judgmental Approaches Bias review panels. Bias review panels are routinely employed by crafters of high stakes assessments (such as the EOGs and EOCs). These panels are also routinely employed by commercial test developers. Item and test content review. See page 73 in Popham for a typical instruction to judges to review individual items. See page 74 in Popham for a typical instruction to judges to review the overall content of an assessment. Differential Item Functioning A set of complex statistical procedures. Looks at the relationships between performance on items (or overall assessment) and class membership. DIF does not necessarily imply bias. Research has shown that on math assessments, woman perform better than males on easy items, while men perform better than woman on difficult items. Being sensitive to bias Recognize that other groups of individuals may not see the world the way you do. They may not learn, study, behave, the way you do. Particularly in performance and portfolio assessments.

End