Fairness, Accuracy, & Consistency in Assessment

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

USING THE FRAMEWORK FOR TEACHING TO SUPPORT EFFECTIVE TEACHER EVALUATION Mary Weck, Ed. D Danielson Group Member.
The Teacher Work Sample
Writing Assessment Plans for Secondary Education / Foundations of Educations Department 9 th Annual Assessment Presentation December 3, 2013 Junko Yamamoto.
Value Added Assessment RAD Reading Assessment Teacher Moderation Greg Miller Supervisor of Assessment Lynda Gellner Literacy Consultant Juanita Redekopp.
Learning Outcomes, Authentic Assessments and Rubrics Erin Hagar
HOUSTON EMPLOYEE ASSESSMENT AND REVIEW (HEAR) PROCESS INFORMATION SESSION NON-SUPERVISOR For more information, visit
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Presented by Exploring edTPA with READ Exploring edTPA with READ Ellen Dobson and Diana Lys May 2, 2014.
Determining the Validity and Reliability of Key Assessments and Other Questions Regarding Key Assessments.
Assessment Rubrics Los Angeles City College Assessment Team.
EQuIP Rubric and Quality Review Curriculum Council September 26, 2014.
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Evaluating tests and examinations What questions to ask to make sure your assessment is the best that can be produced within your context. Dianne Wall.
Assessment: Reliability, Validity, and Absence of bias
Consistency of Assessment
The Program Review Process: NCATE and the State of Indiana Richard Frisbie and T. J. Oakes March 8, 2007 (source:NCATE, February 2007)
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Grade 12 Subject Specific Ministry Training Sessions
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
FLCC knows a lot about assessment – J will send examples
Accountability Assessment Parents & Community Preparing College, Career, & Culturally Ready Graduates Standards Support 1.
CLASSROOM ASSESSMENT FOR STUDENT LEARNING
A Strategy for Assessment PORTFOLIOS Jones, M., Shelton, M. (2011). Developing Your Portfolio--Enhancing Your Learning and Showing Your Stuff: A Guide.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
New York State Education Department Understanding The Process: Science Assessments and the New York State Learning Standards.
Quality in language assessment – guidelines and standards Waldek Martyniuk ECML Graz, Austria.
Let’s Get S.T.A.R.T.ed Standards Transformation and Realignment in Thompson.
January 29, 2010ART Beach Retreat ART Beach Retreat 2010 Assessment Rubric for Critical Thinking First Scoring Session Summary ART Beach Retreat.
Compass: Module 2 Compass Requirements: Teachers’ Overall Evaluation Rating Student Growth Student Learning Targets (SLTs) Value-added Score (VAM) where.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Improving relevant standards. Aims and objectives Familiarize ourselves with best practice standards of teaching To think about how we can implement the.
 This prepares educators to work in P-12 schools (1)  It provides direction (1)  It is knowledge-based, articulated, shared, coherent, consistent with.
Evaluator Workshop for Personnel Evaluating Teachers Rhode Island Model The contents of this training were developed under a Race to the Top grant from.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
THE DANIELSON FRAMEWORK. LEARNING TARGET I will be be able to identify to others the value of the classroom teacher, the Domains of the Danielson framework.
Classroom Assessment for Student Learning March 2009 Assessment Critiquing.
Building Assessment Literacy in Michigan through Quality Common Assessment Development.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Understanding Teachers Standards. Objectives of the session To develop an understanding of the teachers standards To start thinking about the relevant.
LEAP in School Staff. Training Objectives  Understand the changes to LEAP for  Have questions answered.
Clinical Supervision Foundations Module Seven Counselor Development.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
What is grading? What is its purpose? What does it represent? How should it be done?
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
2011–2012 Holistic Rating Training Requirements Texas Education Agency Student Assessment Division.
1 Scoring Provincial Large-Scale Assessments María Elena Oliveri, University of British Columbia Britta Gundersen-Bryden, British Columbia Ministry of.
Candidate Assessment of Performance CAP The Evidence Binder.
Intervention and Support Inclusion Questions. Early and Strategic  How does the school provide purposeful early intervention and support to lift the.
Candidate Assessment of Performance CAP The Evidence Binder.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Learning Goals Development & Assessment The Basics of Goals-Based Course Design.
Designing Quality Assessment and Rubrics
EVALUATING EPP-CREATED ASSESSMENTS
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
Students as Self Assessors Teachers as Focused Coaches
The Year of Core Instruction
Creating Analytic Rubrics April 27, 2017
Students as Self Assessors Teachers as Focused Coaches
Georgia Department of Education
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Aligning Academic Review and Performance Evaluation (AARPE)
Presentation transcript:

Fairness, Accuracy, & Consistency in Assessment NCATE’s recommendations to reduce bias and ensure fairness to students

Fairness “Assess what’s been taught” Candidates should be aware of the knowledge, skills, and dispositions which are measured in the assessments. Was it taught? Curriculum map – shows where students learn and practice what is assessed Do students understand the expectations? Syllabi containing instructions and timing of assessments Rubrics/scoring guides shared with students

Fairness According to these guidelines, are your program’s assessments FAIR? Is your curriculum map up to date? Does the curriculum map link to all professional standards and GSE standards and dispositions? Does the curriculum map indicate any gaps? Do syllabi indicate timing of assessments? Are rubrics/scoring guides shared with students when assigning the work?

Accuracy “Assessments measure what they say they measure” Assessments should be aligned with standards and proficiencies they are designed to measure Are assessments aligned with standards? Content match Complexity match Appropriate degree of difficulty Is there corroborating evidence? Is there field input on the assessment? Ex – paper and pencil test may be ok for content knowledge, but not classroom management skills. Classroom observation may not be best to judge content knowledge.

Accuracy According to these guidelines, are your program’s assessments ACCURATE? Do your program’s assessment types (test, observation) match what is being assessed? Dispositions = observations Skills = performance assessment Which of your assessments “validate” (relate to) other assessments? Ex. - Work sample to lesson plan assignment Have you had input from working professionals?

Consistency “Assessments produce dependable, trustworthy results” The assessment results should be reliably consistent regardless of time and rater Are scoring tools sufficiently descriptive? Language to differentiate between components and between performance levels Clear descriptors that promote accurate scoring Students can tell from scoring tool why they were rated at a certain level Are raters trained? Agreement between “what a 3 in lesson planning” looks like Raters understand of the consequences of final scores Programs have a plan to support/address students with insufficient performance that alleviate rater-pressure

Consistency According to these guidelines, are your program’s assessments CONSISTENT? Does your program consistently use descriptive rubrics for major assignments and performance observations? Does your program have regular rater training? Has your program engaged in rubric calibration and moderation activities? Are raters aware of how their assessment’s scores affect the student? Are raters aware of how their assessment’s scores contribute to program review? What is the plan to support struggling students?

Avoiding Bias “removing contextual and cultural bias from assessments” The assessment itself and assessment context should be analyzed for factors that would affect performance. Are clear assessments administered in the proper environment? Location/equipment Clear instructions/questions Have assessments been reviewed for bias? Racial/ethnic/cultural stereotypes Disability resource center review Assignments that favor one group over another

Avoiding Bias According to these guidelines, are your program’s assessments BIAS-FREE? Have all key assessments been reviewed for clarity of expectations? Have all your assignments been reviewed for accessibility? Have all your assignments been scrutinized for cultural bias and stereotypes? Has your program analyzed student outcomes according to sub-groups to determine if consistent scoring bias exists?

GSE Rubric Guidelines Developed by the Assessment Committee - 2010 4 levels of competency: Unsatisfactory, Emerging, Proficient, Exemplary In ascending order from left to right If numbers are used: 1-4 from left to right   NEEDS IMPROVEMENT (1) EMERGING (2) PROFICIENT (3) EXEMPLARY (4) States values, beliefs and assumptions about priorities for resource allocation. Statement of values, beliefs and assumptions is absent Statement of values, beliefs, and assumptions is vague, too general or contrived. Statement of values, beliefs, and assumptions is clearly defined and specific. Statement of values, beliefs, and assumptions is clear, specific, convincing, and includes personal experience that promotes clarity.

Rubric Moderation Process of strengthening consistency Develops inter-rater reliability through shared examination and discussion of student work. Involves all/many raters Process Recruit raters for a two-hour session Provide 4 samples of work Provide rubric and have raters “score” each sample Discuss as a group why the raters chose the scores Debrief about what was learned, what remains unanswered.

Rubric Calibration Process of strengthening consistency Develops inter-rater reliability by setting expectations of what the scores mean regarding student work. Involves all/many raters Process Recruit raters for a two-hour session Provide 4 samples of work that have been pre-scored (anchor papers), at least one at a low, medium, and high level of performance Discuss rubric areas and expectations for each level and component before scoring begins Provide rubric and have raters “score” each sample with your discussion in mind. Have raters compare the scores they assigned to the “anchor” papers Debrief about what was learned, what remains unanswered.

Next Steps How is your program doing in providing fair, accurate, consistent, bias-free assessments to students? What work needs to be done in your program to ensure quality assessments are used? What do you need to be able to accomplish this?