Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,

Slides:



Advertisements
Similar presentations
Writing an NCATE/IRA Program Report
Advertisements

Performance Assessment
Fairness, Accuracy, & Consistency in Assessment
Cross Cultural Research
General Information --- What is the purpose of the test? For what population is the designed? Is this population relevant to the people who will take your.
Training Module for Cooperating Teachers and Supervising Faculty
Determining the Validity and Reliability of Key Assessments and Other Questions Regarding Key Assessments.
Conceptual Framework What It Is and How It Works Kathe Rasch, Maryville University Donna M. Gollnick, NCATE October 2005.
TWS Aid for Scorers Information on the Background of TWS.
Instrument Development for a Study Comparing Two Versions of Inquiry Science Professional Development Paul R. Brandon Alice K. H. Taum University of Hawai‘i.
ASSESSMENT SYSTEMS FOR TSPC ACCREDITATION Assessment and Work Sample Conference January 13, 2012 Hilda Rosselli, Western Oregon University.
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Office of Research, Evaluation, and Assessment April 19, 2008.
Research Methods in MIS
Grade 12 Subject Specific Ministry Training Sessions
Preparing for the edTPA in Health and Physical Education
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Proposal Writing.
LOGO Teacher evaluation Dr Kia Karavas Session 5 Evaluation and testing in language education.
 Description  The unit has a conceptual framework that defines how our programs prepare candidates to be well-rounded educators. Every course in the.
Becoming a Teacher Ninth Edition
Measuring Dispositions Dr. Sallie Averitt Miller, Associate Dean Office for Assessment and Accreditation Columbus State University GaPSC Regional Assessment.
Educational Research: Competencies for Analysis and Application, 9 th edition. Gay, Mills, & Airasian © 2009 Pearson Education, Inc. All rights reserved.
Navigating Murky Waters The Challenge of Assessing Preservice Teacher Dispositions Dr. Anne B. Bucalos and Dr. Christine G. Price Bellarmine University.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
The College of Saint Rose School of Education Department of Literacy and Special Education Teacher Candidate Assessment.
NCATE Standard 3: Field Experiences & Clinical Practice Monica Y. Minor, NCATE Jeri A. Carroll, BOE Chair Professor, Wichita State University.
Developing Structured Activity Tools. Aligning assessment methods and tools Often used where real work evidence not available / observable Method: Structured.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
What you need to know about changes in state requirements for Teval plans.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Sharon M. Livingston, Ph.D. Assistant Professor and Director of Assessment Department of Education LaGrange College LaGrange, GA GaPSC Regional Assessment.
Presented By Dr / Said Said Elshama  Distinguish between validity and reliability.  Describe different evidences of validity.  Describe methods of.
Education Unit The Practicum Experience Session Two.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
Continuous Improvement. Focus of the Review: Continuous Improvement The unit will engage in continuous improvement between on-site visits. Submit annual.
Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,
6 Standards: Governance, Curriculum, Diversity, Assessment, Faculty, and Clinical  Spring Self Study Completed  June Submit Report  Fall.
Foundations of American Education: Perspectives on Education in a Changing World, 15e © 2011 Pearson Education, Inc. All rights reserved. Chapter 11 Standards,
Stetson University welcomes: NCATE Board of Examiners.
Michigan Assessment Consortium Common Assessment Development Series Module 16 – Validity.
Rubrics, and Validity, and Reliability: Oh My! Pre Conference Session The Committee on Preparation and Professional Accountability AACTE Annual Meeting.
CONNECT WITH CAEP | | Standard 2: Partnership for Practice Stevie Chepko, Sr. VP for Accreditation.
North Carolina Professional Teaching Standards.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Designing Quality Assessment and Rubrics
EVALUATING EPP-CREATED ASSESSMENTS
NCATE Unit Standards 1 and 2
A Pilot Study of the DAPTM Interview in the Online Environment
Eastern’s Assessment System
Classroom Assessment A Practical Guide for Educators by Craig A
Phyllis Lynch, PhD Director, Instruction, Assessment and Curriculum
NCATE Standard 3: Field Experiences & Clinical Practice
Concept of Test Validity
Elayne Colón and Tom Dana
Creating Analytic Rubrics April 27, 2017
Designing Assessment Things to be considered:
Standard Four Program Impact
Clinical Educator and Teacher Candidate Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Assessment Methods.
Assessment Elementary Mathematics
Training Chapter for the Advanced Field Experience Form (Pre-CPAST
Cooperating Teacher and Student Teacher Training Chapter for the Candidate Preservice Assessment for Student Teachers (CPAST) Form Developed by the VARI-EPP*
Presentation transcript:

Susan Malone Mercer University

 “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy, and consistency of its assessment procedures and unit operations.”

 Address contextual distractions (inappropriate noise, poor lighting, discomfort, lack of proper equipment)  Address problems with assessment instruments (missing or vague instructions, poorly worded questions, poorly produced copies that make reading difficult)

 [Review candidate performance to determine if candidates perform differentially with respect to specific demographic characteristics]  [ETS: Guidelines for Fairness Review of Assessments; Pearson: Fairness and Diversity in Tests]

 Assure candidates are exposed to the K, S, & D that are being evaluated  Assure candidates know what is expected of them  Instructions and timing of assessments are clearly stated and shared with candidates  Candidates are given information on how the assessments are scored and how they count toward completion of programs

 Assessments are of appropriate type and content to measure what they purport to measure  Aligned with the standards and/or learning proficiencies they are designed to measure [content validity]

 Produce dependable results or results that would remain constant on repeated trials ◦ Provide training for raters that promotes similar scoring patterns ◦ Use multiple raters ◦ Conduct simple studies of inter-rater reliability ◦ Compare results to other internal or external assessments that measure comparable K, S, & D [concurrent validity]

 Dispositions Assessment  Portfolios  Analysis of Student Learning  Summative Evaluation

 Alignment with INTASC standards, program standards, and the Conceptual Framework (accuracy; content validity) ◦ Matrices Matrices ◦ LiveText standards mapping LiveText standards mapping ◦ PRS Relationship to Standards section PRS Relationship to Standards section ◦ PRS alignment with program standards requirement in Evidence for Meeting Standards section  Alignment with other assessments (accuracy; concurrent validity) ◦ Matrices Matrices ◦ Potential documentation within LiveText Standards Correlation Builder Potential documentation within LiveText Standards Correlation Builder

 Rubrics/assessment expectations shared with candidates in courses; field experience orientations; and LiveText (fairness)  Rubrics/assessment expectations shared with cooperating teachers by university supervisors (consistency)  Statistical study (in process) examining correlations among candidate performances on multiple assessments (where those assessments address comparable K, S, & D) (consistency; concurrent validity)

 Multiple assessors (consistency)  Exploration of faculty’s assumptions re: purpose of the assessment, expectations of behaviors, and meaning of rating scale (consistency; reliability)  Revision of rating scale, addition of more specific indicators, development of two versions (courses/field experiences) (accuracy; content validity)

 Norming session with supervisors (consistency; reliability)  Revision of instructions to align more closely with rubric expectations and expected process (fairness; avoidance of bias)  Review of coursework and fieldwork to ensure candidates are prepared for assignment (fairness)

 Seeking feedback from experts (P12 partners) on whether assignment reqs and assessment criteria are authentic (accuracy; content validity)  Annual review of data disaggregated by demographic factors (gender, race/ethnicity, site, degree program) (fairness, avoidance of bias)

 Recent revision of portfolios and rubrics to align with new INTASC (accuracy; content validity)  Revision of rubrics to include more specific indicators related to the standards (change from generic rating descriptors) (accuracy; content validity)  Cross-college workshop on artifact selections (accuracy; content validity)

 Annual review of artifact selections (accuracy; content validity)  Inter-rater reliability study (consistency; reliability)

 Review of rubric expectations and all other PR and ST assignments to ensure opportunities to demonstrate all standards and indicators during experience (fairness)  Workshops for supervisors on the rubric expectations (consistency)  Feedback from cooperating teachers on relevance of the assessment (accuracy; content validity)  Annual review of data disaggregated by demographic variables (fairness; avoidance of bias)

 Statistical study to identify correlations among entry reqs and successful program completion  Statistical study to determine if key assessments and entry criteria are predictive of program success (as defined by success in student teaching)