Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative,

Slides:



Advertisements
Similar presentations
Sue Sears Sally Spencer Nancy Burstein OSEP Directors’ Conference 2013
Advertisements

The Rubric Reality Cobb Keys Classroom Teacher Evaluation System.
PORTFOLIO.
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
Gwinnett Teacher Effectiveness System Training
Virginia Teacher Performance Evaluation System 0 August 2012.
April 6, 2011 DRAFT Educator Evaluation Project. Teacher Education and Licensure DRAFT The ultimate goal of all educator evaluation should be… TO IMPROVE.
CONNECT WITH CAEP | | Teachers Know Their Content And Teach Effectively: CAEP Standard 1 Stevie Chepko,
Training Module for Cooperating Teachers and Supervising Faculty
LiveText™ Visitor Pass
Susan Malone Mercer University.  “The unit has taken effective steps to eliminate bias in assessments and is working to establish the fairness, accuracy,
Measuring Student Learning March 10, 2015 Cathy Sanders Director of Assessment.
1 Why is the Core important? To set high expectations – for all students – for educators To attend to the learning needs of students To break through the.
Student Assessment CERRA National Board Candidate Support Workshop Toolkit WS
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Virginia Teacher Performance Evaluation System
What should be the basis of
performance INDICATORs performance APPRAISAL RUBRIC
Stronge Teacher Effectiveness Performance Evaluation System
NORTH CAROLINA TEACHER EVALUATION INSTRUMENT and PROCESS
Interim Evaluation Documents evidence of meeting standards
Mentor Workshop: Assessing Learners Facilitated by a Practice Education Facilitator.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Applying the Principles of Prior Learning Assessment Debra A. Dagavarian Diane Holtzman Dennis Fotia.
Teacher Keys Effectiveness System Forsyth County Schools Orientation May 2013 L.. Allison.
CLASS Keys Orientation Douglas County School System August /17/20151.
DLM Early Childhood Express Assessment in Early Childhood Dr. Rafael Lara-Alecio Dr. Beverly J. Irby
Stronge Teacher Effectiveness Performance Evaluation System
CONNECT WITH CAEP | Transitioning from NCATE and TEAC to CAEP: How? Patty Garvin, Senior Director,
Classroom Assessments Checklists, Rating Scales, and Rubrics
ACCREDITATION SITE VISITS.  DIVISION 010 – SITE VISIT PROCESS  DIVISION 017 – UNIT STANDARDS  DIVISION 065 – CONTENT STANDARDS.
Student Assessment Workshop #5 CERRA National Board Candidate Support Workshop Toolkit WS
March 26-28, 2013 SINGAPORE CDIO Asian Regional Meeting and Workshop on Engineering Education and Policies for Regional Leaders Programme Evaluation (CDIO.
Transitioning from NCATE to CAEP November 10, 2014 Dr. Lance Tomei Retired (2013) Director for Assessment, Accreditation, and Data Management, University.
CommendationsRecommendations Curriculum The Lakeside Middle School teachers demonstrate a strong desire and commitment to plan collaboratively and develop.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Designing Local Curriculum Module 5. Objective To assist district leadership facilitate the development of local curricula.
ENGAGING STUDENTS FOSTERING ACHIEVEMENT CULTIVATING 21st CENTURY GLOBAL SKILLS Designing Engaging Units for 21 st Century Learners Consider the 21st Century.
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Teacher Performance Evaluation System Data Sources.
Standard Two: Understanding the Assessment System and its Relationship to the Conceptual Framework and the Other Standards Robert Lawrence, Ph.D., Director.
Ohio Department of Education March 2011 Ohio Educator Evaluation Systems.
The Conceptual Framework: What It Is and How It Works Linda Bradley, James Madison University Monica Minor, NCATE April 2008.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
PLC Team Leader Meeting
Systems Accreditation Berkeley County School District School Facilitator Training October 7, 2014 Dr. Rodney Thompson Superintendent.
STANDARD 4 & DIVERSITY in the NCATE Standards Boyce C. Williams, NCATE John M. Johnston, University of Memphis Institutional Orientation, Spring 2008.
Candidate Assessment of Performance CAP The Evidence Binder.
Candidate Assessment of Performance CAP The Evidence Binder.
Dr. Derrica Davis Prospective Principal Candidate: Fairington Elementary School.
Office of Service Quality
New Lesson Plan Template 2012 Major Divisions of the Lesson Plan Objectives Assessment Methods Lesson Overview.
CONNECT WITH CAEP | | Measures of Teacher Impact on P-12 Students Stevie Chepko, Sr. VP for Accreditation.
FLORIDA EDUCATORS ACCOMPLISHED PRACTICES Newly revised.
NOTE: To change the image on this slide, select the picture and delete it. Then click the Pictures icon in the placeholder to insert your own image. COMMON.
Instructional Leadership Supporting Common Assessments.
Designing Quality Assessment and Rubrics
Summative Evaluation Shasta Davis. Dimension: Preparation (Score- 4) Plans for instructional strategies that encourage the development of critical thinking,
CAEP Standard 4 Program Impact Case Study
EVALUATING EPP-CREATED ASSESSMENTS
Presented by Deborah Eldridge, CAEP Consultant
Clinical Practice evaluations and Performance Review
Elayne Colón and Tom Dana
Teaching and Learning with Technology
October 25, 2017 | Sponsored by LiveText™
(AKA: Meeting CAEP Accreditation Expectations)
Standard Four Program Impact
Exploring Assessment Options NC Teaching Standard 4
Deconstructing Standard 2a Dr. Julie Reffel Valdosta State University
Teacher Practice Instruments
Presentation transcript:

Why So Much Attention on Rubric Quality? CAEP Standard 5, Component 5.2: The provider’s quality assurance system relies on relevant, verifiable, representative, cumulative and actionable measures, and produces empirical evidence that interpretations of data are valid and consistent [emphasis added]. And…

Optional CAEP Review of Assessment Instruments CAEP has announced that its accreditation process will now allow the early submission of all key assessment instruments (rubrics, surveys, etc.) used by an Educator Preparation Provider (EPP) to generate data provided as evidence in support of CAEP accreditation. Once CAEP accreditation timelines are fully implemented, this will occur three years prior to the on-site visit to allow EPPs time to react to formative feedback from CAEP.

A Reality Check Regarding Current Rubrics: Commonly Encountered Weaknesses Using overly broad criteria Using double- or multiple-barreled criteria Using overlapping performance descriptors Failing to include all possible performance outcomes Using double-barreled descriptors that derail actionability Using subjective terms, performance level labels (or surrogates), or inconsequential terms to differentiate performance levels Failing to maintain the integrity of target learning outcomes: a common result of having multiple levels of “mastery”

Overly Broad Criterion CriterionUnsatisfactoryDevelopingProficientDistinguished AssessmentNo evidence of review of assessment data. Inadequate modification of instruction. Instruction does not provide evidence of assessment strategies. Instruction provides evidence of alternative assessment strategies. Some instructional goals are assessed. Some evidence of review of assessment data. Alternative assessment strategies are indicated (in plans). Lessons provide evidence of instructional modification based on learners' needs. Candidate reviews assessment data to inform instruction. Candidate selects and uses assessment data from a variety of sources. Consistently uses alternative and traditional assessment strategies. Candidate communicates with learners about their progress.

Double-barreled Criterion CriterionUnsatisfactoryDevelopingProficient Alignment to Applicable State P-12 Standards and Identification of Appropriate Instructional Materials Lesson plan does not reference P-12 standards or instructional materials. Lesson plan references applicable P-12 standards OR appropriate instructional materials, but not both. Lesson plan references applicable P-12 standards AND identifies appropriate instructional materials

Overlapping Performance Levels CriterionUnsatisfactoryDevelopingProficientDistinguished Communicating Learning Activity Instructions to Students Makes two or more errors when describing learning activity instructions to students Makes no more than two errors when describing learning activity instructions to students Makes no more than one error when describing learning activity instructions to students Provides complete, accurate learning activity instructions to students

Possible Gap in Performance Levels CriterionUnsatisfactoryDevelopingProficientDistinguished Instructional Materials Lesson plan does not reference any instructional materials Instructional materials are missing for one or two parts of the lesson Instructional materials for all parts of the lesson are listed and directly relate to the learning objectives. Instructional materials for all parts of the lesson are listed, directly relate to the learning objectives, and are developmentally appropriate.

Double-barreled Descriptor CriterionUnsatisfactoryDevelopingProficient Alignment to Applicable State P-12 Standards and Identification of Appropriate Instructional Materials Lesson plan does not reference P-12 standards or instructional materials. Lesson plan references applicable P-12 standards OR appropriate instructional materials, but not both. Lesson plan references applicable P-12 standards AND identifies appropriate instructional materials

Use of Subjective Terms CriterionUnsatisfactoryDevelopingProficientDistinguished Knowledge of Laboratory Safety Policies Candidate shows a weak degree of understanding of laboratory safety policies Candidate shows a relatively weak degree of understanding of laboratory safety policies Candidate shows a moderate degree of understanding of laboratory safety policies Candidate shows a high degree of understanding of laboratory safety policies

Use of Performance Level Labels CriteriaUnacceptableAcceptableTarget Analyze Assessment Data Fails to analyze and apply data from multiple assessments and measures to diagnose students’ learning needs, inform instruction based on those needs, and drive the learning process in a manner that documents acceptable performance. Analyzes and applies data from multiple assessments and measures to diagnose students’ learning needs, informs instruction based on those needs, and drives the learning process in a manner that documents acceptable performance. Analyzes and applies data from multiple assessments and measures to diagnose students’ learning needs, informs instruction based on those needs, and drives the learning process in a manner that documents targeted performance.

Use of Surrogates for Performance Levels CriterionUnsatisfactoryDevelopingProficientDistinguished Quality of WritingPoorly writtenSatisfactorily written Well writtenVery well written

Use of Inconsequential Terms CriteriaUnacceptableAcceptableTarget Alignment of Assessment to Learning Outcome(s) The content of the test is not appropriate for this learning activity and is not described in an accurate manner. The content of the test is appropriate for this learning activity and is described in an accurate manner. The content of the test is appropriate for this learning activity and is clearly described in an accurate manner.

Failure to Maintain Integrity of Target Learning Outcomes CriterionUnsatisfactoryDevelopingProficientDistinguished Alignment to Applicable State P-12 Standards No reference to applicable state P- 12 standards Referenced state P- 12 standards are not aligned with the lesson objectives and are not age- appropriate Referenced state P- 12 standards are age-appropriate but are not aligned to the learning objectives. Referenced state P-12 standards are age- appropriate and are aligned to the learning objectives. CriterionUnsatisfactoryDevelopingProficientDistinguished Instructional Materials Lesson plan does not reference any instructional materials Instructional materials are missing for one or two parts of the lesson Instructional materials for all parts of the lesson are listed and relate directly to the learning objectives. Instructional materials for all parts of the lesson are listed, relate directly to the learning objectives, and are developmentally appropriate.

Value-Added of High Quality Rubrics For Candidates, Well-designed Rubrics Can: Serve as an effective learning scaffold by clarifying formative and summative learning objectives (i.e., clearly describe expected performance at key formative transition points and at program completion) Identify the critical indicators aligned to applicable standards/ competencies (=construct & content validity) for each target learning outcome Facilitate self- and peer-evaluations

Value-Added of High Quality Rubrics For Faculty, Well-designed Rubrics Can: Improve assessment of candidates’ performance by: – Providing a consistent framework for key assessments – Ensuring the consistent use of a set of critical indicators (i.e., the rubric criteria) for each competency – Establishing clear/concrete performance descriptors for each assessed criterion at each performance level Help ensure strong articulation of formative and summative assessments Improve validity and inter- and intra-rater reliability of assessment data Produce actionable candidate- and program-level data

Attributes of an Effective Rubric 1.Rubric and the assessed activity or artifact are well-articulated.

Attributes of an Effective Rubric (cont.) 2.Rubric has construct validity (i.e., it measures the right competency/ies) and content validity (rubric criteria represent all critical indicators for the competency/ies to be assessed). How do you ensure this?

Attributes of an Effective Rubric (cont.) 3.Each criterion assesses an individual construct  No overly broad criteria  No double- or multiple-barreled criteria

Attributes of an Effective Rubric (cont.) 4.To enhance reliability, performance descriptors should:  Provide clear/concrete distinctions between performance levels (there is no overlap between performance levels)  Collectively address all possible performance levels (there is no gap between performance levels)  Eliminate (or carefully construct the inclusion of) double/multiple-barrel narratives

Attributes of an Effective Rubric (cont.) 5.Contains no unnecessary performance levels. Common problems encountered when multiple levels of mastery are present include:  Use of subjective terms to differentiate performance levels  Use of performance level labels or surrogates  Use of inconsequential terms to differentiate performance levels  Worst case scenario: failure to maintain the integrity of target learning outcomes

Attributes of an Effective Rubric 6.Resulting data are actionable  To remediate individual candidates  To help identify opportunities for program quality improvement Based on the first four attributes, the following meta- rubric has been developed for use in evaluating the efficacy of other rubrics…

Some Additional Helpful Hints In designing rubrics for key formative and summative assessments, think about both effectiveness and efficiency Identify critical indicators for target learning outcomes and incorporate those into your rubric Limit the number of performance levels to the minimum number needed to meet your assessment requirements. HIGHER RESOLUTION IS BEST ACHIEVED WITH MULTIPLE FORMATIVE LEVELS, NOT MULTIPLE LEVELS OF MASTERY! Populate the target learning outcome column first (Proficient, Mastery, etc.) Make clear (objective/concrete) distinctions between performance levels; avoid the use of subjective terms in performance descriptors Be sure to include all possible outcomes Don’t leave validity and reliability to chance –Most knowledgeable faculty should lead program-level assessment work; engage stakeholders; align key assessments to applicable standards/competencies; focus on critical indicators –Train faculty on the use of rubrics –Conduct and document inter-rater reliability and fairness studies