What is High-Quality Assessment? Linking Research with Practice Santa Clara County Office of Education June 23, 2014 Karin K. Hess, Ed.D.

Slides:



Advertisements
Similar presentations
A Vehicle to Promote Student Learning
Advertisements

Analyzing Student Work
Level 1 Recall Recall of a fact, information, or procedure. Level 2 Skill/Concept Use information or conceptual knowledge, two or more steps, etc. Level.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
Leadership for the Common Core in Mathematics, University of Wisconsin-Milwaukee Reviewing the Cognitive Rigor Matrix and DOK Tuesday September.
Module 2 Text Comprehension
Training Modules Brought to you by the Idaho State Department of Education and Rachel Bear, Jess Westhoff, Ramey Uriarte, Paula Uriarte, and Chris Butts,
Please print the three Cognitive Rigor Matrices full page. Thanks!
Common Core State Standards (CCSS) Nevada Joint Union High School District Nevada Union High School September 23, 2013 Louise Johnson, Ed.D. Superintendent.
An Understanding of Webb’s Depth of Knowledge Tammy Seneca, Ph.D.
Making Assignment Expectations Clear: Create a Grading Rubric Barb Thompson Communication Skills Libby Daugherty Assessment FOR Student Learning 1.
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
Bridging Assessment and Instruction
Common Core State Standards Professional Learning Module Series
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Washington State Teacher and Principal Evaluation Project Preparing and Applying Formative Multiple Measures of Performance Conducting High-Quality Self-Assessments.
Big Ideas and Problem Solving in Junior Math Instruction
Standards Academy Grades 3 and 4 Day 1. Objectives Understand the Critical Areas of our grade levels. Examine the importance of vertical alignment across.
Assessment Cadre #3: “Assess How? Designing Assessments to Do What You Want”
Looking at Student work to Improve Learning
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Introduction to Depth of Knowledge
Common Core State Standards Video-Common Core Overview.
Educator’s Guide Using Instructables With Your Students.
EngageNY.org Overview of the 3-8 ELA Curriculum Modules Session 1A, November 2013 NTI.
1 Summer 2012 Educator Effectiveness Academies English Language Arts Transitioning to the CCSS by Making Strategic and Informed Choices in the Classroom.
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
DOK Depth of Knowledge An Introduction.
Module 2 Planning an Integrated Common Core Literature Lesson.
The Depth of Knowledge (DOK) Matrix
ELA Common Core Shifts. Shift 1 Balancing Informational & Literary Text.
ESSENTIAL QUESTION What does it look like and sound like when students use evidence to support their thinking?
Protocols for Mathematics Performance Tasks PD Protocol: Preparing for the Performance Task Classroom Protocol: Scaffolding Performance Tasks PD Protocol:
ACADEMIC CONVERSATIONS
Modified from Depth of Knowledge presentation by Dr. Robin Smith at 2009 PRESA Leadership Conference… Adapted from Kentucky Department of Education, Mississippi.
Student Growth in the Washington State Teacher Evaluation System Michelle Lewis Puget Sound ESD
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
1. Housekeeping Items June 8 th and 9 th put on calendar for 2 nd round of Iowa Core ***Shenandoah participants*** Module 6 training on March 24 th will.
Teachers Helping Teachers with Rigor/Depth of Knowledge / Revised Bloom’s Taxonomy Presented by NHCS Gifted Education Specialists.
Depth of Knowledge (DOK)
Depth of Knowledge and the Cognitive Rigor Matrix 1.
By Benjamin Newman.  Define “Cognitive Rigor” or “Cognitive Demand”  Understand the role (DOK) Depth of Knowledge plays with regards to teaching with.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
Alternative Assessment Chapter 8 David Goh. Factors Increasing Awareness and Development of Alternative Assessment Educational reform movement Goals 2000,
From Infusing Rigor & Research into Instruction and Assessment presentation USOE, Salt Lake City, UT February 25-26, 2014 Karin K. Hess, Ed.D. Center for.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge.
Instructional Leadership Planning Common Assessments.
And PARCC
GOING DEEPER INTO STEP 1: UNWRAPPING STANDARDS Welcome!
Tri City United Public Schools August 6, 2013 “Leading for educational excellence and equity. Every day for every one.”
Performance Assessment: The Core of Competency Based Learning Rose ColbyIowa ASCDJune, 2014.
March, 2016 SLO End of Course Command Levels. OUTCOMES Teachers will… be prepared to determine end of course command levels for each student. be prepared.
Instructional Leadership Supporting Common Assessments.
1 Cognitive Demand in Problems  Cognitive demand is a measure of what the instructional question (a question posed during class) or test item requires.
Day Two: February 25, :30-3:00. Series Goals Participants will have the opportunity to:  Work collaboratively to:  Deepen their knowledge of the.
The Role of the School Librarian & Media Specialist In the Student Learning Objectives (SLO) Process South Carolina Department of Education Steve Driscoll,
Applying the Reading Anchor Standards: Spring 2016 Instructional Leadership College and Career Ready Standards for Literacy.
Common Core State Standards
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Shifts in ELA/Literacy
Bridging Assessment and Instruction
Counseling with Depth of Knowledge
About This Document The Cognitive Rigor (CR) Matrix (created by Karin Hess by combining Bloom’s Taxonomy and Webb’s Depth of Knowledge) is the primary.
Preparation for the American Literature Eoc
Literacy Content Specialist, CDE
Correlated Curriculum
9/29 & 10/2 GOOD MORNING! After you have copied the Objective and Homework in your agenda, please note the corrected ACT date for April. I initially told.
SUPPORTING THE Progress Report in MATH
Arroyo Valley High School August 19, 2013
Hess Cognitive Rigor Matrix
Presentation transcript:

What is High-Quality Assessment? Linking Research with Practice Santa Clara County Office of Education June 23, 2014 Karin K. Hess, Ed.D. or

Presentation Overview Clarify understandings of cognitive rigor/DOK – using sample assessments & rubrics Use the Hess Validation Tools & Protocols (Module 3) to examine technical criteria for high quality assessments: Formative & Performance Review tools & strategies to discuss & plan future assessment activities and support to teachers Karin’s coaching tips…

Rubric Design & Formative Tools Revisit Handout from this morning: “What I need to do” rubric (citing evidence of proficiency) Handout 2a: Find a half Handout 2b: Hess Cognitive Rigor Matrix – Math-Science Handout 2c: What will this formative assessment uncover? Work in small groups to analyze the assessment

What do we mean by high-quality performance assessment? At your tables, brainstorm examples of performance assessments – any content area (e.g., arts, writing, science) or real world assessments (driver’s test, marriage planning, etc.) Have a recorder write them down You have only 3 minutes

Turn & talk: Select one PA from your list and answer these questions: 1.What is it actually assessing (skills & concepts)? 2.What makes it a PA? 3.What evidence is captured in the assessment that distinguishes poor from best performances? 4.What makes it a “good” performance assessment? 5.You have 5 minutes

Let’s generalize… With regard to skills & concepts assessed ______ What makes something a PA? ______ The kind of evidence that will distinguish poor from exemplary performances _______ What makes it a “good” performance assessment? _________

What we know (from research) about High Quality Assessment: Defined by agreed-upon standards/ expectations Measures the individual’s learning & can take different forms/formats Measures the effectiveness of instruction and appropriateness of curriculum Is transparent: – Students know what is expected of them and how they will be assessed – Assessment criteria are clear and training is provided to educators and reviewers/raters. Communicates information effectively to students, teachers, parents, administration and the public at large

Simply put, HQ assessments have… Clarity of expectations Alignment to the intended expectations (skills, concepts) Reliability of scoring and interpretation of results Attention to the intended rigor (tasks & scoring guides) Opportunities for student engagement & decision making Opportunities to make the assessment “fair” & unbiased for all Linked to instruction (opportunity to learn)

2. The DOK Matrix Instructional Paths Each standard has an assigned Depth of Knowledge. The DOK determines the cognitive level of instruction. Recall, locate basic facts, definitions, details, events Select appropriate words for use when intended meaning is clearly evident. DOK 1 Recall and Reproduction Remember Understand DOK 2 Skills and Concepts Apply Explain relationships Summarize State central idea Use context for word meanings Use information using text features DOK 3 Reasoning and Thinking Analyze Analyze or interpret author’s craft (e.g., literary devices, viewpoint, or potential bias) to critique a text Explain, generalize or connect ideas using supporting evidence (quote, text, evidence). Cite evidence and develop a logical argument for conjectures based on one text or problem Evaluate Use concepts to solve non-routine problems and justify DOK 4 Extended Thinking Synthesize across multiple sources/ texts Articulate a new voice, theme, or perspective Evaluate relevancy, accuracy and completeness of information across texts or sources Analyze multiple sources or multiple text Analyze complex abstract themes Devise an approach among many alternatives to research a novel problem -Explain how concepts or ideas specifically relate to other content domains. Develop a complex model or approach for a given situation Develop an alternative solution. Create Instruction & Assessment Decisions… 9 Selected Response Constructed Response Performance Tasks

GOAL: Each “validated” assessment will demonstrate: Clarity of expectations for the student and teacher(s) Alignment (task & scoring) to the intended standards: content & performance/DOK Provide opportunities for student engagement Provide opportunities to make the assessment “fair” & unbiased for ALL students

First we consider alignment… It’s really about validity – making decisions about the degree to which there is a “strong match” between grade level content standards + performance and the assessment/test questions/tasks And making valid inferences about learning resulting from an assessment score

“ Validity is a matter of degree, rather than all or none.” Robert Lynn, 2008

Alignment (validity) Questions: Is there a strong content match between assessment/test questions/tasks and grade level standards? Are the test questions/tasks (and the assessment as a whole) more rigorous, less rigorous, or of comparable rigor (DOK) to grade level performance standards?

Task Validation Protocol Handout # 3 (K. Hess, 2013) Table Groups review the technical criteria and descriptions on pages 3-4 in the protocol at your tables What’s one aspect you feel you (or teachers you work with) now do well in most local assessments? What’s one aspect you feel you (or your teachers) need to understand more deeply as you work with them?

Uses of the assessment task validation tools & protocols Develop new assessments Analyze existing assessments Validate a revised assessment or new assessment prior to broader administration (or purchase) Provide OBJECTIVE feedback to assessment developers Promote collaboration and a shared understanding of high quality assessment

Local Validation Teams represent the diversity of the school Administrator/Leader/Coach All* content areas represented All/most grade levels (grade spans)* represented PLUS Representation from special education, fine arts, HPE, CTE, foreign language, ELL, etc. *decisions may differ depending on school configurations and staffing, but diversity in teams is critical, especially including special educators

Frequency of Validations? Initially learning & debriefing the process together serves as calibration - so everyone is on the same page – “developing a shared understanding” of what high quality assessment looks like School teams set up their schedules – once each month, every other month, as needed, highest priority, etc. Team members may rotate on-off so more (all) staff are involved over time

Getting ready for validation Grade level or department teams develop the assessments using the Basic Validation Protocol (e.g., a gr 2 team might develop a common math assessment for all gr 2 classes/schools) Developers put the assessment on the local (school/district) validation calendar Validation teams prioritize order of validations – common assessments, major assessments first, second round review after getting feedback, etc.

Validation Materials Each team member needs (electronic) validation protocols (Handout: Module 3, pages 3-4) Each person needs a copy of the cover page with the assessment and scoring rubric/answer key (Handout: Module 3, pages 5-6) There may be additional materials – e.g., anchor papers, examples that do not need to copied for everyone but may be helpful to see during the review Each person needs a content specific DOK reference sheet (Handout: Module 1, tools #1, #2, or # 3)

Validation Protocols [1] Each time, preview norms for working together – I am… – I am NOT… Choose a recorder – to keep an electronic record & provide a copy of feedback for the assessment developers Date and list validation panel names on the “official copy” (this can be set up ahead of time) Individually, take 5-10 minutes to read through & make notes before any discussion

Sample norms (Source: adapted from Powell, WY) I AM Keeping electronic devices on vibrate/off Listening to understand other points of view Respecting everyone as a professional Focusing on the issues Avoiding side conversations Encouraging everyone having a turn to speak Refraining from judgmental statements Representing the best interests of all students Asking clarifying questions Demonstrating a commitment to the process (attending meetings, on time, etc.) Others? I AM NOT Using killer phrases Preparing my next remark instead of listening Sounding apologetic Engaging in unrelated activities Using negative gestures/body language others?

Optional -Validation Protocols [2] Should the authors present the task at the start? (especially if 2 nd round) – there are pros & cons to this – Go over what is on the cover page/what is included and what the purpose of the assessment is – 2-5 minutes to explain the materials in the packet – no interruptions from validation panel – Panel then asks any clarifying questions only – The is NOT for depth of understanding, just to know/clarify what is there BEFORE silently reading & discussing

Validation Protocols [3] Make notes individually before discussion Choose a task manager/ timekeeper to keep things moving – reads each indicator on the Validation Protocols Have a process to reach consensus (fist 5, thumbs up, etc.)- be sure to involve each person! Choose 2 people to give feedback to the authors/developers & “rehearse” comments DEBRIEF! Did we honor norms? What went well/needs to be refined next time?

Giving Feedback Use descriptive language, NOT judgmental language While you may wonder about instructional pieces, comments/suggestions about instruction are probably not appropriate Your job is NOT to redo the assessment! Keep feedback crisp & to the point (e.g., pose a question)- it is the developer’s job to decide what to do next to strengthen the assessment tasks.

Giving Feedback (continued) Well-written, clear feedback guides assessment developers to make a stronger assessment in the end. Place your positive (and descriptive) comments under the feedback section (Module 3, page 7): What makes this a HQ (high quality) assessment?

Examples of Feedback (noted on page 7) 1.We were unable to locate… 2.We think this might be DOK2, not DOK3 because…what do you think? 3.We were not clear what the student is expected to do or to produce. Did you mean…? 4.This might be better aligned to this standard … 5.As hard as it will be, avoid saying “we liked…” This implies you did not like other things and your job is NOT to like the assessment. 6.Include the “HQ” positives! The directions are clear; students have authentic choices; etc.

Debrief each time! Did the validation team honor the norms at all times? Do we need to modify/revise norms? What went well? What could have gone better? What will we do differently next time? Who/when will we meet with authors to give feedback?