Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center

Slides:



Advertisements
Similar presentations
Differentiated Instruction (DI) Meets Understand by Design (UbD) UB EDUC- 503 October 15, 2012.
Advertisements

Performance Assessment
Rubric Design Denise White Office of Instruction WVDE.
Understanding by Design Day 2 Roosevelt Complex Secondary Science Training.
Victorian Curriculum and Assessment Authority
Session 3 The evidence should be credible & helpful. Implications: the assessments should – ◦ Be grounded in real-world applications, supplemented as.
1 SESSION 3 FORMAL ASSESSMENT TASKS CAT and IT ASSESSMENT TOOLS.
An approach to curriculum designed to engage students in inquiry and uncovering ideas.
Designing Scoring Rubrics. What is a Rubric? Guidelines by which a product is judged Guidelines by which a product is judged Explain the standards for.
Assessment in the MYP CENTURY MIDDLE SCHOOL. What is Assessment? Assessment is integral to all teaching and learning. MYP assessment requires teachers.
Connections to the TPGES Framework for Teaching Domains Student Growth Peer Observation Professional Growth Planning Reflection.
MODULE 3 1st 2nd 3rd. The Backward Design Learning Objectives What is the purpose of doing an assessment? How to determine what kind of evidences to.
Authentic Performance Tasks
Educators Evaluating Quality Instructional Products (EQuIP) Using the Tri-State Quality Rubric for Mathematics.
The mere imparting of information is not education. Above all things, the effort must result in helping a person think and do for himself/herself. Carter.
Overview: Competency-Based Education & Evaluation
Principles of High Quality Assessment
DOK and GRASPS, an Introduction for new staff
Principles of Assessment
Shifting to a Standards- Based Mindset Through Quality Assessments and Backwards Design LMS Department Everett High School October 10, 2014.
Becoming a Teacher Ninth Edition
UNDERSTANDING BY DESIGN
RUBRICS: A REFRESHER COURSE PRINCESS ANNE MIDDLE SCHOOL STAFF WEEK TRAINING, AUGUST 2014.
Classroom Assessments Checklists, Rating Scales, and Rubrics
Classroom Assessment A Practical Guide for Educators by Craig A
 Understanding by Design Using Backwards Design Principles to Create Standards-Based Units Welcome! We’re glad you’re here…
Four Basic Principles to Follow: Test what was taught. Test what was taught. Test in a way that reflects way in which it was taught. Test in a way that.
Writing Learning Outcomes David Steer & Stephane Booth Co-Chairs Learning Outcomes Committee.
Assessing General Education Workshop for College of the Redwoods Fred Trapp August 18, 2008.
Classroom Assessment A Practical Guide for Educators by Craig A. Mertler Chapter 7 Portfolio Assessments.
Understanding by Design Stage 2: Evidence Think Like an Assessor.
Integrating Differentiated Instruction & Understanding by Design: Connecting Content and Kids by Carol Ann Tomlinson and Jay McTighe.
Paul Parkison: Teacher Education 1 Articulating and Assessing Learning Outcomes Stating Objectives Developing Rubrics Utilizing Formative Assessment.
Measuring Complex Achievement
Creating Rubrics. Information taken from Formative Assessment and Standards-Based Grading Robert Marzano 2010.
Alternative Assessment
Providing Effective Descriptive Feedback: Designing Rubrics, Part 2 --Are you assessing what you think you’re assessing? Princess Anne Middle School Instructional.
Stages 1 and 2 Wednesday, August 4th, Stage 1: Step 5 National and State Standards.
Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia Ara Tekian, PhD, MHPE University of Illinois at Chicago.
Performance-Based Assessment Authentic Assessment
Programme Objectives Analyze the main components of a competency-based qualification system (e.g., Singapore Workforce Skills) Analyze the process and.
Student assessment Assessment tools AH Mehrparvar,MD Occupational Medicine department Yazd University of Medical Sciences.
Performance-Based Assessment HPHE 3150 Dr. Ayers.
The Backward Design Process
SHOW US YOUR RUBRICS A FACULTY DEVELOPMENT WORKSHOP SERIES Material for this workshop comes from the Schreyer Institute for Innovation in Learning.
High School Department Development Day February 12, 2010.
Assessment Information from multiple sources that describes a student’s level of achievement Used to make educational decisions about students Gives feedback.
Stage 2 Understanding by Design Assessment Evidence.
Georgia will lead the nation in improving student achievement. 1 Georgia Performance Standards Day 3: Assessment FOR Learning.
ASSESSMENT TOOLS DEVELOPMENT: RUBRICS Marcia Torgrude
MAVILLE ALASTRE-DIZON Philippine Normal University
UbD - Stage 2 Assessment Evidence. Our Big Idea Making Best Practices, Common Practice *create a variety of assessments, to be used throughout the lesson,
Chapter 7- Thinking Like an Assessor Amy Broadbent and Mary Beck EDU 5103.
Understanding by Design Stage 2: Determining Acceptable Evidence Summer UBD Workshop, Day 2 August 13, 2014.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Understanding by Design* *Design – (v) To have purpose and intentions; to plan and execute (Oxford English Dictionary)
Writing Learning Outcomes Best Practices. Do Now What is your process for writing learning objectives? How do you come up with the information?
Learning Objectives for Senior School Students. Failing to plan is planning to fail. / Psychology of Achievement /
Designing Quality Assessment and Rubrics
Approaching Assessment
Designing Rubrics with the Three Categories of Knowledge
Classroom Assessment A Practical Guide for Educators by Craig A
Evidence of Understanding
Understanding by Design
Understanding by Design
What Are Rubrics? Rubrics are components of:
jot down your thoughts re:
Designing Your Performance Task Assessment
Alternative Assessment
jot down your thoughts re:
Presentation transcript:

Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center

Assessor – 3 basic questions What kind of evidence do we need to support the attainment of goals? Tasks that reveal understanding, such as comparing and contrasting or summarizing key concepts What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved? Criteria, rubrics, and exemplars are needed Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding? Validity and reliability concerns

Stage 2: Evidence Think like an assessor not an activity designer What should be sufficient and revealing evidence of understanding? What would be interesting and engaging activities on this topic? What performance tasks must anchor the unit and focus the instructional work? What resources and materials are available on this topic? Against what criteria will I distinguish work? How will I give students a grade and justify it to parents? Assessor Activity Designer

Stage 2: Evidence Think like an assessor not an activity designer Assessor Activity Designer How will I be able to distinguish between those who really understand and those who don’t (though they seem to)? What will students be doing in and out of class? What assignments will be given? What misunderstandings are likely? How will I check for those? Did the activities work? Why or why not?

Continuum of Assessment Methods Vary in several characteristics Scope: from simple to complex Time Frame: short-term to long term Setting: decontextualized to authentic Structure: highly structured to ill-structured Move from snapshot to scrapbook Self-assessment of sources of evidence (HO) Informal checks Observation/ Dialogue Quiz/ Test Academic Prompt Performance Task

Collecting a Range of Evidence Activity: (HO) determine a range of assessment evidence you may use related to the Enduring understanding Topics important to know and do Worth being familiar with Which assessment methods best fit the 3 categories? Worth being familiar with Important to know and do Enduring Understanding

Academic Prompt Assessments Open-ended question or problem that require student to prepare a specific academic response Think critically and prepare response Require constructed response under exam conditions Divergent – no single best answer Subjective judgment based scoring using criteria or rubric May or may not be secure Often ill-structured – require development of strategy Involve analysis, synthesis, and evaluation

Performance Task Assessments Complex challenges that mirror the issues and problems faced by adults Real or simulated settings, authentic Require student to address audience in non-exam conditions Divergent – no single best answer Subjective judgment based scoring using criteria or rubric, Greater opportunity to personalize task Not secure – students given criteria in advance

Performance Task – 6 Facets Activity: Use the 6 Facets of Understanding to generate a performance task related to your enduring understanding Questioning for Understanding (HO) Performance Verbs (HO) Performance Task creation (HO) Performance Task brainstorming (HO)

Performance Task -GRASPS Creating a performance task with context and roles Goal Role Audience Situation Product, Performance, and Purpose Standards and Criteria for Success

Performance Task -GRASPS Activity: Create a performance task using GRASPS GRASPS Performance Task Scenario (HO) Student roles and audiences (HO) Possible Products and Performances (HO)

Assessor Question 2: Determine achievement What specific characteristics in student responses, products, or performances should we examine to determine the extent to which the desired results were achieved? Criteria, rubrics, and exemplars are needed

Designing Scoring Rubrics Rubric: criterion-based scoring guide for evaluating a product or performance along a continuum. Consists of: Evaluative Criteria – qualities that must be met for work to measure up to a standard Fixed Measurement Scale – often 4 or 5 levels Indicators – descriptive terms for differentiating among degrees of understanding, proficiency, or quality

Rubric Types Holistic – provide an overall impression of the elements of quality and performance levels in a student’s work Analytic – divides a student’s performance into two or more distinct dimensions (criteria) and judges each separately Recommend use of analytic with a minimum of: Criteria for understanding (HO) Criteria for performance Using Facet-Related Criteria (Figure 8.3, Pg 178)

Rubric Types Generic – general criteria in given performance area Can be developed before specific task defined Example: General Problem Solving Rubric Example: Generic Rubric for Understanding (HO) Task-Specific – designed for use with particular assessment activity Task dependent so cannot be used to evaluate related performance tasks

Rubric Types Longitudinal Rubric – progression from naïve to sophisticated understanding Increased understanding of complex functions and interrelatedness of concepts Greater awareness of how discipline operates Greater personal control over and flexibility with knowledge

Effective Rubrics Relate specific task requirements to more general performance goals Discriminate among different degrees of understanding or proficiency according to significant features Do not combine independent criteria in one column of rubric Use Student Anchors to (Anchor design, Pg 181) Set standards based on student artifacts Consistency in judgment of student work Equip students to do more accurate and productive self- assessment

Effective Rubrics All potential performances should fit somewhere in rubric Rely on descriptive language (what quality looks like) not comparative or value language to make distinctions Avoid making lowest score point sound bad, should describe novice or ineffective performance Highlight judging performance’s impact as opposed to over rewarding just process or effort

Assessor Question 3: Valid and Reliable Does the proposed evidence enable us to infer a student’s knowledge, skill, or understanding? Validity: did we measure what we meant to measure Does the evidence indicate understanding of the expressed outcomes? Are the performances appropriate to the understanding sought? Do not pay so much attention to correctness that degree of understanding is lost.

Validity Two key validity questions for assessment tasks: A student could do well on this performance task, but really not demonstrate the understanding you are after? A student could perform poorly on this task, but still have significant understanding of the ideas and show them in other ways? Activity: determining validity (Figure 8.5)

Validity Two key validity questions for rubric: Could the proposed criteria be met but the performer still not demonstrate deep understanding? Could the proposed criteria not be met but the performer nonetheless still show understanding?

Reliability Reliable assessments reveal a credible pattern, a clear trend Need for multiple evidence (scrapbook) rather than just a snapshot of student performance Have parallel assessments on the same concept using multiple assessment formats.

Dr. Robert Mayes University of Wyoming Science and Mathematics Teaching Center