Can 3 wrongs make a right? Using Test Items to Drive Student Thinking Sendhil Revuluri, Senior Instructional Specialist CPS Department of H. S. Teaching.

Slides:



Advertisements
Similar presentations
Analyzing Student Work
Advertisements

Classroom Assessment Techniques for Early Alert of Students At Risk Carleen Vande Zande, Ph.D. Academic Leaders Workshop.
Summative Assessment Kansas State Department of Education ASSESSMENT LITERACY PROJECT1.
ASSESSMENT LITERACY PROJECT4 Student Growth Measures - SLOs.
Congruency to Math Standards How do we successfully monitor and support our teachers when we can’t be an expert in every content area?
An Overview of Webb’s Depth of Knowledge
FALs and MDC. Before the Collaborative Activity: Meet as a grade level to collaboratively plan in advance the administration of the pre- assessment As.
Classroom Assessment FOUN 3100 Fall Assessment is an integral part of teaching.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Tools for Teachers Linking Assessment and Learning “Assessment should be an integral part of teaching. It is the mechanism whereby teachers can learn how.
How to Take Tests I Background On Testing.
Copyright © 2012 Assessment and Accountability Comprehensive Center & North Central Comprehensive Center at McRel.
Effective Questioning in the classroom
DOK and GRASPS, an Introduction for new staff
Hannah Guldin Chrystol White Aimee Kanemori.  Form an alliance between the teacher and parent “Above all parents need to know that their child’s teacher.
 Inquiry-Based Learning Instructional Strategies Link to Video.
© 2013 UNIVERSITY OF PITTSBURGH LEARNING RESEARCH AND DEVELOPMENT CENTER Supporting Rigorous Mathematics Teaching and Learning Tennessee Department of.
Edmonds School District P12 Summer Institute August
Evaluating Student Growth Looking at student works samples to evaluate for both CCSS- Math Content and Standards for Mathematical Practice.
PSLA 39 TH ANNUAL CONFERENCE APRIL 14, Carolyn Van Etten Beth Sahd Vickie Saltzer – LibGuide Developer.
“We will lead the nation in improving student achievement.” CLASS Keys TM Module 1: Content and Structure Spring 2010 Teacher and Leader Quality Education.
Designing and evaluating good multiple choice items Jack B. Monpas-Huber, Ph.D. Director of Assessment & Student Information.
A Framework for Inquiry-Based Instruction through
Pre-Conference Workshop – June 2007 BUILDING A NATIONAL TEAM: Theatre Education Assessment Models Robert A. Southworth, Jr., Ed.D. TCG Assessment Models.
DOK Depth of Knowledge An Introduction.
Making Group Work Productive PowerPoints available at Click on “Resources”
Welcome! Thanks for coming to this interactive workshop! Please take a handout, join each other, and introduce yourself to someone new. Then get your paper.
The Depth of Knowledge (DOK) Matrix
Robert Kaplinsky Melissa Canham
Webb’s Depth of Knowledge
© 2013 University Of Pittsburgh Supporting Rigorous Mathematics Teaching and Learning Making Sense of Numbers and Operations Fraction Standards via a Set.
2012 Mathematics SOL Institutes General Session October 2012 Michael Bolling, Acting Director, Office of Mathematics and Governor’s Schools Deborah Wickham,
© 2013 UNIVERSITY OF PITTSBURGH LEARNING RESEARCH AND DEVELOPMENT CENTER Study Group 7 - High School Math (Algebra 1 & 2, Geometry) Welcome Back! Let’s.
Accountable Talk Malden Public Schools. What is Accountable Talk “Accountable talk sharpens students' thinking by reinforcing their ability to use and.
Have you implemented “Number Talks” in your classroom? What are the pros? What are the cons? Any suggestions?????
NEW ASSESSMENT. NEW RESULTS SMARTER BALANCED ASSESSMENT What do families need to know? (Insert School Name) (Insert Date) INSERT LOGO.
Using Student & Staff Feedback in Educator Evaluation November 5, 2014 MASC/MASS Joint Conference.
Modified from Depth of Knowledge presentation by Dr. Robin Smith at 2009 PRESA Leadership Conference… Adapted from Kentucky Department of Education, Mississippi.
Developing Assessments for and of Deeper Learning [Day 2b-afternoon session] Santa Clara County Office of Education June 25, 2014 Karin K. Hess, Ed.D.
Smarter Balanced Claims Sue Bluestein Wendy Droke.
WELCOME!! While you are waiting, please complete the Anticipation Guide. 1.Read the statement 2.In the “Before Reading” column, mark whether you agree.
CFN 204 · Diane Foley · Network Leader Math Professional Development September 27, 2013 Presented by: Simi Minhas Math Achievement Coach.
ASSESSING STUDENT ACHIEVEMENT Using Multiple Measures Prepared by Dean Gilbert, Science Consultant Los Angeles County Office of Education.
Developing and Using Meaningful Math Tasks The Key to Math Common Core Take a moment to record on a sticky: What is a meaningful Math Task?
1 Math 413 Mathematics Tasks for Cognitive Instruction October 2008.
Preparation for Final Exam
MH502: Developing Mathematical Proficiency, Geometry and Measurement, K-5 Seminar 1 September 28, 2010.
Developing and Using Meaningful Math Tasks The Key to Math Common Core Take a moment to record on a sticky: What is a meaningful Math Task?
Guided Reading How can we make this really effective for our students?
Professional Development
Using SVMI & SDCOE Resources to Support Transition to the Common Core State Standards in Mathematics.
© 2013 UNIVERSITY OF PITTSBURGH Supporting Rigorous Mathematics Teaching and Learning Shaping Talk in the Classroom: Academically Productive Talk Features.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge.
Classroom Strategies That Work. Questions, Cues, and Advance Organizers Helping Students Activate Prior Knowledge.
Using Multiple Measures ASSESSING STUDENT ACHIEVEMENT.
Chapter 7- Thinking Like an Assessor Amy Broadbent and Mary Beck EDU 5103.
Using Student Assessment Data in Your Teacher Observation and Feedback Process Renee Ringold & Eileen Weber Minnesota Assessment Conference August 5, 2015.
SBAC-Mathematics November 26, Outcomes Further understand DOK in the area of Mathematics Understand how the new SBAC assessments will measure student.
New Hope-Solebury School District. Develop a shared understanding of the concept of cognitive rigor Begin the conversation about Webbs’ Depth of Knowledge.
Chapter 6 Assessing Science Learning Updated Spring 2012 – D. Fulton.
Understanding Depth of Knowledge. Depth of Knowledge (DOK) Adapted from the model used by Norm Webb, University of Wisconsin, to align standards with.
1 Cognitive Demand in Problems  Cognitive demand is a measure of what the instructional question (a question posed during class) or test item requires.
Reviewing, Revising and Writing Mathematics Multiple- Choice Items 1 Writing and scoring test questions based on Ohio’s Academic Content Standards Reviewing,
Depth Of Knowledge Basics © 2010 Measured Progress. All rights reserved. He who learns but does not think is lost. He who thinks but does not learn is.
Teaching with Depth An Understanding of Webb’s Depth of Knowledge
Helping Students Examine Their Reasoning
Taking the TEAM Approach: Writing with a Purpose
Increasing Rigor to Develop Critical Thinking Skills
STAAR: What do we notice?
Building Better Classes
Presentation transcript:

Can 3 wrongs make a right? Using Test Items to Drive Student Thinking Sendhil Revuluri, Senior Instructional Specialist CPS Department of H. S. Teaching & Learning CMSI Annual Conference, May 2, 2009

What is assessment?  Voodoo  Punishment  The bane of my existence  A sadistic plot  A process of reasoning from evidence  All of the above  None of the above

My answer, and some other points  Assessment is a process of reasoning from evidence about student understanding.  Assessment is an essential part of instruction.  There are also other purposes for assessment.  Assessments are usually made for a purpose (but often co-opted for other purposes).  Assessments for different purposes should probably look different.

My perspective on assessment  Former teacher, scheduler, data specialist, …  Currently work with district-wide assessment  I quickly realized how little I knew…  … and asked myself “If I knew then…”  Disclaimer: I’m still no assessment expert  I hope this information helps you be effective

A little jargon (part 1)  Purposes of assessment Formative Summative Predictive Accountability  Scales for assessment  Frequencies for assessment

The assessment triangle Cognition Observation Interpretation

Cognitive demand  (How) does this item make students think?  Low: Recall, recognition, perform procedure  Medium: Represent, multi-step, integrate, apply, solve a problem, compare, justify  High: Plan, analyze, judge, create, abstract, generalize, formulate a problem  What kinds of items can do each?

Types of items and trade-offs  Open-ended or constructed-response Long Short  Multiple-choice (how many?)  Items that are in between  What are the pros and cons of each type? What can some items do that others can’t?

Making large-scale assessments  Standards  Assessment framework  Blueprint  Items  Field-testing  Form

What makes a good item…  … on a classroom test?  … on a large-scale assessment?  In what ways would the answers be the same?  In what ways might they differ? Why?  A non-obvious criterion: alignment

A rectangle has length 3.7 cm and width 5.4 cm. What is its perimeter? A. 8.1 cm B. 9.1 cm C cm D cm A cm B cm 2 C cm D cm 2

The goal Kids get the item right for the right reason, and wrong for the right reason. The right reason is understanding of the objective.

Building a multiple-choice item  Figure out what you’re trying to assess  Make a task (stem or prompt) and answer it  What misconceptions most concern you?  Create distractors based on misconceptions  Clean up your item and options  Is it still aligned with the objective?

Cleaning up items and options  Clarity  Reading load  Cuing and parallel structure Considerations for large-scale assessment  Bias  Danger  Copyright

Did we meet the goal?  Did kids get the item right for the right reason, and wrong for the right reason?  Psychometrics  On the item level, what did students answer?  Do stronger students do better on the item?  Can we adapt this to classroom assessment?  Item analysis: turning data into information

Classroom practices, part 1  Grading for work, not just for the answer (also a way to give more feedback per minute)  Build effective test-taking (critical reading) habits Anticipating options Using the information provided  Build understanding of distractors as errors linked to misunderstandings (not random)

Classroom practices, part 2  Openers or launches to go from a multiple-choice item to explain the errors behind each distractor  Build rigor by devising distractors and writing rationales, which is more cognitively demanding while still somewhat structured  Can lead up to writing items, if you scaffold at first: at the end of a unit for prior learning topics (review) in groups for more “procedural” topics

Classroom practices, part 3  What methods have you used?  Let’s discuss challenges and successes…  What are your open questions?

Did you learn anything?  What’s one idea you’ve gained or one connection you’ve made?  What’s one thing you’re going to try?  What’s one thing you’ll tell someone about?

Thank you!  Please with feedback, questions, ideas, comments, and more problems and resources!  I’m happy to send you these slides and our handout (and more, from a longer version)