Evidence of Understanding Cherie McCollough VaNTH-PER Workshop June 2, 2003
Conceptual Challenges What is the evidence of in-depth understanding as opposed to that which is superficial or naive? What should teachers look for to determine the extent of student understanding? What kinds of assessment evidence will anchor a curricular unit, guiding instruction?
Calvin’s Evidence of Understanding
Establishing Curricular Priorities
Skills Compared to Understandings Reading Text Creating scoring opportunities in soccer Speaking persuasively in public Understanding Reading between the lines Spread defense as broadly and deeply as possible Emotionally appealing to wishes, needs, hopes and fears of audience, regardless of how logical and rational the argument
Framing Questions for Understanding Essential Questions Have no one obvious right answer Raise other important questions Address the philosophical or conceptual foundations of a discipline Recur naturally Are framed to provoke and sustain student interest
Overarching vs. topical questions Point beyond unit to larger, transferable ideas Like a topic to other related topics and subjects Topical Can be answered by uncovering a unit’s content – have several plausible answers that can be defended from Can be answered as a result of in-depth inquiry into a single topic
Tips for using Unit Questions Pp. 112 – 113 McTighe and Wiggins Small group discussion p. 115 - 117 – What makes these questions Overarching vs. Topical? Complete worksheet 6.1 and 6.2 (Pp. 120 – 121) using ideas from your content area.
General assessment design usually consists of: Identifying Engaging Enduring Effective Understandings Design (content standards) (design of lessons)
Backward design assessment consists of: Identifying Evidence of Engaging Enduring Understanding Effective Understandings Design
What are effective assessments?
Continuum of Assessment Methods Informal checks for understanding Observation/dialog Quiz/Test Academic Prompt Performance task/project
Continuum of assessment methods Scope: simple to complex Time Frame: short term to long term Setting: decontextualized to authentic contexts Structure: Highly structured to ill defined All are a result of ONGOING inquiry and RETHINKING
“Think of assessment as a dialog between teacher and student, one that is as informative to the student as it was to the teacher if it is done right.” Laura D’Amico, 1999 Project binders Self-assessments Peer-assessments More traditional methods Should be thought of as a collection of evidence over time instead of a single event.
Conceptual Challenges What is the evidence of in-depth understanding as opposed to that which is superficial or naive? What should teachers look for to determine the extent of student understanding? What kinds of assessment evidence will anchor a curricular unit, guiding instruction?
What is the goal of assessment? Basic concepts and skills? Assessment with written tests and quizzes are generally sufficient Deep Understanding? More complex assessment methods are required
Quiz and Test Items Assess for factual information, concepts, discrete skills Use selected response or short answer formats Typically have a single, best answer Easily scored Are secure
Quizzical Calvin
Academic Prompts Open ended questions that require students to think critically Require constructed responses under exam conditions Open – no single best answer or solution strategy Involve analysis, synthesis, and/or evaluation Require judgment based scoring May or may not be secure
Performance Tasks and Projects Involves complex challenges that mirror the issues and problems that adults face Challenges are authentic Short term, long term, or multi-staged Require tangible product or performance Use real or simulated settings with similar circumstances and adult would find Allow students greater opportunity to personalize the task
Performance Tasks and Projects Continued Differ from academic prompts in that they: Use real or simulated settings that are similar to what an adult would find Require students to address an identified audience Based on specific purpose that relates to the audience Allows students opportunity to personalize the task Are not secure – task, criteria, product, etc. known in advance GRASPS – Goal, Role, Audience, Situation, Product or Performance, and Standards.
Considering a Range of Evidence Pp. 131 - 132 Write the targeted understanding and core performance task in the middle box – what students should understand and be able to do. Brainstorm types of evidence that might be most useful, insightful, and fair for rounding out the picture to produce sufficient evidence of understanding
Curricular Priorities and Assessment Methods
Teacher Misconception RE: Evidence of Understanding MISCONCEPTION: evidence of understanding is that which includes only end-of-teaching test, performance tasks, projects, etc. TRUTH: Evidence of understanding is EVERYTHING on continuum gathered over time. Informal checks for understanding Observation/dialog Quiz/Test Academic Prompt Performance task/project Student self assessments ETC
Teacher Misconception RE: Evidence of Understanding MISCONCEPTION: If evidence is hands-on, results must be valid. TRUTH: Hands-on activities can be done without deep understanding Understanding backward design requires realizing not only THAT assessment design comes before instruction and clarity of target precedes assessment, but WHY. Multiple sources of evidence and types of performance are required.
Conceptual Challenges What is the evidence of in-depth understanding as opposed to that which is superficial or naive? What should teachers look for to determine the extent of student understanding? What kinds of assessment evidence will anchor a curricular unit, guiding instruction?
Designing Performance Tasks Must be authentic: Is realistic Requires judgment and innovation Asks a student to “do” the subject Replicates or simulates the contexts in which adults are tested in workplace, community, and home. Assesses a students ability to effectively use a repertoire of knowledge and skills to negotiate a complex task Allows appropriate opportunities to rehearse, practice, and consult resources; obtain feedback and refine performance and products.
Designing Performance Tasks Looking for that creative inspiration? Pp. 141 – 143: Performance task vignettes. See instructions.
Backwards design (assessor) vs. traditional design (activity designer) Activity designer’s design: Looks for interesting and engaging activities on topic. Identifies available resources and materials. Thinks about what students will be doing in and out of class and what assignments will be given Wonders if the activities worked – why or why not? Assessor’s design: Requires sufficient and revealing evidence of understanding Distinguishes between those who really understand and those who don’t. Has distinguishing work criteria Checks for predetermined misunderstandings
How to design authentic and engaging tasks?? GRASPS = Goal, Role, Audience, Situation, Product or Performance, Standards for Success. Task design tools: Review Pp. 147 – 152 Think about p. 155 using your own unit of interest Complete p. 157 using your own unit of interest
Conceptual Challenges What is the evidence of in-depth understanding as opposed to that which is superficial or naive? What should teachers look for to determine the extent of student understanding? What kinds of assessment evidence will anchor a curricular unit, guiding instruction?
Wrapping it up Have framed essential questions. Have defined what constitutes evidence for understanding. Have begun designing formative and summative assessments. ?? What are you wondering about??