Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.

Slides:



Advertisements
Similar presentations
Performance Assessment
Advertisements

Skills development in the study of a world religion
Analyzing Student Work
Victorian Curriculum and Assessment Authority
Computer-Based Performance Assessments from NAEP and ETS and their relationship to the NGSS Aaron Rogat Educational Testing Service.
Re-viewing the CMI Framework (Re-view: Take another look, see anew)
Curriculum Instruction & Assessment Part I - Alignment By Tina Waddy.
WORKING TOGETHER ACROSS THE CURRICULUM CCSS ELA and Literacy In Content Areas.
© 2013 SRI International - Company Confidential and Proprietary Information Center for Technology in Learning SRI International NSF Showcase 2014 SIGCSE.
Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI International Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI.
Systems Engineering in a System of Systems Context
SRI Technology Evaluation WorkshopSlide 1RJM 2/23/00 Leverage Points for Improving Educational Assessment Robert J. Mislevy, Linda S. Steinberg, and Russell.
University of Maryland Slide 1 May 2, 2001 ECD as KR * Robert J. Mislevy, University of Maryland Roy Levy, University of Maryland Eric G. Hansen, Educational.
University of Maryland Slide 1 July 6, 2005 Presented at Invited Symposium K3, “Assessment Engineering: An Emerging Discipline” at the annual meeting of.
SLRF 2010 Slide 1 Oct 16, 2010 What is the construct in task-based language assessment? Robert J. Mislevy Professor, Measurement, Statistics and Evaluation.
U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with.
FERA 2001 Slide 1 November 6, 2001 Making Sense of Data from Complex Assessments Robert J. Mislevy University of Maryland Linda S. Steinberg & Russell.
Science PCK Workshop March 24, 2013 Dr. Martina Nieswandt UMass Amherst
Science and Engineering Practices
ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement &
Principled Assessments of 21st Century Skills across Disciplines in a Community College Curriculum Louise Yarnall SRI International Jane Ostrander Foothill-DeAnza.
Assessment Report Department of Psychology School of Science & Mathematics D. Abwender, Chair J. Witnauer, Assessment Coordinator Spring, 2013.
Seeing the Destination So We Can Direct Others to It
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
The Use of Student Work as a Context for Promoting Student Understanding and Reasoning Yvonne Grant Portland MI Public Schools Michigan State University.
Bank of Performance Assessment Tasks in English
Terry Vendlinski Geneva Haertel SRI International
Margaret J. Cox King’s College London
Argumentation in Middle & High School Science Victor Sampson Assistant Professor of Science Education School of Teacher Education and FSU-Teach Florida.
ECD in the Scenario-Based GED ® Science Test Kevin McCarthy Dennis Fulkerson Science Content Specialists CCSSO NCSA June 29, 2012 Minneapolis This material.
Foundational Work using Evidence-Centered Design.
Educational Services for Individuals with Exceptionalities Adapted Lesson Plan.
Inquiry-based Learning and Digital Libraries in Undergraduate Science Education Xornam Apedoe Learning & Instruction University of San Francisco November.
The Design Phase: Using Evidence-Centered Assessment Design Monty Python argument.
Some Implications of Expertise Research for Educational Assessment Robert J. Mislevy University of Maryland National Center for Research on Evaluation,
Standards-Based Science Instruction. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
TEA Science Workshop #3 October 1, 2012 Kim Lott Utah State University.
Thomas College Name Major Expected date of graduation address
Learning Progressions: Some Thoughts About What we do With and About Them Jim Pellegrino University of Illinois at Chicago.
The present publication was developed under grant X from the U.S. Department of Education, Office of Special Education Programs. The views.
ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why.
Committee on the Assessment of K-12 Science Proficiency Board on Testing and Assessment and Board on Science Education National Academy of Sciences.
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Chapter 1 Defining Social Studies. Chapter 1: Defining Social Studies Thinking Ahead What do you associate with or think of when you hear the words social.
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Illustration of a Validity Argument for Two Alternate Assessment Approaches Presentation at the OSEP Project Directors’ Conference Steve Ferrara American.
Sharing Design Knowledge through the IMS Learning Design Specification Dawn Howard-Rose Kevin Harrigan David Bean University of Waterloo McGraw-Hill Ryerson.
On Layers and Objects in Assessment Design Robert Mislevy, University of Maryland Michelle Riconscente, University of Maryland Robert Mislevy, University.
Construct-Centered Design (CCD) What is CCD? Adaptation of aspects of learning-goals-driven design (Krajcik, McNeill, & Reiser, 2007) and evidence- centered.
Teaching to the Standard in Science Education By: Jennifer Grzelak & Bonnie Middleton.
Computer Control Lou Loftin FETC Conference Orlando, FL January 28 – 31, 2014.
Unpacking the Elements of Scientific Reasoning Keisha Varma, Patricia Ross, Frances Lawrenz, Gill Roehrig, Douglas Huffman, Leah McGuire, Ying-Chih Chen,
How People Learn – Brain, Mind, Experience, and School (Bransford, Brown, & Cocking, 1999) Three core principles 1: If their (students) initial understanding.
Christine Yang March 17, As a teacher it is critical for me to demonstrate mastery of technology teacher standards. ISTE-NETS Teacher Standards.
Based on the work of Dr. M.S. Smith, University of Pgh. Key Ingredients to Developing Mathematical Understanding: Anticipating, Monitoring, Selecting,
Introduction to the Framework: Unit 1, Getting Readyhttp://facultyinitiative.wested.org/1.
MODEL-BASED SOFTWARE ARCHITECTURES.  Models of software are used in an increasing number of projects to handle the complexity of application domains.
Standards-Based Science Assessment. Ohio’s Science Cognitive Demands Science is more than a body of knowledge. It must not be misperceived as lists of.
Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,
CREATING AN ACTIVE LEARNING ENVIRONMENT Using Inquiry and Primary Sources.
PBL Instructional Design. PBL Instructional Design Name: Name of PBL: Grade Level: Content Area:
Using Evidence-Centered Design to develop Scenario Based Interactive Computer Tasks Daisy Rutstein.
National Science Education Standards. Outline what students need to know, understand, and be able to do to be scientifically literate at different grade.
Using PADI Templates as an Alternative Structure for Specifying GLOBE Investigation Strategies AERA April 2005 Angela Haydel DeBarger, SRI International.
Teaching with CHRONOS Data and Tools A Framework for Design Cathy Manduca Science Education Resource Center Carleton College June 13, 2006.
Creative Curriculum and GOLD Assessment: Early Childhood Competency Based Evaluation System By Carol Bottom.
Inquiry Primer Version 1.0 Part 4: Scientific Inquiry.
Using the 7 Step Lesson Plan to Enhance Student Learning
Eloise Forster, Ed.D. Foundation for Educational Administration (FEA)
Principled Assessment Designs for Inquiry (PADI)
Presentation transcript:

Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Design Patterns: Supporting Task Design by Scaffolding the Assessment Argument DR K-12 grant # , “Application of Evidence-Centered Design to State Large-Scale Science Assessment.” NSF Discovery Research K-12 PI meeting, November 10, Washington D.C. This material is based upon work supported by the National Science Foundation under Grant No. DRL Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. DR K-12 grant # , “Application of Evidence-Centered Design to State Large-Scale Science Assessment.” NSF Discovery Research K-12 PI meeting, November 10, Washington D.C. This material is based upon work supported by the National Science Foundation under Grant No. DRL Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Overview  Design patterns  Background  Evidence-Centered Design  Main idea  Layers  Assessment Arguments  Attributes of Design Patterns  How they inform task design

Design Patterns  Design Patterns in Architecture  Design Patterns in Software Engineering  Polti’s Thirty-Six Dramatic Situations

Messick’s Guiding Questions  What complex of knowledge, skills, or other attributes should be assessed?  What behaviors or performances should reveal those constructs?  What tasks or situations should elicit those behaviors? Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2),

Evidence-Centered Assessment Design  Organizing formally around Messick quote  Principled framework for designing, producing, and delivering assessments  Conceptual model, object model, design tools  Connections among design, inference, and processes to create and deliver assessments.  Particularly useful for new / complex assessments.  Useful to think in terms of layers

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment argument structures Design Patterns

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Psychometric models Automated scoring Task templates Object models Simulation environments

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Authoring interfaces Simulation environments Re-usable platforms & elements

 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Interoperable elements IMS/QTI, SCORM Feedback / instruction / reporting

Toulmin’s Argument Structure Claim Backing unless since Warrant Alternative explanation so Data

Assessment Argument Structure Claim about student Warrant for assessment argument since Alternative explanations unless Data concerning performance so

Assessment Argument Structure Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning performance Data concerning situation

Assessment Argument Structure Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Student acting in assessment situation Warrant for scoring since Warrant for task design

Assessment Argument Structure Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design e.g., near or far transfer, familiarity with tools, assessment format, representational forms, evaluation standards, task content & context. Not in measurement models, but crucial to inference.

PADI Design Patterns  Structured around assessment arguments  Substance based on recurring principles, ways of thinking, inquiry, etc.  E.g., NSES on inquiry, unifying themes  Science ed. & cog psych research

Some PADI Design Patterns  Model-Based Reasoning  Model Formation; Evaluation; Revision; Use  Model-Based Inquiry  Design under Constraints  Generate Scientific Explanations  Troubleshooting (with Cisco)  Assessing Epistemic Frames (in progress; with David Williamson Shaffer)

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. The Structure of Assessment Design Patterns

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument The design pattern is organized around Focal KSAs. They will be involved in the Claim, although there may be other KSAs that are included in the target of inference (e.g., Model Formation—but what models, what context?). Associated with Characteristic Features of Tasks.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument The Rationale provides background into the nature of the Focal KSAs, and the kinds of things that people do in what kinds of situations that evidence it. It contributes to the Warrant in the assessment argument.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument Additional KSAs play multiple roles. You need to think about which ones you really DO want to include as targets of inference (validity) and which ones you really DON’T (invalidity).

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument The Additional KSAs you DO want to include as targets of inference are part of the claim. E.g., knowing Mendel’s laws as well as being able to formulate a model in an investigation. Connected with Variable Features of Tasks.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument The Additional KSAs you DON’T want to include as targets of inference introduce alternative explanations for poor performance. (Especially important for assessing special populations – UDL & acommodations.) Connected with Variable Features of Tasks & Work Products.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument The Characteristic Features of Tasks help you think about critical data concerning the situation –what you need to get evidence about the Focal KSAs.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument Variable Features of Tasks also help you think about data concerning the situation – but now to influence difficulty … or to bring in or reduce demand for Additional KSAs to avoid alternative explanations.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument Some Variable Features of Tasks help you match features of tasks and background / knowledge / characteristics of students: Interests, familiarity, previous instruction.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument Potential Work Products help you think about what you want to capture from a performance – product, process, constructed model, written explanation, etc. Can also call attention to demand for Additional KSAs, & avoid alternative explanations (e.g., Stella)

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument Potential Observations are possibilities for the qualities of Work Products – i.e., the data concerning the performance.

ATTRIBUTEDESCRIPTION Focal Knowledge, Skills, Abilities The primary knowledge / skills / abilities (KSAs) targeted by this design pattern. RationaleHow/why this DP addresses evidence about focal KSAs Additional KSAs Other knowledge/skills/abilities that may be required by tasks. Characteristic features of tasks Aspects of assessment situations that are needed to evoke evidence about the focal KSAs. Variable features of tasks Aspects of assessment situations that can be varied to shift difficulty or focus. Potential work products What students actually say, do, or make, to produce evidence. Potential observations Aspects of work products we might identify and evaluate, as evidence about students’ KSAs. Potential rubrics Ways of evaluating work products to produce values of observations. Claim about student Warrant for assessment argument Alternative explanations since so unless Data concerning situation Data concerning performance Other information concerning student vis a vis assessment situation Student acting in assessment situation since Warrant for scoring since Warrant for task design How Design Patterns Support Thinking about the Assessment Argument And Potential Rubrics are algorithms/rubrics/rules for evaluating Work Products to get the data concerning the performance.

For more information…  PADI: Principled Assessment Design for Inquiry   NSF project, collaboration with SRI et al.  Links to follow-on projects  Bob Mislevy home page   Links to papers on ECD  Cisco applications

Now for the Good Stuff …  Examples of design patterns with content  Different projects  Different grain sizes  Different users  How they evolved to suit needs of users  Same essential structure  Representations, language, emphases, and affordances tuned to users and needs  How they are being used

Use of Design Patterns in STEM Research and Development Projects Britte Haugan Cheng and Geneva Haertel DRK-12 PI Meeting, November 2009

Current Catalog of Design Patterns  ECD/PADI related projects have produced over 100 Design Patterns  Domains include: science inquiry, science content, mathematics, economics, model-based reasoning  Design Patterns span grades  Organized around themes, models, and processes, not surface features or formats of tasks  Support the design of scenario-based, multiple choice, and performance tasks  The following examples show how projects have used and customized Design Patterns in ways that suit their needs and users

Example 1 DRK-12 Project An Application of ECD to a State, Large-scale Science Assessment  Challenge in Minnesota Comprehensive Assessment of science:  How to design scenario-based tasks, technology-enhanced interactions, grounded in standards both EFFICIENTLY and VALIDLY.  Design Patterns support storyboard writing and task authoring  Designers are committee of MN teachers, supported by Pearson  Project focuses on a small number of Design Patterns for “hard-to- assess” science content/inquiry  Based on Minnesota state science standards and benchmarks and the NSES inquiry standards  Design Patterns are Web-based and interactive

Design Pattern Observational Investigation  Relates science content/processes to components of assessment argument  Higher-level, cross-cutting themes, ways of thinking, ways of using science, rather than many finer-grained standards  Related to relevant standards and benchmarks  Interactive Features:  Examples and details  Activate pedagogical content knowledge  Presents exemplar assessment tasks  Provides selected knowledge representations  Links among associated assessment argument components

Design Pattern Observational Investigation

Design Pattern Observational Investigation (cont.)

Interactive Feature: Details

Interactive Feature: Linking assessment argument components

Design Pattern Highlights Observational Investigation  Relates science content/processes to components of assessment argument  Higher-level, cross-cutting themes, ways of thinking, ways of using science, rather than many fine-grained standards  Interactive Features:  Examples and details  Activates pedagogical content knowledge  Presents exemplar assessment tasks  Provides selected knowledge representations  Relates relevant standards and benchmarks  Links among associated assessment argument components

Design Pattern Reasoning about Complex Systems  Relates science content/processes to components of assessment argument  Across scientific domains and standards  Convergence among the design of instruction, assessment and technology  Interactive Features:  Explicit support for designing tasks around multi-year learning progression

Design Pattern Reasoning about Complex Systems

Interactive Feature: Details

Interactive Feature: Linking assessment argument components

Design Pattern Highlights Reasoning about Complex Systems  Relates science content/processes to components of assessment argument  Across scientific domains and standards  Convergence among the design of instruction, assessment and technology  Interactive Feature:  Explicit support for designing tasks around multi-year learning progression

Example 2 Principled Assessment Designs in Inquiry Model-Based Reasoning Suite  Relates science content/processes to components of assessment argument  A suite of seven related Design Patterns support curriculum-based assessment design  Theoretically and empirically motivated by Stewart and Hafner (1994), Research on Problem-Solving: Genetics. In D. L. Gable (Ed.), Handbook of research on science teaching and learning. New York: MacMillan Publishing.  Aspects of model-based reasoning including model formation, model use, model revision, and coordination among aspects of model-based reasoning  Multivariate student model: scientific reasoning and science content  Interactive Feature:  Support the design of both:  Independent tasks associated with an aspect of model-based reasoning  Steps in a larger investigation comprised of several aspects including model conceptualization, model use and model evaluation

Design Pattern Model Formation

Design Pattern Model Formation (cont.)

Interactive Feature: Links among Design Patterns

Design Pattern Highlights Model-based Reasoning Suite  Relates science content/processes to components of assessment argument  Facilitate the integration of model-based reasoning skills into any science content area  Serve as basis of a learning progression  Interactive Features:  Support the design of both independent tasks associated with an aspect of model-based reasoning and steps in a larger investigation that is comprised of several aspects including conceptualization of a model to its use and evaluation  Explicit supports (links among Design Patterns) for designing both investigations and focused tasks

Example 3 Principled Science Assessment Designs for Students with Disabilities Designing and Conducting Scientific Investigations Using Appropriate Methodology  Relates science content/processes to components of assessment argument  Guide refinement of science assessment tasks across multiple states by identifying and reducing sources of construct-irrelevant variance  Integrate six categories of Universal Design for Learning (UDL) into the assessment design process:  Perceptual, linguistic, cognitive, motoric, executive, affective  Interactive Feature:  Highlight relationships among Additional KSAs, Variable Task Features and Potential Work Products to reduce construct-irrelevant variance in a systematic manner

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology

Design Pattern Designing and Conducting a Scientific Investigation Using Appropriate Methodology (cont.)

Interactive Feature: Linking Additional KSAs and Potential Work Products

Design Pattern Highlights Designing and Conducting a Scientific Investigation Using Appropriate Methodology  Relates science content/processes to components of assessment argument  Integrate UDL in assessment design process rather than applying accommodations to an existing task  Supports the selection of task features that reduce construct-irrelevant variance and enhance the performance of all test takers  Particular attention to knowledge representation and executive processing demands  Further customization of Design Patterns to develop assessment tasks for students with particular disabilities  Interactive Feature:  Relate the perceptual and expressive capabilities required to complete an assessment task to that task’s features (Additional KSAs, Variable Task Features and Potential Work Products)

Example 4 Alternate Assessments in Mathematics Describe, extend, and make generalizations about geometric and numeric patterns  Relates math content/processes to components of assessment argument  Standards-based Design Patterns co-designed across three states to guide the development of statewide assessment tasks for students with significant cognitive disabilities  Integration of six UDL categories into the design process  Interactive Feature:  For logistical reasons, Word document used to create Design Patterns  Attributes visualized in accordance with the assessment argument resulting in increased efficiency and improved quality of argument  New arrangement now under development for use in online system

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns

Design Pattern Describe, extend, and make generalizations about geometric and numeric patterns (cont.)

Interactive Feature: Horizontal View Aligning Focal KSAs, Potential Observations and Potential Work Products

Interactive Feature: Horizontal View Aligning Additional KSAs and Variable Task Features

Design Pattern Highlights Describe, extend, and make generalizations about geometric and numeric patterns  Relates math content/processes to components of assessment argument  Deconstruction of NCTM expectations to identify KSAs that are less difficult or tasks that assess related cognitive background knowledge  Supports the principled alignment of task difficulty and scope with challenges to accessibility  Interactive Feature:  Use of multiple views of the Design Pattern to support understanding of the relationship of components of the assessment argument  Increased efficiency of design and validity of assessment argument

Summary  Design Patterns are organized around assessments and key ideas in science and math, as opposed to surface features of assessment tasks.  Support designing tasks that move in ways NSES and NCTM advocate in ways that build on research and experience  Design Patterns support task design for different purposes and different formats (e.g., learning, summative, classroom, large-scale, hands-on, P&P, simulations).  Especially important for newer forms of assessment  Technology-based, scenario based tasks in Minnesota  Scenario-based learning & assessment (Foothill-DeAnza project)  Simulation-based tasks (network troubleshooting, with Cisco)  Games-based assessment (just starting, with MacArthur project)

Summary  Design Patterns are eclectic—they are not associated with any particular underlying theory of learning or cognition; all psychological perspectives can be represented  Document design decisions  Represent hierarchical relationships among Focal KSAs, sequential steps required for the completion of complex tasks, or superordinate, subordinate, and coordinate relations among concepts  Re-usable; a family of assessment tasks can be produced from a single Design Pattern  Enhance the integration of UDL with the evidence-centered design process  Technology makes evident the relationships among Design Pattern attributes and their role in the assessment argument