ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement &

Slides:



Advertisements
Similar presentations
WEB- BASED TRAINING Chapter 4 Virginija Limanauskiene, KTU, Lithuania.
Advertisements

ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Applying the Human Views for MODAF to the conception of energy-saving work solutions Dr Anne Bruseberg Systems Engineering & Assessment Ltd, UK on behalf.
MMAP Middle School Math Through Applications Project Dahwun Deepak Gazi Scott Sun-Young.
Advances in the PARCC Mathematics Assessment August
Towards Adaptive Web-Based Learning Systems Katerina Georgouli, MSc, PhD Associate Professor T.E.I. of Athens Dept. of Informatics Tempus.
The Network of Dynamic Learning Communities C 107 F N Increasing Rigor February 5, 2011.
© 2013 SRI International - Company Confidential and Proprietary Information Center for Technology in Learning SRI International NSF Showcase 2014 SIGCSE.
Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI International Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI.
SRI Technology Evaluation WorkshopSlide 1RJM 2/23/00 Leverage Points for Improving Educational Assessment Robert J. Mislevy, Linda S. Steinberg, and Russell.
NCTM’s Focus in High School Mathematics: Reasoning and Sense Making.
University of Maryland Slide 1 May 2, 2001 ECD as KR * Robert J. Mislevy, University of Maryland Roy Levy, University of Maryland Eric G. Hansen, Educational.
University of Maryland Slide 1 July 6, 2005 Presented at Invited Symposium K3, “Assessment Engineering: An Emerging Discipline” at the annual meeting of.
ECOLT 2006 Slide 1 October 13, 2006 Prospectus for the PADI design framework in language testing ECOLT 2006, October 13, 2006, Washington, D.C. PADI is.
U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with.
FERA 2001 Slide 1 November 6, 2001 Making Sense of Data from Complex Assessments Robert J. Mislevy University of Maryland Linda S. Steinberg & Russell.
1 © 2003 Cisco Systems, Inc. All rights reserved. PTC 6/18/04 What Cisco’s ADI Group is Doing in Performance Testing PTC: June 18, 2004.
TELEStraining Inc. The eTrainerCB: Using Instructional Templates To Create Training SCO’s Lucio Teles, Ph.D., President, TELEStraining Inc. Fuchun Xiao,
Software Architecture in Practice (3rd Ed) Introduction
Can we make a test fun? Eric Church – BreakAway Games/ University of Baltimore
MASTERS THESIS DEFENSE QBANK A Web-Based Dynamic Problem Authoring Tool BY ANN PAUL ADVISOR: PROFESSOR CLIFF SHAFFER JUNE 2013 Computer Science Department.
COPYRIGHT WESTED, 2010 Calipers II: Using Simulations to Assess Complex Science Learning Diagnostic Assessments Panel DRK-12 PI Meeting - Dec 1–3, 2010.
Terry Vendlinski Geneva Haertel SRI International
ECD in the Scenario-Based GED ® Science Test Kevin McCarthy Dennis Fulkerson Science Content Specialists CCSSO NCSA June 29, 2012 Minneapolis This material.
QUALITATIVE MODELING IN EDUCATION Bert Bredweg and Ken Forbus Yeşim İmamoğlu.
1 WEB Engineering Introduction to Electronic Commerce COMM1Q.
The Design Phase: Using Evidence-Centered Assessment Design Monty Python argument.
Some Implications of Expertise Research for Educational Assessment Robert J. Mislevy University of Maryland National Center for Research on Evaluation,
ITCS 6010 SALT. Speech Application Language Tags (SALT) Speech interface markup language Extension of HTML and other markup languages Adds speech and.
Introduction to MDA (Model Driven Architecture) CYT.
IPN Leibniz Institute for Science Education at the University of Kiel Reacting to challenges for the research in mathematics education: case studies of.
1 WEB Engineering E-Commerce Strategy & Management COM350.
Odyssey A Reuse Environment based on Domain Models Prepared By: Mahmud Gabareen Eliad Cohen.
CHAPTER TEN AUTHORING.
Robert J. Mislevy University of Maryland Geneva Haertel & Britte Haugan Cheng SRI International Robert J. Mislevy University of Maryland Geneva Haertel.
Adaptive Hypermedia Tutorial System Based on AHA Jing Zhai Dublin City University.
Putting Research to Work in K-8 Science Classrooms Ready, Set, SCIENCE.
Implementing Inquiry Based Science Teaching Through Teacher Networks The SINUS model for educational development in Germany Matthias Stadler, Kiel (Germany)
Sharing Design Knowledge through the IMS Learning Design Specification Dawn Howard-Rose Kevin Harrigan David Bean University of Waterloo McGraw-Hill Ryerson.
ITS and SCORM Xiangen Hu, Andrew Olney, Eric Mathews, Art Graesser The University of Memphis.
On Layers and Objects in Assessment Design Robert Mislevy, University of Maryland Michelle Riconscente, University of Maryland Robert Mislevy, University.
System.Security.Policy namespace Chinmay Lokesh.NET Security CS 795 Summer 2010.
Towards a Pattern Language for User Interface Design
© 2007 Cisco Systems, Inc. All rights reserved.Cisco PublicPT Advanced 1 Packet Tracer: Advanced Session.
ISECON 2006 The Work System Model as a Tool for Understanding the Problem in an Introductory IS Project Doncho Petkov Eastern Connecticut State University.
1 Assessing Student Understanding David Niemi UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards,
Evidence-Centered Game Design Kristen DiCerbo, Ph.D. Principal Research Scientist, Pearson Learning Games Scientist, GlassLab.
Week 04 Object Oriented Analysis and Designing. What is a model? A model is quicker and easier to build A model can be used in simulations, to learn more.
Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15,
® IBM Software Group © 2009 IBM Corporation Essentials of Modeling with the IBM Rational Software Architect, V7.5 Module 15: Traceability and Static Analysis.
CREATING AN ACTIVE LEARNING ENVIRONMENT Using Inquiry and Primary Sources.
Sources THE INQUIRY DESIGN MODEL SESSION 3:. Part I – The Nature of Sources What are sources? What makes a source disciplinary? What is the relationship.
School of Education Technology, Beijing Normal University Research on the Organization Model of Ubiquitous Learning Resource Shengquan Yu
® IBM Software Group © 2009 IBM Corporation Essentials of Modeling with IBM Rational Software Architect, V7.5 Module 18: Applying Patterns and Transformations.
“…the ability to think and act flexibly with what one knows.” David Perkins in Teaching for Understanding, Wiske et al (1998)
Using the PADI Design System to Examine the Features of a NAEP Performance Assessment Kathleen C. Haynie Kathleen Haynie Consulting Andrea A. Lash, Geneva.
Using Evidence-Centered Design to develop Scenario Based Interactive Computer Tasks Daisy Rutstein.
Principles of Information System Security: Text and Cases
The relationship between hypermedia producers’ preferred learning styles and the motivational aspects of their productions Presenter: Che-Yu Lin Advisor:
COLLABORATIVE WEB 2.0 TOOLS IN EDUCATION USING WIKIS & BLOGS IN THE CLASSROOM.
Funded by the Library of Congress.
Network Topologies for Scalable Multi-User Virtual Environments Lingrui Liang.
Co-director CCNA CATC - Ft. Worth, TX
Project 1 Introduction to HTML.
Inquiry learning and SimQuest
StYLiD: Structured Information Sharing with User-defined Concepts
Creating an Active Learning environment
Creating an Active Learning environment
Principled Assessment Designs for Inquiry (PADI)
Educational Technology Lab, National Kapodistrian
Presentation transcript:

ADL Slide 1 December 15, 2009 Evidence-Centered Design and Cisco’s Packet Tracer Simulation-Based Assessment Robert J. Mislevy Professor, Measurement & Statistics University of Maryland with John T. Behrens & Dennis Frezzo Cisco Systems, Inc. December 15, 2009 ADL, Alexandria, VA

ADL Slide 2 December 15, 2009 Simulation-based & Game-based Assessment Motivation: Cog psych & technology »Complex combinations of knowledge & skills »Complex situations »Interactive, evolving in time, constructive »Challenge of technology-based environments

ADL Slide 3 December 15, 2009 Outline ECD Packet Tracer Packet Tracer & ECD

ADL Slide 4 December 15, 2009 ECD

ADL Slide 5 December 15, 2009 Messick’s (1994) guiding questions: What complex of knowledge, skills, or other attributes should be assessed? What behaviors or performances should reveal those constructs? What tasks or situations should elicit those behaviors? Evidence-Centered Assessment Design

ADL Slide 6 December 15, 2009 Evidence-Centered Assessment Design Principled framework for designing, producing, and delivering assessments Process model, object model, design tools Explicates the connections among assessment designs, inferences regarding students, and the processes needed to create and deliver these assessments. Particularly useful for new / complex assessments.

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

ADL Slide 8 December 15, 2009 Packet Tracer

ADL Slide 9 December 15, 2009 Cisco’s Packet Tracer Online tool used in Cisco Networking Academies Create, edit, configure, run, troubleshoot networks Multiple representations in the logical layer Inspection tool links to a deeper physical world Simulation mode »Detailed visualization and data presentation Standard support for world authoring »Library of elements »Simulation of relevant deep structure »Copy, paste, save, edit, annotate, lock

ADL Slide 10 December 15, 2009 Instructors and students can author their own activities

ADL Slide 11 December 15, 2009 Instructors and students can author their own activities

Explanation

Experimentation

ADL Slide 15 December 15, 2009 Packet Tracer & ECD

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Assessment argument structures Design Patterns

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to familiar assessments Upfront design of features of task situation – static, implicit in task, not usually tagged Upfront design of features of response classes Fixed competency variables

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to complex assessments More complex evaluations of features of task situation Multiple, perhaps configured- on-the-fly competency variables

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Application to familiar assessments Application to complex assessments Macro features of performance Micro features of performance Unfolding situated performance Micro features of situation Macro features of situation Time Some up front design of features of task situation; others recognized (e.g., agents) Some up front design of features of performance or effects; others recognized (e.g., agents) Evolving, interactive, situation

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Object models for representing … Psychometric models Includes “competences/proficiencies Simulation environments Task templates Automated scoring

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. PADI object model for task/evidence models

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Graphical representation of network & configuration is expressable as a text representation in XML format, for presentation & work product, to support automated scoring.

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Authoring interfaces Simulation environments Re-usable platforms & elements Standard data structures IMS/QTI, SCORM

In Packet Tracer, Answer Network Serves as base pattern for work product evaluation

Dynamic task models - Variable Assignment: Initial Network Similar to the Answer Network Tree. When the activity starts, instead of using the initial network as the starting values, the activity will configure the network with the contents of the variables.

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Interoperable elements IMS/QTI, SCORM Feedback / instruction / reporting

ADL Slide 30 December 15, 2009 Conclusion Behavior in learning environments builds connections for performance environments. Assessment tasks & features strongly related to instruction/learning objects & features. Re-use concepts and code in assessment, »via arguments, schemas, data structures that are consonant with instructional objects. Use data structures that are »share-able, extensible, »consistent with delivery processes and design models.

ADL Slide 31 December 15, 2009 Further information Bob Mislevy home page » »Links to papers on ECD »Cisco NetPASS »Cisco Packet Tracer PADI: Principled Assessment Design for Inquiry »NSF project, collaboration with SRI et al. »