Presentation is loading. Please wait.

Presentation is loading. Please wait.

U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with.

Similar presentations


Presentation on theme: "U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with."— Presentation transcript:

1 U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with John T. Behrens & Dennis C. Frezzo Cisco Systems, Inc. Presented at the Lidquist Center at the University of Iowa, Iowa City, IA, September 19, 2007

2 U Iowa Slide 2 Sept 19, 2007 Examples Computer-based, interactive, simulation-based. Info-processing and sociocultural roots. l PacketTracer (Cisco Systems) Computer network design & troubleshooting l The Architectural Registration Examination Architectural design; CAD-like environment l Hydrive Troubleshooting the F15 hydraulics system l Nat’l Board of Medical Examiner’s Primum ® Computer Case Simulations (CCS) l DISC simulator Simulations for problem-solving in dental hygiene

3 U Iowa Slide 3 Sept 19, 2007 Two Not-Entirely-Satisfactory Ways of Assessing Competence in Computer Networking The Cisco Networking Academy supports 60,000+ academies in schools. CNA supplies curriculum & assessment resources online. Standardized multiple-choice exams »Okay on content knowledge, but... »nothing about interacting with functioning networks or troubleshooting in inquiry cycles. High fidelity local hands-on exams »Authentic, high face validity. »Huge local variation, unreliable, expensive

4 U Iowa Slide 4 Sept 19, 2007 PacketTracer Goal is improving instruction by fostering communication and enabling agents The patterns of activity and structure in our software embody both cognitive theory and theory of the domain structure. Tasks are tasks whether you are teaching with them or testing with them.

5 U Iowa Slide 5 Sept 19, 2007 PacketTracer Create and edit networks Multiple representations in the logical layer Inspection tool links to a deeper physical world Simulation mode »Detailed visualization and data presentation Standard support for world authoring »Library of elements »Simulation of relevant deep structure »Copy, paste, save, edit, annotate, lock Skip Screens

6 Instructors and students can author their own activities

7 Explanation

8 Experimentation

9 U Iowa Slide 9 Sept 19, 2007 The WRONG way Ask ‘how do you score it?’ after you’ve built the simulator and scripted a handful of idiosyncratic scenarios. Two ways to design simulation-based assessment system when you’re looking at thousands of people, with hundreds of tasks, at high cost and with high stakes...

10 U Iowa Slide 10 Sept 19, 2007 A RIGHT way Design the simulator and the scenarios around what you want to make inferences about, what you need to see to ground them, and the structure of the interrelationships. Two ways to design simulation-based assessment system when you’re looking at thousands of people, with hundreds of tasks, at high cost and with high stakes...

11 U Iowa Slide 11 Sept 19, 2007 Evidence-Centered Assessment Design Formal, multiple-layered framework from Messick’s (1994) guiding questions: What complex of knowledge, skills, or other attributes should be assessed? What behaviors or performances should reveal those constructs? What tasks or situations should elicit those behaviors?

12 U Iowa Slide 12 Sept 19, 2007 Evidence-Centered Assessment Design Principled framework for designing, producing, and delivering assessments Process model, object model, design tools Explicates the connections among assessment designs, inferences regarding students, and the processes needed to create and deliver these assessments. Particularly useful for new / complex assessments.

13 U Iowa Slide 13 Sept 19, 2007 Evidence-Centered Assessment Design PADI: Principled Assessment Design for Inquiry »NSF project, collaboration with SRI et al. »http://padi.sri.com Bob Mislevy home page »http://www.education.umd.edu/EDMS/mislevy/ »Links to papers on ECD »Cisco NetPASS

14 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability. Layers in the assessment enterprise

15 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

16 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

17 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

18 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

19 U Iowa Slide 19 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

20 U Iowa Slide 20 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

21 U Iowa Slide 21 Sept 19, 2007 Packet Tracer: Task Models & Design Wizard - Inline editing of feedback for each assessable item. - Activity and observable level feedback possible - Scoring based on network configuration, functionality, and/or solution process

22 U Iowa Slide 22 Sept 19, 2007 Packet Tracer: Variable Manager - Variable Assignment - When the activity starts, instead of using the initial network as the starting values, the activity will configure the network with the contents of the variables as entered by the student.

23 U Iowa Slide 23 Sept 19, 2007 Sample evidence rule from HYDRIVE IF an active path which includes the failure has not been created and the student creates an active path which does not include the failure and the edges removed from the problem area are of one power class, THEN the student strategy is splitting the power path ELSE the student strategy is not splitting the power path. Note… Sequential dependence Contextualized definition Multiple ways to succeed & fail Higher-level definition of what we’re looking for

24 U Iowa Slide 24 Sept 19, 2007 NetPass: An Evidence-Model Bayes-Net Fragment for Troubleshooting Tasks

25 U Iowa Slide 25 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

26 U Iowa Slide 26 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

27 U Iowa Slide 27 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

28 U Iowa Slide 28 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

29 U Iowa Slide 29 Sept 19, 2007 From Mislevy & Riconscente, in press Assessment Delivery Students interact with tasks, performances evaluated, feedback created. Four- process delivery architecture. Assessment Implementation Conceptual Assessment Framework Domain Modeling Domain Analysis What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? What is important about this domain? What work and situations are central in this domain? What KRs are central to this domain? How do we represent key aspects of the domain in terms of assessment argument. Conceptualization. Design structures: Student, evidence, and task models. Generativity. Manufacturing “nuts & bolts”: authoring tasks, automated scoring details, statistical models. Reusability.

30 U Iowa Slide 30 Sept 19, 2007 Conclusion Insights from cognitive research can improve the practice of assessment. Doing so requires … »Shift in perspective and »deeper understanding of assessment design. “Too many notes” ? (Emperor Joseph II on Mozart) Suitable conceptual frameworks, tools, and exemplars are now appearing. »Generativity, re-usability, and inter-operability are keys.


Download ppt "U Iowa Slide 1 Sept 19, 2007 Some Terminology and Concepts for Simulation-Based Assessment Robert J. Mislevy University of Maryland In collaboration with."

Similar presentations


Ads by Google