Download presentation
Presentation is loading. Please wait.
Published byLoraine Gardner Modified over 9 years ago
1
AERA April 2005 Models and Tools for Drawing Inferences from Student Work: The BEAR Scoring Engine Cathleen Kennedy & Mark Wilson University of California, Berkeley
2
2 © BEAR Center, 2005 Overview Features of “complex” tasks. How PADI addresses complex task features. The “big” assessment picture and where inferences are drawn (in measurement models implemented in Scoring Engine). An example of the PADI and Scoring Engine views. Next steps: Wizard to guide designers in developing solid chain of reasoning.
3
3 © BEAR Center, 2005 Example from FOSS
4
4 © BEAR Center, 2005 Example from FOSS Two measures: Physics (speed) Mathematics 5 responses: Equation choice Fill-in Numbers Fill-in Units Calculate Units in Answer Are the responses dependent?
5
5 © BEAR Center, 2005 “Complex” Task Features Multiple measures of interest Content & inquiry Multiple aspects of inquiry Response dependencies Common stimulus Sequence of steps
6
6 © BEAR Center, 2005 Complex Measurement Requires Clear Chain of Reasoning 1.Inferences one wishes to draw (cognition vertex) 2.Evidence required to draw the inferences (interpretation vertex) 3.Observations required to generate evidence (observations vertex) 4.Inferences are then interpretable in the context of the purpose of the assessment
7
7 © BEAR Center, 2005 PADI Addresses “Complex” Task Features Multiple measures of interest Content & inquiry Multiple aspects of inquiry Response dependencies Common stimulus Sequence of steps (within task) PADI Approach: Multidimensional IRT measurement model. Well-defined evaluation phases model response dependencies (rather than ignoring them).
8
8 © BEAR Center, 2005 Assessment System Architecture Task Specifications Student Database Scoring Engine Delivery System Design System Students Design Team DesignImplementation
9
9 © BEAR Center, 2005 Chain of Inferential Reasoning Task Specifications Student Database Scoring Engine Delivery System Design System Students Design Team Assessment Purpose Assessment Evidence Inferences about what students know and can do
10
10 © BEAR Center, 2005 FOSS Example: PADI View Two student model variables: Physics (speed) Mathematics 6 observable variables: Equation choice (Physics) Fill-in Numbers (Physics) Fill-in Units (Physics) Calculate (Math) Units in Answer (Physics) Bundled physics items
11
11 © BEAR Center, 2005 FOSS Example : Bundling Rules Defined in PADI Design System Template (task specification) − Activity − Evaluation Procedure − Evaluation Phase Implemented in Delivery System
12
12 © BEAR Center, 2005 FOSS Example : Bundling Rules EquationEq. Fill InEq. UnitsAns. UnitsBundle 00000 00011 00101 10002
13
13 © BEAR Center, 2005 FOSS Example : Scoring Engine View Two student model variables: Physics (speed) Mathematics 2 observable variables: Math score Bundled Physics score “Between Item” MD: Each observable variable provides evidence of one SMV. Scoring Engine returns two proficiency estimates per student to the Assessment Delivery System.
14
14 © BEAR Center, 2005 Next Steps Develop Measurement Model Design Wizard Evaluate design needs of users (how do they do it now, what would work better?) Guide thinking from the “assessment purpose” standpoint Align inferences, evidence and observations
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.