Robert J. Mislevy University of Maryland National Center for Research on Evaluation, Standards, and Student Testing (CRESST) NCME San Diego, CA April 15, 2004 Some Observations on Cognitive Psychology and Educational Assessment
Outline of the talk Themes from cog psych How cog psych informs what we assess and how we might assess it (esp. school & work) How cog psych helps us understand and organize what we do in assessment
Themes Capabilities & limitations Reasoning in terms of patterns Psychological perspectives Acquiring expertise Forms of knowledge representation
Capabilities & limitations Ways we are the same / differing / unique Experiential & reflective cognition Optical illusions / cognitive illusions Limited working memory & attention Can think about our thinking (metacognition) Benefit from procedures, methods, tools
An Optical Illusion (
Reasoning in terms of patterns Perception combines input from environment and patterns from experience Chi, Feltovich, Glaser example Narratives / schemas / scripts / mental models This is how we make sense of the world Some “wired in” Some learned informally and experientially Some through instruction and conscious effort
What is this a picture of? (
Reasoning in terms of patterns Simultaneous use of patterns at many levels Perception / Meaning /Action Key role of interacting with situation Inquiry cycle / model-based reasoning Interactive tasks (construction, simulation) Even in static tasks, focus on perception / explanation / action
Reasoning in terms of patterns Assessment as Evidentiary Argument What complex of knowledge, skills, or other attributes should be assessed … ? What behaviors or performances should reveal those constructs [broadly construed]? What tasks or situations should elicit those behaviors? (Messick, 1994)
Psychological perspectives Trait/Differential (Spearman, Carroll) Origin of machinery of psychometrics Behaviorist (e.g., CRTs of 1970s) Developmental (Piaget) Information-processing (Newell & Simon) Sociocultural/situative (Vygotsky, Lave) (Greeno, Pearson, and Schoenfeld (1996)
Psychological perspectives A perspective shapes… what you pay attention to; what entities and relationships you use in explanations; what you see as problems and solutions. A perspective both enables and constrains thinking.
Psychological perspectives For assessment, perspective shapes… Inferences you target – patterns that shape students’ actions (Meaning & Action) What you look for in what students say, do, or make (Perception) What are the features of the situation that evoke the evidence you need. A perspective both enables and constrains what you can learn from an assessment.
What will Jimmie’s path be if he steps off the merry-go-round right now?
Psychological perspectives Hydrive Info-processing + sociocultural AP Studio Art Sociocultural; interpretational Note interpretation of variables in model Task-based language assessment All perspectives relevant Target language use (Bachman & Palmer) What to stress, how to design situations
Acquiring expertise Expertise as overcoming human cognitive processing limitations Patterns for perceiving, understanding, acting (incl. sociocultural) Use of knowledge representations Automating processes to varying degrees Metacognitive skills
Acquiring expertise Examples in assessment Katz, re NCARB simulations as example for “design under constraint” (assessment is another such domain!) Embretson as example for differential perspective measurement Marshall & Derry as example for assessment design based on schemata Stevens re ordered pairs of actions
Forms of knowledge representation Symbol sets & manipulation Forms of knowledge representation (KRs) Maps, diagrams, object models, flow charts Central to expertise Mediated cognition Distributed cognition Nexus between info-processing & sociocultural perspectives
Forms of knowledge representation Some forms of knowledge representation for design and using assessments: l Measurement models & representations l Argument structures l Evidence-centered design structures l Design patterns, templates, object models l IMS/QTI standards
Three basic models that embody the assessment argument Forms of knowledge representation
Measurement models: l Multivariate models for different aspects of knowledge / skill / propensities (MRCMLM) l Integration of statistical inference with task design (Tatsuoka, Embretson): Cognitive diagnosis, mixed strategies, multilevel models l Conditional dependence (re interaction) l Re-interpretation of variables (propensities to act in situations w x features; rater models)
Example: HYDRIVE Student-model variables in HYDRIVE Motivated by cognitive task analysis Scope shaped by purpose Grain-size determined by instructional options A Bayes net fragment Overall Proficiency Procedural Knowledge Power System Knowledge Strategic Knowledge Use of Gauges Space Splitting Electrical Tests Serial Elimination Landing Gear Knowledge Canopy Knowledge Electronics Knowledge Hydraulics Knowledge Mechanical Knowledge
HYDRIVE, continued A Bayes Net Measurement Model, docked with Student Model Canopy Situation-- No split possible Canopy Situation-- No split possible Use of Gauges Serial Elimination Canopy Knowledge Hydraulics Knowledge Mechanical Knowledge Library of Measurement Model fragments
Conclusion Assessment is a particular kind of narrative: An evidentiary argument about aspects of what students know and can do, based on a handful of particular things that have said, done, or made. Assessment integrates perceiving, understanding, and acting. Assessment forms both enable and constrain thinking about students.
Conclusion Cognitive psychology helps us understand what to make inferences about, what we need to see, what situations can provide us with clues. Conceiving targets of assessment Explicating and improving the design and use of assessments