Presentation is loading. Please wait.

Presentation is loading. Please wait.

UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology.

Similar presentations


Presentation on theme: "UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology."— Presentation transcript:

1 UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology as an Agent of Change in Teaching and Learning Assessment and Technology: Identifying Current Trends and Issues Using Technology for Measures of Complex Knowledge and Skill William L. Bewley UCLA CRESST American Educational Research Association Annual Meeting Chicago, IL - April 10, 2007

2 1 Overview Assessing the impact of technology on teaching and learning is problematic Means and Haertel (2003, p. 42) “We can’t evaluate the impact of technology per se; instead, we must define more specific practices incorporating particular technology tools or supports and evaluate the effects of these.” To do this, we need appropriate learning measures, particularly for complex cognitive skills using complex constructed responses Technology can help us do that But (stating the obvious), we still need to know what we’re doing

3 2 Technology Is Not the Solution Technology is not the solution to better assessment, any more than it’s the solution to better learning and teaching The solution is good assessment design, supported by appropriate uses of technology

4 3 The Solution Good assessment design Why are we assessing? What will be assessed to support this purpose? What behaviors will provide evidence? What task(s) will elicit the behaviors? How is task performance scored? How are results interpreted? Supported by appropriate technology How are the task, performance scoring, and interpretation implemented and delivered?

5 4 Approaches We’re Using Simulation-based assessment Assessing the product Assessing the process: analyzing the clickstream Sensor-based assessment Interpretation (But not the only approaches)

6 5 Assessing the Product Evaluation of Shooting Positions

7 6 Assessing the Product Link Architecture Planning

8 7 Assessing the Product As a coach, observe shooters on the firing line Combat Marksmanship Coaching

9 8 What kind of error did you observe? Sequence Skipped Step Wrong Execution >> Choose all that apply << Assessing the Product Fault check and describe what’s wrong, or fix it by dragging and dropping to fill in missing step(s) correct the execution of a step correct the sequence of steps Combat Marksmanship Coaching

10 9 Assessing the Process Assessing the product Misses the processes used, and these potential benefits Verifying expected problem-solving behaviors Explaining performance differences between subgroups Supporting task validation Enabling individualized instruction

11 10 Assessing the Process Or if process data – can be expensive, may be unreliable, and not real time Think-alouds Video Observation But...

12 11 Assessing the Process Simulation-based assessments can do more than deliver the task – they can Unobtrusively measure processes by analyzing the clickstream Do it in real time Interpret the results in real-time to enable on- the-fly diagnosis of knowledge and skill gaps to individualize instruction

13 12 Known Distance Coaching Assessing the Process Diagnosis

14 13 The Decision Analysis Tool Assessing the Process Create options Adjust parameters

15 14 Assessing the Process Click to check threat information Click to view threat/defense geometry Place assets, location and role assignment Adjust defense geometry Air Defense Planning

16 15 Sensor-Based Assessment Pressure sensor on triggerEye trackerMotion sensor Trigger squeeze Laser strike sensor Eye position data Muzzle movement

17 16 MetricSourceUsage Neurophysiological measures Cognitive overload EEGUsed as an indicator of how well shooter is processing information and accommodating task demands. Prior work in this area has demonstrated the utility and feasibility of EEG as a measure of cognitive overload. AnxietyHeartbeatUsed to measure degree of stress experienced by shooter. Motor response measures Trigger break SwitchUsed to establish a synchronization point for all measures. Trigger squeeze Pressure sensorUsed to examine quality of trigger squeeze—slow or rapid. Trigger control is considered a fundamental skill in marksmanship. Muzzle wobble AccelerometerUsed to measure the degree of movement in the muzzle of the weapon. Sensor-Based Assessment Neurophysiological measures of cognitive overload and anxiety, plus motor response measures

18 17 Interpretation Modeling using ontologies Diagnosis using Bayes nets Artificial neural nets (current research) Hidden Markov models (current research) Prescription based on diagnosis and domain ontologies

19 18 Interpretation Domain knowledge: rifle marksmanship

20 19 Interpretation A domain ontology: rifle marksmanship Concepts and relations between concepts, in a database

21 20 Diagnosis / Remediation System Model of Knowledge Dependencies and Performance Dependencies Recommender Domain Ontology content knowledge and skill gaps knowledge fine motor processes performance: shot group accuracy and precision anxiety background instrumented indoor weapon system

22 21 Bayes Net Fragment: Sight Picture at Trigger Break from target from trigger sensor self-ratings against ideal position from eye tracker from motion sensor

23 22 Example Remediation Feedback (knowledge of results) Definitions, concepts, procedures Pictures, videos, and audio

24 23 Some Issues for Discussion Are these appropriate uses of technology for assessment? Can you model student knowledge in a domain? Interaction of complexity and examinee experience Are clickstream data the outcome of meaningful cognitive events? Alternative methods for analyzing and interpreting complex data Construct-irrelevant variance, e.g., the technology The cost Equivalence of tasks Level of fidelity

25 24 ©2007 Regents of the University of California


Download ppt "UCLA Graduate School of Education & Information Studies National Center for Research on Evaluation, Standards, and Student Testing V3, 4/9/07 SIG: Technology."

Similar presentations


Ads by Google