Download presentation
Presentation is loading. Please wait.
Published byKarlheinz Reinhardt Steinmann Modified over 6 years ago
1
Assessing the impact of computer problem solving coaches: preliminary steps
Qing Xu, Ken Heller, Leon Hsu, Andrew Mason, Anne Loyle-Langholz University of Minnesota, Twin Cities AAPT Summer 2011 Meeting Omaha, NE Assessing the impact of computer problem solving coaches: preliminary steps Supported by NSF DUE # and DUE # and by the University of Minnesota
2
Preliminary steps I .Characterizing how students perform without the coaches. (Baseline) II. Characterizing how do the students use coaches. (Usage) Getting ready for the implementation of the coaches. Two parts: 1. Baseline 2. How students use the data. (Usage of the coaches) Characterizing how students perform without coaches. Charactering students who use coaches – how do they use coaches.
3
I. Baseline Introductory mechanics at the University of Minnesota Spring 2011 4 quizzes, 2 problems per quiz Apply the Problem-Solving Rubric to a representative sample (38 out of 108 students). - Assesses written solution along 5 dimensions Useful Description, Physics Approach, Specific Application of Physics, Mathematical Procedure, Logical Progression. Two expert raters both scored 304 solutions. 3 tiers (determined by exam scores) (tier 1 = high performing, tier 2 = medium performing, tier 3 = low performing) To make sense of the data, tiers. 38 evenly distributed
4
Qualitative differences between tiers
Quiz1 Representative Tier-determined by scores
5
Evolution of scores
6
What’s next? Increasing the sample size
E&M, Fall semester mechanics baseline More students decrease error bars Fall semester (not spring semester)
7
II. Usage- Total time 2 others completed 12 of 15
Reasonable time for completion: 20~ 40 mins / module Students tend to stay on task Only 2 of 18 had at least one break of more than 5 minutes for a question (data~fall2010) Retention rate: high in fall (18 of 21) for 15 coaches 2 others completed 12 of 15 In spring 2011, 2/9 finished all 22.
8
II. Usage- Automated Time
Module\Student_Automated time Student1 (seconds) Student2(seconds) ElevatorLamp 4 4.5 TrainStunt 3 EmergencyRamp KineticSculpture HawkandGoose SunkenShips SkateBoard 3.5 1.5 SpringTrain Average Automated Time 3.7 3.1
9
II. Usage- Normalize to the shortest clicks
10
II. Usage- Normalize to the shortest clicks
11
Summary http://groups.physics.umn.edu/physed
The Problem solving rubric does distinguish students at different level of problem solving skills. Scores achieved by students in all tiers remain constant across all categories as a function of time. Students found the coaches useful and can treat them seriously. POSTER: PST2C56 Tue 08/02, 6:00PM - 6:45PM (Kiewit Fitness Center Courts)
12
II. Usage-student preference
Student preference for each type Faculty tend to disagree with students (found type 1 tedious) 1 student initially preferred type 1 but switched to type 3 after gaining familiarity with physics Most useful 2nd most useful Least useful Type 1 13 3 1 Type 2 9 Type 3 4 5 8
13
Qualitative differences between tiers
14
Qualitative differences between tiers
15
Qualitative differences between tiers
16
Qualitative differences between tiers
17
Evolution of scores
18
Evolution of scores
19
Evolution of scores
20
Evolution of scores
21
II. Usage- Normalize to the shortest clicks
Module\Student_Automated time Student1 (seconds) Student2(seconds) ElevatorLamp 4 4.5 TrainStunt 3 EmergencyRamp KineticSculpture HawkandGoose SunkenShips SkateBoard 3.5 1.5 SpringTrain Average Automated Time 3.7 3.1 Distribution suggests students are taking the tutors seriously Median time = 4.58 s
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.