Viewing AutoTutor as a step- based tutoring system CPI 494, April 2, 2009 Kurt VanLehn
Your reactions to using AutoTutor? Ravi – found artificial stupidity. Lang processing good. Gen specfici flow good. Sung Young – deth of conversation; good script; natural; switching to anothertopic didn’t work, and that’sgood; Trying to mislead generated plausible response; Server died on “yes”; good for learning general logic/knowledge but not for complex problem solving; Andre: Empty + submit produces shift in conversation; nicely natural; did sometimes repeat questions/prompts; can bottom out by repeated submit without text; Nick: Hard ot learn but ok as assessment; When jump ahead, it won’t follow you; wants to go at its own (slow) pace. Darren: Impressive; At first; Long detailed answers were wasted on it. Eentually just gave it random words. Game via keywording. Robert: If using off-the-wall words, then got “try again.” Highly descriptive answers were lost on it; the agent in the videos: bobbing randomly is annoying. Wide eyes. Low cut dress; eyetracking might reveal distraction; speech was fine; but not for an hour. Maria: Too old fashion technology (one without agent); Sometimes could not submit an answer; Scrolling policy was annoying; (with agent): lady is freaky; dress was !; Male has big head and small shoulders & starring black eyes. Mouth moved not in synch with texts; grammar & synonums could be improved; spelling correction not good enough. Voices not so good Javier: Impressed at beginning; Got started right away; no need for UI training. But boring after awhile. One word misspelled can lead down long path. (agent) Woman didn’t blink at first/too slowly. Proportion of man to woman as odd. Looking at each other.
Agent vs. no agent Prefer the agent: 2 but not these agents – Can be distracting. Not real enough. – Lack of gestures on the prompts was noticable – More human agents evoke expectation of humans – Redundant to have both speech/agent & text Prefer no agent: 4
In terms of step-based tutoring, what are AutoTutor’s steps? Not apparent. It will elicit steps in a specific order.
In terms of step-based tutoring, what feedback and hints exist for steps? Got the specificity increasining with hints
A step-based tutoring system sometimes has a reflective debriefing: Does/should AutoTutor? When bottomed it out, got a review.
Does AutoTutor have an outer loop over tasks? Not a smart one. Just works down a list.
What model of the student does AutoTutor have? No. Which tasks in the list have been done already. Knows which aspects hve been mentioned & thus do not need more elicication.
Would it make sense for AutoTutor to repeat tasks? Repeating might make sense. Especially when the intervening discussion might have introduced confusions. Similar might be even better than same. Like the instruction in second language speaking. Similar (in spirit) to pyramid drills in physical sports.
Principle-based schemas vs. problem-based schemas Principle: For every force on A due to B, there is a force on B due to A that is equal and opposite. – Hand pushing wall – Gravity pulling earth