Download presentation
Presentation is loading. Please wait.
Published byDarren Harrington Modified over 9 years ago
1
Research Methods Tutoring in the classroom Peter Wiemer-Hastings David Allbritton Elizabeth Arnott Oussama BenKhadra Jesse Efron
2
Overview Background of AI in Education and Intelligent Tutoring Systems On human learning AutoTutor RMT Architecture Techniques Deployment Evaluation
3
Overview of AIEd The goal: Interact with student to increase learning The ideal system: Models domain knowledge Models student knowledge Adapts to student Adapts its approaches to optimize pedagogy
4
Some influential systems Algebra tutor Project Listen Scenario-based interaction (e.g. Wolf) Inquiry-based learning
5
What affects learning? Motivation Being “constructive” Doing example problems On the other hand: the Hawthorne effect
6
Bloom's hierarchy (1956) Knowledge: recall facts Comprehension: understanding info Application: ability to use info Analysis: see patterns, organization Synthesis: use old ideas to create new Evaluation: compare, discriminate, assess
7
Human tutors Much better than classroom practice: up to 2 standard deviations Rely on doing problems with their students Don’t actually have deep understanding of their students Use basic dialog moves: question, pump, prompt, hint, summarize Have a script of topics to cover
8
AutoTutor Started in 1997 Models (non-expert) human tutors Talking head “incarnates” the tutor Uses dialog moves above Evaluates student answers by comparing to expected answers
9
Latent Semantic Analysis Take a large corpus of texts Make a matrix of words in documents (paragraphs) Weight by inverse frequency Do singular value decomposition to “re- orient” the dimensions, reduce the noise Use vectors to represent “meaning” of words and texts. Use cosine to compare meanings.
10
RMT Follows the AutoTutor approach Re-implemented in Lisp Interacts via the internet Includes spell-checking and synonym substitution
11
Architecture Dialog manager handles logins, providing tutor utterances, collecting student responses. Responses evaluated with NLU components (LSA, aspell) Dialog Transition Network determines tutor’s next response
14
DATN Graphical representation of tutor’s behavior Directly controls the tutor’s choice of response.
15
Curriculum Script 5 topics: Ethics, types of studies, variables, reliability, validity Over 4000 lines Each item contains: Question (Picture) Target good answers Associated prompts, hints, questions Bad answers Summary
16
RM Curriculum Ethics of psychological experiments privacy, informed consent, ethics code, costs versus benefits Types of studies experiments and correlational (observational, archival, case studies, survey research) Variables independent and dependent, operationalization, levels, causality, confounding, categorical and quantitative variables, scales Reliability systematic and random error, test-retest, internal consistency, factors that affect reliability Validity types of validity (internal, construct, external, statistical), threats to validity.
17
Types of student tasks Conceptual: facts, definitions, issues Analytic: Given a scenario, evaluate weaknesses, or fill in missing slots Synthetic: Given a task, create an approach (frame and slots)
18
Example conceptual dialog Tutor: What does the principle of justice imply about how research should be done? Student: No group should have to take more risk than another group. Tutor: Good. What else? Student: The benefits of the research should be available to everyone
19
Analytical example Tutor: I'm going to tell you about a research project someone conducted that has some ethical problems, and then you will tell me what the researcher should have done differently. OK? Here is how the project was done. Joe, a graduate student, did a study of high school students' attitudes about racism. He created a survey about racism and gave it out to 100 high school students. Before he gave out the survey, he got permission from his dissertation advisor and from the principal of the high school to give out an anonymous survey to the students. He also made sure each student read and signed a consent form that was attached to the survey. What should Joe have done differently to make the project comply with the requirements for ethical practices in research? Student: Huh?
20
Synthetic example Now let's practice designing a study. I'll give you a hypothesis to test, and tell you what resources you have available. Then you design a study to test the hypothesis. Let's say that you want to find out whether reading a humorous greeting card causes people to be more likely to buy humorous magazines - that is your hypothesis. You have an agreement with a local magazine shop that allows you to use some of their customer purchase records for your research. … How will you set up the study?
21
Demo
22
Cool stuff on the way Shallow thinking traps Dialog Chains Research supervisor scam
23
Leading students astray Failure to “get it right” strengthens learning Shallow thinking traps (STTs) in content material provide opportunity for failure Dialog chains (DCs) should get students back on track A DC has a trigger and a network of student responses and tutor questions
24
Different tutoring style Traditional: Tutor (usually) asks the questions and the student (hopefully) answers. Pseudo-real world: The tutor is disguised as a research supervisor Asks questions to assess student knowledge (conceptual, analytic) Assigns research tasks (analytic, synthetic) Evaluates student’s work performance
25
To the classroom Use in association with two core courses in Psychology next term Students use the system after they’ve covered the topic in the course
26
In-course evaluation Pilot testing on various aspects 2x2 design: Tutor or static information Traditional tutor or research supervisor Pretests and posttests
27
Pilot testing Questions: 1. How easy/difficult was it for you to use the tutor? VERY EASY 1——2——3——4——5——6 VERY DIFFICULT 2. Was the tutor’s speech easy/difficult to understand? 1. How easy/difficult was the material presented by the tutor? N. Did you feel like you may learn/may not learn something during a tutoring session?
28
Early results In general, subjects disliked the computerized speech - -Every subject in the "Miyako" mentioned her voice as something they did not like, and something that could be improved. "It was a female tutor with a male voice. Made it annoying." "Try to use a less computerized voice." "The animation was fine, but the woman's voice was annoying.“
29
More results In general, subjects liked the animation - the only problem seemed to be that Miyako appears on top of the response box. Ss have to keep moving her back to the center of the screen. Merlin seemed to be the favorite. "I liked his facial expressions." "He was funny looking. Relieved stress in a way." "Liked his outfit."
30
Even more In general, subjects liked the feedback given and the hints and prompts and felt that the level of the material was appropriate "It was easy because he asked a question and in case you got it wrong, he will help you to get to the right answer. "The feedback was helpful." "It didn't just give you the right answer if you got it wrong."
31
Some suggestions -Keep questions on the screen in the dialog box in case the student misses what the tutor asks. -Improve voices on the animated characters. -Move Miyako so that she appears in the center of the screen. -Provide instructions at the beginning; a longer intro. -More explanation of right answer. -Sometimes the tutor asks the question before giving the details of the experiment.
32
Things they liked: -The feedback -The hints and prompts -Multiple chances to get the right answer -Various ways of asking questions -Lead me to the right answer -Merlin animation
33
Issues For the research supervisor, does text- only make more sense? If not, what head do we give it? Is the supervisor scenario realistic enough to motivate students? Will it work?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.