Presentation is loading. Please wait.

Presentation is loading. Please wait.

An introduction to intelligent interactive instructional systems

Similar presentations


Presentation on theme: "An introduction to intelligent interactive instructional systems"— Presentation transcript:

1 An introduction to intelligent interactive instructional systems
Kurt VanLehn ASU

2 Outline Tutoring systems Other interactive instructional systems
Step loop User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Other interactive instructional systems

3 Intelligent “tutoring” system is a misnomer
Almost all are used as seatwork/homework coaches The instructor still… Lectures Leads whole class, small group & lab activities Assigns grades; defends grades Can assign homework / seatwork problems or delegate to the tutoring system The instructor no longer… Grades homework / seatwork Tests? For-profit web-based homework grading services are growing rapidly

4 If students enter only the answer, call it answer-based tutoring
30° 25 X = 40° Answer 45° What is the value of x?

5 If students enter steps that derive the answer, call it step-based tutoring
40+30+y=180 70+y=180 y=110 x+45+y=180 x =180 x= x=25 45° 30° 40° What is the value of x? Step Step Step Step Step Answer Step

6 Def: Feedback is a comment on one of the student’s steps
40+30+y=180 y = 250 45° 30° 40° What is the value of x? Oops! Check your arithmetic. OK

7 Feedback is often given as a hint sequence
40+30+y=180 y = 250 45° 30° 40° What is the value of x? Oops! Check your arithmetic. OK

8 Hints become more specific
40+30+y=180 y = 250 45° 30° 40° What is the value of x? You seem to have made a sign error. OK

9 Try taking a smaller step.
Hints segue from commenting on the student’s step to suggesting a better step 40+30+y=180 y = 250 45° 30° 40° What is the value of x? Try taking a smaller step. OK

10 and become more specific
40+30+y=180 y = 250 45° 30° 40° What is the value of x? Try doing just one arithmetic operation per step. OK

11 Enter 70+y=180, and keep going from there.
Def: A bottom-out hint is the last hint, which tells the student what to enter. 40+30+y=180 y = 250 45° 30° 40° What is the value of x? Enter 70+y=180, and keep going from there. OK

12 Try doing just one arithmetic operation per step.
Def: A next step help request is another way to start up a hint sequence. 40+30+y=180 help 45° 30° 40° What is the value of x? Try doing just one arithmetic operation per step. OK

13 Delayed (as opposed to immediate) feedback occurs when the solution is submitted
x+45+y=180 x =180 x=180–250 x= –70 45° 30° 40° What is the value of x?

14 Delayed (as opposed to immediate) feedback occurs when the solution is submitted
Oops! Check your arithmetic. OK 40+30+y=180 y=250 x+45+y=180 x =180 x=180–250 x= –70 45° 30° 40° What is the value of x? Can an angle measure be negative? OK

15 Both step-based tutors and answer-based tutors have a task loop
Tutor and/or student select a task Tutor poses it to the student Student does the task and submits an answer If answer-based tutor, then work offline If step-based tutor, then work online The step-loop = Do step; get feedback/hints; repeat Repeat

16 Technical terms/concepts (so far)
Answer-based tutoring system (= CAI, CBI, …) Step-based tutoring system (= ITS, ICAI…) Step Next-step help Feedback Immediate Delayed Hint sequence Bottom-out hint Task loop Step loop

17 Andes user interface Read a physics problem Draw vectors
Type in equations Type in answer

18 Andes feedback and hints
“What should I do next?” “What’s wrong with that?” Green means correct Red means incorrect Dialogue & hints

19 SLQ-Tutor (Addison Wesley)
Problem Step Step Step Submit! Feedback The database that the problem refers to

20 Cognitive Algebra I Tutor (Carnegie Learning) Step: Enter an equation
Problem Step: Divide both sides Step: Label a column Step: Define an axis Step: Fill in a cell Step: Plot a point

21 AutoTutor The task Student input is the 2nd half of the step
Let me give you an example. We have an AutoTutor for Newtonian Physics. Answer is about a paragraph of information. It may take 100 turns between students and tutors to answer this question. Each tutor turn + student turn in the dialogue is a step Student input is the 2nd half of the step

22 Introduction: Summary
Main ideas Task loop over tasks Step loop over steps of a task Feedback can be immediate or delayed But it focuses on steps Hint sequence Types of tutoring systems Step-based tutors (ITS) – both loops Answer-based tutors (CBT, CAI, etc) – task loop only

23 Initial framework Step loop Task selection Assessment
User interface Interpreting student actions Suggesting good actions Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

24 Initial framework Step loop Task selection Assessment
User interface Forms, with boxes to be filled Dialogue Simulation Etc. Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

25 Initial framework Step loop Task selection Assessment
User interface Interpreting student steps Equations Typed natural language Actions in a simulation Etc. Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

26 Initial framework Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Any correct path vs. shortest path to answer Which steps can be skipped? Recognize the student’s plan and suggest its next step? Etc. Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

27 Initial framework Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Give a hint before the student attempts a step? Immediate vs. delayed feedback? feedback on request? How long a hint sequence? When to bottom out immediately? Etc. Task selection Assessment Authoring and the software architecture Evaluations Dissemination

28 Initial framework Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Keeping the student in the “zone of proximal development” (ZPD) Mastery learning: Keep giving similar tasks until student master them Choosing a task that suits the learner’s style/attributes Etc. Assessment Authoring and the software architecture Evaluations Dissemination

29 Initial framework Next Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next

30 Assessment vs. Evaluation
“Assessment” of students What does the student know? How motivated/interested is the student? “Evaluation” of instructional treatments Was the treatment implemented as intended? Did it produce learning gains in most students? Did it produce motivation gains in most students? What is the time cost? Other costs?

31 Assessment consists of fitting a model to data about the student
Single factor model: A single number representing competence/knowledge Probability of a correct answer on a test item = f(competence(student), difficulty(item)) Knowledge component model: One number per knowledge component representing its mastery Probability of a correct answer on a test item = f(mastery(KC1), mastery(KC2), mastery(KC3), …) where KCn are the ones applied in a correct solution

32 Example: Answer-based assessment of algebraic equation solving skill
Test item: Solve 3+2x=10 for x KC5: Subtract from both sides & simplify 3+2x=10  2x=7 KC8: Divide both sides & simplify 2x=7  x=3.5 Single factor model If answer is correct, increment competence else decrement Knowledge component model If answer correct, increment mastery of KC5 & KC8 If answer incorrect, decrement mastery of KC5 & KC8 Weakest one is most likely to be the failure, so decrement it more

33 Step-based assessment of algebraic equation solving skill
Solve 3+2x=10 for x Step1: Step2: Single factor model: Whenever a step is answered correctly without hints, increment competence else decrement Knowledge component model: Whenever a step is answered correctly without hints, increment its KC’s mastery else decrement 2x = 7 x = 3.5

34 Task selection uses assessments
Single factor model Choose a task that is the right level of difficulty i.e., in the ZPD (zone of proximal development) of the student Knowledge component model Choose a task whose solution uses mostly mastered KCs, and only a few KCs that need to be mastered

35 Other assessment issues
Other decisions, besides task selection, that can use assessment? Assessment of motivation or interest? Assessment of learning styles? Disabilities? Diagnosis of misconceptions? Bugs? 8 7 - 1 9 6 2 8 6 7 - 1 9

36 Should a “Skillometer” displays knowledge component mastery to the student?

37 Initial framework Next Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next

38 Authoring Author creates new tasks Who can be an author?
Author generates all solutions? System generates all solutions? Same taste as author? Can author add new problem-solving knowledge? Who can be an author? Instructors? Professional authors? Knowledge engineers?

39 Software architecture & engineering
Client-server issues Platform independence Integration with learning management systems E.g., Blackboard, WebAssign, many others Cheating, privacy Quality assurance Software bugs Content & pedagogy bugs

40 Initial framework Next Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next

41 Types of evaluations Analyses of expert human tutors
What do they do that the system should emulate? Formative evaluation What behaviors of the system need to be fixed? Have students talk aloud, interviews; teachers… Summative evaluation Is the system more effective than what it replaces? Two condition experiment: System vs. control/baseline Pre-test and post-test (+ other assessments) Hypothesis testing Why is the system effective? Multi-condition experiments: System ±feature(s)

42 Example: Summative evaluation of the Andes Physics tutor
University physics (mechanics) 1 semester 2 Conditions: Homework done with… Andes physics tutor Pencil & paper Same teachers (sometimes), text, exams, labs Results ( ) in terms of effect sizes Experimenter’s post-test: d=1.2 Final exam: d=0.3 d = (mean_Andes_score – mean_control_score) ÷ pooled_standard_deviation

43 Aptitude-treatment interaction (ATI)
Control instruction Experimental instruction Good learner (e.g., highly motivated, well prepared…) Large gains Poor learner Small gains

44 Open-response, problem solving exams scores
Andes Exam score Control Grade-point average

45 Ideal tutoring system adapts to the student’s needs
High Bored, & irritated Assistance provided Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…

46 No-so-good tutoring system helps only some students
High Not-so-good tutor Assistance provided Bored, & irritated Large learning gains Struggling Low Assistance needed Assistance provided = task selection, feedback, hints, user interface…

47 Initial framework Next Step loop Task selection Assessment
User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next

48 Dissemination = getting the system into widespread use
Routes Post and hope Open source Commercialization Issues Instructor acceptance Instructor training Student acceptance Marketing

49 Outline Next Tutoring systems Other interactive instructional systems
Step loop User interface Interpreting student steps Suggesting good steps Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next Other interactive instructional systems

50 Other intelligent interactive instructional systems
Teachable agent Student deliberately teaches the system, which is then assessed (in public) Learning companion Student works while system encourages Peer learner Student and system work & learn together To be discovered…


Download ppt "An introduction to intelligent interactive instructional systems"

Similar presentations


Ads by Google