Download presentation
Presentation is loading. Please wait.
1
ILMDA: An Intelligent Learning Materials Delivery Agent and Simulation Leen-Kiat Soh, Todd Blank, L. D. Miller, Suzette Person Department of Computer Science and Engineering University of Nebraska, Lincoln, NE {lksoh, tblank, lmille, sperson} @cse.unl.edu
2
Introduction Traditional Instruction ourworld.compuserve.com/homepages/g_knott/lecturer.gifhttp://battellemedia.com/archives/old%20book%206.gif
3
Introduction Intelligent Tutoring Systems – Interact with students – Model student behavior – Decided which materials to deliver – All ITS are adaptive, only some learn
4
Related Work Intelligent Tutoring Systems – PACT, ANDES, AutoTutor, SAM These lack machines learning capabilities – They generally do not adapt to new circumstances – Do not self-evaluate and self-configure their own strategies – Do not monitor usage history of content presented to students
5
Project Framework Learning material components – A tutorial – A set of related examples – A set of exercise problems
6
Project Framework Underlying agent assumptions – A student’s behavior is a good indicator how well the student is understanding the topic in question – It is possible to determine the extent to which a student understands the topic by presenting different examples
7
Methodology ILMDA System – Graphical user interface front-end – MySQL database backend – ILMDA reasoning in-between
8
Methodology Overall methodology
9
Methodology Flow of operations Under the hood – Case-based reasoning – Machine Learning – Fuzzy Logic Retrieval – Outcome Function
10
Learner Model Student Profiling – Student background Relatively static First and last name, major, GPA, interests, etc. – Student activity Real-time behavior and patterns Average number of mouse clicks, time spent in tutorial, number of quits after tutorial, number of successes, etc.
11
Case-based reasoning Each case contains problem description and solution parameters The casebase is maintained separately from the examples and problems Chooses example or problem for students with most similar solution parameters
12
Solution Parameters Description TimesViewedThe number of times the case has been viewed DiffLevelThe difficulty level of the case between 0 and 10 MinUseTimeThe shortest time, in milliseconds, a single student has viewed the case MaxUseTimeThe longest time, in milliseconds, a single student has viewed the case AveUseTimeThe average time, in milliseconds, a single student has viewed the case BloomBloom’s Taxonomy Number AveClickThe average number of clicks the interface has recorded for this case LengthThe number of characters in the course content for this case ContentThe stored list of interests for this case
13
Adaptation Heuristics Adapt the solution parameters for the old case – Based on difference between problem description of old and new cases – Each heuristic is weighted and responsible for one solution parameter – Heuristics are implemented in a rulebase that adds flexibility to our design
14
Simulated Annealing Used when adaptation process selects an old case that has repeatedly led to unsuccessful outcome Rather than remove old case SA is used to refresh its solution parameters
15
Implementation End-to-end ILMDA – Applet-based GUI front-end – CBR-powered agent – Backend database system ILMDA simulator
16
Simulator Consists of two distinct modules – Student Generator Creates virtual students Nine different types student types based on aptitude and speed – Outcome Generator Simulates student interactions and outcomes
17
Student generator Creates virtual students – Generates all student background values such as names, GPAs, interests, etc – Generates the activity profile such as average time spent on session and average number of mouse clicks using Gaussian distribution
18
Outcome Generator Simulates student interaction and outcomes – Determines the time spent and the number of clicks for one learning material – Also determines whether a virtual student quits the learning material and answers it successfully
19
Simulation 900 students, 100 from each type – Step 1: 1000 iterations with no learning – Step 2: 100 iterations with learning – Step 3: 1000 iterations again with no learning Results – Between Steps 1 and 3, average problem scores increased from 0.407 to 0.568 – Between Steps 1 and 3, the number of examples given increased twofold
20
Future Work Deploy the ILMDA system to the introductory CS core course – Fall 2004(done) – Spring 2005(done) – Fall 2005 Add fault determination capability – Students || Agent Reasoning || Content at fault
21
Questions
22
Responses I Blooms Taxonomy (Cognitive) – Knowledge: Recall of data. – Comprehension: Understand the meaning, translation, interpolation, and interpretation of instructions and problems. State a problem in one's own words. – Application: Use a concept in a new situation or unprompted use of an abstraction. Applies what was learned in the classroom into novel situations in the workplace. – Analysis: Separates material or concepts into component parts so that its organizational structure may be understood. Distinguishes between facts and inferences. – Synthesis: Builds a structure or pattern from diverse elements. Put parts together to form a whole, with emphasis on creating a new meaning or structure. – Evaluation: Make judgments about the value of ideas or materials. http://www.nwlink.com/~donclark/hrd/bloom.html
23
Responses II Outcome function (example or problem) – Ranges from 0..1 – Quitting at tutorial or example results in 0 for outcome – Otherwise, compare average clicks and times for student with those for example or problem
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.