Does Conjunctive Knowledge Tracing Provide Leverage to the Temporal and Location Heuristics in Error Attribution? Adaeze Nwaigwe University of Maryland.

Slides:



Advertisements
Similar presentations
Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
Advertisements

Sta220 - Statistics Mr. Smith Room 310 Class #14.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C3: Conditional Probability And.
Improving learning by improving the cognitive model: A data- driven approach Cen, H., Koedinger, K., Junker, B. Learning Factors Analysis - A General Method.
NIPRL Chapter 1. Probability Theory 1.1 Probabilities 1.2 Events 1.3 Combinations of Events 1.4 Conditional Probability 1.5 Probabilities of Event Intersections.
Logistic Regression Part I - Introduction. Logistic Regression Regression where the response variable is dichotomous (not continuous) Examples –effect.
Independent Events Let A and B be two events. It is quite possible that the percentage of B embodied by A is the same as the percentage of S embodied by.
Probability (Kebarangkalian) EBB 341. Probability and Statistics Parameters population sample before observation after observation data random variables.
Temporal Pattern Matching of Moving Objects for Location-Based Service GDM Ronald Treur14 October 2003.
Ensemble Learning: An Introduction
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 1: Introduction to Decision Support Systems Decision Support.
Examples of Ensemble Methods
+ Doing More with Less : Student Modeling and Performance Prediction with Reduced Content Models Yun Huang, University of Pittsburgh Yanbo Xu, Carnegie.
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Chapter 15 Probability Rules!
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Section 4-2 Basic Concepts of Probability.
For Better Accuracy Eick: Ensemble Learning
Computer Simulation A Laboratory to Evaluate “What-if” Questions.
Intermediate Statistical Analysis Professor K. Leppel.
Made with Protégé: An Intelligent Medical Training System Olga Medvedeva, Eugene Tseytlin, and Rebecca Crowley Center for Pathology Informatics, University.
Computational Methods to Vocalize Arabic Texts H. Safadi*, O. Al Dakkak** & N. Ghneim**
NA387 Lecture 5: Combinatorics, Conditional Probability
5.3B Conditional Probability and Independence Multiplication Rule for Independent Events AP Statistics.
Simpson Rule For Integration.
بسم الله الرحمن الرحیم. Ehsan Khoddam Mohammadi M.J.Mahzoon Koosha K.Moogahi.
Some Probability Rules Compound Events
Previous Lecture: Data types and Representations in Molecular Biology.
Introduction to Probability  Probability is a numerical measure of the likelihood that an event will occur.  Probability values are always assigned on.
Evaluation of User Interface Design 4. Predictive Evaluation continued Different kinds of predictive evaluation: 1.Inspection methods 2.Usage simulations.
Noboru Matsuda Human-Computer Interaction Institute
Slide 15-1 Copyright © 2004 Pearson Education, Inc.
1 Chapter 4, Part 1 Repeated Observations Independent Events The Multiplication Rule Conditional Probability.
Curiosity-Driven Exploration with Planning Trajectories Tyler Streeter PhD Student, Human Computer Interaction Iowa State University
DCM – the theory. Bayseian inference DCM examples Choosing the best model Group analysis.
Recommendation for English multiple-choice cloze questions based on expected test scores 2011, International Journal of Knowledge-Based and Intelligent.
O PTIMAL SERVICE TASK PARTITION AND DISTRIBUTION IN GRID SYSTEM WITH STAR TOPOLOGY G REGORY L EVITIN, Y UAN -S HUN D AI Adviser: Frank, Yeong-Sung Lin.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Copyright © Cengage Learning. All rights reserved. Elementary Probability Theory 5.
Past research in decision making has shown that when solving certain types of probability estimation problems, groups tend to exacerbate errors commonly.
PRINCIPLES OF COGNITIVE SCIENCE IN EDUCATION SELF-GENERATION, ERRORS, & FEEDBACK Janet Metcalfe Columbia University IES 2006.
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Copyright © Cengage Learning. All rights reserved. Chi-Square and F Distributions 10.
Assessment embedded in step- based tutors (SBTs) CPI 494 Feb 12, 2009 Kurt VanLehn ASU.
Applying the Redundancy Principle ( Chapter 7) And using e-learning data for CTA Ken Koedinger 1.
1 DECISION MAKING Suppose your patient (from the Brazilian rainforest) has tested positive for a rare but serious disease. Treatment exists but is risky.
Leveraging Examples in e-Learning (Chapter 11)
SOFTWARE TESTING. Introduction Software Testing is the process of executing a program or system with the intent of finding errors. It involves any activity.
Data mining with DataShop Ken Koedinger CMU Director of PSLC Professor of Human-Computer Interaction & Psychology Carnegie Mellon University.
Condition Testing. Condition testing is a test case design method that exercises the logical conditions contained in a program module. A simple condition.
1 Chapter 15 Probability Rules. 2 Recall That… For any random phenomenon, each trial generates an outcome. An event is any set or collection of outcomes.
Statistical Tests We propose a novel test that takes into account both the genes conserved in all three regions ( x 123 ) and in only pairs of regions.
IAY 0600 Digital Systems Design Event-Driven Simulation VHDL Discussion Alexander Sudnitson Tallinn University of Technology.
Inductive and Deductive Reasoning  The pre-requisites for this chapter have not been seen since grade 7 (factoring, line constructions,..);
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
Doc.: IEEE /0044r0 Submission November 2005 Steve Shellhammer, Qualcomm Inc.Slide 1 Au Update on Estimating Packet Error Rate Caused by Interference.
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Prophet/Critic Hybrid Branch Prediction B B B
1 IP Routing table compaction and sampling schemes to enhance TCAM cache performance Author: Ruirui Guo, Jose G. Delgado-Frias Publisher: Journal of Systems.
Data Mining Lab Student performance evaluation. Rate of learning varies from student to student May depend on similarity of the problem Is it possible.
Image Contrast Enhancement Based on a Histogram Transformation of Local Standard Deviation Dah-Chung Chang* and Wen-Rong Wu, Member, IEEE IEEE TRANSACTIONS.
Chapter 15 Probability Rules!. The General Addition Rule If A and B are disjoint use: P(A  B) = P(A) + P(B) If A and B are not disjoint, this addition.
Psychological Therapies Schizophrenia. Introduction Although the use of drugs is crucial in the treatment of schizophrenia, many people do not experience.
Shadow Detection in Remotely Sensed Images Based on Self-Adaptive Feature Selection Jiahang Liu, Tao Fang, and Deren Li IEEE TRANSACTIONS ON GEOSCIENCE.
Does the brain compute confidence estimates about decisions?
Multiplication Timed Tests.
Definitions Addition Rule Multiplication Rule Tables
Visiting human errors in IR systems from decision making perspective
Bayes' theorem p(A|B) = p(B|A) p(A) / p(B)
The Behavior of Tutoring Systems
28th September 2005 Dr Bogdan L. Vrusias
Presentation transcript:

Does Conjunctive Knowledge Tracing Provide Leverage to the Temporal and Location Heuristics in Error Attribution? Adaeze Nwaigwe University of Maryland University College August 4, 2012

Introduction Previously, we proposed, implemented and evaluated four computational models for making error attributions in Intelligent Tutoring Systems (Nwaigwe et al., 2007).  two location-based models and  two temporal-based models Basis for the models  whether the attribution was driven by interface location and whether or not the student model in the Intelligent Tutor was used.  whether the attribution was driven by the temporal ordering of events and again, and whether or not the student model in the Intelligent Tutor was used.

Error Attribution Heuristics EA Heuristics Location-basedTemporal-based SM used SM not usedSM used SM not used

Doing Error Attribution.. In applying each heuristic on an error transaction our simplified approach was to uniformly apportion blame to all knowledge components (KCs) needed to generate a successful outcome. However, not all KCs may have been to blame. In the Andes log, for a significant proportion of time, multiple KCs are needed to generate a single correct step

Current Research Simplified approach (Standard Knowledge Tracing) may cause problem selection thrashing in an Intelligent Tutor where multiple KCs are required to produce a single response for difficult problems (Koedinger et al., 2011). Conjunctive Knowledge Tracing (CKT) in blame assignment has shown promise (Koedinger et al., 2011) in  reducing problem selection thrashing in a Geometry Cognitive Tutor and  in improving future task selection, therefore saving students' time

Our Goals…1 To investigate whether the four heuristics initially proposed will gain leverage when combined with CKT in terms of the quality of their cognitive models.

Our Goals….2 Since our previous study showed the simple location heuristic to be superior, we wish to see if use of CKT + the simple location heuristic will make better use of the student’s time and  will result in an improved inference procedure update of the student model,  Better future task selection and  enhanced student learning,  Versus using the simple location heuristic solely. We intend to conduct our study in a Physics Intelligent Tutor.

Standard Knowledge Tracing P(Know-KC 1 |Error) = P(Error|Know-KC 1 * P(Know-KC 1 ) = S * K 1 /[K 1 * S + (1-k 1 ) * (1-G )] ……. (1) K 1 (Know-KC 1 ) = prob of knowing KC 1 G = prob that student is correct when they do not know the KC S = prob that a student will be incorrect even though they know the KC Eq (1) derives from conditional probability: P(A|B) = P(B|A)* P(A) / P(B) SKT blames all KCs equally as seen from eq. (1). Approach too simplistic (Koedinger et al., 2011) P(Error)

Conjunctive Knowledge Tracing P(Error|Know-KC 1 ) = S 1 + K 2 S 2 +(1 – K 2 )(1 – G 2 ) – [S 1 ][K 2 S 2 + (1 – K 2 )(1 – G 2 )… (2) P(Error ) = 1-P(Correct) = 1 – [K 1 (1-S 1 ) + (1-K 1 )G 1 ][K 2 (1- S 2 ) + (1 – K 2 )G 2 ]…. (3) Thus, P(Know-KC 1 | Error ) = S 1 + K 2 S 2 +(1 – K 2 )(1 – G 2 ) – [S 1 ][K 2 S 2 + (1 – K 2 )(1 – G 2 ) * P(Know-KC1)] / 1 – [K 1 (1-S 1 ) + (1-K 1 )G 1 ][K 2 (1- S 2 ) + (1 – K 2 )G 2 ]…….. (4) CKT uses eq (4) and considers the fact that the student may have made an error in executing “both” KC 1 and KC 2. Eq (4) can be generalized to more than 2 KCs

What we intend to do…. We will use CKT to compute KC probabilities Use probabilities to assign blame Measure quality of  resulting cognitive model  inference procedure update of the student model,  future task selection and  student learning.

Thank you!! Questions???