Design of Adaptive Systems for Computer-based Learning

Slides:



Advertisements
Similar presentations
Big Ideas in Cmput366. Search Blind Search State space representation Iterative deepening Heuristic Search A*, f(n)=g(n)+h(n), admissible heuristics Local.
Advertisements

Recursive Noisy-OR Authors : Lemmer and Gossink. 2 Recursive Noisy-Or Model A technique which allows combinations of dependent causes to be entered and.
Bridgette Parsons Megan Tarter Eva Millan, Tomasz Loboda, Jose Luis Perez-de-la-Cruz Bayesian Networks for Student Model Engineering.
Knowledge Inference: Advanced BKT Week 4 Video 5.
Modeling Student Knowledge Using Bayesian Networks to Predict Student Performance By Zach Pardos, Neil Heffernan, Brigham Anderson and Cristina Heffernan.
Knowledge Engineering for Bayesian Networks
Effective Skill Assessment Using Expectation Maximization in a Multi Network Temporal Bayesian Network By Zach Pardos, Advisors: Neil Heffernan, Carolina.
1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.
M.I. Jaime Alfonso Reyes ´Cortés.  The basic task for any probabilistic inference system is to compute the posterior probability distribution for a set.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Software Engineering Laboratory1 Introduction of Bayesian Network 4 / 20 / 2005 CSE634 Data Mining Prof. Anita Wasilewska Hiroo Kusaba.
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
Chapter 7 Probability. Definition of Probability What is probability? There seems to be no agreement on the answer. There are two broad schools of thought:
1 Knowledge Engineering for Bayesian Networks Ann Nicholson School of Computer Science and Software Engineering Monash University.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
Bayesian networks practice. Semantics e.g., P(j  m  a   b   e) = P(j | a) P(m | a) P(a |  b,  e) P(  b) P(  e) = … Suppose we have the variables.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Semantic Web Technologies Lecture # 2 Faculty of Computer Science, IBA.
Daphne Koller Bayesian Networks Application: Diagnosis Probabilistic Graphical Models Representation.
Technology-Intensive Teaching Carlos Delgado Kloos IFIP TC3 Representative for Spain Universidad Carlos III de Madrid
Determining the Significance of Item Order In Randomized Problem Sets Zachary A. Pardos, Neil T. Heffernan Worcester Polytechnic Institute Department of.
1 September 4, 2003 Bayesian System Identification and Structural Reliability Soheil Saadat, Research Associate Mohammad N. Noori, Professor & Head Department.
A Brief Introduction to Graphical Models
Modern Test Theory Item Response Theory (IRT). Limitations of classical test theory An examinee’s ability is defined in terms of a particular test The.
CS62S: Expert Systems Based on: The Engineering of Knowledge-based Systems: Theory and Practice A. J. Gonzalez and D. D. Dankel.
The ABC’s of Pattern Scoring Dr. Cornelia Orr. Slide 2 Vocabulary Measurement – Psychometrics is a type of measurement Classical test theory Item Response.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Choosing Sample Size for Knowledge Tracing Models DERRICK COETZEE.
ICT Work Programme Objective 4.2 Technology Enhanced Learning European Commission, DG Information Society and Media Unit E3 – Cultural Heritage.
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
Computer Science, Software Engineering & Robotics Workshop, FGCU, April 27-28, 2012 Fault Prediction with Particle Filters by David Hatfield mentors: Dr.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Introduction to Artificial Intelligence and Soft Computing
Comp 538 Course Presentation Discrete Factor Analysis Learning Hidden Variables in Bayesian Network Calvin Hua & Lily Tian Computer Science Dep, HKUST.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
CHAPTER 4 PROBABILITY THEORY SEARCH FOR GAMES. Representing Knowledge.
ICNEE 2002 Applying RL to Take Pedagogical Decisions in Intelligent Tutoring Systems Ana Iglesias Maqueda Computer Science Department Carlos III of Madrid.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Mining Logs Files for Data-Driven System Management Advisor.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
CHAPTER 5 Probability Theory (continued) Introduction to Bayesian Networks.
Validity and Item Analysis Chapter 4.  Concerns what instrument measures and how well it does so  Not something instrument “has” or “does not have”
The ABC’s of Pattern Scoring
Chapter 20 Classification and Estimation Classification – Feature selection Good feature have four characteristics: –Discrimination. Features.
W. L. Johnson and J. T Rickel. “Animated Pedagogical Agents: Face-to-Face Interaction in Interactive Learning Environments,” International Journal of Artificial.
Hybrid Intelligent Systems for Network Security Lane Thames Georgia Institute of Technology Savannah, GA
Advanced Methods and Analysis for the Learning and Social Sciences PSY505 Spring term, 2012 January 25, 2012.
An argument-based framework to model an agent's beliefs in a dynamic environment Marcela Capobianco Carlos I. Chesñevar Guillermo R. Simari Dept. of Computer.
COMPUTER SYSTEM FUNDAMENTAL Genetic Computer School INTRODUCTION TO ARTIFICIAL INTELLIGENCE LESSON 11.
Introduction to Artificial Intelligence Heshaam Faili University of Tehran.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11 CS479/679 Pattern Recognition Dr. George Bebis.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Learning Bayesian Networks for Complex Relational Data
Unit 1: The Scientific Method El Método Científico
Oliver Schulte Machine Learning 726
How to interact with the system?
Pekka Laitila, Kai Virtanen
Probabilistic Data Management
Course: Autonomous Machine Learning
Using Bayesian Networks to Predict Test Scores
CSCI 5822 Probabilistic Models of Human and Machine Learning
Uncertainty in AI.
Introduction to Artificial Intelligence and Soft Computing
MPEG-7 Video Retrieval using Bayesian Networks
Lecture Overview Background Introductory Concepts
How to interact with the system?
COnDITIONAL Probability
Deep Knowledge Tracing
Presentation transcript:

Design of Adaptive Systems for Computer-based Learning Ruth González Novillo, Pedro J. Muñoz-Merino, Carlos Delgado Kloos UNIVERSIDAD CARLOS III DE MADRID

User Modelling Skills Cognitive states Emotions Motivation Engagement Gaming the system

Example: Skill modelling Inference of the skills of a student: Knowledge Spaces Item Response Theory Bayesian networks Semantic based

KNOWLEDGE SPACES What does a student know? What is a student ready to know? Structured knowledge A limited number of knowledge states Different learning paths Los ks definen un árbol que muestra las relaciones entre ejercicios q muestra el orden en q deben realizarse

KNOWLEDGE SPACES Different learning paths Personalization

ITEM RESPONSE THEORY Only one ability One, two or three parameters models: Difficulty Slope Need of calibration Guess ITEM CHARACTERISTIC CURVE (ICC) Pregunta posible -> limites por qué Em¡n la literatura son valores típicos.

ITEM RESPONSE THEORY ITEM RESPONSE FUNCTION (IRF) Probability of answering correctly an item depending on the ability Local independence of items. Estimation of the student ability. Pregunta posible -> limites por qué Em¡n la literatura son valores típicos.

BAYESIAN NETWORKS Nodes are usually exercises or skills Each node has a Conditional Probability Table(CPT) that denotes the probability of doing correctly an exercise depending on the results of the parent nodes Bayes Theorem for making the inference Conditional independence

Bayesian Networks An event updates the network It is possible to calculate the different probabilities

Semantic based solutions More rich semantic relationships among contents, e.g. using ontologies

COMPARISON ITEM RESPONSE THEORY BAYESIAN NETWORKS KNOWLEDGE SPACES +There is no semantic information +There is no need to build a network +One single skill +A priori calibration of parameters BAYESIAN NETWORKS KNOWLEDGE SPACES + More semantic information + Several skills + More complexity +Need of making tags, making the structure and probability assignment +No need of calibration +Predefined knowledge structure +No hidden node

Model for Adaptation Rules Adaptive rules that are atomic, parametric, reusable, interchangable 1) P.J. Muñoz-Merino, C. Delgado Kloos, M. Muñoz-Organero, & A. Pardo, “A software engineering model for the development of adaptation rules and its application in a hinting adaptive e-learning system,” Computer Science and Information Systems, vol. 12, no. 1, pp. 203-231, 2015

ISCARE: Adaptive Competition system P.J. Muñoz-Merino, M. Fernández Molina, M. Muñoz-Organero, & C. Delgado Kloos (2012), “An adaptive and innovative question-driven competition-based intelligent tutoring system for learning,” Expert Systems with Applications, vol. 39, no. 8, pp. 6932-6948.