Machine Learning Instance Based Learning & Case Based Reasoning Exercise Solutions.

Slides:



Advertisements
Similar presentations
1 Classification using instance-based learning. 3 March, 2000Advanced Knowledge Management2 Introduction (lazy vs. eager learning) Notion of similarity.
Advertisements

Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Lazy vs. Eager Learning Lazy vs. eager learning
Classification and Decision Boundaries
Navneet Goyal. Instance Based Learning  Rote Classifier  K- nearest neighbors (K-NN)  Case Based Resoning (CBR)
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CS292 Computational Vision and Language Pattern Recognition and Classification.
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
Classification Dr Eamonn Keogh Computer Science & Engineering Department University of California - Riverside Riverside,CA Who.
Instance Based Learning
Instance-Based Learning
Machine Learning Group University College Dublin Nearest Neighbour Classifiers Lazy v’s Eager k-NN Condensed NN.
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Case-based Reasoning System (CBR)
These slides are based on Tom Mitchell’s book “Machine Learning” Lazy learning vs. eager learning Processing is delayed until a new instance must be classified.
Aprendizagem baseada em instâncias (K vizinhos mais próximos)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Instance Based Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán) 1.
Nearest Neighbour Condensing and Editing David Claus February 27, 2004 Computer Vision Reading Group Oxford.
INSTANCE-BASE LEARNING
Data Mining By Andrie Suherman. Agenda Introduction Major Elements Steps/ Processes Tools used for data mining Advantages and Disadvantages.
CBR in Medicine Jen Bayzick CSE435 – Intelligent Decision Support Systems.
Bayesian Networks. Male brain wiring Female brain wiring.
K Nearest Neighborhood (KNNs)
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
1 Data Mining Lecture 5: KNN and Bayes Classifiers.
Decision Trees & the Iterative Dichotomiser 3 (ID3) Algorithm David Ramos CS 157B, Section 1 May 4, 2006.
Chapter 8 The k-Means Algorithm and Genetic Algorithm.
1 Instance Based Learning Ata Kaban The University of Birmingham.
Visual Information Systems Recognition and Classification.
On the Role of Dataset Complexity in Case-Based Reasoning Derek Bridge UCC Ireland (based on work done with Lisa Cummins)
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
CpSc 881: Machine Learning Instance Based Learning.
CpSc 810: Machine Learning Instance Based Learning.
Outline K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Lazy Learners K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
KNN Classifier.  Handed an instance you wish to classify  Look around the nearby region to see what other classes are around  Whichever is most common—make.
CS Machine Learning Instance Based Learning (Adapted from various sources)
K-Nearest Neighbor Learning.
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
CS 8751 ML & KDDInstance Based Learning1 k-Nearest Neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy and eager learning.
1 Instance Based Learning Soongsil University Intelligent Systems Lab.
Classification of tissues and samples 指導老師:藍清隆 演講者:張許恩、王人禾.
3.3. Case-Based Reasoning (CBR)
Rule Induction for Classification Using
Classification Nearest Neighbor
Data Mining: Concepts and Techniques (3rd ed
K Nearest Neighbors and Instance-based methods
K-Nearest Neighbours and Instance based learning
Instance Based Learning (Adapted from various sources)
K Nearest Neighbor Classification
Classification Nearest Neighbor
Nearest-Neighbor Classifiers
یادگیری بر پایه نمونه Instance Based Learning Instructor : Saeed Shiry
Instance Based Learning
Classification Algorithms
COSC 4335: Other Classification Techniques
Chap 8. Instance Based Learning
Advanced Mathematics Hossein Malekinezhad.
Machine Learning: UNIT-4 CHAPTER-1
Nearest Neighbors CSC 576: Data Mining.
CSE4334/5334 Data Mining Lecture 7: Classification (4)
Presentation transcript:

Machine Learning Instance Based Learning & Case Based Reasoning Exercise Solutions

Worked example In the diagram below the figures next to the + and - signs refer to the values taken by a real-valued target function. Calculate the value predicted for the target function at the query instance x q by the 5-Nearest Neighbour Learning Algorithm.

Worked example (contd) xqxq

Exercise 1 Assume a Boolean target function and a two dimensional instance space (shown below). Determine how the k-Nearest Neighbour Learning algorithm would classify the new instance x q for k = 1,3,5. The + and – signs in the instance space refer to positive and negative examples respectively.

Exercise 1:Solution xqxq NN+ 3-NN- 5-NN- 7-NN-

Exercise1 (cont) How does the efficiency and accuracy of k-Neighbourhood search change as k increases? –IF there are sufficient numbers of examples the accuracy should increase –The time to calculate the prediction will also increase. In that sense less efficient

Exercise 2 (exam Q from previous years) (a) Some machine learning algorithms are described as eager, others as, lazy. Choose an example of each type of algorithm and explain in what sense one is eager and the other is lazy. Answer: K-nearest neighbour or Case-Based Reasoning are lazy learning methods because they do computations only when presented with a new example to classify. By contrary, Decision trees, neural nets, Bayesian classification are eager methods because they build up a model at the training phase. They need to do little work when presented with new examples to be classified.

(b) Describe the essential differences between k- nearest neighbour learning and Case- based reasoning Answer: k-NN uses the Euclidean distance measure to find examples that are close to a test case. Thus it works with numerical data. CBR can deal with a wide variety of data types, and for these the Euclidean distance is not defined. Thus CBR needs to define a measure of closeness for non-numerical objects.

(c) Describe in words the R4 model of Case-based Reasoning Answer: The R4 model of CBR is based on the following four main stages: –Retrieve: Matching cases in the case-base to the incoming test case –Re-use or Revise: If a perfect match occurs we can re-use the solution stored in the case-base, if not, we can apply the adaptation rules to adapt a stored case so that a match with the incoming case is obtained. –Retain: If the associated outcome corresponding to the incoming case is later known the case is added to the case-base (but only if the outcome- case isn't identical to an existing case or an existing case after adaptation)

d) How do Case-based Reasoning systems to learn? Answer: Learning occurs by –retain new cases/outcomes –adding, modifying adaptation rules

e) Some researchers in the field of machine learning argue that Case-based reasoning is closer to human thinking than other some others forms of machine learning. Give an real-world example that supports this view. Answer: For example, a doctor diagnosing a patient by matching the patient's symptoms to those of another patient whose diagnosis was known. Humans often reason by matching new instances to previously experienced (or told) situations, also they are often adapted.