1er. Escuela Red ProTIC - Tandil, 18-28 de Abril, 2006 4. Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the.

Slides:



Advertisements
Similar presentations
1 Classification using instance-based learning. 3 March, 2000Advanced Knowledge Management2 Introduction (lazy vs. eager learning) Notion of similarity.
Advertisements

Nonparametric Methods: Nearest Neighbors
1er. Escuela Red ProTIC - Tandil, de Abril, Decision Tree Learning 3.1 Introduction –Method for approximation of discrete-valued target functions.
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
Data Mining Classification: Alternative Techniques
K-means method for Signal Compression: Vector Quantization
1 CS 391L: Machine Learning: Instance Based Learning Raymond J. Mooney University of Texas at Austin.
Preventing Overfitting Problem: We don’t want to these algorithms to fit to ``noise’’ The generated tree may overfit the training data –Too many branches,
Instance Based Learning
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Lazy vs. Eager Learning Lazy vs. eager learning
Classification and Decision Boundaries
Navneet Goyal. Instance Based Learning  Rote Classifier  K- nearest neighbors (K-NN)  Case Based Resoning (CBR)
Instance Based Learning
Università di Milano-Bicocca Laurea Magistrale in Informatica Corso di APPRENDIMENTO E APPROSSIMAZIONE Lezione 8 - Instance based learning Prof. Giancarlo.
K nearest neighbor and Rocchio algorithm
MACHINE LEARNING 9. Nonparametric Methods. Introduction Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 
Instance based learning K-Nearest Neighbor Locally weighted regression Radial basis functions.
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Carla P. Gomes Module: Nearest Neighbor Models (Reading: Chapter.
Instance Based Learning
Instance-Based Learning
Data Mining Classification: Alternative Techniques
These slides are based on Tom Mitchell’s book “Machine Learning” Lazy learning vs. eager learning Processing is delayed until a new instance must be classified.
CES 514 – Data Mining Lec 9 April 14 Mid-term k nearest neighbor.
Aprendizagem baseada em instâncias (K vizinhos mais próximos)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
1er. Escuela Red ProTIC - Tandil, de Abril, Bayesian Learning 5.1 Introduction –Bayesian learning algorithms calculate explicit probabilities.
Instance Based Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán) 1.
INSTANCE-BASE LEARNING
Nearest Neighbor Classifiers other names: –instance-based learning –case-based learning (CBL) –non-parametric learning –model-free learning.
CS Instance Based Learning1 Instance Based Learning.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
K Nearest Neighborhood (KNNs)
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
1 Data Mining Lecture 5: KNN and Bayes Classifiers.
11/12/2012ISC471 / HCI571 Isabelle Bichindaritz 1 Prediction.
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 9 Instance-Based.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Statistical Inference (By Michael Jordon) l Bayesian perspective –conditional perspective—inferences.
1 Instance Based Learning Ata Kaban The University of Birmingham.
Instance Based Learning
CpSc 881: Machine Learning Instance Based Learning.
CpSc 810: Machine Learning Instance Based Learning.
Chapter1: Introduction Chapter2: Overview of Supervised Learning
Outline K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Lazy Learners K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Tuesday, November 23, 1999.
Machine Learning ICS 178 Instructor: Max Welling Supervised Learning.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Thursday, 05 April 2007 William.
Meta-learning for Algorithm Recommendation Meta-learning for Algorithm Recommendation Background on Local Learning Background on Algorithm Assessment Algorithm.
CS Machine Learning Instance Based Learning (Adapted from various sources)
K-Nearest Neighbor Learning.
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
CS 8751 ML & KDDInstance Based Learning1 k-Nearest Neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy and eager learning.
1 Instance Based Learning Soongsil University Intelligent Systems Lab.
1 Instance Based Learning Soongsil University Intelligent Systems Lab.
Classification Nearest Neighbor
Instance Based Learning (Adapted from various sources)
K Nearest Neighbor Classification
Classification Nearest Neighbor
Nearest-Neighbor Classifiers
Instance Based Learning
COSC 4335: Other Classification Techniques
Chap 8. Instance Based Learning
Machine Learning: UNIT-4 CHAPTER-1
Nearest Neighbors CSC 576: Data Mining.
Presentation transcript:

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the target function that applies in the neighborhood of the query instance –Cost of classifying new instances can be high: Nearly all computations take place at classification time –Examples: k-Nearest Neighbors –Radial Basis Functions: Bridge between instance-based learning and artificial neural networks

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning K-Nearest Neighbors

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Most plausible hypothesis

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Now? Or maybe…

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Is the simplest hypothesis always the best one?

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.2 k-Nearest Neighbor Learning Instance x = [ a 1 (x), a 2 (x),..., a n (x) ]  n d(x i,x j ) = [ (x i -x j ).(x i -x j ) ] ½ = Euclidean Distance –Discrete-Valued Target Functions ƒ :  n  V = {v 1, v 2,.., v s )

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Prediction for a new query x: (k nearest neighbors of x) ƒ(x) = argmax v  V  i=1,k  [v,ƒ(x i )]  [v,ƒ(x i )] = 1 if v =ƒ(x i ),  [v,ƒ(x i )] = 0 otherwise –Continuous-Valued Target Functions ƒ(x) = (1/k)  i=1,k ƒ(x i )

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Distance-Weighted k-NN ƒ(x) = argmax v  V  i=1,k w i  [v,ƒ(x i )] ƒ(x) =  i=1,k w i ƒ(x i ) / k  i=1,k w i w i = [d(x i,x)] -2  Weights more heavily closest neighbors

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Remarks for k-NN –Robust to noise –Quite effective for large training sets –Inductive bias: The classification of an instance will be most similar to the classification of instances that are nearby in Euclidean distance –Especially sensitive to the curse of dimensionality –Elimination of irrelevant attributes by suitably chosen the metric: d(x i,x j ) = [ (x i -x j ).G.(x i -x j ) ] ½

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.3 Locally Weighted Regression Builds an explicit approximation to ƒ(x) over a local region surrounding x (usually a linear or quadratic fit to training examples nearest to x) Locally Weighted Linear Regression: ƒ L (x) = w 0 + w 1 x w n x n E(x) =  i=1,k [ƒ L (x i )-ƒ(x i )] 2 (x i nn of x)

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Generalization: ƒ L (x) = w 0 + w 1 x w n x n E(x) =  i=1,N K[ d(x i,x)] [ƒ L (x i )-ƒ(x i )] 2 K[ d(x i,x)] = kernel function Other possibility: ƒ Q (x) = quadratic function of x j

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.4 Radial Basis Functions Approach closely related to distance-weighted regression and artificial neural network learning ƒ RBF (x) = w 0 +  µ=1,k w µ K[ d(x µ,x)] K[ d(x µ,x)] = exp[-d 2 (x µ,x)/ 2  2 µ ] = Gaussian kernel function

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning Training RBF Networks 1 st Stage: Determination of k (=number of basis functions) x µ and  µ (kernel parameters)  Expectation-Maximization (EM) algorithm 2 nd Stage: Determination of weights w µ  Linear Problem

1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.6 Remarks on Lazy and Eager Learning Lazy Learning: stores data and postpones decisions until a new query is presented Eager Learning: generalizes beyond the training data before a new query is presented Lazy methods may consider the query instance x when deciding how to generalize beyond the training data D (local approximation) Eager methods cannot (they have already chosen their global approximation to the target function)