Chap 8. Instance Based Learning

Slides:



Advertisements
Similar presentations
1 Classification using instance-based learning. 3 March, 2000Advanced Knowledge Management2 Introduction (lazy vs. eager learning) Notion of similarity.
Advertisements

Slides from: Doug Gray, David Poole
ECG Signal processing (2)
Machine Learning Instance Based Learning & Case Based Reasoning Exercise Solutions.
Data Mining Classification: Alternative Techniques
K-means method for Signal Compression: Vector Quantization
Support Vector Machines
1 CS 391L: Machine Learning: Instance Based Learning Raymond J. Mooney University of Texas at Austin.
Instance Based Learning
1 Machine Learning: Lecture 7 Instance-Based Learning (IBL) (Based on Chapter 8 of Mitchell T.., Machine Learning, 1997)
Lazy vs. Eager Learning Lazy vs. eager learning
1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the.
Navneet Goyal. Instance Based Learning  Rote Classifier  K- nearest neighbors (K-NN)  Case Based Resoning (CBR)
Instance Based Learning
Università di Milano-Bicocca Laurea Magistrale in Informatica Corso di APPRENDIMENTO E APPROSSIMAZIONE Lezione 8 - Instance based learning Prof. Giancarlo.
Pattern Recognition and Machine Learning
K nearest neighbor and Rocchio algorithm
MACHINE LEARNING 9. Nonparametric Methods. Introduction Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Instance based learning K-Nearest Neighbor Locally weighted regression Radial basis functions.
Instance Based Learning
Instance-Based Learning
Lazy Learning k-Nearest Neighbour Motivation: availability of large amounts of processing power improves our ability to tune k-NN classifiers.
These slides are based on Tom Mitchell’s book “Machine Learning” Lazy learning vs. eager learning Processing is delayed until a new instance must be classified.
1 Nearest Neighbor Learning Greg Grudic (Notes borrowed from Thomas G. Dietterich and Tom Mitchell) Intro AI.
Aprendizagem baseada em instâncias (K vizinhos mais próximos)
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Instance Based Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán) 1.
INSTANCE-BASE LEARNING
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
CS Instance Based Learning1 Instance Based Learning.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Radial-Basis Function Networks
Radial Basis Function Networks
Radial Basis Function Networks
11/12/2012ISC471 / HCI571 Isabelle Bichindaritz 1 Prediction.
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 9 Instance-Based.
1 Instance Based Learning Ata Kaban The University of Birmingham.
CpSc 881: Machine Learning Instance Based Learning.
CpSc 810: Machine Learning Instance Based Learning.
Radial Basis Function ANN, an alternative to back propagation, uses clustering of examples in the training set.
Outline K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Lazy Learners K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Kansas State University Department of Computing and Information Sciences CIS 798: Intelligent Systems and Machine Learning Tuesday, November 23, 1999.
KNN Classifier.  Handed an instance you wish to classify  Look around the nearby region to see what other classes are around  Whichever is most common—make.
CS Machine Learning Instance Based Learning (Adapted from various sources)
K-Nearest Neighbor Learning.
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Kansas State University Department of Computing and Information Sciences CIS 890: Special Topics in Intelligent Systems Wednesday, November 15, 2000 Cecil.
Instance-Based Learning Evgueni Smirnov. Overview Instance-Based Learning Comparison of Eager and Instance-Based Learning Instance Distances for Instance-Based.
CS 8751 ML & KDDInstance Based Learning1 k-Nearest Neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy and eager learning.
1 Instance Based Learning Soongsil University Intelligent Systems Lab.
1 Instance Based Learning Soongsil University Intelligent Systems Lab.
Machine Learning Supervised Learning Classification and Regression
Instance Based Learning
Data Mining: Concepts and Techniques (3rd ed
Instance Based Learning (Adapted from various sources)
K Nearest Neighbor Classification
Nearest-Neighbor Classifiers
یادگیری بر پایه نمونه Instance Based Learning Instructor : Saeed Shiry
Neuro-Computing Lecture 4 Radial Basis Function Network
Instance Based Learning
COSC 4335: Other Classification Techniques
Nearest Neighbor Classifiers
Advanced Mathematics Hossein Malekinezhad.
Chapter 8: Generalization and Function Approximation
Machine Learning: UNIT-4 CHAPTER-1
Introduction to Radial Basis Function Networks
Nearest Neighbor Classifiers
Presentation transcript:

Chap 8. Instance Based Learning 인공지능 연구실 신 동 호

Abstracts Learning method Nearest neighbor Locally weighted regression Store the training examples v.s construct a general target function Nearest neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy learning method Local estimation cf) target function for the entire instance space

INTRODUCTION Learning : storing the presented training data Local approximation complex target function is constructed by less complex local target functions Disadvantage the cost of classifying new instances can be high

Basic Schema 사람 - Concept : attribute-value pair 개별 인간 - Instance 백인, 황인, 흑인 - prototypes : same as instance

K-Nearest Neighbor Learning Instance points in the n-dimensional space feature vector <a1(x), a2(x),...,an(x)> distance target function : discrete or real value

Classification algorithm: Training algorithm: For each training example (x,f(x)), add the example to the list training_examples Classification algorithm: Given a query instance xq to be classified, Lex x1...xk denote the k instances from training_examples that are nearest to xq Return

Distance-Weighted N-N Algorithm Giving greater weight to closer neighbors discrete case real case

Remarks on k-N-N Algorithm Robust to noisy training data Effective in sufficiently large set of training data Subset of instance attributes Dominated by irrelevant attributes weight each attribute differently Indexing the stored training examples kd-tree

Locally Weighted Regression the function is approximated based only on data near the query point. Weighted the contribution of each training example is weighted by its distance from the query point. Regression approximating real-valued function

Locally Weighted Linear Regression where ai(x) : ith attribute of the instance x minimize the squared error sum where D is training set where is a learning rate

A Local Approximation 1. minimize the squared error over k nearest neighbor 2. minimize the squared error over entire set D, with weights 3. combine 1,2 with training rule

Radial Basis Functions Distance weighted regression and ANN where xu : instance from X Ku(d(xu,x)) : kernel function The contribution from each of the Ku(d(xu,x)) terms is localized to a region nearby the point xu : Gaussian Function Corresponding two layer network first layer : computes the values of the various Ku(d(xu,x)) second layer : computes a linear combination of first-layer unit values.

RBF network Training construct kernel function adjust weights RBF networks provide a global approximation to the target function, represented by a linear combination of many local kernel functions.

Case-Based Reasoning CBR lazy learning classify new query instances by similar instances symbolic descriptions (not n-dimensional space)

CADET system

Cont’d Library : functional description cases Search : matching the design problem various subgraphs rewrite rule by general knowledge merging problem Target function f : maps function graphs to the structures that implement them

Remarks on Lazy and Eager Learning Lazy Learning : generalization at query time k-N-N locally weighted regression case-based reasoning Eager Learning : generalization at training time radial basis function Back-Propagation

Differences Computation time Classifications produced for new queries. train: eager > lazy query: eager < lazy Classifications produced for new queries. Target function eager: a single linear function that covers the entire instance space lazy: a combination of many local approximations

Summary Instance-based learning k-N-N search form a different local approximation for each query instance k-N-N search target function is estimated from known k-N-N Locally weighted regression an explicit local approximation to the target function RBF ANN constructed from localized kernel functions CBR instances represented by logical descriptions