Learning from Observations Chapter 18 Section 1 – 4.

Slides:



Advertisements
Similar presentations
Machine Learning Intro iCAMP 2012
Advertisements

Learning from Observations
Learning from Observations Chapter 18 Section 1 – 3.
1 Classification using instance-based learning. 3 March, 2000Advanced Knowledge Management2 Introduction (lazy vs. eager learning) Notion of similarity.
Data Mining Classification: Alternative Techniques
CS 4700: Foundations of Artificial Intelligence
Lazy vs. Eager Learning Lazy vs. eager learning
Statistical Classification Rong Jin. Classification Problems X Input Y Output ? Given input X={x 1, x 2, …, x m } Predict the class label y  Y Y = {-1,1},
1er. Escuela Red ProTIC - Tandil, de Abril, Instance-Based Learning 4.1 Introduction Instance-Based Learning: Local approximation to the.
Nearest Neighbor. Predicting Bankruptcy Nearest Neighbor Remember all your data When someone asks a question –Find the nearest old data point –Return.
Learning from Observations Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 18 Spring 2004.
1 Chapter 10 Introduction to Machine Learning. 2 Chapter 10 Contents (1) l Training l Rote Learning l Concept Learning l Hypotheses l General to Specific.
Learning from Observations Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 18 Fall 2005.
Cooperating Intelligent Systems
Computational Learning Theory PAC IID VC Dimension SVM Kunstmatige Intelligentie / RuG KI2 - 5 Marius Bulacu & prof. dr. Lambert Schomaker.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
CS 590M Fall 2001: Security Issues in Data Mining Lecture 3: Classification.
Learning From Observations
Instance Based Learning
Learning from Observations Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 18 Fall 2004.
Instructor: Max Welling
Learning from Observations Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 18.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
Machine Learning ICS 273A Instructor: Max Welling.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Learning: Introduction and Overview
Machine Learning CPS4801. Research Day Keynote Speaker o Tuesday 9:30-11:00 STEM Lecture Hall (2 nd floor) o Meet-and-Greet 11:30 STEM 512 Faculty Presentation.
Copyright R. Weber Machine Learning, Data Mining ISYS370 Dr. R. Weber.
Inductive learning Simplest form: learn a function from examples
Learning CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Chapter 8 The k-Means Algorithm and Genetic Algorithm.
Learning from observations
Learning from Observations Chapter 18 Through
CHAPTER 18 SECTION 1 – 3 Learning from Observations.
CS 445/545 Machine Learning Spring, 2013 See syllabus at
Learning from Observations Chapter 18 Section 1 – 3, 5-8 (presentation TBC)
Learning from Observations Chapter 18 Section 1 – 3.
CPS 270: Artificial Intelligence Machine learning Instructor: Vincent Conitzer.
 2003, G.Tecuci, Learning Agents Laboratory 1 Learning Agents Laboratory Computer Science Department George Mason University Prof. Gheorghe Tecuci 9 Instance-Based.
Learning from observations
Carla P. Gomes CS4700 CS 4700: Foundations of Artificial Intelligence Prof. Carla P. Gomes Module: Intro Learning (Reading: Chapter.
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.7: Instance-Based Learning Rodney Nielsen.
CpSc 810: Machine Learning Instance Based Learning.
Outline K-Nearest Neighbor algorithm Fuzzy Set theory Classifier Accuracy Measures.
Data Mining and Decision Support
Machine Learning ICS 178 Instructor: Max Welling Supervised Learning.
CS Machine Learning Instance Based Learning (Adapted from various sources)
Chapter 18 Section 1 – 3 Learning from Observations.
Eick: kNN kNN: A Non-parametric Classification and Prediction Technique Goals of this set of transparencies: 1.Introduce kNN---a popular non-parameric.
Learning From Observations Inductive Learning Decision Trees Ensembles.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
CS 8751 ML & KDDInstance Based Learning1 k-Nearest Neighbor Locally weighted regression Radial basis functions Case-based reasoning Lazy and eager learning.
Learning from Observations
Learning from Observations
Machine Learning Inductive Learning and Decision Trees
Introduce to machine learning
Data Science Algorithms: The Basic Methods
Presented By S.Yamuna AP/CSE
Instance Based Learning (Adapted from various sources)
Learning.
Instructor: Max Welling
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Learning from Observations
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Lecture 14 Learning Inductive inference
Learning from Observations
Decision trees One possible representation for hypotheses
Machine Learning: Decision Tree Learning
Presentation transcript:

Learning from Observations Chapter 18 Section 1 – 4

Outline Learning agents Inductive learning Nearest Neighbors

Learning agents Sometimes we want to invest time and effort to observe the feedback from our environment to our actions in order to improve these actions so we can more effectively optimize our utility in the future.

Learning element Design of a learning element is affected by –Which components of the performance element are to be learned (e.g. learn to stop for traffic light) –What feedback is available to learn these components (e.g. visual feedback form camera) –What representation is used for the components (e.g. logic, probabilistic descriptions, attributes,...) Type of feedback: –Supervised learning: correct answers for each example (label). –Unsupervised learning: correct answers not given. –Reinforcement learning: occasional rewards

Two Examples of Learning Object Categories. Here is your training set (2 classes):

Here is your test set: Does it belong to one of the above classes?

S. Savarese, 2003 Copied from P. Perona talk slides. Learning from 1 Example

P. Buegel, 1562

Inductive learning Simplest form: learn a function from examples f is the target function An example is a pair (x, f(x)) Problem: find a hypothesis h such that h ≈ f given a training set of examples

Inductive learning method Construct/adjust h to agree with f on training set (h is consistent if it agrees with f on all examples) E.g., curve fitting:

Inductive learning method Construct/adjust h to agree with f on training set (h is consistent if it agrees with f on all examples) E.g., curve fitting:

Inductive learning method

which curve is best?

Ockham’s razor: prefer the simplest hypothesis consistent with data Inductive learning method

Supervised Learning I Example: Imagine you want to classify versus Data: 100 monkey images and 200 human images with labels what is what. where x represents the greyscale of the image pixels and y=0 means “monkey” while y=1 means “human”. Task: Here is a new image: monkey or human?

1 nearest neighbors (your first ML algorithm!) Idea: 1.Find the picture in the database which is closest to your query image. 2.Check its label. 3.Declare the class of your query image to be the same as that of the closest picture. query closest image

1NN Decision Surface decision curve

Distance Metric How do we measure what it means to be “close”? Depending on the problem we should choose an appropriate “distance” metric (or more generally, a (dis)similarity measure) -Demo: -Matlab Demo.

Remarks on NN methods We only need to construct a classifier that works locally for each query. Hence: We don’t need to construct a classifier everywhere in space. Classifying is done at query time. This can be computationally taxing at a time where you might want to be fast. Memory inefficient. Curse of dimensionality: imagine many features are irrelevant / noisy  distances are always large. Very flexible, not many prior assumptions. k-NN variants robust against “bad examples”.

Summary Learning agent = performance element + learning element For supervised learning, the aim is to find a simple hypothesis approximately consistent with training examples Decision tree learning + boosting Learning performance = prediction accuracy measured on test set