Learning from Data.

Slides:



Advertisements
Similar presentations
The Software Infrastructure for Electronic Commerce Databases and Data Mining Lecture 4: An Introduction To Data Mining (II) Johannes Gehrke
Advertisements

Learning from Observations
Learning from Observations Chapter 18 Section 1 – 3.
Decision Tree Approach in Data Mining
Data Mining Classification: Alternative Techniques
Learning Department of Computer Science & Engineering Indian Institute of Technology Kharagpur.
Decision making in episodic environments
Cooperating Intelligent Systems
Learning From Observations
LEARNING DECISION TREES
Learning: Introduction and Overview
Machine learning Image source:
Machine learning Image source:
Machine Learning CPS4801. Research Day Keynote Speaker o Tuesday 9:30-11:00 STEM Lecture Hall (2 nd floor) o Meet-and-Greet 11:30 STEM 512 Faculty Presentation.
Processing of large document collections Part 2 (Text categorization) Helena Ahonen-Myka Spring 2006.
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
Inductive learning Simplest form: learn a function from examples
LEARNING DECISION TREES Yılmaz KILIÇASLAN. Definition - I Decision tree induction is one of the simplest, and yet most successful forms of learning algorithm.
Data Mining: Classification & Predication Hosam Al-Samarraie, PhD. Centre for Instructional Technology & Multimedia Universiti Sains Malaysia.
Learning from observations
Learning from Observations Chapter 18 Through
CHAPTER 18 SECTION 1 – 3 Learning from Observations.
1 Learning Chapter 18 and Parts of Chapter 20 AI systems are complex and may have many parameters. It is impractical and often impossible to encode all.
Learning from Observations Chapter 18 Section 1 – 3, 5-8 (presentation TBC)
Learning from Observations Chapter 18 Section 1 – 3.
Machine Learning II 부산대학교 전자전기컴퓨터공학과 인공지능연구실 김민호
CS690L Data Mining: Classification
L6. Learning Systems in Java. Necessity of Learning No Prior Knowledge about all of the situations. Being able to adapt to changes in the environment.
Machine learning Image source:
CSC 8520 Spring Paula Matuszek DecisionTreeFirstDraft Paula Matuszek Spring,
Chapter 18 Section 1 – 3 Learning from Observations.
Learning From Observations Inductive Learning Decision Trees Ensembles.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Decision Tree Learning CMPT 463. Reminders Homework 7 is due on Tuesday, May 10 Projects are due on Tuesday, May 10 o Moodle submission: readme.doc and.
Learning from Observations
Learning from Observations
Machine learning Image source:
DECISION TREES An internal node represents a test on an attribute.
Introduce to machine learning
Learning from Data. Learning from Data Learning sensors actuators environment agent ? As an agent interacts with the world, it should learn about its.
Decision Trees.
School of Computer Science & Engineering
Decision Trees: Another Example
Presented By S.Yamuna AP/CSE
Monty Hall a b c *(Goat not necessarily behind Door b)
Classification Nearest Neighbor
Supervised Learning Seminar Social Media Mining University UC3M
Example Example Alternate Type Patrons Target_wait x1 Yes Thai Full
Machine Learning Week 1.
Classification Techniques: Bayesian Classification
Learning.
Prepared by: Mahmoud Rafeek Al-Farra
Decision making in episodic environments
CS Fall 2016 (Shavlik©), Lecture 2
Automatic Detection of Causal Relations for Question Answering
Computer Vision Chapter 4
Classification and Prediction
Lecture 05: Decision Trees
Learning from Observations
ECE/CS/ME 539 Artificial Neural Networks Final Project
Learning Chapter 18 and Parts of Chapter 20
Basics of ML Rohan Suri.
©Jiawei Han and Micheline Kamber
CS639: Data Management for Data Science
Learning from Observations
Decision trees One possible representation for hypotheses
Machine Learning: Decision Tree Learning
Decision Trees - Intermediate
Word representations David Kauchak CS158 – Fall 2016.
Presentation transcript:

Learning from Data

Focus on Supervised Learning first… Given previous data, how can we “learn” to classify new data?

APPLE APPLE BANANA BANANA APPLE or BANANA? APPLE

Learned model/ Classifier Training Learned model/ Classifier Training Set Extract features/ labels Train Decision Trees Bayesian Learning Neural Nets...

Training Classifying Learned model/ Classifier Training Set Extract features/ labels Train Decision Trees Bayesian Learning Neural Nets... Classifying Learned model/ Classifier Label Instance/Example Extract features

Inductive Learning Supervised Learning: Training data is a set of (x, y) pairs x: input example/instance y: output/label Learn an unknown function f(x)=y x represented by D-dimensional feature vector x = < x1 , x2 , x3 ,…, xD > Each dimension is a feature or attribute

Wait for a Table?

Wait for a Table? T: Positive/Yes examples (better to wait for a table) F: Negative/No examples (better not wait)

All examples with Patrons=None were No Patrons = Some were Yes Examples with Patrons = Full depended on other features

Decision Trees

How to classify new example? All examples with Patrons=None were No Patrons = Some were Yes Examples with Patrons = Full depended on other features

How to classify new example? All examples with Patrons=None were No Patrons = Some were Yes Examples with Patrons = Full depended on other features

Classifying a New Example

Which one is better? All examples with Patrons=None were No Patrons = Some were Yes Examples with Patrons = Full depended on other features

Better because smaller

Decision Trees How to find the smallest decision tree?

Constructing the “best” decision tree NP-hard to find smallest tree, so just try to fall a “smallish” decision tree. First, how to construct any decision tree?

Construct Tree Example F Full $ Italian 30-60 Patrons = None (All False)

Construct Tree Example F Full $ Italian 30-60 Patrons = Some (All True)

Construct Tree Example F Full $ Italian 30-60 Patrons = Full (Some True, Some False)

Construct Tree Example F Full $ Italian 30-60 Patrons = Full (Some True, Some False) AND Hungry = False

Construct Tree Example F Full $ Italian 30-60 Patrons = Full (Some True, Some False) AND Hungry = True

Choosing the Best Feature Compare Type and Patrons Which one seems better? True examples False examples False True

Choosing the Best Feature Compare Type and Patrons Which one seems better? At each node select the feature that divides the examples into sets which are almost all positive or all negative. Yes examples No examples