What we will cover here What is a classifier

Slides:



Advertisements
Similar presentations
Data Mining Classification: Alternative Techniques
Advertisements

Data Mining Classification: Alternative Techniques
Classification Techniques: Decision Tree Learning
Naïve Bayes Classifier
Naïve Bayes Classifier
On Discriminative vs. Generative classifiers: Naïve Bayes
Data Mining Classification: Naïve Bayes Classifier
Algorithms: The basic methods. Inferring rudimentary rules Simplicity first Simple algorithms often work surprisingly well Many different kinds of simple.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Introduction to Bayesian Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
Review. 2 Statistical modeling  “Opposite” of 1R: use all the attributes  Two assumptions: Attributes are  equally important  statistically independent.
SEG Tutorial 1 – Classification Decision tree, Naïve Bayes & k-NN CHANG Lijun.
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
Revision (Part II) Ke Chen COMP24111 Machine Learning Revision slides are going to summarise all you have learnt from Part II, which should be helpful.
SEEM Tutorial 2 Classification: Decision tree, Naïve Bayes & k-NN
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Crash Course on Machine Learning
Naïve Bayes Classifier Ke Chen Extended by Longin Jan Latecki COMP20411 Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
NAÏVE BAYES CLASSIFIER 1 ACM Student Chapter, Heritage Institute of Technology 10 th February, 2012 SIGKDD Presentation by Anirban Ghose Parami Roy Sourav.
DATA MINING : CLASSIFICATION. Classification : Definition  Classification is a supervised learning.  Uses training sets which has correct answers (class.
Bayesian Networks. Male brain wiring Female brain wiring.
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Naive Bayes Classifier
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
CS464 Introduction to Machine Learning1 Bayesian Learning Features of Bayesian learning methods: Each observed training example can incrementally decrease.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
Chapter 6 Bayesian Learning
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.2 Statistical Modeling Rodney Nielsen Many.
Algorithms for Classification: The Basic Methods.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
Bayesian Learning. Bayes Classifier A probabilistic framework for solving classification problems Conditional Probability: Bayes theorem:
Naïve Bayes Classifier April 25 th, Classification Methods (1) Manual classification Used by Yahoo!, Looksmart, about.com, ODP Very accurate when.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Bayesian Learning Evgueni Smirnov Overview Bayesian Theorem Maximum A Posteriori Hypothesis Naïve Bayes Classifier Learning Text Classifiers.
Knowledge-based systems Sanaullah Manzoor CS&IT, Lahore Leads University
Data Mining Chapter 4 Algorithms: The Basic Methods Reporter: Yuen-Kuei Hsueh.
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
Review of Decision Tree Learning Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Bayesian Learning Reading: Tom Mitchell, “Generative and discriminative classifiers: Naive Bayes and logistic regression”, Sections 1-2. (Linked from.
Classification: Naïve Bayes Classifier
Naïve Bayes Classifier
Naive Bayes Classifier
CSE543: Machine Learning Lecture 2: August 6, 2014
text processing And naïve bayes
Data Science Algorithms: The Basic Methods
Naïve Bayes Classifier
Naïve Bayes Classifier
Data Mining Lecture 11.
Classification Techniques: Bayesian Classification
Revision (Part II) Ke Chen
Revision (Part II) Ke Chen
Naïve Bayes Classifier
Generative Models and Naïve Bayes
Artificial Intelligence 9. Perceptron
Multivariate Methods Berlin Chen, 2005 References:
Generative Models and Naïve Bayes
NAÏVE BAYES CLASSIFICATION
Naïve Bayes Classifier
Naïve Bayes Classifier
Presentation transcript:

What we will cover here What is a classifier Difference of learning/training and classifying Math reminder for Naïve Bayes Tennis example = naïve Bayes What may be wrong with your Bayes Classifier?

Naïve Bayes Classifier

QUIZZ: Probability Basics Quiz: We have two six-sided dice. When they are tolled, it could end up with the following occurance: (A) dice 1 lands on side “3”, (B) dice 2 lands on side “1”, and (C) Two dice sum to eight. Answer the following questions:

Outline Background Probability Basics Probabilistic Classification Naïve Bayes Example: Play Tennis Relevant Issues Conclusions

Probabilistic Classification

Probabilistic Classification Establishing a probabilistic model for classification Discriminative model Discriminative Probabilistic Classifier What is a discriminative Probabilistic Classifier? Example C1 – benign mole C2 - cancer

Probabilistic Classification Establishing a probabilistic model for classification (cont.) Generative model Probability that this fruit is an orange Probability that this fruit is an apple Generative Probabilistic Model for Class 1 for Class 2 for Class L

Background: methods to create classifiers There are three methods to establish a classifier a) Model a classification rule directly Examples: k-NN, decision trees, perceptron, SVM b) Model the probability of class memberships given input data Example: perceptron with the cross-entropy cost c) Make a probabilistic model of data within each class Examples: naive Bayes, model based classifiers a) and b) are examples of discriminative classification c) is an example of generative classification b) and c) are both examples of probabilistic classification GOOD NEWS: You can create your own hardware/software classifiers!

LAST LECTURE REMINDER: Probability Basics We defined prior, conditional and joint probability for random variables Prior probability: Conditional probability: Joint probability: Relationship: Independence: Bayesian Rule

Method: Probabilistic Classification with MAP MAP classification rule MAP: Maximum A Posterior Assign x to c* if Method of Generative classification with the MAP rule Apply Bayesian rule to convert them into posterior probabilities Then apply the MAP rule We use this rule in many applications

Naïve Bayes

Naïve Bayes Bayes classification Naïve Bayes classification For a class, the previous generative model can be decomposed by n generative models of a single input. Bayes classification Difficulty: learning the joint probability Naïve Bayes classification Assumption that all input attributes are conditionally independent! MAP classification rule: for Product of individual probabilities

Naïve Bayes Algorithm 1. Learning Phase: Given a training set S, Naïve Bayes Algorithm (for discrete input attributes) has two phases 1. Learning Phase: Given a training set S, Output: conditional probability tables; for elements 2. Test Phase: Given an unknown instance , Look up tables to assign the label c* to X’ if Learning is easy, just create probability tables. Classification is easy, just multiply probabilities

Tennis Example Example: Play Tennis

The learning phase for tennis example P(Play=Yes) = 9/14 P(Play=No) = 5/14 We have four variables, we calculate for each we calculate the conditional probability table Outlook Play=Yes Play=No Sunny 2/9 3/5 Overcast 4/9 0/5 Rain 3/9 2/5 Temperature Play=Yes Play=No Hot 2/9 2/5 Mild 4/9 Cool 3/9 1/5 Humidity Play=Yes Play=No High 3/9 4/5 Normal 6/9 1/5 Wind Play=Yes Play=No Strong 3/9 3/5 Weak 6/9 2/5

Formulation of a Classification Problem Given the data as found in last slide: Find for a new point in space (vector of values) to which group it belongs (classify)

The test phase for the tennis example Given a new instance of variable values, x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) Given calculated Look up tables Use the MAP rule to calculate Yes or No P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.

Example: software exists Test Phase Given a new instance, x’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) Look up tables MAP rule From previous slide P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Yes|x’): [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = 0.0053 P(No|x’): [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = 0.0206 Given the fact P(Yes|x’) < P(No|x’), we label x’ to be “No”.

Issues Relevant to Naïve Bayes

Issues Relevant to Naïve Bayes Violation of Independence Assumption Zero conditional probability Problem

Issues Relevant to Naïve Bayes First Issue Violation of Independence Assumption For many real world tasks, Nevertheless, naïve Bayes works surprisingly well anyway! Events are correlated

Issues Relevant to Naïve Bayes Second Issue Zero conditional probability Problem Such problem exists when no example contains the attribute value In this circumstance, during test For a remedy, conditional probabilities are estimated with

Another Problem: Continuous-valued Input Attributes What to do in such a case? Numberless values for an attribute Conditional probability is then modeled with the normal distribution Learning Phase: Output: normal distributions and Test Phase: Calculate conditional probabilities with all the normal distributions Apply the MAP rule to make a decision

Conclusion on classifiers Naïve Bayes is based on the independence assumption Training is very easy and fast; just requiring considering each attribute in each class separately Test is straightforward; just looking up tables or calculating conditional probabilities with normal distributions Naïve Bayes is a popular generative classifier model Performance of naïve Bayes is competitive to most of state-of-the-art classifiers even if in presence of violating the independence assumption It has many successful applications, e.g., spam mail filtering A good candidate of a base learner in ensemble learning Apart from classification, naïve Bayes can do more…

Sources Ke Chen