COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.

Slides:



Advertisements
Similar presentations
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Advertisements

Data Mining Classification: Alternative Techniques
Classification Techniques: Decision Tree Learning
What we will cover here What is a classifier
Naïve Bayes Classifier
Naïve Bayes Classifier
On Discriminative vs. Generative classifiers: Naïve Bayes
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Review. 2 Statistical modeling  “Opposite” of 1R: use all the attributes  Two assumptions: Attributes are  equally important  statistically independent.
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
Revision (Part II) Ke Chen COMP24111 Machine Learning Revision slides are going to summarise all you have learnt from Part II, which should be helpful.
Thanks to Nir Friedman, HU
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Crash Course on Machine Learning
Naïve Bayes Classifier Ke Chen Extended by Longin Jan Latecki COMP20411 Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
NAÏVE BAYES CLASSIFIER 1 ACM Student Chapter, Heritage Institute of Technology 10 th February, 2012 SIGKDD Presentation by Anirban Ghose Parami Roy Sourav.
Bayesian Networks. Male brain wiring Female brain wiring.
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier Naïve Bayes Logistic Regression Support Vector Machines.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
CS464 Introduction to Machine Learning1 Bayesian Learning Features of Bayesian learning methods: Each observed training example can incrementally decrease.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
Chapter 6 Bayesian Learning
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.2 Statistical Modeling Rodney Nielsen Many.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
Bayesian Learning. Bayes Classifier A probabilistic framework for solving classification problems Conditional Probability: Bayes theorem:
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Knowledge-based systems Sanaullah Manzoor CS&IT, Lahore Leads University
Introduction to Machine Learning Nir Ailon Lecture 11: Probabilistic Models.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Bayesian Learning Reading: Tom Mitchell, “Generative and discriminative classifiers: Naive Bayes and logistic regression”, Sections 1-2. (Linked from.
Classification: Naïve Bayes Classifier
Oliver Schulte Machine Learning 726
Machine Learning and Data Mining – Adaptive Agents
Decision trees (concept learnig)
Naïve Bayes Classifier
Naive Bayes Classifier
CSE543: Machine Learning Lecture 2: August 6, 2014
Prepared by: Mahmoud Rafeek Al-Farra
text processing And naïve bayes
Data Science Algorithms: The Basic Methods
Decision Trees: Another Example
Naïve Bayes Classifier
Bayes Net Learning: Bayesian Approaches
Oliver Schulte Machine Learning 726
Naïve Bayes Classifier
Data Mining Lecture 11.
Machine Learning Week 1.
Classification Techniques: Bayesian Classification
Revision (Part II) Ke Chen
Privacy Preserving Data Mining
Revision (Part II) Ke Chen
Naïve Bayes Classifier
Generative Models and Naïve Bayes
DATA MINING LECTURE 10 Classification k-nearest neighbor classifier
Play Tennis ????? Day Outlook Temperature Humidity Wind PlayTennis
Ensemble learning.
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Generative Models and Naïve Bayes
A task of induction to find patterns
NAÏVE BAYES CLASSIFICATION
Naïve Bayes Classifier
Naïve Bayes Classifier
Presentation transcript:

COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen

COMP24111 Machine Learning 2 Outline Background Probability Basics Probabilistic Classification Naïve Bayes –Principle and Algorithms –Example: Play Tennis Relevant Issues Summary

COMP24111 Machine Learning 3 Background There are three methods to establish a classifier a) Model a classification rule directly Examples: k-NN, decision trees, perceptron, SVM b) Model the probability of class memberships given input data Example: perceptron with the cross-entropy cost c) Make a probabilistic model of data within each class Examples: naive Bayes, model based classifiers a) and b) are examples of discriminative classification c) is an example of generative classification b) and c) are both examples of probabilistic classification

COMP24111 Machine Learning 4 Probability Basics Prior, conditional and joint probability for random variables –Prior probability: –Conditional probability: –Joint probability: –Relationship: –Independence: Bayesian Rule

COMP24111 Machine Learning 5 Probability Basics Quiz: We have two six-sided dice. When they are tolled, it could end up with the following occurance: (A) dice 1 lands on side “3”, (B) dice 2 lands on side “1”, and (C) Two dice sum to eight. Answer the following questions:

COMP24111 Machine Learning 6 Probabilistic Classification Establishing a probabilistic model for classification –Discriminative model Discriminative Probabilistic Classifier

COMP24111 Machine Learning 7 Probabilistic Classification Establishing a probabilistic model for classification (cont.) –Generative model Generative Probabilistic Model for Class 1 Generative Probabilistic Model for Class 2 Generative Probabilistic Model for Class L

COMP24111 Machine Learning 8 Probabilistic Classification MAP classification rule –MAP: Maximum A Posterior –Assign x to c* if Generative classification with the MAP rule –Apply Bayesian rule to convert them into posterior probabilities –Then apply the MAP rule

COMP24111 Machine Learning 9 Naïve Bayes Bayes classification Difficulty: learning the joint probability Naïve Bayes classification –Assumption that all input features are conditionally independent! –MAP classification rule: for

COMP24111 Machine Learning 10 Naïve Bayes

COMP24111 Machine Learning 11 Example Example: Play Tennis

COMP24111 Machine Learning 12 Example Learning Phase OutlookPlay=YesPlay=No Sunny 2/93/5 Overcast 4/90/5 Rain 3/92/5 TemperaturePlay=YesPlay=No Hot 2/92/5 Mild 4/92/5 Cool 3/91/5 HumidityPlay=YesPlay=No High 3/94/5 Normal 6/91/5 WindPlay=YesPlay=No Strong 3/93/5 Weak 6/92/5 P(Play=Yes) = 9/14P(Play=No) = 5/14

COMP24111 Machine Learning 13 Example Test Phase –Given a new instance, predict its label x ’=(Outlook=Sunny, Temperature=Cool, Humidity=High, Wind=Strong) –Look up tables achieved in the learning phrase –Decision making with the MAP rule P(Outlook=Sunny|Play=No) = 3/5 P(Temperature=Cool|Play==No) = 1/5 P(Huminity=High|Play=No) = 4/5 P(Wind=Strong|Play=No) = 3/5 P(Play=No) = 5/14 P(Outlook=Sunny|Play=Yes) = 2/9 P(Temperature=Cool|Play=Yes) = 3/9 P(Huminity=High|Play=Yes) = 3/9 P(Wind=Strong|Play=Yes) = 3/9 P(Play=Yes) = 9/14 P(Yes| x ’) ≈ [P(Sunny|Yes)P(Cool|Yes)P(High|Yes)P(Strong|Yes)]P(Play=Yes) = P(No| x ’) ≈ [P(Sunny|No) P(Cool|No)P(High|No)P(Strong|No)]P(Play=No) = Given the fact P(Yes| x ’) < P(No| x ’), we label x ’ to be “No”.

COMP24111 Machine Learning 14 Naïve Bayes Algorithm: Continuous-valued Features –Numberless values for a feature –Conditional probability often modeled with the normal distribution –Learning Phase: Output: normal distributions and –Test Phase: Given an unknown instance Instead of looking-up tables, calculate conditional probabilities with all the normal distributions achieved in the learning phrase Apply the MAP rule to make a decision

COMP24111 Machine Learning 15 Naïve Bayes Example: Continuous-valued Features –Temperature is naturally of continuous value. Yes: 25.2, 19.3, 18.5, 21.7, 20.1, 24.3, 22.8, 23.1, 19.8 No: 27.3, 30.1, 17.4, 29.5, 15.1 –Estimate mean and variance for each class –Learning Phase: output two Gaussian models for P(temp|C)

COMP24111 Machine Learning 16 Relevant Issues Violation of Independence Assumption –For many real world tasks, –Nevertheless, naïve Bayes works surprisingly well anyway! Zero conditional probability Problem –If no example contains the feature value –In this circumstance, during test –For a remedy, conditional probabilities re-estimated with

COMP24111 Machine Learning 17 Summary Naïve Bayes: the conditional independence assumption –Training is very easy and fast; just requiring considering each attribute in each class separately –Test is straightforward; just looking up tables or calculating conditional probabilities with estimated distributions A popular generative model –Performance competitive to most of state-of-the-art classifiers even in presence of violating independence assumption –Many successful applications, e.g., spam mail filtering –A good candidate of a base learner in ensemble learning –Apart from classification, naïve Bayes can do more…