Naive Bayesian Classification

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

CS 484 – Artificial Intelligence1 Announcements Homework 8 due today, November 13 ½ to 1 page description of final project due Thursday, November 15 Current.
What is Statistical Modeling
Probability & Certainty: Intro Probability & Certainty.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Software Engineering Laboratory1 Introduction of Bayesian Network 4 / 20 / 2005 CSE634 Data Mining Prof. Anita Wasilewska Hiroo Kusaba.
Lecture 13-1: Text Classification & Naive Bayes
Probability theory Much inspired by the presentation of Kren and Samuelsson.
Naïve Bayes Model. Outline Independence and Conditional Independence Naïve Bayes Model Application: Spam Detection.
Naïve Bayes Classification Debapriyo Majumdar Data Mining – Fall 2014 Indian Statistical Institute Kolkata August 14, 2014.
Text Categorization Moshe Koppel Lecture 2: Naïve Bayes Slides based on Manning, Raghavan and Schutze.
Probability & Certainty: Intro Probability & Certainty.
Chapter 15: Probability Rules
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Bayesian Networks. Male brain wiring Female brain wiring.
Text Classification, Active/Interactive learning.
How to classify reading passages into predefined categories ASH.
Naive Bayes Classifier
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Naïve Bayes Classifier. Bayes Classifier l A probabilistic framework for classification problems l Often appropriate because the world is noisy and also.
AP Review Day 2: Discrete Probability. Basic Probability Sample space = all possible outcomes P(A c ) = 1 – P(A) Probabilities have to be between 0 and.
Mathematics topic handout: Conditional probability & Bayes Theorem Dr Andrew French. PAGE 1www.eclecticon.info Conditional Probability.
Probability Rules!! Chapter 15.
Copyright © 2010 Pearson Education, Inc. Chapter 15 Probability Rules!
1 A Bayesian statistical method for particle identification in shower counters IX International Workshop on Advanced Computing and Analysis Techniques.
Bayesian Classification. Bayesian Classification: Why? A statistical classifier: performs probabilistic prediction, i.e., predicts class membership probabilities.
Classification Techniques: Bayesian Classification
Naive Bayes Classifier Christopher Gonzalez. Outline Bayes’ Theorem What is a Naive Bayes Classifier (NBC)? Why/when to use NBC? How does NBC work? Applications.
Modeling Data Greg Beckham. Bayes Fitting Procedure should provide – Parameters – Error estimates on the parameters – A statistical measure of goodness.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
2. Introduction to Probability. What is a Probability?
©2012 Paula Matuszek CSC 9010: Text Mining Applications Lab 3 Dr. Paula Matuszek (610)
1 Text Categorization CSE Categorization Given: –A description of an instance, x  X, where X is the instance language or instance space. –A fixed.
1 Chapter 4, Part 1 Basic ideas of Probability Relative Frequency, Classical Probability Compound Events, The Addition Rule Disjoint Events.
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Naïve Bayes Classifier April 25 th, Classification Methods (1) Manual classification Used by Yahoo!, Looksmart, about.com, ODP Very accurate when.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
BAYESIAN LEARNING. 2 Bayesian Classifiers Bayesian classifiers are statistical classifiers, and are based on Bayes theorem They can calculate the probability.
Introduction to Information Retrieval Introduction to Information Retrieval Lecture 15: Text Classification & Naive Bayes 1.
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
Chapter 14 Probability Rules!. Do Now: According to the 2010 US Census, 6.7% of the population is aged 10 to 14 years, and 7.1% of the population is aged.
1 Neural Codes. 2 Neuronal Codes – Action potentials as the elementary units voltage clamp from a brain cell of a fly.
Artificial Intelligence and Authorship: When Computers Learn to Read Kristin Betancourt COSC 480.
Text Classification and Naïve Bayes Multinomial Naïve Bayes: A Worked Example.
Text Classification and Naïve Bayes
CSC 594 Topics in AI – Natural Language Processing
Lecture 1.31 Criteria for optimal reception of radio signals.
Chapter 15 Probability Rules!
Chapter 15 Probability Rules!.
A Survey of Probability Concepts
Text Mining CSC 600: Data Mining Class 20.
Chapter 4 Probability.
Good morning! August 16, Good morning! August 16, 2017.
Naive Bayes Classifier
Lecture 15: Text Classification & Naive Bayes
Data Mining Lecture 11.
Machine Learning. k-Nearest Neighbor Classifiers.
Classification Techniques: Bayesian Classification
Chapter 14 Probability Rules!.
Modelling data and curve fitting
Learning to Classify Documents Edwin Zhang Computer Systems Lab
Building a Naive Bayes Text Classifier with scikit-learn
Naive Bayes for Document Classification
Chapter 15 Probability Rules! Copyright © 2010 Pearson Education, Inc.
Chapter 15 Probability Rules!.
Text Mining CSC 576: Data Mining.
Naive Bayes Classifier
Naïve Bayes Text Classification
basic probability and bayes' rule
Presentation transcript:

Naive Bayesian Classification Abel Sanchez, John R Williams

Stunningly Simple The mathematics of Bayes Theorem are stunningly simple. In its most basic form, it is just an equation with three known variables and one unknown one. This simple formula can lead to surprising predictive insights.

Bayes and Laplace The intimate connection between probability, prediction, and scientific progress was thus well understood by Bayes and Laplace in the eighteenth century—the period when human societies were beginning to take the explosion of information that had become available with the invention of the printing press several centuries earlier, and finally translate it into sustained scientific, technological, and economic progress.

Conditional Probability Bayes’s theorem is concerned with conditional probability. That is, it tells us the probability that a hypothesis is true if some event has happened.

Bayes Theorem P(A) and P(B) are the probabilities of A and B independent of each other P(A|B) a conditional probability, is the probability of A given that B is true P(B|A) is the probability of B given that A is true

Example

Probability that your partner is cheating on you, given an event Event: you come home from a business trip to discover a strange pair of underwear. Condition: you have found the underwear Hypothesis: probability that you are being cheated on

p(u/c) - The probability of underwear u given cheating c Probability of underwear appearing, conditional on his cheating 50%

p(u) - The probability of the underwear u appearing if NO cheating Probability of the underwear’s appearing conditional on the hypothesis being false. 5%

p(u) - The probability of cheating c What is the probability you would have assigned to him cheating on you before you found the underwear? 4%

Underwear Example* The probability of cheating c given underwear u The probability of underwear u given cheating c The probability of the cheating c The probability of the underwear u * The Signal and the Noise: Why So Many Predictions Fail--but Some Don't, Nate Silver, 2012

Active Learning

Active Learning – Calculate Cheating Probability The probability of cheating c given underwear u The probability of underwear u given cheating c 50 The probability of the cheating c 4 The probability of the underwear u appearing if NO cheating 5

Classification of Drew

Example: classification of Drew We have two classes: c1=male, and c2=female Classifying drew as male or female is equivalent to asking is it more probable that drew is male or female. probability of being called “drew” given that you are a male? Probability of being a male? probability of being named “drew”?

Using Data Name Gender Drew Male Claudia Female Alberto Karin Nina probability of being called “drew” given that you are a male? Probability of being a male? probability of being named “drew”? Name Gender Drew Male Claudia Female Alberto Karin Nina Sergio p(male|drew) = 1/3 x 3/8 0.125 3/8 p(female|drew) 2/5 x 5/8 0.250

Bayesian Approach Posterior probability based on prior probability plus a new event

Classification of Documents

Questions We Can Answer Is this spam? Who wrote which Federalist papers? Positive or negative movie review? What is the subject of this article?

Text Classification Assigning subject categories, topics, or genres Authorship identification Age/gender identification Language Identification Sentiment analysis …

Bayes Theorem For a document d and a class c a conditional probability, is the probability of class c given document d is the probability of document d given class c the probability of the class c the probability of the document d

The probability of a word given a class Count of the word occurring in that class Count of all words in that class Vocabulary – unique instances of words The probability of the class Number of documents with that class Total number of documents

Data Doc Words Class Training 1 chinese beijing chinese c 2 chinese chinese shanghai 3 chinese macao 4 tokyo japan chinese j Test 5 chinese chinese chinese tokyo japan ? Priors p(c) = 3/4 p(j) 1/4 Conditional Probabilities p(chinese|c) = (5+1)/(8+6) 6/14 p(tokyo|c) (0+1)/(8+6) 1/14 p(japan|c) p(chinese|j) (1+1)/(3+6) 2/9 p(tokyo|j) p(japan|j) Choosing a class (category) p(c|d5) = (3/4)*(3/7)*(3/7)*(3/7)*(1/14)*(1/14) ≈ 0.0003 p(j|d5) (1/4)*(2/9)*(2/9)*(2/9)*(2/9)*(2/9) 0.0001

For homework we will use*: probability of language given word Probability that word is in language Probability that word is not in language * http://en.wikipedia.org/wiki/Naive_Bayes_spam_filtering

Calculating Probabilities // probability that word shows up in a language // probability that word is not in language

Underflow Prevention Multiplying lots of probabilities can result in floating-point underflow. Since log(xy) = log(x) + log(y); better to sum logs of probabilities instead of multiplying probabilities. Add probability of words (per language) using: In JavaScript ln is Math.log, and e is Math.exp At completion of each language: