Final figure Classify it Classify it.

Slides:



Advertisements
Similar presentations
Classroom Bill of Rights
Advertisements

Reagan/Clinton Compare/Contrast Essay. I. Reagan and Clinton both argue... but... A.Reagan argues B.Clinton argues.
What we will cover here What is a classifier
Naïve Bayes Classifier
Naïve Bayes Classifier
On Discriminative vs. Generative classifiers: Naïve Bayes
Bayesian networks practice. Semantics e.g., P(j  m  a   b   e) = P(j | a) P(m | a) P(a |  b,  e) P(  b) P(  e) = … Suppose we have the variables.
1 Financial Mathematics Clicker review session, Final.
Bayesian networks. Weather data What is the Bayesian Network corresponding to Naïve Bayes?
Universal Data Collection (UDC) Mapping Project
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=
I.1 ii.2 iii.3 iv.4 1+1=. i.1 ii.2 iii.3 iv.4 1+1=

Roman Numerals. The Numbers I-1 II-2 III-3 IV-4 V-5 VI-6 VII-7 VIII-8 IX-9 X-10 C-100 D-500 M-1000.
Naïve Bayes Classifier Ke Chen Extended by Longin Jan Latecki COMP20411 Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
8.1.4 Can it still be factored? Factoring Completely I can factor out a common factor.
Naïve Bayes Classifier Ke Chen Modified and extended by Longin Jan Latecki
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
#2 Classification Lab /24. I. TITLE: classification lab II. PURPOSE : to classify and prepare a “key” that allows for identification of various.
Bayesian networks practice (Weka). Weather data What is the Bayesian Network corresponding to Naïve Bayes?
Data Mining Practical Machine Learning Tools and Techniques Chapter 6.3: Association Rules Rodney Nielsen Many / most of these slides were adapted from:
COMP24111 Machine Learning Naïve Bayes Classifier Ke Chen.
ΟΡΓΑΝΩΣΗ ΚΑΙ ΔΙΟΙΚΗΣΗ ΕΠΙΧΕΙΡΗΣΕΩΝ 3 Ο ΜΑΘΗΜΑ. ΟΙ ΜΕΓΑΛΕΣ ΕΠΙΧΕΙΡΗΣΕΙΣ Η δημιουργία μεγάλων επιχειρήσεων είναι ένα από τα χαρακτηριστικά του 20 ου αιώνα.
International Accounting International Accounting A User Perspective by Shahrokh M. Saudagaran.
Last lecture summary Naïve Bayes Classifier. Bayes Rule Normalization Constant LikelihoodPrior Posterior Prior and likelihood must be learnt (i.e. estimated.
Naïve Bayes Classifier
text processing And naïve bayes
Naïve Bayes Classifier
Hazard Mitigation Planning VII
The Perfect Marriage! Ephesians 5:21-33.
Seven Essential Qualities Of A Disciple Maker
Naïve Bayes Classifier
The U.S. Constitution.
Plenary: rules of indices
Flow diagrams (i) (ii) (iii) x – Example
3. Old Brain I III II V IV VII VI VIII IX X XII XI
ОПШТИНА КУРШУМЛИЈА.
Tweet summary of article I
LAB REPORT (50 pts) I- Title (1)
Naïve Bayes Classifier
Generative Models and Naïve Bayes
Monetary Bulletin 2009/4 Powerpoint charts.
Monetary Bulletin 2010/4 Powerpoint charts from Economic and monetary developments and prospects.
Как да кандидатстваме по НИФ
ОПЕРАТИВНА ПРОГРАМА “ИНОВАЦИИ И КОНКУРЕНТОСПОСОБНОСТ“ „Подобряване на производствения капацитет в МСП“
Figure 8. Stages within the Phases of the Process
الأَوزانُ Verb Forms Happy Land for Islamic Teachings.
Index Notation Sunday, 24 February 2019.
ХУД 34-р сургуулийн нийгмийн ухааны багш О. БАЯРЦЭЦЭГ
Power Point бағдарламасында презентация құрастыру
NS3040 Global Capital Flows Fall Term, 2014
Solving Equations 3x+7 –7 13 –7 =.
CHAPTER 3 THE CONSTITUTION
Monetary Bulletin 2010/2 Powerpoint charts from Economic and monetary developments and prospects.
CHAPTER 3 THE CONSTITUTION
Monetary Bulletin 2009/2 Powerpoint charts.
Example Make x the subject of the formula
Generative Models and Naïve Bayes
Counting Statistics and Error Prediction
Counting Statistics and Error Prediction
Name ___________________________________________
NAÏVE BAYES CLASSIFICATION
I  Linear and Logical Pulse II  Instruments Standard Ch 17 GK I  Linear and Logical Pulse II  Instruments Standard III  Application.
I 大地悲嘆之飲酒歌 1st Theme 1st Theme ( ) ( ) 5 8
Chapter 13 Vector Functions
8 Core Steps of success: I.Step:1 : DREAM SETTING: II. Step: 2 : LIST MAKING : IV. Step: 4 : SHOW THE PLAN: III. Step: 3 : INVITATION: V. Step: 5 : FOLLOW.
U A B II III I IV 94.
Presentation transcript:

Final figure Classify it Classify it

Classification I Classify it P(play=yes|outlook=sunny, temp=cool,humidity=high, windy=true) = *P(play=yes) *P(outlook=sunny|play=yes) *P(temp=cool|play=yes, outlook=sunny) *P(humidity=high|play=yes, temp=cool) *P(windy=true|play=yes, = *0.625*0.25*0.4*0.2*0.5 = *0.00625

Classification II Classify it P(play=no|outlook=sunny, temp=cool,humidity=high, windy=true) = *P(play=no) *P(outlook=sunny|play=no) *P(temp=cool|play=no, outlook=sunny) *P(humidity=high|play= no, temp=cool) *P(windy=true|play=no, = *0.375*0.5*0.167*0.333*0.4 = *0.00417

Classification III Classify it P(play=yes|outlook=sunny, temp=cool,humidity=high, windy=true) = *0.00625 P(play=no|outlook=sunny, temp=cool,humidity=high, windy=true) = *.00417  = 1/(0.00625+0.00417) =95.969 = 95.969*0.00625 = 0.60

Classification IV (missing values or hidden variables) P(play=yes|temp=cool, humidity=high, windy=true) = *outlookP(play=yes) *P(outlook|play=yes) *P(temp=cool|play=yes,outlook) *P(humidity=high|play=yes, temp=cool) *P(windy=true|play=yes,outlook) =…(next slide)

Classification V (missing values or hidden variables) P(play=yes|temp=cool, humidity=high, windy=true) = *outlookP(play=yes)*P(outlook|play=yes)*P(temp=cool|play=yes,outlook) *P(humidity=high|play=yes,temp=cool)*P(windy=true|play=yes,outlook) = *[ P(play=yes)*P(outlook= sunny|play=yes)*P(temp=cool|play=yes,outlook=sunny) *P(humidity=high|play=yes,temp=cool)*P(windy=true|play=yes,outlook=sunny) +P(play=yes)*P(outlook= overcast|play=yes)*P(temp=cool|play=yes,outlook=overcast) *P(humidity=high|play=yes,temp=cool)*P(windy=true|play=yes,outlook=overcast) +P(play=yes)*P(outlook= rainy|play=yes)*P(temp=cool|play=yes,outlook=rainy) *P(humidity=high|play=yes,temp=cool)*P(windy=true|play=yes,outlook=rainy) ] 0.625*0.25*0.4*0.2*0.5 + 0.625*0.417*0.286*0.2*0.5 + 0.625*0.33*0.333*0.2*0.2 ] =*0.01645

Classification VI (missing values or hidden variables) P(play=no|temp=cool, humidity=high, windy=true) = *outlookP(play=no)*P(outlook|play=no)*P(temp=cool|play=no,outlook) *P(humidity=high|play=no,temp=cool)*P(windy=true|play=no,outlook) = *[ P(play=no)*P(outlook=sunny|play=no)*P(temp=cool|play=no,outlook=sunny) *P(humidity=high|play=no,temp=cool)*P(windy=true|play=no,outlook=sunny) +P(play=no)*P(outlook= overcast|play=no)*P(temp=cool|play=no,outlook=overcast) *P(humidity=high|play=no,temp=cool)*P(windy=true|play=no,outlook=overcast) +P(play=no)*P(outlook= rainy|play=no)*P(temp=cool|play=no,outlook=rainy) *P(humidity=high|play=no,temp=cool)*P(windy=true|play=no,outlook=rainy) ] 0.375*0.5*0.167*0.333*0.4 + 0.375*0.125*0.333*0.333*0.5 + 0.375*0.375*0.4*0.333*0.75 ] =*0.0208

Classification VII (missing values or hidden variables) P(play=yes|temp=cool, humidity=high, windy=true) =*0.01645 P(play=no|temp=cool, humidity=high, windy=true) =*0.0208 =1/(0.01645 + 0.0208)= 26.846 P(play=yes|temp=cool, humidity=high, windy=true) = 26.846 * 0.01645 = 0.44 P(play=no|temp=cool, humidity=high, windy=true) = 26.846 * 0.0208 = 0.56 I.e. P(play=yes|temp=cool, humidity=high, windy=true) is 44% and P(play=no|temp=cool, humidity=high, windy=true) is 56% So, we predict ‘play=no.’