Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham.

Slides:



Advertisements
Similar presentations
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Advertisements

Decision Making Under Risk Continued: Bayes’Theorem and Posterior Probabilities MGS Chapter 8 Slides 8c.
1 Essential Probability & Statistics (Lecture for CS598CXZ Advanced Topics in Information Retrieval ) ChengXiang Zhai Department of Computer Science University.
Psychology 290 Special Topics Study Course: Advanced Meta-analysis April 7, 2014.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham 2005.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
AI – CS364 Uncertainty Management Introduction to Uncertainty Management 21 st September 2006 Dr Bogdan L. Vrusias
Ai in game programming it university of copenhagen Statistical Learning Methods Marco Loog.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Statistical Learning: Bayesian and ML COMP155 Sections May 2, 2007.
Basics of Statistical Estimation. Learning Probabilities: Classical Approach Simplest case: Flipping a thumbtack tails heads True probability  is unknown.
Introduction to Bayesian Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
Bayesian Models Honors 207, Intro to Cognitive Science David Allbritton An introduction to Bayes' Theorem and Bayesian models of human cognition.
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
ECE 8443 – Pattern Recognition LECTURE 06: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Bias in ML Estimates Bayesian Estimation Example Resources:
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
Graziella Quattrocchi & Louise Marshall Methods for Dummies 2014
Bayes for Beginners Presenters: Shuman ji & Nick Todd.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Naive Bayes Classifier
“PREDICTIVE MODELING” CoSBBI, July Jennifer Hu.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 Wednesday, 20 October.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Mathematics topic handout: Conditional probability & Bayes Theorem Dr Andrew French. PAGE 1www.eclecticon.info Conditional Probability.
Bayes’ Theorem Susanna Kujanpää OUAS Bayes’ Theorem This is a theorem with two distinct interpretations. 1) Bayesian interpretation: it shows.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 25 of 41 Monday, 25 October.
Maximum Likelihood - "Frequentist" inference x 1,x 2,....,x n ~ iid N( ,  2 ) Joint pdf for the whole random sample Maximum likelihood estimates.
DNA Identification: Bayesian Belief Update Cybergenetics © TrueAllele ® Lectures Fall, 2010 Mark W Perlin, PhD, MD, PhD Cybergenetics, Pittsburgh,
Bayesian statistics Probabilities for everything.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Lecture: Forensic Evidence and Probability Characteristics of evidence Class characteristics Individual characteristics  features that place the item.
BIOSTAT 3 Three tradition views of probabilities: Classical approach: make certain assumptions (such as equally likely, independence) about situation.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Uncertainty ECE457 Applied Artificial Intelligence Spring 2007 Lecture #8.
Education as a Signaling Device and Investment in Human Capital Topic 3 Part I.
1 Machine Learning: Lecture 6 Bayesian Learning (Based on Chapter 6 of Mitchell T.., Machine Learning, 1997)
Statistical NLP: Lecture 4 Mathematical Foundations I: Probability Theory (Ch2)
Bayesian Learning Evgueni Smirnov Overview Bayesian Theorem Maximum A Posteriori Hypothesis Naïve Bayes Classifier Learning Text Classifiers.
Bayes for Beginners Anne-Catherine Huys M. Berk Mirza Methods for Dummies 20 th January 2016.
Naive Bayes Classifier. REVIEW: Bayesian Methods Our focus this lecture: – Learning and classification methods based on probability theory. Bayes theorem.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Bayesian Learning Reading: Tom Mitchell, “Generative and discriminative classifiers: Naive Bayes and logistic regression”, Sections 1-2. (Linked from.
Essential Probability & Statistics
Lecture 1.31 Criteria for optimal reception of radio signals.
When is the post-test probability sufficient for decision-making?
CS479/679 Pattern Recognition Dr. George Bebis
Probability and Statistics
Does Naïve Bayes always work?
Naive Bayes Classifier
Ch3: Model Building through Regression
Bayes' theorem p(A|B) = p(B|A) p(A) / p(B)
Bayes Net Learning: Bayesian Approaches
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
Combining Random Variables
Latent Variables, Mixture Models and EM
Bayesian Inference, Basics
'Linear Hierarchical Models'
Statistical NLP: Lecture 4
Wellcome Trust Centre for Neuroimaging
LECTURE 07: BAYESIAN ESTIMATION
When do we make decisions?
Bayes for Beginners Luca Chech and Jolanda Malamud
Machine Learning: Lecture 6
28th September 2005 Dr Bogdan L. Vrusias
Machine Learning: UNIT-3 CHAPTER-1
Naive Bayes Classifier
Chapter 2, Unit E Screening Tests.
basic probability and bayes' rule
Presentation transcript:

Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham

You are to be tested for a disease that has prevalence in the population of 1 in The lab test used is not always perfect: It has a false-positive rate of 1%. [A false-positive result is when the test is positive, although the disease is not present.] The false negative rate of the test is zero. [A false negative is when the test result is negative while in fact the disease is present.] a) If you are tested and you get a positive result, what is the probability that you actually have the disease? b) Under the conditions in the previous question, is it more probable that you have the disease or that you don’t? c) Would the answers to a) and / or b) differ if you use a maximum likelihood versus a maximum a posteriori hypothesis estimation method? Comment on your answer.

ANSWER a) We have two binary variables, A and B. A is the outcome of the test, B is the presence/absence of the disease. We need to compute P(B=1|A=1). We use Bayes theorem: Now the required quantities are known from the problem. These are the following: P(A=1|B=1)=1, i.e. true positives P(B=1)=1/1000, i.e. prevalence P(A=1|B=0)=0.01, i.e. false positives P(B=0)=1-1/1000 Replacing, we have:

b) Under the conditions in the previous question, is it more probable that you have the disease or that you don’t? ANSWER: P(B=0|A=1)=1-P(B=1|A=1)= So clearly it is more probable that the disease is not present.

c) Would the answers to a) and / or b) differ if you use a maximum likelihood versus a maximum a posteriori hypothesis estimation method? Comment on your answer. ANSWER: -ML maximises P(D|h) w.r.t. h, whereas MAP maximises P(h|D). So MAP includes prior knowledge about the hypothesis, as P(h|D) is in fact proportional to P(D|h)*P(h). This is a good example where the importance and influence of prior knowledge is evident. -The answer at b) is based on the maximum a posteriori estimate, as we have included prior knowledge in the form of prevalence of the disease. If that would not been taken into account, i.e. both P(B=1)=0.5 and P(B=0)=0.5 is considered than the hypothesis estimate would be the maximum likelihood one. In that case the presence of the disease would come out be more probable than the absence of it. This is an example of how prior knowledge can influence the Bayesian decisions. However, more data should be collected in order to produce a more reliable estimate.