Bayes' theorem p(A|B) = p(B|A) p(A) / p(B)

Slides:



Advertisements
Similar presentations
Day 16 More on conditional probability, independence and total probability formula.
Advertisements

0 0 Review Probability Axioms –Non-negativity P(A)≥0 –Additivity P(A U B) =P(A)+ P(B), if A and B are disjoint. –Normalization P(Ω)=1 Independence of two.
INTRODUCTION TO ARTIFICIAL INTELLIGENCE Massimo Poesio LECTURE 11 (Lab): Probability reminder.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham 2005.
1 BASIC NOTIONS OF PROBABILITY THEORY. NLE 2 What probability theory is for Suppose that we have a fair dice, with six faces, and that we keep throwing.
5/17/20151 Probabilistic Reasoning CIS 479/579 Bruce R. Maxim UM-Dearborn.
1. Probability 2. Random variables 3. Inequalities 4. Convergence of random variables 5. Point and interval estimation 6. Hypotheses testing 7. Nonparametric.
Probability Notation Review Prior (unconditional) probability is before evidence is obtained, after is posterior or conditional probability P(A) – Prior.
September SOME BASIC NOTIONS OF PROBABILITY THEORY Universita’ di Venezia 29 Settembre 2003.
Bayesian Models. Agenda Project WebCT Late HW Math –Independence –Conditional Probability –Bayes Formula & Theorem Steyvers, et al 2003.
Naïve Bayes Model. Outline Independence and Conditional Independence Naïve Bayes Model Application: Spam Detection.
Bayesian Models Honors 207, Intro to Cognitive Science David Allbritton An introduction to Bayes' Theorem and Bayesian models of human cognition.
Probability (cont.). Assigning Probabilities A probability is a value between 0 and 1 and is written either as a fraction or as a proportion. For the.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. Section 4-2 Basic Concepts of Probability.
MGMT 322 T UTORIAL 4 QUIZ Q UESTION 1 A) Define Statistically Dependent and Independent Events. B) Give 2 examples for independent events.
The Erik Jonsson School of Engineering and Computer Science Chapter 1 pp William J. Pervin The University of Texas at Dallas Richardson, Texas
Statistics Chapter 3: Probability.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
CSCI 121 Special Topics: Bayesian Network Lecture #1: Reasoning Under Uncertainty.
Graziella Quattrocchi & Louise Marshall Methods for Dummies 2014
Bayes for Beginners Presenters: Shuman ji & Nick Todd.
A.P. STATISTICS LESSON 6.3 ( DAY 2 ) GENERAL PROBABILITY RULES ( EXTENDED MULTIPLICATION RULES )
Copyright © 2010 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
Chapter 3: DECISION ANALYSIS Part 2 1. Decision Making Under Risk  Probabilistic decision situation  States of nature have probabilities of occurrence.
Bayes’ Theorem Susanna Kujanpää OUAS Bayes’ Theorem This is a theorem with two distinct interpretations. 1) Bayesian interpretation: it shows.
Bayesian vs. frequentist inference frequentist: 1) Deductive hypothesis testing of Popper--ruling out alternative explanations Falsification: can prove.
Bayesian Methods Allow updating an existing probability when additional information is made available Allows flexibility when integrating different types.
Conditional Probability Objective: I can find the probability of a conditional event.
1 Chapter 4, Part 1 Repeated Observations Independent Events The Multiplication Rule Conditional Probability.
Probability You’ll probably like it!. Probability Definitions Probability assignment Complement, union, intersection of events Conditional probability.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Probability. Rules  0 ≤ P(A) ≤ 1 for any event A.  P(S) = 1  Complement: P(A c ) = 1 – P(A)  Addition: If A and B are disjoint events, P(A or B) =
1 Discrete Random Variables – Outline 1.Two kinds of random variables – Discrete – Continuous 2. Describing a DRV 3. Expected value of a DRV 4. Variance.
1 Chapter 15 Probability Rules. 2 Recall That… For any random phenomenon, each trial generates an outcome. An event is any set or collection of outcomes.
I can find probabilities of compound events.. Compound Events  Involves two or more things happening at once.  Uses the words “and” & “or”
STATISTICS 6.0 Conditional Probabilities “Conditional Probabilities”
© 2013 Pearson Education, Inc. Reading Quiz For use with Classroom Response Systems Introductory Statistics: Exploring the World through Data, 1e by Gould.
Conditional Probability If two events are not mutually exclusive, the fact that we know that B has happened will have an effect on the probability of A.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 4 Probability.
Chapter 15: Probability Rules! Ryan Vu and Erick Li Period 2.
Intro to Bayesian Learning Exercise Solutions Ata Kaban The University of Birmingham.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Virtual University of Pakistan
Chapter 11 Probability.
A Survey of Probability Concepts
Chapter 4 Probability.
Bayesian Methods Allow updating an existing probability when additional information is made available Allows flexibility when integrating different types.
CHAPTER 10: Logistic Regression
Conditional probability
Probability Review of Basic Facts
Bayes for Beginners Stephanie Azzopardi & Hrvoje Stojic
Incorporating New Information to Decision Trees (posterior probabilities) MGS Chapter 6 Part 3.
Bayes’ Theorem Suppose we have estimated prior probabilities for events we are concerned with, and then obtain new information. We would like to a sound.
Mathematical representation of Bayes theorem.
Lecture 10 – Introduction to Probability
6.3 Probabilities with Large Numbers
Chapter 14 Probability Rules!.
General Probability Rules
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
Addition of Independent Normal Random Variables
St. Edward’s University
6. Multistage events and application of probability
Chapter 15 Probability Rules!.
First, a question Can we find the perfect value for a parameter?
Chapter 11 Probability.
Bayes for Beginners Luca Chech and Jolanda Malamud
General Probability Rules
Probability Multiplication law for dependent events
Presentation transcript:

Bayes' theorem p(A|B) = p(B|A) p(A) / p(B) In general p(A|B) (usually read 'probability of A given B’) = the probability of finding observation A, given that some piece of evidence B is present p(A) = the probability of the outcome occurring, without knowledge of the new evidence p(B) = the probability of the evidence arising, without regard for the outcome p(B|A) = the probability of the evidence turning up, given that the outcome obtains

Video Also mentions base rate fallacy *try not to laugh with hypothesitis

An example 4 possible situations: There is a race between 2 horses: Fleetfoot and Dogmeat  Fleetfoot and Dogmeat have raced against each other on 12 previous occasions: Dogmeat won 5 and Fleetfoot won 7 Therefore, the estimated probability of Dogmeat winning the next race is 5/12 = 0.417 = 41.7% Fleetfoot, on the other hand, appears to be a better bet at 7/12 = 58.3% BUT 3/5 of Dogmeat's previous wins, it had rained heavily However, it had rained only once on the days that he lost On the day of the race in question, it is raining Which horse should I bet on? 4 possible situations: Raining Not Raining Dogmeat wins 3 2 Dogmeat loses 1 6 Source: http://www.kevinboone.net/bayes.html

So, Bayes’ theorem in our example… p(A|B) = p(B|A) p(A) / p(B) ‘A’ = the outcome in which Dogmeat wins ‘B’ = the piece of evidence that it is raining p(A|B) = the probability of Dogmeat winning given that it is raining = what we want to find out p(B|A) = the probability that it is raining, given that Dogmeat wins p(B|A) = 3/5 = 0.6 (since it was raining 3/5 times that Dogmeat won) p(A) = 5/2 = 0.417 (because Dogmeat won on 5/12 occasions) p(B) = 4/12 = 0.333 (since we know it rained on 4 days in total) Therefore, p(A|B) = p(B|A)*p(A) / pB) = 0.6 * 0.417 / 0.333 = 0.75

Patten et al. (2017)

Friston et al. (2014)

The brain as a phantastic organ Friston et al. (2014)