6.3 Bayes Theorem. We can use Bayes Theorem… …when we know some conditional probabilities, but wish to know others. For example: We know P(test positive|have.

Slides:



Advertisements
Similar presentations
Multiplication Rule We now discuss the situation when both events A and B occur.
Advertisements

Discrete Probability Chapter 7.
1 Probability Theory Dr. Deshi Ye
Law of Total Probability and Bayes’ Rule
COUNTING AND PROBABILITY
1 Bayesian Spam Filters Key ConceptsKey Concepts –Conditional Probability –Independence –Bayes Theorem.
22C:19 Discrete Structures Discrete Probability Fall 2014 Sukumar Ghosh.
Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. 6.1 Chapter Six Probability.
For Thursday, Feb. 20 Bring Conditional Probability Worksheet to class Exam #1:March 4 Project Presentations Start March 6 Team Homework #4 due Feb. 25.
1 Discrete Math CS 2800 Prof. Bart Selman Module Probability --- Part b) Bayes’ Rule Random Variables.
Probability Harry R. Erwin, PhD School of Computing and Technology University of Sunderland.
Chapter 4: Probability (Cont.) In this handout: Total probability rule Bayes’ rule Random sampling from finite population Rule of combinations.
Conditional probability Probability of an event E, given that an event F has occurred is called the conditional probability of E given F, and is written.
Basic probability Sample Space (S): set of all outcomes of an experiment Event (E): any collection of outcomes Probability of an event E, written as P(E)
Conditional Probability and Independence Section 3.6.
Loan Workouts: Putting it All Together. Loan Workouts So far we’ve looked at:  Probability of success and failure over the entire loan records P(S) =
1 The probability that a medical test will correctly detect the presence of a certain disease is 98%. The probability that this test will correctly detect.
Conditional Probability
Chapter 15: Probability Rules!
Conditional Probability. Conditional Probability of an Event Illustrating Example (1): Consider the experiment of guessing the answer to a multiple choice.
An Intuitive Explanation of Bayes' Theorem By Eliezer Yudkowsky.
Chapter 7 Sets & Probability
Review Chapter Chapter 1 Combinatorial Analysis Basic principle of counting Permutation Combination 2.
Chapter 11 Probability Sample spaces, events, probabilities, conditional probabilities, independence, Bayes’ formula.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 8 Conditional Probability.
Section 6.6: Some General Probability Rules. General Addition Rule for Two Events For any two events E and F,
CSE 321 Discrete Structures Winter 2008 Lecture 19 Probability Theory TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.:
Ch Counting Techniques Product Rule If the first element or object of an ordered pair can be used in n 1 ways, and for each of these n1 ways.
Conditional Probability More often than not, we wish to express probabilities conditionally. i.e. we specify the assumptions or conditions under which.
 P(A c ∩ B) + P(A ∩ B)=P(B)  P(A c ∩ B) and P(A ∩ B) are called joint probability.  P(A) and P(B) are called marginal probability.  P(A|B) and P(B|A)
Baye’s Theorem Working with Conditional Probabilities.
Computing Science, University of Aberdeen1 Reflections on Bayesian Spam Filtering l Tutorial nr.10 of CS2013 is based on Rosen, 6 th Ed., Chapter 6 & exercises.
Bayes’ Theorem -- Partitions Given two events, R and S, if P(R  S) =1 P(R  S) =0 then we say that R and S partition the sample space. More than 2 events.
NLP. Introduction to NLP Formula for joint probability –p(A,B) = p(B|A)p(A) –p(A,B) = p(A|B)p(B) Therefore –p(B|A)=p(A|B)p(B)/p(A) Bayes’ theorem is.
Topic 2: Intro to probability CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text, Probability and Statistics for Engineering.
Project 1 Lecture Notes. Table of Contents Basic Probability Word Processing Mathematics Summation Notation Expected Value Database Functions and Filtering.
22C:19 Discrete Structures Discrete Probability Spring 2014 Sukumar Ghosh.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
1 Chapter 12 Probabilistic Reasoning and Bayesian Belief Networks.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
13.3 Conditional Probability and Intersections of Events Understand how to compute conditional probability. Calculate the probability of the intersection.
Conditional Probability and Intersection of Events Section 13.3.
False Positives Sensitive Surveys Lesson Starter A bag contains 5 red marbles and 4 blue marbles. Two marbles are drawn without replacement. What.
Exercises (1) 1. A man has two kids. What’s the probability that both are girls, given that one of them is a girl.
Section Conditional Probability Objectives: 1.Understand the meaning of conditional probability. 2.Learn the general Multiplication Rule:
Psychology 202a Advanced Psychological Statistics September 24, 2015.
Stat 13, Thu 4/19/ Hand in HW2! 1. Resistance. 2. n-1 in sample sd formula, and parameters and statistics. 3. Probability basic terminology. 4. Probability.
Section 6.4: Conditional Probability. Conditional Probability of an Event Let E and F be two events with P(F) > 0. Conditional Probability is denoted.
Chapter 7 With Question/Answer Animations. Chapter Summary Introduction to Discrete Probability Probability Theory Bayes’ Theorem Expected Value and Variance.
HL2 Math - Santowski Lesson 93 – Bayes’ Theorem. Bayes’ Theorem  Main theorem: Suppose we know We would like to use this information to find if possible.
STAT 240 PROBLEM SOLVING SESSION #2. Conditional Probability.
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
Project 1 Lecture Notes. Table of Contents Basic Probability Word Processing Mathematics Summation Notation Expected Value Database Functions and Filtering.
7.6 Bayes’ Theorem. SCBA D A  DB  DC  D In this Venn Diagram, S is the whole sample space (everything), and D overlaps the other three sets. We will.
Conditional Probability and Independence Lecture Lecturer : FATEN AL-HUSSAIN.
Chapter 7. Section 7.1 Finite probability  In a lottery, players win a large prize when they pick four digits that match, in the correct order, four.
Unit 5 Lesson 4 Probability 6.5 Independence. Independent Events Two events are independent if knowing that one will occur (or has occurred) does not.
The law of total probability and Bayes Formula. Law of Total Probability F1F1 F2F2 F3F3 E Suppose a sample space S is the union of n pairwise disjoint.
Probability --- Part b)
Section 7.6 Bayes’ Theorem.
Bayesian Notions and False Positives
Discrete Structures for Computer Science
CSE 321 Discrete Structures
Homework: pg ) P(A and B)=0.46*0.32= ) A B.) ) .3142; If A and B were independent, then the conditional probability of.
CSE 321 Discrete Structures
MAS2317 Presentation Question 1
Discrete-time markov chain (continuation)
Lecture 12 More Probability (Section 0.2)
Lesson 56 – Probability of Dependent Events – Conditional Probabilty
Presentation transcript:

6.3 Bayes Theorem

We can use Bayes Theorem… …when we know some conditional probabilities, but wish to know others. For example: We know P(test positive|have disease), and we wish to know P(have disease|test positive)

Ex. 1 (book ex 2- p. 419) Suppose that one person in 100,000 has a particular rare disease for which there is a fairly accurate diagnostic test. This test is correct 99% of the times for someone who has the disease and 99.5% of the time for someone who does not.

Define E, F, E’, F’ Let F=event one has the disease E=event one tests positive We know that P(F) = 1/100,000 = P(E|F)= P(positive|disease) =.99 and P(E’ |F’ ) = P(negative| don’t have disease) =.995 Determine P(F|E) = P(has disease|test positive) = ___ and P ( F’ |E’ )= P(does not have disease |test negative)= ___

Draw tree diagram starting with F, F’

Find P(F|E) and P(F’| E’) P(F|E) = = P(F’ | E’)= =

Ex. 2: F=studied for final, E=passed class Assume: P(F) = P(studied)=.8 P(E|F)= P(passed|studied)=.9 and P(E|F ’ ) = P(passed|didn’t study)=.2 Find P(F|E) = P(studied|passed)= ___ P (F’ | E’ )= P(didn’t study | failed) = ___

Tree diagram, starting with F, F’

Spam filters Ex. 3: Spam filters Idea: spam has words like “offer”, “special”, “opportunity”, “Rolex”, … Non-spam has words like “mom”, “lunch”,… False negatives: when we fail to detect spam False positives: when non-spam is seen as spam Let S=spamE=has a certain word Assume P(S)=0.5

Tree diagram starting with S, S’

Given a message says “Rolex”, find probability it is spam Consider that “Rolex” occurs in 250/2000=.125 spam messages and in 5/1000=.005 non-spam messages. Assume P(S)=0.5 Ex: P(S|uses word “Rolex”) = =.962