Advanced Probability.

Slides:



Advertisements
Similar presentations
Conditional Probability
Advertisements

Bayes Rule The product rule gives us two ways to factor a joint probability: Therefore, Why is this useful? –Can get diagnostic probability P(Cavity |
Foundations of Artificial Intelligence 1 Bayes’ Rule - Example  Medical Diagnosis  suppose we know from statistical data that flu causes fever in 80%
Objectives (BPS chapter 12)
Processing physical evidence discovering, recognizing and examining it; collecting, recording and identifying it; packaging, conveying and storing it;
MGMT 242 Fall, 1998 Probability Essentials Chapter 3 “A pinch of probably is worth a pound of perhaps.” James Thurber, American Humorist.
ISP 121 Statistics That Deceive. Simpson’s Paradox It’s a well accepted rule of thumb that the larger the data set, the better Simpson’s Paradox demonstrates.
Does Naïve Bayes always work?
Chapter 15: Probability Rules!
Review: Probability Random variables, events Axioms of probability
Chapter 4 Probability Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Bayes Theorem.  The Gambler’s Fallacy ◦ Is assuming that the odds of a single truly random event are affected in any way by previous iterations of the.
Probability, Bayes’ Theorem and the Monty Hall Problem
Probability and inference General probability rules IPS chapter 4.5 © 2006 W.H. Freeman and Company.
6 Probability Chapter6 p Operations on events and probability An event is the basic element to which probability can be applied. Notations Event:
Probability and Bayes Theorem.  The Gambler’s Fallacy ◦ Is assuming that the odds of a single truly random event are affected in any way by previous.
5.3A Conditional Probability, General Multiplication Rule and Tree Diagrams AP Statistics.
An Intuitive Explanation of Bayes' Theorem By Eliezer Yudkowsky.
Statistics 3502/6304 Prof. Eric A. Suess Chapter 4.
1 NA387 Lecture 6: Bayes’ Theorem, Independence Devore, Sections: 2.4 – 2.5.
Bayes for Beginners Presenters: Shuman ji & Nick Todd.
Probability and inference General probability rules IPS chapter 4.5 © 2006 W.H. Freeman and Company.
PROBABILITY Basic Concepts So simple.... Figure these out Take a blank piece of paper and write down your own answers before they show up on the slides.
Probability and inference General probability rules IPS chapter 4.5 © 2006 W.H. Freeman and Company.
Copyright © 2014, 2011 Pearson Education, Inc. 1 Chapter 8 Conditional Probability.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Bayesian Networks for Cyber Crimes. Bayes’ Theorem For an hypothesis H supported by evidence E: Pr(H|E) = Pr(E|H).Pr(H)/Pr(E) where – Pr(H|E) is the posterior.
Lecture: Forensic Evidence and Probability Characteristics of evidence Class characteristics Individual characteristics  features that place the item.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Review: Probability Random variables, events Axioms of probability Atomic events Joint and marginal probability distributions Conditional probability distributions.
12/7/20151 Math b Conditional Probability, Independency, Bayes Theorem.
Lesson 8.7 Page #1-29 (ODD), 33, 35, 41, 43, 47, 49, (ODD) Pick up the handout on the table.
Conditional Probability and Independence. Learning Targets 1. I can use the multiplication rule for independent events to compute probabilities. 2. I.
Stat 1510: General Rules of Probability. Agenda 2  Independence and the Multiplication Rule  The General Addition Rule  Conditional Probability  The.
- 1 - Outline Introduction to the Bayesian theory –Bayesian Probability –Bayes’ Rule –Bayesian Inference –Historical Note Coin trials example Bayes rule.
HL2 Math - Santowski Lesson 93 – Bayes’ Theorem. Bayes’ Theorem  Main theorem: Suppose we know We would like to use this information to find if possible.
Probability Probability Day 3 Introduction to Probability Probability of Independent Events.
Statistics for Managers 5th Edition
Section 12.3 Conditional Probability. Activity #1 Suppose five cards are drawn from a standard deck of playing cards without replacement. What is the.
Biostatistics Class 2 Probability 2/1/2000.
Bayesian inference in neural networks
Bayesian inference, Naïve Bayes model
Introduction to Probability Theory
Review of Probability.
Does Naïve Bayes always work?
Statistics 200 Lecture #12 Thursday, September 29, 2016
Independent events Two events are independent if knowing that one event is true or has happened does not change the probability of the other event. “male”
Naive Bayes Classifier
Constraints on Credence
Milwaukee, Wisconsin(USA)
Elementary Statistics
Learn to let go. That is the key to happiness. ~Jack Kornfield
Basic Probabilistic Reasoning
Bayesian inference in neural networks
سرطان الثدي Breast Cancer
Conditional Probability AGENDA
Medical Diagnosis Problem
Probability Probability is the frequency of a particular outcome occurring across a number of trials
Odds of Having Cancer.
Lecture: Forensic Evidence and Probability Characteristics of evidence
Least-squares, Maximum likelihood and Bayesian methods
MAT 3100 Introduction to Proof
Chapter 4 Probability 4.2 Basic Concepts of Probability
Basics of ML Rohan Suri.
Probabilistic Reasoning With Bayes’ Rule
posterior probability
Probabilistic Reasoning With Bayes’ Rule
posterior probability
Chapter 12 Vocabulary.
The two-step Fagan's nomogram.
Presentation transcript:

Advanced Probability

h and g can’t both be true at the same time. Probability calculus 1 ≥ Pr(h) ≥ 0 If e deductively implies h, then Pr(h|e) = 1. (disjunction rule) If h and g are mutually exclusive, then Pr(h or g) = Pr(h) + Pr(g). (disjunction rule) If h and g are mutually exclusive, then Pr(h or g) = Pr(h) + Pr(g) (conditional probability) Pr(h|g) = Pr(h&g)/Pr(g) h and g can’t both be true at the same time. 5. (conjunction rule) Pr(h&g) = Pr(h) x Pr(g|h)

Probability calculus 6. (conjunction rule) Pr(h&g) = Pr(h) x Pr(g|h) Independent: the occurrence of one doesn’t affect the probability of the other. Pr(g) = Pr(g|h) 6. (conjunction rule) Pr(h&g) = Pr(h) x Pr(g|h) 7. (conjunction rule) If h and g are independent, then Pr(h&g) = Pr(h) x Pr(g).

Probability calculus 1 ≥ Pr(h) ≥ 0 If e deductively implies h, then Pr(h|e) = 1. (disjunction rule) If h and g are mutually exclusive, then Pr(h or g) = Pr(h) + Pr(g). (conditional probability) Pr(h|g) = Pr(h&g)/Pr(g) 5. (conjunction rule) Pr(h&g) = Pr(h) x Pr(g|h)

Bayes’s theorem Pr(e|h) x Pr(h) Pr(h|e) = Pr(e)

Bayes’s theorem probability of evidence given hypothesis “prior probability” of the hypothesis Pr(e|h) x Pr(h) Pr(h|e) = Pr(e) probability of the hypothesis given the evidence “posterior probability” “expectedness” of the evidence “likelihood” How strongly does the hypothesis lead us to expect this evidence? How plausible was the hypothesis before the new evidence? What is the probability of the hypothesis, given this new newevidence? How unsurprising is the new evidence? base rate [Pr(e|h) x Pr(h)] + [Pr(e|not-h) x Pr(not-h)]

[Pr(w|not-r) x Pr(not-r)] Suppose there is a 40% chance of rain today and a 90% chance that you get wet if it rains. What is the chance that you get wet? Pr(r) = .4 Pr(not-r) = .6 Pr(w|r) = .9 so, Pr(w) = Pr(w|r) x Pr(r) [ ] .9 x .4 + .36 + .36 + .12 + [Pr(w|not-r) x Pr(not-r)] Pr(w|not-r) .2 x .6 0 x .6 ? .48 .36

Bayes’s theorem Pr(e|h) x Pr(h) Pr(h|e) = Pr(e) Pr(e|h) x Pr(h) [Pr(e|h) x Pr(h)] + [Pr(e|not-h) x Pr(not-h)]

Bayes and Frequency Trees

The probability that a woman in this group has breast cancer is. 8% The probability that a woman in this group has breast cancer is .8%. If a woman has breast cancer, the probability is 90% that she will have a positive mammogram. If a woman does not have breast cancer, the probability is 7% that she will still have a positive mammogram. x has a positive mammogram. What is the probability that x has breast cancer?

Base rate (prior probability) The probability that a woman in this group has breast cancer is .8%. If a woman has breast cancer, the probability is 90% that she will have a positive mammogram. If a woman does not have breast cancer, the probability is 7% that she will still have a positive mammogram. x has a positive mammogram. What is the probability that x has breast cancer?

Base rate (prior probability) Sensitivity inverse of false negatives likelihood The probability that a woman in this group has breast cancer is .8%. If a woman has breast cancer, the probability is 90% that she will have a positive mammogram. If a woman does not have breast cancer, the probability is 7% that she will still have a positive mammogram. x has a positive mammogram. What is the probability that x has breast cancer?

Base rate (prior probability) Sensitivity False positive The probability that a woman in this group has breast cancer is .8%. If a woman has breast cancer, the probability is 90% that she will have a positive mammogram. If a woman does not have breast cancer, the probability is 7% that she will still have a positive mammogram. x has a positive mammogram. What is the probability that x has breast cancer?

Odds are 10:1 you don’t have cancer! Probability of having cancer given positive screen is 7/77, approximately 9%. base rate = .8% 1,000 women sensitivity = 90% 8 have cancer 992 don’t false positives = 7% ~1 negative ~7 positive ~7 positive ~70 positive ~70 positive ~922 negative Odds are 10:1 you don’t have cancer!

base rate = .8% base rate = .8% Women in group sensitivity = 90% sensitivity = 90% false positives = 7% false positives = 7% tested positive women with cancer

Don’t confuse sensitivity with posterior probability! Pr(e|h) x Pr(h) Pr(h|e) = Pr(e) likelihood = posterior sensitivity Don’t confuse sensitivity with posterior probability!