ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2013 3. Conditional probability part two.

Slides:



Advertisements
Similar presentations
Discrete Probability Chapter 7.
Advertisements

Instructor: Dr. Ayona Chatterjee Spring  If there are N equally likely possibilities of which one must occur and n are regarded as favorable, or.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Combinatorial Analysis.
Exercises (1) 1. In the usual dice experiment, let A, B and C, be the following events: A = {1,7,6} and B = {1,2,7,5} and C = {3,5} Find the following:
Chapter 7 Probability 7.1 Experiments, Sample Spaces, and Events
1 Introduction to Stochastic Models GSLM Outline  course outline course outline  Chapter 1 of the textbook.
Discrete Mathematics Lecture 7 More Probability and Counting Harper Langston New York University.
PROBABILITY MODELS. 1.1 Probability Models and Engineering Probability models are applied in all aspects of Engineering Traffic engineering, reliability,
1 Copyright M.R.K. Krishna Rao 2003 Chapter 5. Discrete Probability Everything you have learned about counting constitutes the basis for computing the.
Chapter Two Probability. Probability Definitions Experiment: Process that generates observations. Sample Space: Set of all possible outcomes of an experiment.
1 Chapter 3 Set Theory Yen-Liang Chen Dept of Information Management National Central University.
P robability Sample Space 郭俊利 2009/02/27. Probability 2 Outline Sample space Probability axioms Conditional probability Independence 1.1 ~ 1.5.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two.
Probability. Randomness Long-Run (limiting) behavior of a chance (non- deterministic) process Relative Frequency: Fraction of time a particular outcome.
Conditional Probability. Conditional Probability of an Event Illustrating Example (1): Consider the experiment of guessing the answer to a multiple choice.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Independence and Bernoulli.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
Principles of Statistics Chapter 2 Elements of Probability.
Chapter 1 Probability Spaces 主講人 : 虞台文. Content Sample Spaces and Events Event Operations Probability Spaces Conditional Probabilities Independence of.
Section 7.1. Section Summary Finite Probability Probabilities of Complements and Unions of Events Probabilistic Reasoning.
Chapter 11 Probability Sample spaces, events, probabilities, conditional probabilities, independence, Bayes’ formula.
AP Statistics Chapter 6 Notes. Probability Terms Random: Individual outcomes are uncertain, but there is a predictable distribution of outcomes in the.
OKAN UNIVERSITY FACULTY OF ENGINEERING AND ARCHITECTURE Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability.
Chapter 7 With Question/Answer Animations. Section 7.1.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Lesson 6 – 2b Probability Models Part II. Knowledge Objectives Explain what is meant by random phenomenon. Explain what it means to say that the idea.
Chapter 16 Probability. Activity Rock-Paper-Scissors Shoot Tournament 1)Pair up and choose one person to be person A and the other person B. 2)Play 9.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Independence and Bernoulli Trials. Sharif University of Technology 2 Independence  A, B independent implies: are also independent. Proof for independence.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part two.
1 CHAPTERS 14 AND 15 (Intro Stats – 3 edition) PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
3. Conditional probability
1 CHAPTER 7 PROBABILITY, PROBABILITY RULES, AND CONDITIONAL PROBABILITY.
Rules of Probability. Recall: Axioms of Probability 1. P[E] ≥ P[S] = 1 3. Property 3 is called the additive rule for probability if E i ∩ E j =
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
ICS 253: Discrete Structures I Discrete Probability King Fahd University of Petroleum & Minerals Information & Computer Science Department.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Natural Language Processing Giuseppe Attardi Introduction to Probability IP notice: some slides from: Dan Jurafsky, Jim Martin, Sandiway Fong, Dan Klein.
Part 1: Theory Ver Chapter 3 Conditional Probability and Independence.
Chapter 2. Conditional Probability Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Andrej Bogdanov ENGG 2430A: Probability and Statistics for Engineers Spring Axioms of Probability.
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
PROBABILITY AND BAYES THEOREM 1. 2 POPULATION SAMPLE PROBABILITY STATISTICAL INFERENCE.
3.4 Elements of Probability. Probability helps us to figure out the liklihood of something happening. The “something happening” is called and event. The.
Section 7.1. Probability of an Event We first define these key terms: An experiment is a procedure that yields one of a given set of possible outcomes.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Probability What is the probability of rolling “snake eyes” in one roll? What is the probability of rolling “yahtzee” in one roll?
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and Statistics.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
ICS 253: Discrete Structures I
The Pigeonhole Principle
Discrete Mathematics Lecture 8 Probability and Counting
Probability.
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
What is Probability? Quantification of uncertainty.
Probability.
Applicable Mathematics “Probability”
Discrete Probability Chapter 7 With Question/Answer Animations
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 2: Probability CIS Computational Probability and.
2. Conditional Probability
1. Probabilistic Models.
3. Independence and Random Variables
Click the mouse button or press the Space Bar to display the answers.
Agenda Lecture Content: Discrete Probability
Chapter 1 Probability Spaces
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two

Boxes I choose a cup at random and then a random ball from that cup. The ball is blue. You need to guess where the ball came from. (a) Which cup would you guess? (b) What is the probability you are correct? 1 2 3

Bayes’ rule P(F i |E) = P(E|F) P(F) P(E)P(E) P(E|F) P(F) + P(E|F c ) P(F c ) = More generally, if F 1,…, F n partition S then P(F|E) = P(E|F i ) P(F i ) P(E|F 1 ) P(F 1 ) + … + P(E|F n ) P(F n )

Medical tests If you are sick (S), a blood test comes out positive (P) 95% of the time. If you are not sick, the test is positive 1% of the time. Suppose 0.5% people in Hong Kong are sick. You take the test and come out positive. What are the chances that you are sick? P(P|S) P(S) P(P|S) P(S) + P(P|S c ) P(S c ) P(S|P) = 95% 0.5% 1% 99.5% ≈ 32.3%

Problem for you to think about Urn one has 9 blue balls and 1 red ball. Urn two has 9 red balls and 1 blue ball. I choose an urn at random and draw a ball. It is blue. I draw another ball from the same urn (without replacement). What is the probability it is blue?

Russian roulette Alice Bob BANG Alice and Bob take turns spinning the 6 hole cylinder and shooting at each other. What is the probability that Alice wins (Bob dies)?

Russian roulette S = { H, MH, MMH, MMMH, MMMH, …} E.g. MMH : Alice misses, then Bob misses, then Alice kills A = “Alice wins” = { H, MMH, MMMMH, …} Probability model outcomes are not equally likely!

Russian roulette P(A)P(A) outcome H MH MMH MMMH MMMH probability 1/65/6 ∙ 1/6(5/6) 2 ∙ 1/6 (5/6) 3 ∙ 1/6 (5/6) 4 ∙ 1/6 = 1/6 + (5/6) 2 ∙ 1/6 + (5/6) 4 ∙ 1/6 + … = 1/6 ∙ (1 + (5/6) 2 + (5/6) 4 + …) = 1/6 ∙ 1/(1 – (5/6) 2 ) = 6/11

Russian roulette Solution using conditional probabilities: P(A) = P(A|W 1 ) P(W 1 ) + P(A|W 1 c ) P(W 1 c ) A = “Alice wins” = { H, MMH, MMMMH, …} W 1 = “Alice wins in first round” = { H } A c = “Bob wins” = { MH, MMMH, MMMMMH, …} 5/6 1/6 1 P(Ac)P(Ac) P(A) = 1 ∙ 1/6 + (1 – P(A)) ∙ 5/6 11/6 P(A) = 1 so P(A) = 6/11

Infinite sample spaces Axioms of probability: S E 1. for every E, 0 ≤ P(E) ≤ 1 S 2. P(S) = 1 S EF 3. If EF = ∅ then P(E ∪ F) = P(E) + P(F) 3. If E 1, E 2, … are pairwise disjoint : P(E 1 ∪ E 2 ∪ …) = P(E 1 ) + P(E 2 ) + …

Problem for you to solve Charlie tosses a pair of dice. Alice wins if the sum is 7. Bob wins if the sum is 8. Charlie keeps tossing until one of them wins. What is the probability that Alice wins?

Independence of two events Let E 1 be “first coin comes up H ” E 2 be “second coin comes up H ” Then P(E 2 | E 1 ) = P(E 2 ) Events A and B are independent if P(A B) = P(A) P(B) P(E 2 E 1 ) = P(E 2 )P(E 1 )

Examples of (in)dependence Let E 1 be “first die is a 4 ” S 6 be “sum of dice is a 6 ” S 7 be “sum of dice is a 7 ” P(E 1 ) = 1/6 P(S 6 ) = 5/36 P(E 1 S 6 ) = 1/36 E 1, S 6 are dependent P(S 7 ) = 1/6P(E 1 S 7 ) = 1/36 E 1, S 7 are independent P(S 6 S 7 ) = 0 S 6, S 7 are dependent

Reliability of sequential components CUHK Shing Mun Tsing MaAirport W SM : “Shing Mun tunnel is operational” W TM : “Tsing Ma bridge is operational” P(W SM ) = 90% P(W TM ) = 98% Assuming events W SM and W TM are independent: P(W) = P(W SM W TM ) = P(W SM )P(W TM ) = 88.2% W : “The road is operational”

Algebra of independent events If A and B are independent, then A and B c are also independent. Proof: Assume A and B are independent. P(B c | A) = 1 – P(B | A) = 1 – P(B)= P(B c ) so B c and A are independent. Taking complements preserves independence.

Reliability of parallel components CUHK Lion Rock Tate’s Cairn Hung Hom 85% 95% Assuming W LR and W TC are independent: P(W) = P(W LR ∪ W TC ) P(W c ) = P(W LR c W TC c ) = P(W LR c )P(W TC c ) P(W) = 1 – P(W LR c )P(W TC c ) = 1 – 15% 5% = 99.25%

Independence of three events Events A, B, and C are independent if P(AB) = P(A) P(B) P(BC) = P(B) P(C) P(AC) = P(B) P(C) and P(ABC) = P(A) P(B) P(C). This is important!

(In)dependence of three events Let E 1 be “first die is a 4 ” E 2 be “second die is a 3 ” S 7 be “sum of dice is a 7 ” P(E 1 E 2 ) = P(E 1 ) P(E 2 ) P(E 1 S 7 ) = P(E 1 ) P(S 7 ) P(E 2 S 7 ) = P(E 2 ) P(S 7 ) P(E 1 E 2 S 7 ) = P(E 1 ) P(E 2 ) P(S 7 ) ✔ ✔ ✔ ✗ 1/6 1/36 E1E1 E2E2 S7S7 1/6 1/36

Independence of many events Events A 1, A 2, … are independent if for every subset A i1, …, A ir of the events P(A i1 …A ir ) = P(A i1 ) … P(A ir ) Independence is preserved if we replace some event(s) by their complements, intersections, unions Algebra of independent events

For you to think about Lion Rock Tate’s Cairn 95% Eastern Cross-Harbour 85% 90% 70% Shek O Assuming failures are independent, what is the probability that there is an operational road from CUHK to Shek O? CUHK

Playoffs Alice wins 60% of her ping pong matches against Bob. They meet for a 3 match playoff. What are the chances that Alice will win the playoff? Probability model Let W i be the event Alice wins match i We assume P(W 1 ) = P(W 2 ) = P(W 3 ) = 0.6 We also assume W 1, W 2, W 3 are independent

Playoffs Probability model To convince ourselves this is a probability model, let’s redo it the usual way S = { AAA, AAB, ABA, ABB, BAA, BAB, BBA, BBB } the probability of AAA is P(W 1 W 2 W 3 ) = AAB P(W 1 W 2 W 3 c ) = ∙ 0.4 ABA P(W 1 W 2 c W 3 ) = ∙ 0.4 BBB P(W 1 c W 2 c W 3 c ) = … … The probabilities add up to one.

Playoffs A = { AAA, AAB, ABA, BAA } For Alice to win the tournament, she must win at least 2 out of 3 games. The corresponding event is ∙ 0.4 each P(A) = ∙ ∙ 0.4 = Alice wins a p fraction of her ping pong games against Bob. What are the chances Alice beats Bob in an n match tournament ( n is odd)? General playoff

Playoffs Solution Probability model similar as before. Let A be the event “Alice wins playoff” A k be the event “Alice wins exactly k matches” P(A k ) = C(n, k) p k (1 – p) n - k A = A (n+1)/2 ∪ … ∪ A n P(A) = P(A (n+1)/2 ) + … + P(A n ) (they are disjoint) number of arrangements of k A s, n – k B s probability of each such arrangement

Playoffs P(A) = ∑ k = (n+1)/2 n C(n, k) p k (1 – p) n - k p = 0.6 p = 0.7 The probability that Alice wins an n game tournament nn

Problem for you The Lakers and the Celtics meet for a 7-game playoff. They play until one team wins four games. Suppose the Lakers win 60% of the time. What is the probability that all 7 games are played?

Gambler’s ruin You have $100. You keep betting $1 on red at roulette. You stop when you win $200, or when you run out of money. What is the probability you win $200?

Gambler’s ruin Probability model S = all infinite sequences of R eds and O thers Let R i be the event of red in the i th round (there is an R in position i ) Probabilities: P(R 1 ) = P(R 2 ) = … = 18/37 R 1, R 2, … are independent call this p

Gambler’s ruin Let W be the event you win $200 and w n = P(W). You have $100. You stop when you win $200. $n$n = P(W|R 1 ) P(R 1 ) + P(W|R 1 c ) P(R 1 c )w n = P(W) 1-p p w n+1 w n-1 w n = (1-p)w n-1 + pw n+1 w 0 = 0w 200 = 1.

Gambler’s ruin p(w n+1 – w n ) = (1-p)(w n – w n-1 ) w n = (1-p)w n-1 + pw n+1 w 0 = 0w 200 = 1. w n+1 – w n = (w n – w n-1 ) let = (1-p)/p = 19/18 = 2 (w n-1 – w n-2 ) = … = n (w 1 – w 0 ) w n+1 = w n + n w 1 = w n-1 + n-1 w 1 + n w 1 = … = w 1 + w 1 + … + n w 1

Gambler’s ruin w n = (1-p)w n-1 + pw n+1 w 0 = 0w 200 = 1. = (1-p)/p = 19/18 w n+1 = w 1 + … + n w 1 = ( n+1 – 1)/( – 1)w 1 w 200 = ( 200 – 1)/( – 1)w 1 w n+1 = n+1 – – 1 You have $100. You stop when you win $200 or run out. The probability you win is w 100 ≈