3. Independence and Random Variables

Slides:



Advertisements
Similar presentations
2 Discrete Random Variable. Independence (module 1) A and B are independent iff the knowledge that B has occurred does not change the probability that.
Advertisements

Introduction to Probability
Lec 18 Nov 12 Probability – definitions and simulation.
CONDITIONAL PROBABILITY and INDEPENDENCE In many experiments we have partial information about the outcome, when we use this info the sample space becomes.
Probability Distributions
P robability Sample Space 郭俊利 2009/02/27. Probability 2 Outline Sample space Probability axioms Conditional probability Independence 1.1 ~ 1.5.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability part two.
Special Topics. Definitions Random (not haphazard): A phenomenon or trial is said to be random if individual outcomes are uncertain but the long-term.
Review of Probability Theory. © Tallal Elshabrawy 2 Review of Probability Theory Experiments, Sample Spaces and Events Axioms of Probability Conditional.
Chapter 5 Discrete Probability Distribution I. Basic Definitions II. Summary Measures for Discrete Random Variable Expected Value (Mean) Variance and Standard.
Problem A newly married couple plans to have four children and would like to have three girls and a boy. What are the chances (probability) their desire.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
Conditional Probability and Independence If A and B are events in sample space S and P(B) > 0, then the conditional probability of A given B is denoted.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Chapter 1:Independent and Dependent Events
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
BINOMIALDISTRIBUTION AND ITS APPLICATION. Binomial Distribution  The binomial probability density function –f(x) = n C x p x q n-x for x=0,1,2,3…,n for.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Computing Fundamentals 2 Lecture 6 Probability Lecturer: Patrick Browne
Copyright © Cengage Learning. All rights reserved. 8.6 Probability.
Probability Basic Concepts Start with the Monty Hall puzzle
Introduction  Probability Theory was first used to solve problems in gambling  Blaise Pascal ( ) - laid the foundation for the Theory of Probability.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Part 1: Theory Ver Chapter 3 Conditional Probability and Independence.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Random Variables Example:
Week 21 Rules of Probability for all Corollary: The probability of the union of any two events A and B is Proof: … If then, Proof:
Warm Up: Quick Write Which is more likely, flipping exactly 3 heads in 10 coin flips or flipping exactly 4 heads in 5 coin flips ?
Probability 9.8. Copyright © by Houghton Mifflin Company, Inc. All rights reserved. 2 Definition: Experiment Any activity with an unpredictable results.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Chapter 10 PROBABILITY. Probability Terminology  Experiment: take a measurement Like flipping a coin  Outcome: one possible result of an experiment.
Chapter5 Statistical and probabilistic concepts, Implementation to Insurance Subjects of the Unit 1.Counting 2.Probability concepts 3.Random Variables.
Probability What do you think probability means? ◦Write down a definition based on what you already know ◦Use the word probability in a sentence.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
9.8 Probability Basic Concepts
Conditional Probability & the Multiplication Rule
PROBABILITY Probability Concepts
3 Discrete Random Variables and Probability Distributions
Random Variables.
Chapter 4 Probability Concepts
Chapter 6 6.1/6.2 Probability Probability is the branch of mathematics that describes the pattern of chance outcomes.
Chapter Two Probability
Sequences, Series, and Probability
PROBABILITY.
What is Probability? Quantification of uncertainty.
Copyright © 2016, 2013, and 2010, Pearson Education, Inc.
Definitions: Random Phenomenon:
Chapter 2: Probability · An Experiment: is some procedure (or process) that we do and it results in an outcome. A random experiment: is an experiment we.
Basic Probability aft A RAJASEKHAR YADAV.
Conditional Probability on a joint discrete distribution
13-5 Conditional Probability
Simple Random Sample A simple random sample (SRS) of size n consists of n elements from the population chosen in such a way that every set of n elements.
Probability.
Chapter 11: Further Topics in Algebra
Digital Lesson Probability.
4. Expectation and Variance Joint PMFs
5. Conditioning and Independence
7.2 Union, intersection, complement of an event, odds
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Theoretical Probability
Chapter 11 Probability.
Chapter 1 Probability Spaces
Sets, Combinatorics, Probability, and Number Theory
Presentation transcript:

3. Independence and Random Variables

Independence of two events Let E1 be “first coin comes up H” E2 be “second coin comes up H” Then P(E2 | E1) = P(E2) P(E2∩E1) = P(E2)P(E1) Events A and B are independent if P(A∩B) = P(A) P(B)

Examples of (in)dependence Let E1 be “first die is a 4” S6 be “sum of dice is a 6” S7 be “sum of dice is a 7” E1 and S6? E1 and S7? S6 and S7?

Sequential components ER: “East Rail Line is working” P(ER) = 70% P(MS) = 98% MS: “Ma On Shan Line is working”

Algebra of independent events If A and B are independent, then A and Bc are also independent. Proof:

Parallel components TW: “Tsuen Wan Line is operational” P(TW) = 80% TC: “Tung Chung Line is operational” P(TC) = 85%

Independence of three events Events A, B, and C are independent if P(A∩B) = P(A) P(B) P(B∩C) = P(B) P(C) P(A∩C) = P(B) P(C) and P(A∩B∩C) = P(A) P(B) P(C).

(In)dependence of three events Let E1 be “first die is a 4” E2 be “second die is a 3” S7 be “sum of dice is a 7” E1 E2 S7 1/6 1/36 E1, E2? E1, S7? This example shows pairwise independence not enough for independence E2, S7? E1, E2, S7?

(In)dependence of three events Let A be “first roll is 1, 2, or 3 ” B be “first roll is 3, 4, or 5” C be “sum of rolls is 9” A, B? A, C? This example shows P(ABC) = P(A)P(B)P(C) can hold but A, B, C still not independent B, C? A, B, C?

Independence of many events Events A1, A2, … are independent if for every subset of the events, the probability of the intersection is the product of their probabilities. Independence is preserved if we replace some event(s) by their complements, intersections, unions Algebra of independent events

Multiple components P(ER) = 70% P(WR) = 75% P(TW) = 85% P(KT) = 95%

Multiple components P(ER) = 70% P(WR) = 75% P(KT) = 95% P(TW) = 85%

Playoffs Probability model Alice wins 60% of her ping pong matches against Bob. They meet for a 3 match playoff. What are the chances that Alice will win the playoff? Probability model Let Ai be the event Alice wins match i Point out without proof that this is a valid definition of a probability model: You can calculate the probability of any outcome, and this is consistent with the axioms of probability Assume P(A1) = P(A2) = P(A3) = 0.6 Also assume A1, A2, A3 are independent

Playoffs outcome probability P(A) =

Bernoulli trials n trials, each succeeds independently with probability p The probability at least k out of n succeed is

Playoffs p = 0.6 p = 0.7 n n The probability that Alice wins an n game tournament

The Lakers and the Celtics meet for a 7-game playoff The Lakers and the Celtics meet for a 7-game playoff. They play until one team wins four games. Suppose the Lakers win 60% of the time. What is the probability that all 7 games are played?

Conditional independence A and B are independent conditioned on F if P(A ∩ B | F) = P(A | F) P(B | F) Alternative definition: P(A | B ∩ F) = P(A | F)

It is ☀️ on Monday. Will it 🌧 on Wednesday? today tomorrow ☀️ 80% ☀️, 20% 🌧 🌧 40% ☀️, 60% 🌧 It is ☀️ on Monday. Will it 🌧 on Wednesday?

Conditioning does not preserve independence Let E1 be “first die is a 4” E2 be “second die is a 3” S7 be “sum of dice is a 7” E1, E2 are independent, but dependent conditioned on S7

Conditioning may destroy dependence Probability model 🙎🏻‍♂️ 🙍🏻‍♀️ 99% 👍 👍 🇺🇸 🇨🇳 1% 👎 👎 The events A = “Alice likes the movie” and B = ”Bob likes the movie” are conditionally independent, but unconditionally dependent P(A and B) = ½ * .99^2 + ½ * .01^2 very close to ½, P(A) = P(B) = 1/2

Random variable Example A discrete random variable assigns a discrete value to every outcome in the sample space. { HH, HT, TH, TT } Example N = number of Hs

Probability mass function The probability mass function (p.m.f.) of discrete random variable X is the function p(x) = P(X = x) { HH, HT, TH, TT } Example ¼ N = number of Hs p(0) = P(N = 0) = P({TT}) = 1/4 p(1) = P(N = 1) = P({HT, TH}) = 1/2 p(2) = P(N = 2) = P({HH}) = 1/4

Probability mass function We can describe the p.m.f. by a table or by a chart. x 0 1 2 p(x) p(x) ¼ ½ ¼ x

Two six-sided dice are tossed. Calculate the p. m. f Two six-sided dice are tossed. Calculate the p.m.f. of the difference D of the rolls. What is the probability that D > 1? D is odd?

11 12 13 14 15 16 21 22 23 24 25 26 31 32 33 34 35 36 41 42 43 44 45 46 51 52 53 54 55 56 61 62 63 64 65 66

The binomial random variable Binomial(n, p): Perform n independent trials, each of which succeeds with probability p. X = number of successes Examples Toss n coins. “number of heads” is Binomial(n, ½). Toss n dice. “Number of s” is Binomial(n, 1/6).

A less obvious example Toss n coins. Let C be the number of consecutive changes (HT or TH). Examples: w C(w) HTHHHHT 3 THHHHHT HHHHHHH 2 Then C is Binomial(n – 1, ½).

No! A non-example Is N a Binomial(10, 1/13) random variable? Draw a 10-card hand from a 52-card deck. Let N = number of aces among the drawn cards Is N a Binomial(10, 1/13) random variable? No! Trial outcomes are not independent.

Probability mass function If X is Binomial(n, p), its p.m.f. is p(k) = P(X = k) = pk (1 - p)n-k n k ( )

Binomial(10, 0.5) Binomial(50, 0.5) Binomial(10, 0.3) Binomial(50, 0.3)

Geometric random variable Let X1, X2, … be independent trials with success p. A Geometric(p) random variable N is the time of the first success among X1, X2, … : N = first (smallest) n such that Xn = 1. So P(N = n) = P(X1 = 0, …, Xn-1 = 0, Xn = 1) = (1 – p)n-1p. This is the p.m.f. of N.

Geometric(0.5) Geometric(0.7) Geometric(0.05)

Apples Probability model About 10% of the apples on your farm are rotten. You sell 10 apples. How many are rotten? Probability model Number of rotten apples you sold is Binomial(n = 10, p = 1/10).

Apples You improve productivity; now only 5% apples rot. You can now sell 20 apples. N is now Binomial(20, 1/20).

Poisson(1) Binomial(10, 1/10) Binomial(20, 1/20) .349 .387 .194 .001 10-10 Binomial(20, 1/20) .354 .377 .189 .002 10-8 10-26 .367 .183 .003 10-7 10-19 Poisson(1)

The Poisson random variable A Poisson(l) random variable has this p.m.f.: p(k) = e-l lk/k! k = 0, 1, 2, 3, … Poisson random variables do not occur “naturally” in the sample spaces we have seen. They approximate Binomial(n, p) random variables when l = np is fixed and n is large (so p is small) pPoisson(l)(k) = limn → ∞ pBinomial(n, l/n)(k)

Functions of random variables p.m.f. of X: p.m.f. of X – 1? x 0 1 2 p(x) 1/3 1/3 1/3 p.m.f. of (X – 1)2? If X is a random variable with p.m.f. pX, then Y = f(X) is a random variable with p.m.f. pY(y) = ∑x: f(x) = y pX(x).

Two six-sided dice are tossed. D is the difference of rolls Two six-sided dice are tossed. D is the difference of rolls. Calculate the p.m.f. of |D|.