ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring 2014 4. Random variables part one.

Slides:



Advertisements
Similar presentations
6.2 Construct and Interpret Binomial Distributions
Advertisements

Chapter 4 Probability and Probability Distributions
EXAMPLE 1 Construct a probability distribution
EXAMPLE 1 Construct a probability distribution Let X be a random variable that represents the sum when two six-sided dice are rolled. Make a table and.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Probability Distributions Finite Random Variables.
Random Variables A Random Variable assigns a numerical value to all possible outcomes of a random experiment We do not consider the actual events but we.
Random Variable (RV) A function that assigns a numerical value to each outcome of an experiment. Notation: X, Y, Z, etc Observed values: x, y, z, etc.
Expected Value.  In gambling on an uncertain future, knowing the odds is only part of the story!  Example: I flip a fair coin. If it lands HEADS, you.
Eighth lecture Random Variables.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Chapter 3 Random Variables and Probability Distributions 3.1 Concept of a Random Variable: · In a statistical experiment, it is often very important to.
Chapter 5: Probability Distribution What is Probability Distribution? It is a table of possible outcomes of an experiment and the probability of each outcome.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Conditional probability.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Chapter 5: Random Variables and Discrete Probability Distributions
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Properties of expectation.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
Expected Value. Expected Value - Definition The mean (average) of a random variable.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Chapter Four Random Variables and Their Probability Distributions
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Math b (Discrete) Random Variables, Binomial Distribution.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Random Variables an important concept in probability.
3. Conditional probability
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Axioms of Probability part one.
Sixth lecture Concepts of Probabilities. Random Experiment Can be repeated (theoretically) an infinite number of times Has a well-defined set of possible.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
L56 – Discrete Random Variables, Distributions & Expected Values
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
AP STATISTICS Section 7.1 Random Variables. Objective: To be able to recognize discrete and continuous random variables and calculate probabilities using.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
Chapter 3 Discrete Random Variables and Probability Distributions  Random Variables.2 - Probability Distributions for Discrete Random Variables.3.
Andrej Bogdanov ENGG 2430A: Probability and Statistics for Engineers Spring Axioms of Probability.
The Mean of a Discrete Random Variable Lesson
CHAPTER Discrete Models  G eneral distributions  C lassical: Binomial, Poisson, etc Continuous Models  G eneral distributions 
Probability and Simulation The Study of Randomness.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Probability Distribution. Probability Distributions: Overview To understand probability distributions, it is important to understand variables and random.
Basic Probabilities Starting Unit 6 Today!. Definitions  Experiment – any process that generates one or more observable outcomes  Sample Space – set.
Random Variables By: 1.
Chapter 9: Joint distributions and independence CIS 3033.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
Probability Distribution for Discrete Random Variables
CHAPTER 2 RANDOM VARIABLES.
Discrete Random Variables
Sequences, Series, and Probability
5. Continuous Random Variables
ETM 607 – Spreadsheet Simulations
Conditional Probability
Probability Review for Financial Engineers
ASV Chapters 1 - Sample Spaces and Probabilities
3. Independence and Random Variables
4. Expectation and Variance Joint PMFs
I flip a coin two times. What is the sample space?
ASV Chapters 1 - Sample Spaces and Probabilities
M248: Analyzing data Block A UNIT A3 Modeling Variation.
e is the possible out comes for a model
Presentation transcript:

ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one

Random variable A discrete random variable assigns a discrete value to every outcome in the sample space. { HH, HT, TH, TT } Example N = number of H s

Probability mass function ¼¼¼ ¼ N = number of H s p(0) = P(N = 0) = P({ TT }) = 1/4 p(1) = P(N = 1) = P({ HT, TH }) = 1/2 p(2) = P(N = 2) = P({ HH }) = 1/4 { HH, HT, TH, TT } Example The probability mass function (p.m.f.) of discrete random variable X is the function p(x) = P(X = x)

Probability mass function We can describe the p.m.f. by a table or by a chart. x p(x) ¼ ½ ¼ x p(x)p(x)

Balls We draw 3 balls without replacement from this urn: Let X be the sum of the values on the balls. What is the p.m.f. of X ? 0

Balls X = sum of values on the 3 balls 0 P(X = 0) P(X = 1) = P(E 100 ) + P(E 11(-1) ) E abc : we chose balls of type a, b, c = P(E 000 ) + P(E 1(-1)0 ) = (1 + 3×3×3)/C(9, 3) = 28/84 = (3×3 + 3×3)/C(9, 3) = 18/84 P(X = -1) = P(E (-1)00 ) + P(E (-1)(-1)1 )= (3×3 + 3×3)/C(9, 3) = 18/84 P(X = 2) = P(E 110 )= 3×3/C(9, 3)= 9/84 P(X = -2) = P(E (-1)(-1)0 )= 3×3/C(9, 3)= 9/84 P(X = 3) = P(E 111 )= 1/C(9, 3)= 1/84 P(X = -3) = P(E (-1)(-1)(-1) )= 1/C(9, 3)= 1/84 1

Probability mass function p.m.f. of X The events “ X = x ” are disjoint and partition the sample space, so for every p.m.f ∑ x p(x) = 1

Events from random variables p.m.f. of X 28/84 18/84 9/84 1/84 P(X > 0) = 18/84 + 9/84 + 1/84 = 1/3 P(X is even) = 9/ /84 + 9/84= 23/42

Example Two six-sided dice are tossed. Calculate the p.m.f. of the difference D of the outcomes. What is the probability that D > 1 ? D is odd?

Cumulative distribution function The cumulative distribution function (c.d.f.) of discrete random variable X is the function F(x) = P(X ≤ x) x p(x)p(x) x F(x)F(x)

Coupon collection

There are n types of coupons. Every day you get one. By when will you get all the coupon types? Solution Let X t be the day you collect the (first) type t coupon Let X be the day on which you collect all coupons (X ≤ d) = (X 1 ≤ d) and (X 2 ≤ d) … (X n ≤ d)

Coupon collection Let X 1 be the day you collect the type 1 coupon Probability model Let E i be the event you get a type 1 coupon on day i We also assume E 1, E 2, … are independent Since there are n types, we assume P(E 1 ) = P(E 2 ) = … = 1/n We are interested in P(X 1 ≤ d)

Coupon collection P(X 1 ≤ d) = 1 – P(X 1 > d) = 1 – P(E 1 c ) P(E 2 c ) … P(E d c ) = 1 – (1 – 1/n) d = 1 – P(E 1 c E 2 c … E d c ) (X 1 ≤ d) = E 1 ∪ E 2 ∪ … ∪ E d

Coupon collection There are n types of coupons. Every day you get one. By when will you get all the coupon types? Solution Let X be the day on which you collect all coupons Let X t be the day when you get your type t coupon (X ≤ d) = (X 1 ≤ d) and (X 2 ≤ d) … (X n ≤ d) (X > d) = (X 1 > d) ∪ (X 2 > d) ∪ … ∪ (X n > d) not independent!

Coupon collection We calculate P(X > d) by inclusion-exclusion P(X > d) = ∑ P(X t > d) – ∑ P(X t > d and X u > d) + … P(X 1 > d) = (1 – 1/n) d P(X 1 > d and X 2 > d) = P(F 1 … F d ) by symmetry P(X t > d) = (1 – 1/n) d F i = “day i coupon is not of type 1 or 2” = P(F 1 ) … P(F d ) = (1 – 2/n) d independent events

Coupon collection P(X 1 > d) = (1 – 1/n) d P(X 1 > d and X 2 > d) = (1 – 2/n) d P(X 1 > d and X 2 > d and X 3 > d) = (1 – 3/n) d and so on so P(X > d) = C(n, 1) (1 – 1/n) d – C(n, 2) (1 – 2/n) d + … = ∑ i = 1 (-1) i+1 C(n, i) (1 – i/n) d n P(X > d) = ∑ P(X t > d) – ∑ P(X t > d and X u > d) + …

Coupon collection n = 15 d Probability of collecting all n coupons by day d P(X ≤ d)

Coupon collection dd n = 5n = 10 n = 15n =

Coupon collection Day on which the probability of collecting all n coupons first exceeds 1/2 n n The function n ln n ln 2

Coupon collection 16 teams 17 coupons per team 272 coupons it takes 1624 days to collect all coupons with probability 1/2.

Expected value The expected value (expectation) of a random variable X with p.m.f. p is E[X] = ∑ x x p(x) N = number of H s x 0 1 p(x) ½ ½ E[N] = 0 ½ + 1 ½ = ½ Example

Expected value Example N = number of H s x p(x) ¼ ½ ¼ E[N] = 0 ¼ + 1 ½ + 2 ¼ = 1 E[N]E[N] The expectation is the average value the random variable takes when experiment is done many times

Expected value Example F = face value of fair 6-sided die E[F] = =

Chuck-a-luck If it doesn’t appear, you lose $1. If appears k times, you win $ k.

Chuck-a-luck P = profit E[P] = -1 × (5/6) × 3(5/6) 2 (1/6) × 3(5/6)(1/6) × (5/6) n p(n)p(n) 1 6 ( ) ( ) 5 6 ( ) ( ) 1 6 ( ) 2 3 Solution = -17/216

Utility Should I come to class next Tuesday? C ome S kip not called called F /176/17 E[C]E[C] = -3.82… 5×11/17 − 20×6/17 E[S]E[S] = … 100×11/17 − 300×6/17