5. Continuous Random Variables

Slides:



Advertisements
Similar presentations
Exponential Distribution
Advertisements

1 Continuous random variables Continuous random variable Let X be such a random variable Takes on values in the real space  (-infinity; +infinity)  (lower.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Discrete Uniform Distribution
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Continuous Random Variable (1). Discrete Random Variables Probability Mass Function (PMF)
Review of Basic Probability and Statistics
Section 2.6 Consider a random variable X = the number of occurrences in a “unit” interval. Let = E(X) = expected number of occurrences in a “unit” interval.
Probability Distributions
Chapter 6 Continuous Random Variables and Probability Distributions
Probability Distributions
CONTINUOUS RANDOM VARIABLES. Continuous random variables have values in a “continuum” of real numbers Examples -- X = How far you will hit a golf ball.
Stat 321 – Day 15 More famous continuous random variables “All models are wrong; some are useful” -- G.E.P. Box.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki Chapter 5: Continuous Random Variables.
Operations Management
Week 51 Relation between Binomial and Poisson Distributions Binomial distribution Model for number of success in n trails where P(success in any one trail)
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter 4 Continuous Random Variables and Probability Distributions
Lecture 10 – Introduction to Probability Topics Events, sample space, random variables Examples Probability distribution function Conditional probabilities.
Continuous Random Variables and Probability Distributions
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Continuous Probability Distributions  Continuous Random Variable  A random variable whose space (set of possible values) is an entire interval of numbers.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part two.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
Continuous Distributions The Uniform distribution from a to b.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Jointly Distributed Random Variables.
One Random Variable Random Process.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
 Random variables can be classified as either discrete or continuous.  Example: ◦ Discrete: mostly counts ◦ Continuous: time, distance, etc.
Probability Review-1 Probability Review. Probability Review-2 Probability Theory Mathematical description of relationships or occurrences that cannot.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Random Variables Example:
Topic 5: Continuous Random Variables and Probability Distributions CEE 11 Spring 2002 Dr. Amelia Regan These notes draw liberally from the class text,
Chapter 6: Continuous Probability Distributions A visual comparison.
MATH 256 Probability and Random Processes Yrd. Doç. Dr. Didem Kivanc Tureli 14/10/2011Lecture 3 OKAN UNIVERSITY.
 Recall your experience when you take an elevator.  Think about usually how long it takes for the elevator to arrive.  Most likely, the experience.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
Chapter 6: Continuous Probability Distributions A visual comparison.
Lesson 99 - Continuous Random Variables HL Math - Santowski.
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Continuous Random Variables.
ONE DIMENSIONAL RANDOM VARIABLES
Random Variable 2013.
Continuous Random Variable
Random variables (r.v.) Random variable
ETM 607 – Spreadsheet Simulations
The Bernoulli distribution
Lecture 10 – Introduction to Probability
Multinomial Distribution
Probability Review for Financial Engineers
ASV Chapters 1 - Sample Spaces and Probabilities
STATISTICAL MODELS.
... DISCRETE random variables X, Y Joint Probability Mass Function y1
4. Expectation and Variance Joint PMFs
7. Continuous Random Variables II
ASV Chapters 1 - Sample Spaces and Probabilities
Presentation transcript:

5. Continuous Random Variables

Delivery time A package is to be delivered between noon and 1pm. When will it arrive?

Delivery time A probability model Sample space S1 = {0, 1, …, 59} equally likely outcomes Random variable X: minute when package arrives X(0) = 0, X(1) = 1, …, X(59) = 59 X(w) = w E[X] = 0⋅1/60 + … + 59⋅1/60 = 29.5

A more precise probability model Delivery time A more precise probability model S2 = {0, , , …, 1, 1 , …, 59 } 1 60 2 59 equally likely outcomes X: minute when package arrives E[X] = 29.983…

Taking precision to the limit S = the (continuous) interval [0, 60) equally likely outcomes S1 = {0, 1, …, 59} p = 1/60 S2 = {0, , …, 59 } 1 60 59 p = 1/3600 S = [0, 60) p = 0

Uncountable sample spaces In Lecture 2 we said: “The probability of an event is the sum of the probabilities of its elements” but in S = [0, 60) all elements have probability zero! To specify and calculate probabilites, we have to work with the axioms of probability

The uniform random variable Sample space S = [0, 60) Events of interest: intervals [x, y) ⊆ [0, 60) their intersections, unions, etc. Probabilities: P([x, y)) = (y – x)/60 Random variable: X(w) = w

How to do calculations Solution You walk out of the apartment from 12:30 to 12:45. What is the probability you missed the delivery? Solution Event of interest: E = [30, 45) or E = “30 ≤ X < 45” E 60 P(E) = (45 – 30)/60 = 1/4

How to do calculations From 12:08 - 12:12 and 12:54 - 12:57 the doorbell wasn’t working. Event of interest: E = “8 ≤ X < 12” ∪ “54 ≤ X < 57” 60 P(E) = P([8, 12)) + P([54, 57)) = 4/60 + 3/60 = 7/60

Cumulative distribution function The probability mass function doesn’t make much sense because P(X = x) = 0 for all x. Instead, we can describe X by its cumulative distribution function (c.d.f.) F: F(x) = P(X ≤ x) The c.d.f. makes sense for discrete as well as continuous random variables.

Cumulative distribution functions f(x) = P(X = x) F(x) = P(X ≤ x)

Uniform random variable If X is uniform over [0, 60) then 60 X ≤ x x F(x) x for x < 0 P(X ≤ x) = x/60 for x ∈ [0, 60) 1 for x > 60

Cumulative distribution functions p.m.f. f(x) = P(X = x) discrete c.d.f. F(x) = P(X ≤ x) continuous c.d.f. F(x) = P(X ≤ x) ?

Discrete random variables: p.m.f. f(x) = P(X = x) c.d.f. F(x) = P(X ≤ x) f(x) = F(x) – F(x – d) for small d F(a) = ∑x ≤ a f(x) Continuous random variables: The probability density function (p.d.f.) of a random variable with c.d.f. F(x) is f(x) = F(x) – F(x – d) d lim d → 0 dF(x) dx =

Discrete random variables: p.m.f. f(x) = P(X = x) c.d.f. F(x) = P(X ≤ x) F(a) = ∑x ≤ a f(x) Continuous random variables: p.d.f. f(x) = dF(x)/dx c.d.f. F(x) = P(X ≤ x) F(a) = ∫x ≤ a f(x)dx

Uniform random variable if x < 0 F(x) = x/60 if x ∈ [0, 60) 1 if x ≥ 60 c.d.f. F(x) p.d.f. f(x) 1/60 if x < 0 dF(x)/dx = 1/60 if x ∈ (0, 60) if x > 60

Cumulative distribution functions discrete c.d.f. F(x) = P(X ≤ x) p.m.f. f(x) = P(X = x) continuous c.d.f. F(x) = P(X ≤ x) p.d.f. f(x) = dF(x)/dx

Uniform random variable A random variable X is Uniform(0, 1) if its p.d.f. is f(x) = if x ∈ (0, 1) 1 if x < 0 or x > 1 f(x) A Uniform(a, b) has p.d.f. f(x) = if x ∈ (a, b) 1/(b - a) if x < a or x > b X a b

Some practice A package is to be delivered between noon and 1pm. It is now 12.30 and the package is not in yet. When will it arrive?

Some practice Probability model Arrival time X is Uniform(0, 60) We want the c.d.f. of X conditioned on X > 30: = P(X ≤ x and X > 30) P(X > 30) P(X ≤ x | X > 30) = P(30 < X ≤ x) P(X > 30) = 1/2 (x – 30)/60 (x – 30)/30 =

Some practice The c.d.f. of X conditioned on X > 30 is G(x) = (x – 30)/30 for x in [30, 60) The p.d.f. of X conditioned on X > 30 is g(x) = dG(x)/dx = 1/30 for x in [30, 60) and 0 outside. So X conditioned on X > 30 is Uniform(30, 60).

Waiting for a friend Your friend said she’ll show up between 7 and 8 but probably around 7.30. It is now 7.30. What is the probability you have to wait past 7.45?

Waiting for a friend Probability model Let’s assume arrival time X has following p.d.f.: 60 30 x f(x) 1/30 We want to calculate = P(X > 45) P(X > 30) P(X > 45 | X > 30)

Waiting for a friend P(X > 30) = ∫30 f(x)dx 60 = 1/2 1/30 x 30 45 60 P(X > 30) = ∫30 f(x)dx 60 = 1/2 P(X > 45) = ∫45 f(x)dx 60 = 1/8 so P(X > 45 | X > 30) = (1/8)/(1/2) = 1/4.

Interpretation of the p.d.f. The p.d.f. value f(x) d approximates the probability that X in an interval of length d around x P(x – d ≤ X < x) = f(x) d + o(d) P(x ≤ X < x + d) = f(x) d + o(d) Example 1/60 x d If X is uniform, then f(x) = 1/60 P(x ≤ X < x + d) = d/60

Discrete versus continuous p.m.f. f(x) p.d.f. f(x) P(X ≤ a) ∑x ≤ a f(x) ∫x ≤ a f(x)dx E[X] ∑x x f(x) ∫x x f(x)dx E[X2] ∑x x2 f(x) ∫x x2 f(x)dx Var[X] E[(X – E[X])2] = E[X2] – E[X]2

Uniform random variable A random variable X is Uniform(0, 1) if its p.d.f. is 1 if x ∈ (0, 1) f(x) = if x < 0 or x > 1 f(x) m m – s m + s ∫0 f(x)dx 1 = ∫0 dx = 1 1 E[X] = ∫0 x f(x)dx 1 = x2/2|0 1 = 1/2 E[X2] = ∫0 x2 f(x)dx 1 = x3/3|0 1 = 1/3 m = E[X] s = √Var[X] Var[X] = 1/3 – (1/2)2 = 1/12 x √Var[X] = 1/√12 ≈ 0.289

Uniform random variable A random variable X is Uniform(a, b) if its p.d.f. is 1/(b - a) if x ∈ (a, b) f(x) = if x < a or x > b Then E[X] = (b – a)/2 Var[X] = (b – a)2/12

Raindrops again Rain is falling on your head at an average speed of l drops/second. 1 2 How long do we wait until the next drop?

Raindrops again Probability model Time is divided into intervals of length 1/n Events Ei = “raindrop hits in interval i” have probability p = l/n and are independent X = interval of first drop X 1 2 P(X = x) = P(E1c…Ex-1cEx) = (1 – p)x-1p

Raindrops again X = interval of first drop X = x x-1 n x n T T = time (in seconds) of first drop X = x means that (x – 1)/n ≤ T < x/n P((x – 1)/n ≤ T < x/n) = P(X = x) = (1 – p)x-1p = (1 – l/n)x-1(l/n)

Raindrops again T = time (in seconds) of first drop P((x – 1)/n ≤ T < x/n) = (1 – l/n)x-1(l/n) If we set t = (x – 1)/n and d = 1/n we get P(t ≤ T < t + d) = (1 – d l)t/d(dl) = dl e– (d l + o(d l)) t/d f(t) = d lim d → 0 P(t ≤ T < t + d) = l e-lt

The exponential random variable The p.d.f. of an Exponential(l) random variable T is l e-lt if x ≥ 0 f(t) = if x < 0. p.d.f. f(t) l = 1 c.d.f. F(t) = P(T ≤ t) l = 1

The exponential random variable The c.d.f. of T is F(a) = ∫0 l e-lt dt = e-lt|0 = 1 – e-la a if a ≥ 0 What should the expected value of T be? (Hint: Rain falls at l drops/second How many seconds till the first drop?) E[T] = 1/l Var[T] = 1/l2

Poisson vs. exponential 1 2 T Poisson(l) Exponential(l) description number of events within time unit time until first event happens l 1/l expectation l 1/l std. deviation

Memoryless property Solution How much time between the second and third drop? 1 2 T Solution We start time when the second drop falls. What happened before is irrelevant. Then T is Exponential(l)

Expected time Solution What is the expected time of the third drop? 1 2 T T3 T2 T1 Solution T = T1 + T2 + T3 T is not exponential but E[T] = E[T1] + E[T2] + E[T3] = 3/l