Further Topics on Random Variables: 1

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Review.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
ELEC 303 – Random Signals Lecture 13 – Transforms Dr. Farinaz Koushanfar ECE Dept., Rice University Oct 15, 2009.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Stochastic Models Lecture 2 Poisson Processes
Convergence in Distribution
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
One Random Variable Random Process.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Mean, Variance, Moments and.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Fourier series, Discrete Time Fourier Transform and Characteristic functions.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Probability and Moment Approximations using Limit Theorems.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Unit 4 Review. Starter Write the characteristics of the binomial setting. What is the difference between the binomial setting and the geometric setting?
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Continuous Probability Distributions
Functions and Transformations of Random Variables
Virtual University of Pakistan
Chapter 2 Discrete Random Variables
Appendix A: Probability Theory
STATISTICS Random Variables and Distribution Functions
Maximum Likelihood Estimation
Sample Mean Distributions
Chapter 7: Sampling Distributions
Parameter, Statistic and Random Samples
Lecture 13 Sections 5.4 – 5.6 Objectives:
Tutorial 9: Further Topics on Random Variables 2
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Berlin Chen Department of Computer Science & Information Engineering
Discrete Random Variables: Expectation, Mean and Variance
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Discrete Random Variables: Basics
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Experiments, Outcomes, Events and Random Variables: A Revisit
Introduction to Probability: Solutions for Quizzes 4 and 5
Discrete Random Variables: Expectation, Mean and Variance
Discrete Random Variables: Expectation, Mean and Variance
Simulation Berlin Chen
Berlin Chen Department of Computer Science & Information Engineering
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Covariance and Correlation
Further Topics on Random Variables: Derived Distributions
Discrete Random Variables: Basics
Continuous Random Variables: Basics
Presentation transcript:

Further Topics on Random Variables: 1 Further Topics on Random Variables: 1. Transforms (Moment Generating Functions) 2. Sum of a Random Number of Independent Random Variables Berlin Chen Department of Computer Science & Information Engineering National Taiwan Normal University Reference: - D. P. Bertsekas, J. N. Tsitsiklis, Introduction to Probability , Sections 4.4 & 4.5

Transforms Also called moment generating functions of random variables The transform of the distribution of a random variable is a function of a free parameter , defined by If is discrete If is continuous

Illustrative Examples (1/5) Example 4.22. Let

Illustrative Examples (2/5) Example 4.23. The Transform of a Poisson Random Variable. Consider a Poisson random variable with parameter :

Illustrative Examples (3/5) Example 4.24. The Transform of an Exponential Random Variable. Let be an exponential random variable with parameter :

Illustrative Examples (4/5) Example 4.25. The Transform of a Linear Function of a Random Variable. Let be the transform associated with a random variable . Consider a new random variable . We then have For example, if is exponential with parameter and , then

Illustrative Examples (5/5) Example 4.26. The Transform of a Normal Random Variable. Let be normal with mean and variance . 1

From Transforms to Moments (1/2) Given a random variable , we have Or When taking the derivative of the above functions with respect to (for example, the continuous case) If we evaluate it at , we can further have (If is continuous) (If is discrete) the first moment of

From Transforms to Moments (2/2) More generally, taking the differentiation of n times with respect to will yield the n-th moment of

Illustrative Examples (1/2) Example 4.27. Given a random variable with PMF:

Illustrative Examples (2/2) Example. Given an exponential random variable with PMF:

Two Properties of Transforms For any random variable , we have If random variable only takes nonnegative integer values ( )

Inversion of Transforms Inversion Property The transform associated with a random variable uniquely determines the probability law of , assuming that is finite for all in an interval The determination of the probability law of a random variable => The PDF and CDF In particular, if for all in , then the random variables and have the same probability law

Illustrative Examples (1/2) Example 4.28. We are told that the transform associated with a random variable is

Illustrative Examples (2/2) Example 4.29. The Transform of a Geometric Random Variable. We are told that the transform associated with random variable is of the form Where

Mixture of Distributions of Random Variables (1/3) Let be continuous random variables with PDFs , and let be a random variable, which is equal to with probability ( ). Then, and

Mixture of Distributions of Random Variables (2/3)

Mixture of Distributions of Random Variables (3/3) Mixture of Gaussian Distributions More complex distributions with multiple local maxima can be approximated by Gaussian (a unimodal distribution) mixture Gaussian mixtures with enough mixture components can approximate any distribution

An Illustrative Example (1/2) Example 4.30. The Transform of a Mixture of Two Distributions. The neighborhood bank has three tellers, two of them fast, one slow. The time to assist a customer is exponentially distributed with parameter λ = 6 at the fast tellers, and λ = 4 at the slow teller. Jane enters the bank and chooses a teller at random, each one with probability 1/3. Find the PDF of the time it takes to assist Jane and the associated transform

An Illustrative Example (2/2) The service time of each teller is exponentially distributed The distribution of the time that a customer spends in the bank The associated transform the faster teller the slower teller

Sum of Independent Random Variables Addition of independent random variables corresponds to multiplication of their transforms Let and be independent random variables, and let . The transform associated with is, Since and are independent, and and are functions of and , respectively More generally, if is a collection of independent random variables, and

Illustrative Examples (1/3) Example 4.10. The Transform of the Binomial. Let be independent Bernoulli random variables with a common parameter . Then, If , can be thought of as a binomial random variable with parameters and , and its corresponding transform is given by

Illustrative Examples (2/3) Example 4.11. The Sum of Independent Poisson Random Variables is Poisson. Let and be independent Poisson random variables with means and , respectively The transforms of and will be the following, respectively If , then the transform of the random variable is From the transform of , we can conclude that is also a Poisson random variable with mean cf. p.5 (in this lecture)

Illustrative Examples (3/3) Example 4.12. The Sum of Independent Normal Random Variables is Normal. Let and be independent normal random variables with means , and variances , respectively The transforms of and will be the following, respectively If , then the transform of the random variable is From the transform of , we can conclude that also is normal with mean and variance cf. p.8 (in this lecture)

Tables of Transforms (1/2)

Tables of Transforms (2/2)

Exercise Given that is an exponential random variable with parameter : (i) Show that the transform (moment generating function) of can be expressed as: (ii) Find the expectation and variance of based on its transform. (ii) Given that random variable can be expressed as . . Find the transform of . (iv) Given that is also an exponential random variable with parameter , and and are independent. Find the transform of random variable .

Sum of a Random Number of Independent Random Variables (1/4) If we know that is a random variable taking positive integer values are independent, identically distributed (i.i.d.) random variables (with common mean and variance ) A subset of ( ) are independent as well What are the formulas for the mean, variance, and the transform of ? (If , we let )

Sum of a Random Number of Independent Random Variables (2/4) If we fix some number , the random variable is independent of random variable can be viewed as a function of random variable is a random variable The mean of (i.e. ) can be calculated by using the law of iterated expectations ?

Sum of a Random Number of Independent Random Variables (3/4) Similarly, can be expressed as can be viewed as a function of random variable is a random variable The variance of can be calculated using the law of total variance

Sum of a Random Number of Independent Random Variables (4/4) Similarly, can be expressed as can be viewed as a function of random variable is a random variable The mean of (i.e. the transform of , ) can be calculated by using the law of iterated expectations

Properties of the Sum of a Random Number of Independent Random Variables

Illustrative Examples (1/5) Example 4.34. A remote village has three gas stations, and each one of them is open on any given day with probability 1/2, independently of the others. The amount of gas available in each gas station is unknown and is uniformly distributed between 0 and 1000 gallons. We wish to characterize the distribution ( ) of the total amount of gas available at the gas stations that are open 2 Total amount of gas available The amount of gas provided by one gas station, out of three ( is uniformly distributed) 1 3

Illustrative Examples (2/5) Example 4.35. Sum of a Geometric Number of Independent Exponential Random Variables. Jane visits a number of bookstores, looking for Great Expectations. Any given bookstore carries the book with probability , independently of the others. In a typical bookstore visited, Jane spends a random amount of time, exponentially distributed with parameter , until she either finds the book or she decides that the bookstore does not carry it. Assuming that Jane will keep visiting bookstores until she buys the book and that the time spent in each is independent of everything else We wish to determine the mean, variance, and PDF of the total time spent in bookstores.

Illustrative Examples (3/5) found found found Total amount of time spent The amount of time spent in a given bookstore 1 2

Illustrative Examples (4/5) 3

Illustrative Examples (5/5) Example 4.36. Sum of a Geometric Number of Independent Geometric Random Variables. This example is a discrete counterpart of the preceding one. We let be geometrically distributed with parameter . We also let each random variable be geometrically distributed with parameter . We assume that all of these random variables are independent.

Exercise A fair coin is flipped independently until the first head is encountered. For each time of the coin flipping, you will get a score of 1 with probability 0.4 and a score of 0 with probability 0.6. Let the random variable be defined as the sum of all the scores obtained during the process of the coin flipping (including the last time of the coin flipping). Find the following characteristics of : (i) mean (ii) variance (iii) transform mean=0.8; var=0.8

Probability versus Statistics Population (assumed to be a random variable or to follow some distribution) Sample Inference Statistics Parameters

The Central Limit Theorem (1/2) Let X1,…,Xn be a sequence of independent, identically distributed random variables with common mean  and variance 2 Let be an (sample) average of these random variables Let Sn = X1+…+Xn be the sum of these random variables Then if n is sufficiently large: And approximately

The Central Limit Theorem (2/2) Example Rule of Thumb For most populations, if the (sample) size n is greater than 30, the Central Limit Theorem approximation is good continuous Solid lines: real distributions of sample mean Dashed lines: normal distributions somewhat skewed to the right discrete highly skewed to the right continuous

Chebyshev Inequality If is a random variable with mean and variance , then ?