Functions and Transformations of Random Variables

Slides:



Advertisements
Similar presentations
Central Limit Theorem. So far, we have been working on discrete and continuous random variables. But most of the time, we deal with ONE random variable.
Advertisements

Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
ORDER STATISTICS.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Review of Basic Probability and Statistics
Review.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Section 09.  This table is on page 280 of the Actex manual Distribution of XiDistribution of Y Bernoulli B(1,p)Binomial B(k,p) Binomial.
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Section 10.  An insurance policy is a contract between the party that is at risk (the policyholder) and the insurer  The policyholder pays a premium.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
McGraw-Hill/Irwin Copyright © 2013 by The McGraw-Hill Companies, Inc. All rights reserved.
Topic 4 - Continuous distributions
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
The Practice of Statistics Third Edition Chapter 8: The Binomial and Geometric Distributions 8.1 The Binomial Distribution Copyright © 2008 by W. H. Freeman.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
The final exam solutions. Part I, #1, Central limit theorem Let X1,X2, …, Xn be a sequence of i.i.d. random variables each having mean μ and variance.
Using the Tables for the standard normal distribution.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
7.2 Means & Variances of Random Variables AP Statistics.
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Unit 4 Review. Starter Write the characteristics of the binomial setting. What is the difference between the binomial setting and the geometric setting?
Practice Problems Actex 10. Section #1 An insurance policy pays an individual 100 per day for up to 3 days of hospitalization and 25 per day for.
Review of Risk Management Concepts
Sampling and Sampling Distributions
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Expectations of Random Variables, Functions of Random Variables
Expectations of Random Variables, Functions of Random Variables
Jointly distributed random variables
Continuous Random Variables
Chapter 2 Discrete Random Variables
Chapter 5 Joint Probability Distributions and Random Samples
The distribution function F(x)
Chapter 5 Statistical Models in Simulation
Some Rules for Expectation
ASV Chapters 1 - Sample Spaces and Probabilities
Probability Review for Financial Engineers
Using the Tables for the standard normal distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
GENERATING NON-UNIFORM RANDOM DEVIATES
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Chapter 3 : Random Variables
Chapter 2. Random Variables
Further Topics on Random Variables: 1
Further Topics on Random Variables: Derived Distributions
Berlin Chen Department of Computer Science & Information Engineering
Further Topics on Random Variables: Derived Distributions
Introduction to Probability: Solutions for Quizzes 4 and 5
Statistical Inference I
Continuous Distributions
Further Topics on Random Variables: Derived Distributions
Moments of Random Variables
Presentation transcript:

Functions and Transformations of Random Variables Section 09 Functions and Transformations of Random Variables

Transformation of continuous X Suppose X is a continuous random variable with pdf 𝑓 𝑋 (𝑥) and cdf 𝐹 𝑋 (𝑥) Suppose 𝑢(𝑥) is a one-to-one function with inverse 𝑣(𝑥) ; so that 𝑣 𝑢 𝑥 =𝑥 The random variable 𝑌=𝑢(𝑋) is a transformation of X with pdf: 𝑓 𝑌 𝑦 = 𝑓 𝑋 𝑣 𝑦 ∗ 𝑣′(𝑦) If 𝑢(𝑥) is a strictly increasing function, then 𝐹 𝑌 𝑦 = 𝐹 𝑋 (𝑣 𝑦 ) and then 𝑓 𝑌 𝑦 = 𝐹 ′ 𝑌 (𝑦) Do proof for Fy(y) = Fx(v(y))

Transformation of discrete X Again, 𝑌=𝑢(𝑥) Since X is discrete, Y is also discrete with pdf 𝑔 𝑦 = 𝑦=𝑢(𝑥) 𝑓(𝑥) This is the sum of all the probabilities where u(x) is equal to a specified value of y

Transformation of jointly distributed X and Y X and Y are jointly distributed with pdf 𝑓(𝑥,𝑦) 𝑢 and 𝑣 are functions of x and y This makes 𝑈=𝑢(𝑋,𝑌) and 𝑉=𝑣(𝑋,𝑌) also random variables with a joint distribution In order to find the joint pdf of U and V, call it g(u,v), we expand the one variable case Find inverse functions ℎ(𝑢,𝑣) and 𝑘(𝑢,𝑣) so that 𝑥=ℎ(𝑢 𝑥,𝑦 ,𝑣 𝑥,𝑦 ) and 𝑦=𝑘(𝑢 𝑥,𝑦 ,𝑣 𝑥,𝑦 ) Then the joint pdf is: 𝑔 𝑢,𝑣 =𝑓 ℎ 𝑢,𝑣 ,𝑘 𝑢,𝑣 ∗ 𝜕ℎ 𝜕𝑢 ∗ 𝜕𝑘 𝜕𝑣 − 𝜕ℎ 𝜕𝑣 ∗ 𝜕𝑘 𝜕𝑢

Sum of random variables If 𝑌= 𝑋 1 + 𝑋 2 then 𝐸 𝑌 =𝐸 𝑋 1 +𝐸( 𝑋 2 ) 𝑉𝑎𝑟 𝑌 =𝑉𝑎𝑟 𝑋 1 +𝑉𝑎𝑟 𝑋 2 +2𝐶𝑜𝑣( 𝑋 1 , 𝑋 2 ) If Xs are continuous with joint pdf 𝑓( 𝑥 1 , 𝑥 2 ) 𝑓 𝑌 𝑦 = −∞ ∞ 𝑓( 𝑥 1 ,𝑦− 𝑥 1 ) 𝑑 𝑥 1 If Xs are discrete with joint pdf 𝑓( 𝑥 1 , 𝑥 2 ) 𝑃 𝑋 1 + 𝑋 2 =𝑘 = 𝑥 1 =0 𝑘 𝑓( 𝑥 1 ,𝑘− 𝑥 1 )

Convolution method for sums If X1 and X2 are independent, we use the convolution method for both discrete & cont. Discrete: 𝑃[ 𝑋 1 + 𝑋 2 =𝑘] = 𝑥 1 =0 𝑘 𝑓 1 𝑥 1 ∗ 𝑓 2 (𝑘− 𝑥 1 ) Continuous: 𝑌= 𝑋 1 + 𝑋 2 −∞ ∞ 𝑓 1 𝑥 1 ∗ 𝑓 2 (𝑦− 𝑥 1 ) 𝑑 𝑥 1

Sums of random variables If X1, X2, …, Xn are random variables and 𝑌= 𝑖=1 𝑛 𝑋 𝑖 𝐸 𝑌 = 𝐸( 𝑋 𝑖 ) 𝑉𝑎𝑟 𝑌 = 𝑉𝑎𝑟( 𝑋 𝑖 ) +2 𝑖=1 𝑛 𝑗=𝑖+1 𝑛 𝐶𝑜𝑣( 𝑋 𝑖 , 𝑋 𝑗 ) If Xs are mutually independent 𝑉𝑎𝑟 𝑌 = 𝑉𝑎𝑟( 𝑋 𝑖 ) 𝑀 𝑌 𝑡 = 𝑖=1 𝑛 𝑀 𝑋 𝑖 (𝑡)

Central Limit Theorem 𝑌 𝑛 = 𝑋 1 + 𝑋 2 +…+ 𝑋 𝑛 𝐸 𝑌 𝑛 =𝑛𝜇 𝑉𝑎𝑟 𝑌 𝑛 =𝑛 𝜎 2 X1, X2, …, Xn are independent random variables with the same distribution of mean μ and standard deviation σ 𝑌 𝑛 = 𝑋 1 + 𝑋 2 +…+ 𝑋 𝑛 𝐸 𝑌 𝑛 =𝑛𝜇 𝑉𝑎𝑟 𝑌 𝑛 =𝑛 𝜎 2 As n increases, Yn approaches the normal distribution 𝑁(𝑛𝜇,𝑛 𝜎 2 ) Questions asking about probabilities for large sums of independent random variables are often asking to use the normal approximation (integer correction sometimes necessary).

Sums of certain distribution This table is on page 280 of the Actex manual Distribution of Xi Distribution of Y Bernoulli B(1,p) Binomial B(k,p) Binomial B(n,p) Binomial B( 𝑛 𝑖 ,p) Poisson λ 𝑖 Poisson λ 𝑖 Geometric p Negative binomial k,p Normal N(μ,σ2) Normal N( 𝜇 𝑖 , 𝜎 2 𝑖 ) There are more than these but these are the most common/easy to remember

Distribution of max or min of random variables X1 and X2 are independent random variables 𝑈=max⁡{ 𝑋 1 , 𝑋 2 } 𝑉=min⁡{ 𝑋 1 , 𝑋 2 } 𝐹 𝑈 𝑢 =𝑃 𝑈≤𝑢 =𝑃 max 𝑋 1 , 𝑋 2 ≤𝑢 =𝑃 𝑋 1 ≤𝑢 ∩ 𝑋 2 ≤𝑢 = 𝐹 1 𝑢 ∗ 𝐹 2 (𝑢) 𝐹 𝑣 𝑣 =𝑃 𝑉≤𝑣 =1−𝑃 𝑉>𝑣 =1−𝑃 min 𝑋 1 , 𝑋 2 >𝑣 =1−𝑃 𝑋 1 >𝑣 ∩ 𝑋 2 >𝑣 𝑋 1 >𝑣 ∩ 𝑋 2 >𝑣 = 1− 1− 𝐹 1 𝑣 ∗[1− 𝐹 2 𝑣 ]

Mixtures of Distributions X1 and X2 are independent random variables We can define a brand new random variable X as a mixture of these variables! X has the pdf 𝑓 𝑥 =𝑎∗ 𝑓 1 𝑥 + 1−𝑎 ∗ 𝑓 2 (𝑥) Expectations, probabilities, and moments follow this “weighted-average” form 𝐸 𝑋 =𝑎𝐸 𝑋 1 + 1−𝑎 𝐸( 𝑋 2 ) 𝐹 𝑋 𝑥 =𝑎 𝐹 1 𝑥 +(1−𝑎) 𝐹 2 (𝑥) 𝑀 𝑥 𝑡 =𝑎 𝑀 𝑋 1 𝑡 +(1−𝑎) 𝑀 𝑋 2 (𝑡) Be careful! Variances do not follow weighted-average! Instead, find first and second moments of X and subtract special case x1=0 also x =/= ax1 + ax2

Sample Exam #95 X and Y are independent random variables with common moment generating function M(t) = exp((t^2) / 2). Let W = X + Y and Z = Y-X. Determine the joint moment generating function, M(t1, t2) of W and Z.

Sample Exam #98 Let X1, X2, X3 be a random sample from a discrete distribution with probability function p(x) = 1/3, x = 0 2/3, x = 1 0, otherwise. Determine the moment generating function, M(t), of Y = X1 * X2 * X3.

Sample Exam #102 A company has two electric generators. The time until failure for each generator follows an exponential distribution with mean 10. The company will begin using the second generator immediately after the first one fails. What is the variance of the total time that the generators produce electricity?

Let x be uniformly distributed on the range [10, 100]. Y = 3 Let x be uniformly distributed on the range [10, 100]. Y = 3*e^(3x) Find f(y) ,the probability density function of Y. Use f(y) to find the expected value of Y.

Let f(x,y) = (x+y)/8 for 0<x<2 and 0<y<2 U = 2x + 3y and V = (x+y)/2 Find f(u, v), the joint probability density function of U and V.

Sample Exam #289 For a certain insurance company, 10% of its policies are Type A, 50% are Type B, and 40% are Type C. The annual number of claims for an individual Type A, Type B, and Type C policy follows Poisson distributions with respective means 1, 2, and 10. Let X represent the annual number of claims of a randomly selected policy. Calculate the variance of X.

Sample Exam #296 A homeowners insurance policy covers losses due to theft, with a deductible of 3. Theft losses are uniformly distributed on [0,10]. Determine the moment generating function, M(t), for t =/= 0, of the claim payment on a theft.