TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE

Slides:



Advertisements
Similar presentations
Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Random Variables ECE460 Spring, 2012.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
ORDER STATISTICS.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
1 Review of Probability Theory [Source: Stanford University]
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
Chapter 4: Joint and Conditional Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
STAT 552 PROBABILITY AND STATISTICS II
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
CSE 531: Performance Analysis of Systems Lecture 2: Probs & Stats review Anshul Gandhi 1307, CS building
CIVL 181Tutorial 5 Return period Poisson process Multiple random variables.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
ELEC 303 – Random Signals Lecture 13 – Transforms Dr. Farinaz Koushanfar ECE Dept., Rice University Oct 15, 2009.
CHAPTER 4 Multiple Random Variable
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Multiple Random Variables Two Discrete Random Variables –Joint pmf –Marginal pmf Two Continuous Random Variables –Joint Distribution (PDF) –Joint Density.
1 Two Functions of Two Random Variables In the spirit of the previous lecture, let us look at an immediate generalization: Suppose X and Y are two random.
Chapter 2 Random variables 2.1 Random variables Definition. Suppose that S={e} is the sampling space of random trial, if X is a real-valued function.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
One Random Variable Random Process.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
EE 5345 Multiple Random Variables
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Chapter 3 DeGroot & Schervish. Functions of a Random Variable the distribution of some function of X suppose X is the rate at which customers are served.
Distributions of Functions of Random Variables November 18, 2015
EEE APPENDIX B Transformation of RV Huseyin Bilgekul EEE 461 Communication Systems II Department of Electrical and Electronic Engineering Eastern.
MULTIPLE RANDOM VARIABLES A vector random variable X is a function that assigns a vector of real numbers to each outcome of a random experiment. e.g. Random.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
© by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.
Engineering Probability and Statistics - SE-205 -Chap 3 By S. O. Duffuaa.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Probability & Random Variables
Random Variables By: 1.
Statistics Lecture 19.
Engineering Probability and Statistics - SE-205 -Chap 3
Example A device containing two key components fails when and only when both components fail. The lifetime, T1 and T2, of these components are independent.
3.1 Expectation Expectation Example
UNIT-2 Multiple Random Variable
Jiaping Wang Department of Mathematical Science 04/10/2013, Wednesday
Example Suppose X ~ Uniform(2, 4). Let . Find .
EE255/CPS226 Discrete Random Variables
Chapter 2. Random Variables
TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
9. Two Functions of Two Random Variables
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Presentation transcript:

TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS

TRANSFORMATION OF RANDOM VARIABLES If X is an rv with cdf F(x), then Y=g(X) is also an rv. If we write y=g(x), the function g(x) defines a mapping from the original sample space of X, S, to a new sample space, , the sample space of the rv Y. g(x): S 

TRANSFORMATION OF RANDOM VARIABLES Let y=g(x) define a 1-to-1 transformation. That is, the equation y=g(x) can be solved uniquely: Ex: Y=X-1  X=Y+1 1-to-1 Ex: Y=X²  X=± sqrt(Y) not 1-to-1 When transformation is not 1-to-1, find disjoint partitions of S for which transformation is 1-to-1.

TRANSFORMATION OF RANDOM VARIABLES If X is a discrete r.v. then S is countable. The sample space for Y=g(X) is ={y:y=g(x),x S}, also countable. The pmf for Y is

Example Let X~GEO(p). That is, Find the p.m.f. of Y=X-1 Solution: X=Y+1 P.m.f. of the number of failures before the first success Recall: X~GEO(p) is the p.m.f. of number of Bernoulli trials required to get the first success

Example Let X be an rv with pmf Let Y=X2. S ={2,  1,0,1,2}  ={0,1,4}

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Let X be an rv of the continuous type with pdf f. Let y=g(x) be differentiable for all x and non-zero. Then, Y=g(X) is also an rv of the continuous type with pdf given by

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Example: Let X have the density Let Y=eX. X=g1 (y)=log Y dx=(1/y)dy.

FUNCTIONS OF CONTINUOUS RANDOM VARIABLE Example: Let X have the density Let Y=X2. Find the pdf of Y.

THE PROBABILITY INTEGRAL TRANSFORMATION Let X have continuous cdf FX(x) and define the rv Y as Y=FX(x). Then, Y is uniformly distributed on (0,1), that is, P(Y  y) = y, 0<y<1. This is very commonly used, especially in random number generation procedures.

Example 1 Generate random numbers from X~ Exp(1/λ) if you only have numbers from Uniform(0,1).

Example 2 Generate random numbers from the distribution of X(1)=min(X1,X2,…,Xn) if X~ Exp(1/λ) if you only have numbers from Uniform(0,1).

Example 3 Generate random numbers from the following distribution:

CDF method Example: Let Consider . What is the p.d.f. of Y? Solution:

CDF method Example: Consider a continuous r.v. X, and Y=X². Find p.d.f. of Y. Solution:

TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES BIVARIATE TRANSFORMATIONS

DISCRETE CASE Let X1 and X2 be a bivariate random vector with a known probability distribution function. Consider a new bivariate random vector (U, V) defined by U=g1(X1, X2) and V=g2(X1, X2) where g1(X1, X2) and g2(X1, X2) are some functions of X1 and X2 .

DISCRETE CASE If B is any subset of 2, then (U,V)B iff (X1,X2)A where Then, Pr(U,V)B=Pr(X1,X2)A and probability distribution of (U,V) is completely determined by the probability distribution of (X1,X2). Then, the joint pmf of (U,V) is

EXAMPLE Let X1 and X2 be independent Poisson distribution random variables with parameters 1 and 2. Find the distribution of U=X1+X2.

CONTINUOUS CASE Let X=(X1, X2, …, Xn) have a continuous joint distribution for which its joint pdf is f, and consider the joint pdf of new random variables Y1, Y2,…, Yk defined as

CONTINUOUS CASE If the transformation T is one-to-one and onto, then there is no problem of determining the inverse transformation. An and Bk=n, then T:AB. T-1(B)=A. It follows that there is a one-to-one correspondence between the points (y1, y2,…,yk) in B and the points (x1, x2,…,xn) in A. Therefore, for (y1, y2,…,yk)B we can invert the equation in (*) and obtain new equation as follows:

CONTINUOUS CASE Assuming that the partial derivatives exist at every point (y1, y2,…,yk=n)B. Under these assumptions, we have the following determinant J

CONTINUOUS CASE called as the Jacobian of the transformation specified by (**). Then, the joint pdf of Y1, Y2,…,Yk can be obtained by using the change of variable technique of multiple variables.

CONTINUOUS CASE As a result, the function g is defined as follows:

Example Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, Prove this for n=2 (for simplicity).

M.G.F. Method If X1,X2,…,Xn are independent random variables with MGFs Mxi (t), then the MGF of is

Example Recall that I claimed: Let’s prove this.

Example Recall that I claimed: Let X1,X2,…,Xn be independent rvs with Xi~Gamma(i, ). Then, We proved this with transformation technique for n=2. Now, prove this for general n.

More Examples on Transformations Recall the relationship: If , then X~N( , 2) Let’s prove this.

Example 2 Recall that I claimed: Let X be an rv with X~N(0, 1). Then, Let’s prove this.

Example 3 Recall that I claimed: If X and Y have independent N(0,1) distribution, then Z=X/Y has a Cauchy distribution with =0 and σ=1. Recall the p.d.f. of Cauchy distribution: Let’s prove this claim.

Example 4 See Examples 6.3.12 and 6.3.13 in Bain and Engelhardt (pages 207 & 208 in 2nd edition). This is an example of two different transformations: In Example 6.3.12: In Example 6.3.13: X1 & X2 ~ Exp(1) Y1=X1-X2 Y2=X1+X2 X1 & X2 ~ Exp(1) Y1=X1 Y2=X1+X2

Example 5 Let X1 and X2 are independent with N(μ1,σ²1) and N(μ2,σ²2), respectively. Find the p.d.f. of Y=X1-X2.

Example 6 Let X~N( , 2) and Y=exp(X). Find the p.d.f. of Y.