copyright Robert J. Marks II

Slides:



Advertisements
Similar presentations
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Advertisements

Random Variables ECE460 Spring, 2012.
Important Random Variables Binomial: S X = { 0,1,2,..., n} Geometric: S X = { 0,1,2,... } Poisson: S X = { 0,1,2,... }
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Chapter 1 Probability Theory (i) : One Random Variable
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Prof. Bart Selman Module Probability --- Part d)
CSE 3504: Probabilistic Analysis of Computer Systems Topics covered: Moments and transforms of special distributions (Sec ,4.5.3,4.5.4,4.5.5,4.5.6)
1 Review of Probability Theory [Source: Stanford University]
Probability and Statistics Review
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
2. Random variables  Introduction  Distribution of a random variable  Distribution function properties  Discrete random variables  Point mass  Discrete.
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
Winter 2006EE384x1 Review of Probability Theory Review Session 1 EE384X.
Dept of Bioenvironmental Systems Engineering National Taiwan University Lab for Remote Sensing Hydrology and Spatial Modeling STATISTICS Random Variables.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Statistics for Engineer Week II and Week III: Random Variables and Probability Distribution.
Moment Generating Functions
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Probability Theory Overview and Analysis of Randomized Algorithms Prepared by John Reif, Ph.D. Analysis of Algorithms.
Convergence in Distribution
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
1 Since everything is a reflection of our minds, everything can be changed by our minds.
One Random Variable Random Process.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
EE 5345 Multiple Random Variables
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Limit theorems.
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Expectations of Continuous Random Variables. Introduction The expected value E[X] for a continuous RV is motivated from the analogous definition for a.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Random Variables By: 1.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
3. Random Variables (Fig.3.1)
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
ASV Chapters 1 - Sample Spaces and Probabilities
Functions and Transformations of Random Variables
Probability Theory Overview and Analysis of Randomized Algorithms
Probability for Machine Learning
Appendix A: Probability Theory
STATISTICS Random Variables and Distribution Functions
The Bernoulli distribution
3.1 Expectation Expectation Example
Chapter 3 Discrete Random Variables and Probability Distributions
Moment Generating Functions
Probability Review for Financial Engineers
ECE 5345 Sums of RV’s and Long-Term Averages The Law of Large Numbers
Example Suppose X ~ Uniform(2, 4). Let . Find .
Chernoff bounds The Chernoff bound for a random variable X is
Lecture 5 b Faten alamri.
Handout Ch 4 實習.
3. Random Variables Let (, F, P) be a probability model for an experiment, and X a function that maps every to a unique point.
Chapter 3 : Random Variables
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
EE 5345 Multiple Random Variables 2-D RV’s Conditional RV’s
CS723 - Probability and Stochastic Processes
Further Topics on Random Variables: 1
Discrete Random Variables: Expectation, Mean and Variance
Moments of Random Variables
Linear Change of Variables for Densities
Presentation transcript:

copyright Robert J. Marks II EE 5345 Review of Probability & Random Variables (cont) copyright Robert J. Marks II

copyright Robert J. Marks II Transformation on a RV Given fX(x) and Y=g(X) find fY(x) Solution: Solve roots of x and y=g(x). Call them… Transformation is copyright Robert J. Marks II

Transformation on a RV (cont) Special case: Y=g(X) where y=g(x) is strictly increasing. Notes If a < b then g(a) < g(b) a b x g(b ) g(a) y=g(x ) copyright Robert J. Marks II

Transformation on a RV (cont) Notes (cont) If y=g(x) is strictly increasing, so is x=g-1(y) y=g(x ) x=g-1(y) x=g-1(y) y=g(x ) Example: y=ex=g(x) and x=ln (y)=g-1(y). copyright Robert J. Marks II

Transformation on a RV (cont) Transformation when y=g(x) is strictly increasing copyright Robert J. Marks II

Transformation on a RV (cont) Transformation when y=g(x) is strictly increasing (cont) copyright Robert J. Marks II

Transformation on a RV (cont) Transformation when y=g(x) is strictly increasing (cont) y=g(x ) x=g-1(y) x=g-1(y) y=g(x ) copyright Robert J. Marks II

Transformation on a RV (cont) More generally, if y=g(x) is strictly increasing or strictly decreasing copyright Robert J. Marks II

Transformation on a RV (cont) Most general: Given fX(x) and Y=g(X) find fY(x) Solution: Solve roots of x and y=g(x). Call them… Transformation is x y x1 x2 x3 g(x) copyright Robert J. Marks II

Transformation on a RV (cont) Other points Visualization: Mapping of probability mass: Mapping when y=g(x)is constant for an interval:  Map to Dirac deltas x y fX(x) fY(y) y=g(x) copyright Robert J. Marks II

Simple Transformations Transposition: Y = -X Shift: Y = X+s Scale: Y = aX copyright Robert J. Marks II

Simple Transformations(cont) Modulus: Y = |X| Shift & Scale: Y = aX + s copyright Robert J. Marks II

copyright Robert J. Marks II Generating RV’s Uniform & Gaussian RV’s available Transformation allows generation of other RV’s. Ex: When X is uniform Y = ln(X) gives an exponential RV Y=tan(X) gives Cauchy Note: Mappings are not unique! copyright Robert J. Marks II

Synthesizing RV’s (cont) Given a uniform RV on (0,1), find y=g(x) so that Y is distributed as fY(y). One solution: y=FY -1(x) The converse problem: Given fX(x), find y=g(x) so that Y is uniform on (0,1). One solution: y=FX(x). These solutions are not unique. copyright Robert J. Marks II

copyright Robert J. Marks II RV Expected Values nth moment mean (center of mass) variance (measure of dispersion) E(x) copyright Robert J. Marks II

RV Expected Values (cont.) Some random variables have means = . For example FX(x)=1 - 1/x ; x>1 Some random variables have infinite second moments (e.g. Cauchy). E(x) copyright Robert J. Marks II

Discrete RV Expected Values For lattice discrete RV’s, kn=n Example lattice discrete RV’s Poisson Binomial Bernoulli E(x) copyright Robert J. Marks II

copyright Robert J. Marks II Discrete RV Moments For lattice discrete RV’s 50-50 coin flip Heads = 0. Tails = 1. E(X) = 1/2 The expected value need not be a realizable value of the random variable. E(x) copyright Robert J. Marks II

Discrete RV Moments (cont) Bernoulli’s Gambling Problem Rules: Let X be the number of coin flips before the first head. Pr[X = k]=2-k. Payoff: Y=2k dollars. Q: If you ran this game in Las Vegas, what would you charge to play? What is E(Y)? A: . Does this make sense? E(x) copyright Robert J. Marks II

Characteristic Function Fourier transform of PDF Properties copyright Robert J. Marks II

Characteristic Function Properties copyright Robert J. Marks II

Characteristic Function If X and Y are independent From the convolution theorem of Fourier transforms Foundation of Central Limit Theorem copyright Robert J. Marks II

Second Characteristic Function Definition X(x)=ln  X(x) Properties copyright Robert J. Marks II

Moment Generating Function Laplace transform of PDF Moment Generating Function copyright Robert J. Marks II

copyright Robert J. Marks II MGF Properties Moment Generation Taylor Series copyright Robert J. Marks II

Probability Generating Function Z transform probability mass function (pmf) where copyright Robert J. Marks II

copyright Robert J. Marks II PGF Properties Mean and Variance copyright Robert J. Marks II

copyright Robert J. Marks II Common Models Bernoulli, p = probability of success copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Binomial (n repeated Bernoulli trials) copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Geometric copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Pascal or Negative Binomial copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Poisson copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Uniform copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Gaussian or Normal copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Exponential copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Laplace copyright Robert J. Marks II

copyright Robert J. Marks II Common Models (cont) Gamma copyright Robert J. Marks II

Tail Inequalities: Markov If fX(x)= fX(x) (x) then fX(x) Andrei Andreyevich Markov Born: 14 June 1856 in Ryazan, Russia Died: 20 July 1922 in St Petersburg, Russia x a copyright Robert J. Marks II Proof: p. 137

Markov’s Inequality: Proof fX(x) x quod erat demonstrandum copyright Robert J. Marks II

Tail Inequalities: Chebyshev fX(x) Pafnuty Lvovich Chebyshev Born: 16 May 1821 in Okatovo, Russia Died: 8 Dec 1894 in St Petersburg, Russia x p.137 copyright Robert J. Marks II

Chebyshev’s Inequality: Proof copyright Robert J. Marks II

Tail Inequalities: Chebyshev Chebyshev’s inequality illustrates the variance as a measure of dispersion. As the variance increases, the bound noted by the shaded area of the tails, becomes larger and larger suggesting the distribution is spreading out more and more. 2a fX(x) x copyright Robert J. Marks II

copyright Robert J. Marks II The Chernoff Bound a fX(x) x copyright Robert J. Marks II