STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.

Slides:



Advertisements
Similar presentations
Week 91 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are.
Advertisements

Random Variables ECE460 Spring, 2012.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Introduction to stochastic process
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
Probability theory 2010 Main topics in the course on probability theory  Multivariate random variables  Conditional distributions  Transforms  Order.
Today Today: Chapter 5 Reading: –Chapter 5 (not 5.12) –Suggested problems: 5.1, 5.2, 5.3, 5.15, 5.25, 5.33, 5.38, 5.47, 5.53, 5.62.
Chapter 6 Continuous Random Variables and Probability Distributions
Probability theory 2011 Main topics in the course on probability theory  The concept of probability – Repetition of basic skills  Multivariate random.
Visual Recognition Tutorial1 Random variables, distributions, and probability density functions Discrete Random Variables Continuous Random Variables.
Chapter 4: Joint and Conditional Distributions
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Lecture II-2: Probability Review
Modern Navigation Thomas Herring
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
1 Performance Evaluation of Computer Systems By Behzad Akbari Tarbiat Modares University Spring 2009 Introduction to Probabilities: Discrete Random Variables.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
K. Shum Lecture 16 Description of random variables: pdf, cdf. Expectation. Variance.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
Module 1: Statistical Issues in Micro simulation Paul Sousa.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Ch5. Probability Densities II Dr. Deshi Ye
Convergence in Distribution
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
STA347 - week 31 Random Variables Example: We roll a fair die 6 times. Suppose we are interested in the number of 5’s in the 6 rolls. Let X = number of.
Chapter 01 Probability and Stochastic Processes References: Wolff, Stochastic Modeling and the Theory of Queues, Chapter 1 Altiok, Performance Analysis.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Random Variable The outcome of an experiment need not be a number, for example, the outcome when a coin is tossed can be 'heads' or 'tails'. However, we.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
The generalization of Bayes for continuous densities is that we have some density f(y|  ) where y and  are vectors of data and parameters with  being.
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
1 Probability and Statistical Inference (9th Edition) Chapter 4 Bivariate Distributions November 4, 2015.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Distributions of Functions of Random Variables November 18, 2015
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
Lecture 5 Introduction to Sampling Distributions.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Generating Random Variates
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Copyright © Cengage Learning. All rights reserved. 4 Continuous Random Variables and Probability Distributions.
Expectations of Random Variables, Functions of Random Variables
STAT 311 REVIEW (Quick & Dirty)
Cumulative distribution functions and expected values
Example A device containing two key components fails when and only when both components fail. The lifetime, T1 and T2, of these components are independent.
Sample Mean Distributions
The Bernoulli distribution
Monte Carlo Approximations – Introduction
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
STOCHASTIC HYDROLOGY Random Processes
EMIS 7300 SYSTEMS ANALYSIS METHODS FALL 2005
Functions of Random variables
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Further Topics on Random Variables: Derived Distributions
Generating Random Variates
Presentation transcript:

STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random vector is described by a joint probability density function f(x 1,x 2,…,x p ) = f(x). If the joint density of a p x 1 random vector can be factored as f(x 1,x 2,…,x p ) = f 1 (x 1 ) f 2 (x 2 )∙∙∙ f p (x p ) then the p continuous random variables X 1,X 2,…X p are mutually independent.

STA347 - week 92 Mean and Variance of Random Vector The expected value of a random vector is a vector of the expected values of each of its elements. That is, the population mean vector is The population variance-covariance matrix of a px1 random vector x is a p x p symmetric matrix where σ ij = Cov(X i, X j ) = E(X i – μ i )(X j – μ j ). The population correlation matrix of a px1 random vector x is a p x p symmetric matrix ρ = (ρ ij ) where

STA347 - week 93 Properties of Mean Vector and Covariance Matrix

STA347 - week 94 Functions of Random variables In some case we would like to find the distribution of Y = h(X) when the distribution of X is known. Discrete case Examples 1. Let Y = aX + b, a ≠ 0 2. Let

STA347 - week 95 Continuous case – Examples 1. Suppose X ~ Uniform(0, 1). Let, then the cdf of Y can be found as follows The density of Y is then given by 2. Let X have the exponential distribution with parameter λ. Find the density for 3. Suppose X is a random variable with density Check if this is a valid density and find the density of.

STA347 - week 96 Theorem If X is a continuous random variable with density f X (x) and h is strictly increasing and differentiable function form R  R then Y = h(X) has density for. Proof:

STA347 - week 97 Theorem If X is a continuous random variable with density fX(x) and h is strictly decreasing and differentiable function form R  R then Y = h(X) has density for. Proof:

STA347 - week 98 Summary If Y = h(X) and h is monotone then Example X has a density Let. Compute the density of Y.

STA347 - week 99 Change-of-Variable for Joint Distributions Theorem Let X and Y be jointly continuous random variables with joint density function f X,Y (x,y) and let D XY = {(x,y): f X,Y (x,y) >0}. If the mapping T given by T(x,y) = (u(x,y),v(x,y)) maps D XY onto D UV. Then U, V are jointly continuous random variable with joint density function given by where J(u,v) is the Jacobian of T -1 given by assuming derivatives exists and are continuous at all points in D UV.

STA347 - week 910 Example Let X, Y have joint density function given by Find the density function of

STA347 - week 911 Example Show that the integral over the Standard Normal distribution is 1.

STA347 - week 912 Example A device containing two key components fails when and only when both components fail. The lifetime, T 1 and T 2, of these components are independent with a common density function given by The cost, X, of operating the device until failure is 2T 1 + T 2. Find the density function of X.

STA347 - week 913 Convolution Suppose X, Y jointly distributed random variables. We want to find the probability / density function of Z=X+Y. Discrete case X, Y have joint probability function p X,Y (x,y). Z = z whenever X = x and Y = z – x. So the probability that Z = z is the sum over all x of these joint probabilities. That is If X, Y independent then This is known as the convolution of p X (x) and p Y (y).

STA347 - week 914 Example Suppose X~ Poisson(λ 1 ) independent of Y~ Poisson(λ 2 ). Find the distribution of X+Y.

15 Convolution - Continuous case Suppose X, Y random variables with joint density function f X,Y (x,y). We want to find the density function of Z=X+Y. Can find distribution function of Z and differentiate. How? The Cdf of Z can be found as follows: If is continuous at z then the density function of Z is given by If X, Y independent then This is known as the convolution of f X (x) and f Y (y).

STA347 - week 916 Example X, Y independent each having Exponential distribution with mean 1/λ. Find the density for W=X+Y.

STA347 - week 917 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order. Denote by X (1) = smallest of X 1, X 2,…, X n X (2) = 2 nd smallest of X 1, X 2,…, X n X (n) = largest of X 1, X 2,…, X n Note, even if X i ’s are independent, X (i) ’s can not be independent since X (1) ≤ X (2) ≤ … ≤ X (n) Distribution of X i ’s and X (i) ’s are NOT the same.

STA347 - week 918 Distribution of the Largest order statistic X (n) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The CDF of the largest order statistic, X (n), is given by The density function of X (n) is then

STA347 - week 919 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (n).

STA347 - week 920 Distribution of the Smallest order statistic X (1) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The CDF of the smallest order statistic X (1) is given by The density function of X (1) is then

STA347 - week 921 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (1).

STA347 - week 922 Distribution of the kth order statistic X (k) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The density function of X (k) is

STA347 - week 923 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (k).

STA347 - week 924 Computer Simulations - Introduction Modern high-speed computers can be used to perform simulation studies. Computer simulation methods are commonly used in statistical applications; sometimes they replace theory, e.g., bootstrap methods. Computer simulations are becoming more and more common in many applications such as quality control, marketing, scientific research etc.

STA347 - week 925 Applications of Computer Simulations Our main focus is on probabilistic simulations. Examples of applications of such simulations include:  Simulate probabilities and random variables numerically.  Approximate quantities that are too difficult to compute mathematically.  Random selection of a sample from a very large data sets.  Encrypt data or generate passwords.  Generate potential solutions for difficult problems.

STA347 - week 926 Steps in Probabilistic Simulations In most applications, the first step is to specify a certain probability distribution. Once such distribution is specified, it will be desired to generate one or more random variables having that distribution. The build-in computer device that generates random numbers is called pseudorandom number generator. It is a device for generating a sequence U 1, U 2, … of random values that are approximately independent and have approximately uniform distribution of the unit interval [0,1].

STA347 - week 927 Simulating Discrete Distributions - Example Suppose we wish to generate X ~ Bernoulli(p), where 0 < p < 1. We start by generating U ~ Uniform[0, 1] and then set: Then clearly X takes two values, 0 and 1. Further, Therefore, we have that X ~ Bernoulli(p). This can be generalized to generate Y ~ Binomial(n, p) by generating U 1, U 2, … U n. Setting X i as above and let Y = X 1 + ∙∙∙ + X n.

STA347 - week 928 Simulating Discrete Distributions In general, suppose we wish to generate a random variable with probability mass function p. Let, x 1 0. Let U ~ Uniform[0, 1]. Define Y by: Theorem 1: Y is a discrete random variable, having probability mass function p. Proof:

STA347 - week 929 Simulating Continuous Distributions - Example Suppose we wish to generate X ~ Uniform[a, b]. We start by generating U ~ Uniform[0, 1] and then set: Using one-dimensional change of variable theorem we can easily show that X ~ Uniform[a, b].

STA347 - week 930 Simulating Continuous Distributions In general, simulating continuous distribution is not an easy task. However, for certain continuous distributions it is not difficult. The general method for simulating continuous distribution makes use of the inverse cumulative distribution function. The inverse cdf of a random variable X with cumulative distribution function F is defined by: for 0 < t < 1.

STA347 - week 931 Inversion Method for Generating RV Let F be any cumulative distribution function, and let U ~ Uniform[0, 1]. Define a random variable Y by: Theorem 2: Y has cumulative distribution function given by F. That is, Proof:

STA347 - week 932 Important Notes The theorem above is valid for any cumulative distribution function whether it corresponds to a continuous distribution, a discrete distribution or a mixture of the two. The inversion method for generating random variables described above can be used whenever the distribution function is not too complicated and has a close form. For distributions that are too complicated to sample using the inversion method and for which there is no simple trick, it may still be possible to generate samples using Markov chain methods.

STA347 - week 933 Example – Exponential Distribution Suppose X ~ Exponential(λ). The probability density function of X is: The cdf of X is: Setting and solving for x we get… Therefore, by theorem 2 above, where U ~ Uniform[0, 1], has an Exponential(λ) distribution. 33

STA347 - week 934 Example – Standard Normal Distribution Suppose X ~ Normal(0,1). The cdf of X is denoted by Ф(x). It is given by: Then, if U ~ Uniform[0, 1], by theorem 2 above has a N(0,1) distribution. However, since both Ф and Ф -1 don’t have a close form, i.e., it is difficult to compute them, the inversion method for generating RV is not practical. 34