Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X.

Slides:



Advertisements
Similar presentations
Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Advertisements

Functions of Random Variables Notes of STAT 6205 by Dr. Fan.
Chapter 3 Some Special Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2.
Section 7.4 (partially). Section Summary Expected Value Linearity of Expectations Independent Random Variables.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
1 Functions of Random Variables Dr. Jerrell T. Stracener, SAE Fellow Leadership in Engineering EMIS 7370/5370 STAT 5340 : PROBABILITY AND STATISTICS FOR.
Statistics for Financial Engineering Part1: Probability Instructor: Youngju Lee MFE, Haas Business School University of California, Berkeley.
Probability theory 2010 Order statistics  Distribution of order variables (and extremes)  Joint distribution of order variables (and extremes)
Section 6.2 Suppose X 1, X 2, …, X n are observations in a random sample from a N( ,  2 ) distribution. Then,
Section 2.3 Suppose X is a discrete-type random variable with outcome space S and p.m.f f(x). The mean of X is The variance of X is The standard deviation.
Section 5.7 Suppose X 1, X 2, …, X n is a random sample from a Bernoulli distribution with success probability p. Then Y = X 1 + X 2 + … + X n has a distribution.
Section 6.1 Let X 1, X 2, …, X n be a random sample from a distribution described by p.m.f./p.d.f. f(x ;  ) where the value of  is unknown; then  is.
Assignment 2 Chapter 2: Problems  Due: March 1, 2004 Exam 1 April 1, 2004 – 6:30-8:30 PM Exam 2 May 13, 2004 – 6:30-8:30 PM Makeup.
Section 5.3 Suppose X 1, X 2, …, X n are independent random variables (which may be either of the discrete type or of the continuous type) each from the.
Standard Normal Distribution
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.
Section 10.6 Recall from calculus: lim= lim= lim= x  y  — x x — x kx k 1 + — y y eekek (Let y = kx in the previous limit.) ekek If derivatives.
Section 5.5 Important Theorems in the Text: Let X 1, X 2, …, X n be independent random variables with respective N(  1,  1 2 ), N(  2,  2 2 ), …, N(
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
The moment generating function of random variable X is given by Moment generating function.
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Section 5.6 Important Theorem in the Text: The Central Limit TheoremTheorem (a) Let X 1, X 2, …, X n be a random sample from a U(–2, 3) distribution.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Some Continuous Probability Distributions Asmaa Yaseen.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Maximum Likelihood Estimation
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Chi-squared distribution  2 N N = number of degrees of freedom Computed using incomplete gamma function: Moments of  2 distribution:
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam 2 Sections 4.6 – 5.6 Jiaping Wang Department of Mathematical Science 04/01/2013, Monday.
Section 3.5 Let X have a gamma( ,  ) with  = r/2, where r is a positive integer, and  = 2. We say that X has a chi-square distribution with r degrees.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Moment Generating Functions
Section 3.6 Recall that y –1/2 e –y dy =   0 (Multivariable Calculus is required to prove this!)  (1/2) = Perform the following change of variables.
Section 2.5 Important definition in the text: The definition of the moment generating function (m.g.f.) Definition If S is the space for a random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
1/23 Ch10 Nonparametric Tests. 2/23 Outline Introduction The sign test Rank-sum tests Tests of randomness The Kolmogorov-Smirnov and Anderson- Darling.
Convergence in Distribution
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 6 Some Continuous Probability Distributions.
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Chapter 7: Introduction to Sampling Distributions Section 2: The Central Limit Theorem.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Distributions of Functions of Random Variables November 18, 2015
Section 10.5 Let X be any random variable with (finite) mean  and (finite) variance  2. We shall assume X is a continuous type random variable with p.d.f.
Section 8-6 Testing a Claim about a Standard Deviation or Variance.
Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.
Central Limit Theorem Let X 1, X 2, …, X n be n independent, identically distributed random variables with mean  and standard deviation . For large n:
Chapter 5: The Basic Concepts of Statistics. 5.1 Population and Sample Definition 5.1 A population consists of the totality of the observations with which.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
Chapter 9: Joint distributions and independence CIS 3033.
Functions and Transformations of Random Variables
Sample Mean Distributions
Moment Generating Functions
More about Normal Distributions
Example Suppose X ~ Uniform(2, 4). Let . Find .
ASV Chapters 1 - Sample Spaces and Probabilities
How accurately can you (1) predict Y from X, and (2) predict X from Y?
Functions of Random variables
Generating Functions.
Handout Ch 4 實習.
6.3 Sampling Distributions
Moments of Random Variables
Presentation transcript:

Section 5.4 is n  a i X i i = 1 n  M i (a i t). i = 1 M 1 (a 1 t) M 2 (a 2 t) … M n (a n t) = Y = a 1 X 1 + a 2 X 2 + … + a n X n = If X 1, X 2, …, X n comprise a random sample from a distribution with moment generating function M(t), then the moment generating function Corollary is n  X i i = 1 Y = X 1 + X 2 + … + X n = ofM Y (t) = Important Theorems in the Text: If X 1, X 2, …, X n are independent random variables with respective moment generating functions M 1 (t), M 2 (t), …, M n (t), and a 1, a 2, …, a n are constants, then the moment generating function of M Y (t) = Theorem n  M(t) = [M(t)] n, i = 1 and the moment generating function ofthe sample mean X = is n  X i i = 1 n M X (t) = n  M i (t/n) = [M(t/n)] n. i = 1

If X 1, X 2, …, X n are independent chi-square random variables with r 1, r 2, …, r n degrees of freedom respectively, then the random variable Y = has a distribution. Theorem n  X i i = 1  2 (r 1 + r 2 + … + r n ) If Z 1, Z 2, …, Z n are independent random variables each with a standard normal (N(0, 1)) distribution, then the random variable W = has a distribution. Corollary n  Z i 2 i = 1 2(n)2(n) If X 1. X 2. …, X n are independent random variables with respective normal distributions N(  i,  i 2 ) for i = 1, 2, …, n, then the random variable W = has a distribution. Corollary n  i = 1 2(n)2(n) X i –  i ———  i 2

1. (a) Let X 1, X 2, …, X n be a random sample of size n from an exponential distribution with mean . From Corollary 5.4-1, we have that M Y (t) = 1 ——— = 1 –  t n 1 ——— (1 –  t) n From this m.g.f., we recognize that Y must have a distribution. gamma(n,  ) [M(t)] n = Find the m.g.f. of random variable Y = X 1 + X 2 + … + X n = Is it possible to tell from the m.g.f. what the distribution of Y is? (Note: This is essentially Text Exercise ) n  X i. i = 1

(b) Find the m.g.f. of random variable X = (the sample mean). Is it possible to tell from the m.g.f. what the distribution of X is? n  X k k = 1 n From Corollary 5.4-1(b), we have that M X (t) = 1 ———– = 1 –  (t/n) n 1 ————— [1 – (  /n)t] n From this m.g.f., we recognize that X must have a distribution. gamma(n,  /n) n  M i (t/n) = i = 1

2.Suppose the joint p.m.f. of random variables X and Y could be graphically displayed as follows: 2 y 1 12 x 1/41/3 1/61/4 Can X and Y be treated as a random sample of size n = 2? Why or why not? The random variables X and Y cannot be treated as a random sample, because they are not independent. f 1 (x) = if x = 1 if x = 2 5/12 7/12 f 2 (y) = if y = 1 if y = 2 5/12 7/12 f 1 (1) = f 2 (1) =f(1,1) =5/12 1/6

Suppose W, X, and Y are mutually independent random variables each having exponential distribution where E(W) = 1, E(X) = 5, and E(Y) = (a) (b) Can W, X, and Y be treated as a random sample of size n = 3? Why or why not? The random variables W, X, and Y cannot be treated as a random sample, because they do not have identical distributions. Find the m.g.f. of random variable U = 10W + 2X + Y. Is it possible to tell from the m.g.f. what the distribution of U is? From Theorem 5.4-1, we have that M U (t) = 3  M i (a i t) = i = 1 1 ——— 1 – 10t 3 M W ( )M X ( )M Y ( ) =10t2t2tt 1 ——— 1 – ( ) 1 ———— 1 – 5( ) 1 ——— = 1 – 10t 10t2t2t From this m.g.f., we recognize that U must have a distribution. gamma(3, 10)

(c)Find the m.g.f. of random variable V = 8W + 5X – 3Y. Is it possible to tell from the m.g.f. what the distribution of V is? From Theorem 5.4-1, we have that M V (t) = 3  M i (a i t) = i = 1 1 ——— 1 – ( ) 1 ———— 1 – 5( ) 1 ————— = 1 – 10( ) Since we do not recognize this m.g.f., we cannot tell what type of distribution V has. 1 —————————— (1 – 8t)(1 – 25t)(1 + 30t) M W ( )M X ( )M Y ( ) =8t8t5t5t–3t 8t8t5t5t