Week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for.

Slides:



Advertisements
Similar presentations
Order Statistics The order statistics of a set of random variables X1, X2,…, Xn are the same random variables arranged in increasing order. Denote by X(1)
Advertisements

Some additional Topics. Distributions of functions of Random Variables Gamma distribution,  2 distribution, Exponential distribution.
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
06/05/2008 Jae Hyun Kim Chapter 2 Probability Theory (ii) : Many Random Variables Bioinformatics Tea Seminar: Statistical Methods in Bioinformatics.
Ch 5.2: Series Solutions Near an Ordinary Point, Part I
Probability theory 2010 Outline  The need for transforms  Probability-generating function  Moment-generating function  Characteristic function  Applications.
Generating Functions. The Moments of Y We have referred to E(Y) and E(Y 2 ) as the first and second moments of Y, respectively. In general, E(Y k ) is.
Probability theory 2008 Conditional probability mass function  Discrete case  Continuous case.
A random variable that has the following pmf is said to be a binomial random variable with parameters n, p The Binomial random variable.
Probability theory 2011 Convergence concepts in probability theory  Definitions and relations between convergence concepts  Sufficient conditions for.
The moment generating function of random variable X is given by Moment generating function.
Week 51 Theorem For g: R  R If X is a discrete random variable then If X is a continuous random variable Proof: We proof it for the discrete case. Let.
Approximations to Probability Distributions: Limit Theorems.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Review of Probability.
Chapter 12 Review of Calculus and Probability
Limits and the Law of Large Numbers Lecture XIII.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Week 41 Continuous Probability Spaces Ω is not countable. Outcomes can be any real number or part of an interval of R, e.g. heights, weights and lifetimes.
Random Sampling, Point Estimation and Maximum Likelihood.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Functions of Two Random.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Use of moment generating functions 1.Using the moment generating functions of X, Y, Z, …determine the moment generating function of W = h(X, Y, Z, …).
Continuous Distributions The Uniform distribution from a to b.
Convergence in Distribution
Chapter 4 DeGroot & Schervish. Variance Although the mean of a distribution is a useful summary, it does not convey very much information about the distribution.
More Continuous Distributions
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
One Random Variable Random Process.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Evaluating E(X) and Var X by moment generating function Xijin Ge SDSU Stat/Math Mysterious Mathematics Ahead! Student Discretion Advised.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
7 sum of RVs. 7-1: variance of Z Find the variance of Z = X+Y by using Var(X), Var(Y), and Cov(X,Y)
Expectation. Let X denote a discrete random variable with probability function p(x) (probability density function f(x) if X is continuous) then the expected.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.9, 4.10: Moment Generating Function and Probability.
Distributions of Functions of Random Variables November 18, 2015
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
One Function of Two Random Variables
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Probability Generating Functions Suppose a RV X can take on values in the set of non-negative integers: {0, 1, 2, …}. Definition: The probability generating.
Ver Chapter 5 Continuous Random Variables 1 Probability/Ch5.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
2.2 Discrete Random Variables 2.2 Discrete random variables Definition 2.2 –P27 Definition 2.3 –P27.
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Week 61 Poisson Processes Model for times of occurrences (“arrivals”) of rare phenomena where λ – average number of arrivals per time period. X – number.
Lecture 3 B Maysaa ELmahi.
Sequences and Series of Functions
Lebesgue measure: Lebesgue measure m0 is a measure on i.e., 1. 2.
The distribution function F(x)
Parameter, Statistic and Random Samples
Linear Combination of Two Random Variables
Moment Generating Functions
Multinomial Distribution
Example Suppose X ~ Uniform(2, 4). Let . Find .
Generating Functions.
Further Topics on Random Variables: 1
Continuous Distributions
Presentation transcript:

week 111 Some facts about Power Series Consider the power series with non-negative coefficients a k. If converges for any positive value of t, say for t = r, then it converges for all t in the interval [-r, r] and thus defines a function of t on that interval. For any t in (-r, r), this function is differentiable at t and the series converges to the derivatives. Example: For k = 0, 1, 2,… and -1< x < 1 we have that (differentiating geometric series).

week 112 Generating Functions For a sequence of real numbers {a j } = a 0, a 1, a 2,…, the generating function of {a j } is if this converges for |t| 0.

week 113 Probability Generating Functions Suppose X is a random variable taking the values 0, 1, 2, … (or a subset of the non-negative integers). Let p j = P(X = j), j = 0, 1, 2, …. This is in fact a sequence p 0, p 1, p 2, … Definition: The probability generating function of X is Since if |t| < 1 and the pgf converges absolutely at least for |t| < 1. In general, π X (1) = p 0 + p 1 + p 2 +… = 1. The pgf of X is expressible as an expectation:

week 114 Examples X ~ Binomial(n, p), converges for all real t. X ~ Geometric(p), converges for |qt| < 1 i.e. Note: in this case p j = pq j for j = 1, 2, …

week 115 PGF for sums of independent random variables If X, Y are independent and Z = X+Y then, Example Let Y ~ Binomial(n, p). Then we can write Y = X 1 +X 2 +…+ X n. Where X i ’s are i.i.d Bernoulli(p). The pgf of X i is The pgf of Y is then

week 116 Use of PGF to find probabilities Theorem Let X be a discrete random variable, whose possible values are the nonnegative integers. Assume π X (t 0 ) 0. Then π X (0) = P(X = 0), etc. In general, where is the k th derivative of π X with respect to t. Proof:

week 117 Example Suppose X ~ Poisson(λ). The pgf of X is given by Using this pgf we have that

week 118 Finding Moments from PGFs Theorem Let X be a discrete random variable, whose possible values are the nonnegative integers. If π X (t) 1. Then etc. In general, Where is the kth derivative of π X with respect to t. Note: E(X(X-1)∙∙∙(X-k+1)) is called the kth factorial moment of X. Proof:

week 119 Example Suppose X ~ Binomial(n, p). The pgf of X is π X (t) = (pt+q) n. Find the mean and the variance of X using its pgf.

week 1110 Uniqueness Theorem for PGF Suppose X, Y have probability generating function π X and π Y respectively. Then π X (t) = π Y (t) if and only if P(X = k) = P(Y = k) for k = 0,1,2,… Proof: Follow immediately from calculus theorem: If a function is expressible as a power series at x=a, then there is only one such series. A pgf is a power series about the origin which we know exists with radius of convergence of at least 1.

week 1111 Moment Generating Functions The moment generating function of a random variable X is m X (t) exists if m X (t) 0 If X is discrete If X is continuous Note: m X (t) = π X (e t ).

week 1112 Examples X ~ Exponential(λ). The mgf of X is X ~ Uniform(0,1). The mgf of X is

week 1113 Generating Moments from MGFs Theorem Let X be any random variable. If m X (t) 0. Then m X (0) = 1 etc. In general, Where is the kth derivative of m X with respect to t. Proof:

week 1114 Example Suppose X ~ Exponential(λ). Find the mean and variance of X using its moment generating function.

week 1115 Example Suppose X ~ N(0,1). Find the mean and variance of X using its moment generating function.

week 1116 Example Suppose X ~ Binomial(n, p). Find the mean and variance of X using its moment generating function.

week 1117 Properties of Moment Generating Functions m X (0) = 1. If Y=a+bX, then the mgf of Y is given by If X,Y independent and Z = X+Y then,

week 1118 Uniqueness Theorem If a moment generating function m X (t) exists for t in an open interval containing 0, it uniquely determines the probability distribution.

week 1119 Example Find the mgf of X ~ N(μ,σ 2 ) using the mgf of the standard normal random variable. Suppose,, independent. Find the distribution of X 1 +X 2 using mgf approach.