ORDER STATISTICS AND LIMITING DISTRIBUTIONS

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
ORDER STATISTICS.
Independence of random variables
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Review.
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Random Vectors Shruti Sharma Ganesh Oka References :-
The moment generating function of random variable X is given by Moment generating function.
Discrete Random Variables and Probability Distributions
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
Approximations to Probability Distributions: Limit Theorems.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
STAT 552 PROBABILITY AND STATISTICS II
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
4.2 Variances of random variables. A useful further characteristic to consider is the degree of dispersion in the distribution, i.e. the spread of the.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Convergence in Distribution
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Confidence Interval & Unbiased Estimator Review and Foreword.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
Probability and Moment Approximations using Limit Theorems.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Copyright © Cengage Learning. All rights reserved. 3 Discrete Random Variables and Probability Distributions.
Parameter, Statistic and Random Samples
Large Sample Distribution Theory
Jiaping Wang Department of Mathematical Science 04/22/2013, Monday
Large Sample Theory EC 532 Burak Saltoğlu.
Statistics Lecture 19.
Functions and Transformations of Random Variables
STATISTICAL INFERENCE PART I POINT ESTIMATION
Parameter, Statistic and Random Samples
3.1 Expectation Expectation Example
Chapter 5 Statistical Models in Simulation
Large Sample Theory EC 532 Burak Saltoğlu.
5.6 The Central Limit Theorem
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
STOCHASTIC HYDROLOGY Random Processes
6 Point Estimation.
Handout Ch 4 實習.
Handout Ch 4 實習.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
STATISTICAL INFERENCE PART I POINT ESTIMATION
Chapter 2. Random Variables
TRANSFORMATION OF FUNCTION OF TWO OR MORE RANDOM VARIABLES
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Chapter 5 Properties of a Random Sample
Xi = 1 if ith access is a hit, 0 if miss. 1st miss on kth access 1st access=hit k-1 access=hit kth access miss.
Statistical Inference I
Presentation transcript:

ORDER STATISTICS AND LIMITING DISTRIBUTIONS

ORDER STATISTICS Let X1, X2,…,Xn be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b. Let X(1) be the smallest of Xi, X(2) be the second smallest of Xi,…, and X(n) be the largest of Xi. X(i) is the i-th order statistic.

ORDER STATISTICS It is often useful to consider ordered random sample. Example: suppose a r.s. of five light bulbs is tested and the failure times are observed as (5,11,4,100,17). These will actually be observed in the order of (4,5,11,17,100). Interest might be on the kth smallest ordered observation, e.g. stop the experiment after kth failure. We might also be interested in joint distributions of two or more order statistics or functions of them (e.g. range=max – min)

ORDER STATISTICS If X1, X2,…,Xn is a r.s. of size n from a population with continuous pdf f(x), then the joint pdf of the order statistics X(1), X(2),…,X(n) is Order statistics are not independent. The joint pdf of ordered sample is not same as the joint pdf of unordered sample. Future reference: For discrete distributions, we need to take ties into account (two X’s being equal). See, Casella and Berger, 1990, pg 231.

Example Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the joint pdf of order statistics and the marginal pdf of the smallest order statistic.

ORDER STATISTICS The Maximum Order Statistic: X(n)

ORDER STATISTICS The Minimum Order Statistic: X(1)

# of possible orderings ORDER STATISTICS k-th Order Statistic y y1 y2 yk-1 yk yk+1 yn … # of possible orderings n!/{(k1)!1!(n  k)!} P(X<yk) P(X>yk) fX(yk)

Example Same example but now using the previous formulas (without taking the integrals): Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the marginal pdf of the smallest order statistic.

Example X~Uniform(0,1). A r.s. of size n is taken. Find the p.d.f. of kth order statistic. Solution: Let Yk be the kth order statistic.

# of possible orderings ORDER STATISTICS Joint p.d.f. of k-th and j-th Order Statistic (for k<j) k-1 items j-k-1 items n-j items # of possible orderings n!/{(k1)!1!(j-k-1)!1!(n  j)!} 1 item 1 item y y1 y2 yk-1 yk yk+1 yn … yj-1 yj yj+1 P(X<yk) P(yk<X<yj) P(X>yj) fX(yk) fX(yj)

LIMITING DISTRIBUTION

Motivation The p.d.f. of a r.v. often depends on the sample size (i.e., n) If X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) is a function of them, sometimes it is possible to find the exact distribution of Yn (what we have been doing lately) However, sometimes it is only possible to obtain approximate results when n is large  limiting distributions.

CONVERGENCE IN DISTRIBUTION Consider that X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) be a function of rvs with cdfs Fn(y) so that for each n=1, 2,… where F(y) is continuous. Then, the sequence Yn is said to converge in distribution to Y.

CONVERGENCE IN DISTRIBUTION Theorem: If for every point y at which F(y) is continuous, then Yn is said to have a limiting distribution with cdf F(y). The term “Asymptotic distribution” is sometimes used instead of “limiting distribution” Definition of convergence in distribution requires only that limiting function agrees with cdf at its points of continuity.

Example Let X1,…, Xn be a random sample from Unif(0,1). Find the limiting distribution of the max order statistic, if it exists.

Example Let {Xn} be a sequence of rvs with pmf Find the limiting distribution of Xn , if it exists.

Example Let Yn be the nth order statistic of a random sample X1, …, Xn from Unif(0,θ). Find the limiting distribution of Zn=n(θ -Yn), if it exists.

Example Let X1,…, Xn be a random sample from Exp(θ). Find the limiting distribution of the min order statistic, if it exists.

Example Suppose that X1, X2, …, Xn are iid from Exp(1). Find the limiting distribution of the max order statistic, if it exists.

LIMITING MOMENT GENERATING FUNCTIONS Let rv Yn have an mgf Mn(t) that exists for all n. If then Yn has a limiting distribution which is defined by M(t).

Example Let =np. The mgf of Poisson() The limiting distribution of Binomial rv is the Poisson distribution.

EXAMPLES Let Xn~ Gamma(n, ) where  does not depend on n. Let Yn=Xn/n. Find the limiting distribution of Yn.

EXAMPLES Let Xn~ Exp(1) and be the sample mean of r.s. of size n. Find the limiting distribution of

CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) A rv Yn convergence in probability to a rv Y if for every >0. Special case: Y=c where c is a constant not depending on n. The limiting distribution of Yn is degenerate at point c.

CHEBYSHEV’S INEQUALITY Let X be an rv with E(X)= and V(X)=2. The Chebyshev’s Inequality can be used to prove stochastic convergence in many cases.

CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) The Chebyshev’s Inequality proves the convergence in probability if the following three conditions are satisfied. 1. E(Yn)=n where

Example Let X be an rv with E(X)= and V(X)=2<. For a r.s. of size n, is the sample mean. Is

EXAMPLES Let Zn be Show that the limiting distribution of Wn is degenerate at 0.

WEAK LAW OF LARGE NUMBERS (WLLN) Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, converges in probability to . WLLN states that the sample mean is a good estimate of the population mean. For large n, the sample mean and population mean are close to each other with probability 1. But more to come on the properties of estimators.

STRONG LAW OF LARGE NUMBERS Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, that is, converges almost surely to .

Relation between convergences Almost sure convergence is the strongest. (reverse is generally not true)

THE CENTRAL LIMIT THEOREM Let X1, X2,…,Xn be a sequence of iid rvs with E(Xi)= and V(Xi)=2<∞. Define . Then, or Proof can be found in many books, e.g. Bain and Engelhardt, 1992, page 239

Example Let X1, X2,…,Xn be iid rvs from Unif(0,1) and Find approximate distribution of Yn. Find approximate values of The 90th percentile of Yn. Warning: n=20 is probably not a big enough number of size.

EXAMPLES Let be the sample mean from a r.s. of size n=100 from . Compute approximate value of

SLUTKY’S THEOREM If Xn X in distribution and Yn a, a constant, in probability, then a) YnXnaX in distribution. b) Xn+YnX+a in distribution.

SOME THEOREMS ON LIMITING DISTRIBUTIONS If Xn c>0 in probability, then for any function g(x) continuous at c, g(Xn) g(c) in prob. e.g. If Xnc in probability and Ynd in probability, then aXn+bYn ac+bd in probability. XnYn cd in probability 1/Xn 1/c in probability for all c0.

Example X~Gamma(, 1). Show that

EXAMPLES X~Gamma(1,n). Let Let ZnN(0,1) in distribution and Ync in probability. Find the limiting distribution of the following Wn=YnZn. Un=Zn/n. Vn=Zn+Yn.

Example Consider a random sample of size n from p.d.f. and a random sample of size n from which is independent from X. Find the limiting distribution of