ORDER STATISTICS.

Slides:



Advertisements
Similar presentations
Random Processes Introduction (2)
Advertisements

MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
DISCRETE RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
1 اعداد : د. زكية الصيعري كلية العلوم للبنات Dr. Zakeia A.Alsaiary.
Use of moment generating functions. Definition Let X denote a random variable with probability density function f(x) if continuous (probability mass function.
Independence of random variables
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Review of Basic Probability and Statistics
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE UNIVARIATE TRANSFORMATIONS.
Review.
Prof. Bart Selman Module Probability --- Part d)
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
1 Review of Probability Theory [Source: Stanford University]
Random Vectors Shruti Sharma Ganesh Oka References :-
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
The moment generating function of random variable X is given by Moment generating function.
Discrete Random Variables and Probability Distributions
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
STATISTICAL INFERENCE PART II SOME PROPERTIES OF ESTIMATORS
1 Sampling Distribution Theory ch6. 2  Two independent R.V.s have the joint p.m.f. = the product of individual p.m.f.s.  Ex6.1-1: X1is the number of.
Discrete Random Variables and Probability Distributions
Approximations to Probability Distributions: Limit Theorems.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Lecture 28 Dr. MUMTAZ AHMED MTH 161: Introduction To Statistics.
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
STAT 552 PROBABILITY AND STATISTICS II
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Limits and the Law of Large Numbers Lecture XIII.
All of Statistics Chapter 5: Convergence of Random Variables Nick Schafer.
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
MOMENT GENERATING FUNCTION AND STATISTICAL DISTRIBUTIONS
Andy Guo 1 Handout Ch5(2) 實習. Andy Guo 2 Normal Distribution There are three reasons why normal distribution is important –Mathematical properties of.
DATA ANALYSIS Module Code: CA660 Lecture Block 3.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Functions of Random Variables. Methods for determining the distribution of functions of Random Variables 1.Distribution function method 2.Moment generating.
Convergence in Distribution
Chapter 5.6 From DeGroot & Schervish. Uniform Distribution.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
1 ORDER STATISTICS AND LIMITING DISTRIBUTIONS. 2 ORDER STATISTICS Let X 1, X 2,…,X n be a r.s. of size n from a distribution of continuous type having.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Exam 2: Rules Section 2.1 Bring a cheat sheet. One page 2 sides. Bring a calculator. Bring your book to use the tables in the back.
Chapter 4-5 DeGroot & Schervish. Conditional Expectation/Mean Let X and Y be random variables such that the mean of Y exists and is finite. The conditional.
Week 121 Law of Large Numbers Toss a coin n times. Suppose X i ’s are Bernoulli random variables with p = ½ and E(X i ) = ½. The proportion of heads is.
Chapter 5a:Functions of Random Variables Yang Zhenlin.
1 Probability and Statistical Inference (9th Edition) Chapter 5 (Part 2/2) Distributions of Functions of Random Variables November 25, 2015.
TRANSFORMATION OF FUNCTION OF A RANDOM VARIABLE
Probability and Moment Approximations using Limit Theorems.
Joint Moments and Joint Characteristic Functions.
STA347 - week 91 Random Vectors and Matrices A random vector is a vector whose elements are random variables. The collective behavior of a p x 1 random.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chapter 5 Special Distributions Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
Chebyshev’s Inequality Markov’s Inequality Proposition 2.1.
Sums of Random Variables and Long-Term Averages Sums of R.V. ‘s S n = X 1 + X X n of course.
Large Sample Distribution Theory
Large Sample Theory EC 532 Burak Saltoğlu.
Statistics Lecture 19.
Functions and Transformations of Random Variables
Parameter, Statistic and Random Samples
Chapter 5 Statistical Models in Simulation
Large Sample Theory EC 532 Burak Saltoğlu.
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
STOCHASTIC HYDROLOGY Random Processes
ORDER STATISTICS AND LIMITING DISTRIBUTIONS
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.
Xi = 1 if ith access is a hit, 0 if miss. 1st miss on kth access 1st access=hit k-1 access=hit kth access miss.
Statistical Inference I
Presentation transcript:

ORDER STATISTICS

ORDER STATISTICS Let X1, X2,…,Xn be a r.s. of size n from a distribution of continuous type having pdf f(x), a<x<b. Let X(1) be the smallest of Xi, X(2) be the second smallest of Xi,…, and X(n) be the largest of Xi. X(i) is the i-th order statistic.

ORDER STATISTICS It is often useful to consider ordered random sample. Example: suppose a r.s. of five light bulbs is tested and the failure times are observed as (5,11,4,100,17). These will actually be observed in the order of (4,5,11,17,100). Interest might be on the kth smallest ordered observation, e.g. stop the experiment after kth failure. We might also be interested in joint distributions of two or more order statistics or functions of them (e.g. range=max – min)

ORDER STATISTICS If X1, X2,…,Xn is a r.s. of size n from a population with continuous pdf f(x), then the joint pdf of the order statistics X(1), X(2),…,X(n) is Order statistics are not independent. The joint pdf of ordered sample is not same as the joint pdf of unordered sample. Future reference: For discrete distributions, we need to take ties into account (two X’s being equal). See, Casella and Berger, 1990, pg 231.

Example Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the joint pdf of order statistics and the marginal pdf of the smallest order statistic.

ORDER STATISTICS The Maximum Order Statistic: X(n)

ORDER STATISTICS The Minimum Order Statistic: X(1)

Example Same example but now using the previous formulas (without taking the integrals): Suppose that X1, X2, X3 is a r.s. from a population with pdf f(x)=2x for 0<x<1 Find the marginal pdf of the smallest order statistic.

# of possible orderings ORDER STATISTICS k-th Order Statistic y y1 y2 yk-1 yk yk+1 yn … # of possible orderings n!/{(k1)!1!(n  k)!} P(X<yk) P(X>yk) fX(yk)

Example X~Uniform(0,1). A r.s. of size n is taken. Find the p.d.f. of kth order statistic. Solution: Let Yk be the kth order statistic.

# of possible orderings ORDER STATISTICS Joint p.d.f. of k-th and j-th Order Statistic (for k<j) k-1 items j-k-1 items n-j items # of possible orderings n!/{(k1)!1!(j-k-1)!1!(n  j)!} 1 item 1 item y y1 y2 yk-1 yk yk+1 yn … yj-1 yj yj+1 P(X<yk) P(yk<X<yj) P(X>yj) fX(yk) fX(yj)

LIMITING DISTRIBUTIONS

Motivation The p.d.f. of a r.v. often depends on the sample size (i.e., n) If X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) is a function of them, sometimes it is possible to find the exact distribution of Yn (what we have been doing lately) However, sometimes it is only possible to obtain approximate results when n is large  limiting distributions.

CONVERGENCE IN DISTRIBUTION Consider that X1, X2,…, Xn is a sequence of rvs and Yn=u(X1, X2,…, Xn) be a function of rvs with cdfs Fn(y) so that for each n=1, 2,… where F(y) is continuous. Then, the sequence Yn is said to converge in distribution to Y.

CONVERGENCE IN DISTRIBUTION Theorem: If for every point y at which F(y) is continuous, then Yn is said to have a limiting distribution with cdf F(y). Definition of convergence in distribution requires only that limiting function agrees with cdf at its points of continuity.

CONVERGENCE IN DISTRIBUTION Example: “limiting distribution” “Asymptotic distribution”

Example Let X1,…, Xn be a random sample from Unif(0,1). Find the limiting distribution of the max order statistic, if it exists.

Example Let {Xn} be a sequence of rvs with pmf Find the limiting distribution of Xn , if it exists.

Example Let Yn be the nth order statistic of a random sample X1, …, Xn from Unif(0,θ). Find the limiting distribution of Zn=n(θ -Yn), if it exists.

Example Suppose that X1, X2, …, Xn are iid from Exp(1). Find the limiting distribution of the max order statistic, if it exists.

LIMITING MOMENT GENERATING FUNCTIONS Let rv Yn have an mgf Mn(t) that exists for all n. If then Yn has a limiting distribution which is defined by M(t).

Example Let =np. The mgf of Poisson() The limiting distribution of Binomial rv is the Poisson distribution.

CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) A rv Yn convergence in probability to a rv Y if for every >0.

CHEBYSHEV’S INEQUALITY The Chebyshev’s Inequality can be used to prove stochastic convergence in many cases. Let X be an rv with E(X)= and V(X)=2.

CONVERGENCE IN PROBABILITY (STOCHASTIC CONVERGENCE) The Chebyshev’s Inequality proves the convergence in probability if the following three conditions are satisfied. 1. E(Yn)=n where

Example Let X be an rv with E(X)= and V(X)=2<. For a r.s. of size n, is the sample mean. Is

WEAK LAW OF LARGE NUMBERS (WLLN) Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, converges in probability to . WLLN states that the sample mean is a good estimate of the population mean. For large n, the sample mean and population mean are close to each other with probability 1. But more to come on the properties of estimators.

STRONG LAW OF LARGE NUMBERS Let X1, X2,…,Xn be iid rvs with E(Xi)= and V(Xi)=2<. Define . Then, for every >0, that is, converges almost surely to .

Relation between convergences Almost sure convergence is the strongest. (reverse is generally not true)

THE CENTRAL LIMIT THEOREM Let X1, X2,…,Xn be a sequence of iid rvs with E(Xi)= and V(Xi)=2<∞. Define . Then, or Proof can be found in many books, e.g. Bain and Engelhardt, 1992, page 239

Example Let X1, X2,…,Xn be iid rvs from Unif(0,1) and Find approximate distribution of Yn. Find approximate values of The 90th percentile of Yn.

SLUTKY’S THEOREM If XnX in distribution and Yna, a constant, in probability, then a) YnXnaX in distribution. b) Xn+YnX+a in distribution.

SOME THEOREMS ON LIMITING DISTRIBUTIONS If Xn c>0 in probability, then for any function g(x) continuous at c, g(Xn) g(c) in prob. e.g. If Xnc in probability and Ynd in probability, then aXn+bYn ac+bd in probability. XnYn cd in probability 1/Xn 1/c in probability for all c0.

Example X~Gamma(, 1). Show that