Download presentation
Presentation is loading. Please wait.
Published byAdrian Norton Modified over 8 years ago
1
week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order. Denote by X (1) = smallest of X 1, X 2,…, X n X (2) = 2 nd smallest of X 1, X 2,…, X n X (n) = largest of X 1, X 2,…, X n Note, even if X i ’s are independent, X (i) ’s can not be independent since X (1) ≤ X (2) ≤ … ≤ X (n) Distribution of X i ’s and X (i) ’s are NOT the same.
2
week 22 Distribution of the Largest order statistic X (n) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The CDF of the largest order statistic, X (n), is given by The density function of X (n) is then
3
week 23 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (n).
4
week 24 Distribution of the Smallest order statistic X (1) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The CDF of the smallest order statistic X (1) is given by The density function of X (1) is then
5
week 25 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (1).
6
week 26 Distribution of the kth order statistic X (k) Suppose X 1, X 2,…, X n are i.i.d random variables with common distribution function F X (x) and common density function f X (x). The density function of X (k) is
7
week 27 Example Suppose X 1, X 2,…, X n are i.i.d Uniform(0,1) random variables. Find the density function of X (k).
8
week 28 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The statistical model corresponds to the information a statistician brings to the application about what the true distribution is or at least what he or she is willing to assume about it. The variable θ is called the parameter of the model, and the set Ω is called the parameter space. From the definition of a statistical model, we see that there is a unique value, such that f θ is the true distribution that generated the data. We refer to this value as the true parameter value.
9
week 29 Examples Suppose there are two manufacturing plants for machines. It is known that the life lengths of machines built by the first plant have an Exponential(1) distribution, while machines manufactured by the second plant have life lengths distributed Exponential(1.5). You have purchased five of these machines and you know that all five came from the same plant but do not know which plant. Further, you observe the life lengths of these machines, obtaining a sample (x1, …, x5) and want to make inference about the true distribution of the life lengths of these machines. Suppose we have observations of heights in cm of individuals in a population and we feel that it is reasonable to assume that the distribution of height is the population is normal with some unknown mean and variance. The statistical model in this case is where Ω = R×R +, where R + = (0, ∞).
10
week 210 Point Estimate Most statistical procedures involve estimation of the unknown value of the parameter of the statistical model. A point estimate,, is an estimate of the parameter θ. It is a statistic based on the sample and therefore it is a random variable with a distribution function. The standard deviation of the sampling distribution of an estimator is usually called the standard error of the estimator. For a given statistical model with unknown parameter θ there could be more then one point estimate. The parameter θ of a statistical model can have more then just one element.
11
week 211 Properties of Point Estimators Let be a point estimator for a parameter θ. Then is an unbiased estimator if The bias of a point estimator is given by The variance of a point estimator is Ideally we would like our estimator to have minimum variance.
12
week 212 Mean Square Error of Point Estimators The mean square error (MSE) of a point estimator is Claim:
13
week 213 Common Point Estimators A natural estimate for the population mean μ is the sample mean (in any distribution). The sample mean is an unbiased estimator of the population mean. A common estimator for the population variance is the sample variance s 2.
14
week 214 Claim Let X 1, X 2,…, X n be random sample of size n from a normal population. The sample variance s 2 is an unbiased estimator of the population variance σ 2. Proof…
15
week 215 Example Suppose X 1, X 2,…, X n is a random sample from U(0, θ) distribution. Let. Find the density of and its mean. Is unbiased?
16
week 216 Asymptotically Unbiased Estimators An estimator is asymptotically unbiased if Example:
17
week 217 Relative Efficiency Given two estimators and of a parameter θ with variances respectively. The efficiency of relative to is the ratio Interpretation…
18
week 218 Example In the uniform example above let and. Which point estimate is more efficient?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.