Presentation is loading. Please wait.

Presentation is loading. Please wait.

Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.

Similar presentations


Presentation on theme: "Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true."— Presentation transcript:

1 week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The distribution f θ can be either a probability density function or a probability mass function. The joint probability density function or probability mass function of iid random variables X 1, …, X n is

2 week 32 The Likelihood Function Let x 1, …, x n be sample observations taken on corresponding random variables X 1, …, X n whose distribution depends on a parameter θ. The likelihood function defined on the parameter space Ω is given by Note that for the likelihood function we are fixing the data, x 1,…, x n, and varying the value of the parameter. The value L(θ | x 1, …, x n ) is called the likelihood of θ. It is the probability of observing the data values we observed given that θ is the true value of the parameter. It is not the probability of θ given that we observed x 1, …, x n.

3 week 33 Examples Suppose we toss a coin n = 10 times and observed 4 heads. With no knowledge whatsoever about the probability of getting a head on a single toss, the appropriate statistical model for the data is the Binomial(10, θ) model. The likelihood function is given by Suppose X 1, …, X n is a random sample from an Exponential(θ) distribution. The likelihood function is

4 week 34 Sufficiency - Introduction A statistic that summarizes all the information in the sample about the target parameter is called sufficient statistic. An estimator is sufficient if we get as much information about θ from as we would from the entire sample X 1, …, X n. A sufficient statistic T(x 1, …, x n ) for a model is any function of the data x 1, …, x n such that once we know the value of T(x 1, …, x n ), then we can determine the likelihood function.

5 week 35 Sufficient Statistic A sufficient statistic is a function T(x 1, …, x n ) defined on the sample space, such that whenever T(x 1, …, x n ) = T(y 1, …, y n ), then for some constant c. Typically, T(x 1, …, x n ) will be of lower dimension than x 1, …, x n, so we can consider replacing x 1, …, x n by T(x 1, …, x n ) as a data reduction and this simplifies the analysis. Example…

6 week 36 Minimal Sufficient Statistics A minimal sufficient statistic T for s model is any sufficient statistic such that once we know a likelihood function L(θ|x 1, …, x n ) for the model and data then we can determine T(x 1, …, x n ). A relevant likelihood function can always be obtained from the value of any sufficient statistic T, but if T is minimal sufficient as well, then we can also obtain the value of T from any likelihood function. It can be shown that a minimal sufficient statistics gives the maximal reduction of the data. Example…

7 week 37 Alternative Definition of Sufficient Statistic Let X 1, …, X n be a random sample from a distribution with unknown parameter θ. The statistic T(x 1, …, x n ) is said to be sufficient for θ if the conditional distribution of X 1, …, X n given T does not depend on θ. This definition is much harder to work with as the conditional distribution of the sample X 1, …, X n given the sufficient statistics T is often hard to derive.

8 week 38 Factorization Theorem Let T be a statistic based on a random sample X 1, …, X n. Then T is a sufficient statistic for θ if i.e. if the likelihood function can be factored into two nonnegative functions one that depend on T(x 1, …, x n ) and θ and one that depend only on the data x 1, …, x n. Proof:

9 week 39 Examples

10 week 310 Minimum Variance Unbiased Estimator MVUE for θ is the unbiased estimator with the smallest possible variance. We look amongst all unbiased estimators for the one with the smallest variance.

11 week 311 The Rao-Blackwell Theorem Let be an unbiased estimator for θ such that. If T is a sufficient statistic for θ, define. Then, for all θ, and Proof:


Download ppt "Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true."

Similar presentations


Ads by Google