Download presentation
1
POINT ESTIMATION AND INTERVAL ESTIMATION
2
DEFINITIONS An estimator of a population parameter is a random variable that depends on the sample information and whose realizations provide approximations to this unknown parameter. A Spescific realization of that random variable is called an estimate. A point estimator of a population parameter is a function of the sample information that yields a single number. The corresponding realization is called the point estimate of the parameter.
3
DEFINITIONS POPULATION PARAMETER ESTIMATOR ESTIMATE Mean ( )
Variance ( ) StandartDeviation ( ) Proportion ( )
4
PROPERTIES OF GOOD POINT ESTIMATORS
A good estimator must satisfy three conditions: Unbiased: The estimator is said to be an unbiased estimator of the parameter if the mean of the sampling distribution of is In the other words the expected value of the estimator must be equal to the mean of the parameter
5
UNBIASEDNESS OF SOME ESTIMATORS
The sample mean, variance and proportion are unbiased estimators of the corresponding population quantities. In general, the sample standart deviation is not an unbiased estimator of the population standart deviation. Let be an estimator of The bias in is defined as the difference between its mean and ; that is It follows that the bias of an unbiased estimator is 0.
6
EFFICIENCY Let and be two unbiased estimators of ,based on the same number of sample observations. Then is said to be more efficient than if The relative efficiency of one estimator with respect to the other is the ratio of their variances; that is Relative efficiency=
7
EFFICIENCY is the more efficient estimator.
If is an unbiased estimator of , and no other unbiased estimator has smaller variance, then is said to be most efficient or minimum variance unbiased estimator of
8
CHOICE OF POINT ESTIMATOR
There are estimation problems for which no unbiased estimator is very satisfactory and for which there may be much to be gained from the sacrifice of accepting little bias. One measure of the expected closeness of an estimator to a parameter is its mean squared error – the expectation of the squared difference between the estimator and the parameter, that is It can be shown that,
9
CONSISTENCY Consistency also desirable is that an estimate tend to lie nearer the population characteristic as the sample size becomes larger. This is the basis of the property of consistency. An estimator is a consistent estimator of a population characteristic if the larger the sample size, the more likely it is that the estimate will be close to
10
INTERVAL ESTIMATION An interval estimator for apopulation parameter is a rule for determining (based on sample information) a range, or interval, in which the parameter is likely to fall. The corresponding estimate is called an interval estimate. Let be an unknown parameter. Suppose that on the basis of sample information, we can find random variables A and B such that If the specific sample realizations of A and B are denoted by a and b ,then the interval from a to b is called a 100(1-α)% confidence interval for The quantity is called the probability content or level of confidence, of the interval. If the population was repeatedly sampled a very large number of times, the parameter would be contained in 100(1-α)% of intervals calculated this way.
11
ELEMENTS OF CONFIDENCE INTERVAL
12
CONFIDENCE LIMITS FOR POPULATION MEAN
13
FACTORS EFFECTING INTERVAL WIDTH
14
CONFIDENCE INTERVALS KNOWN
15
CONFIDENCE INTERVALS KNOWN
16
CONFIDENCE INTERVALS UNKNOWN
17
STUDENT’S t DISTRIBUTION
18
STUDENT’S t TABLE
19
ESTIMATION FOR FINITE POPULATIONS
When sample is large relative to population, n/N>0,05 Use finite population correction factor;
20
CONFIDENCE INTERVALS FOR THE POPULATION PROPORTION
Assumptions; Two Categorical Outcomes (faulty/not faulty – complex/easy), Population Follows Binomial Distribution Normal Approximation Can Be Used if: n·p ≥ n·(1 - p) ≥ 5 Confidence Interval Estimate;
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.