Applied Statistics and Probability for Engineers

Slides:



Advertisements
Similar presentations
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Advertisements

CHAPTER 8 More About Estimation. 8.1 Bayesian Estimation In this chapter we introduce the concepts related to estimation and begin this by considering.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chapter 7. Statistical Estimation and Sampling Distributions
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Maximum likelihood (ML) and likelihood ratio (LR) test
Chapter 6 Introduction to Sampling Distributions
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Presenting: Assaf Tzabari
8 Statistical Intervals for a Single Sample CHAPTER OUTLINE
Copyright © Cengage Learning. All rights reserved. 6 Point Estimation.
Inferences About Process Quality
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Maximum likelihood (ML)
Chapter 7 Estimation: Single Population
Estimation Basic Concepts & Estimation of Proportions
Chapter 6. Point Estimation Weiqi Luo ( 骆伟祺 ) School of Software Sun Yat-Sen University : Office : # A313
Prof. Dr. S. K. Bhattacharjee Department of Statistics University of Rajshahi.
Random Sampling, Point Estimation and Maximum Likelihood.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
A statistical model Μ is a set of distributions (or regression functions), e.g., all uni-modal, smooth distributions. Μ is called a parametric model if.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 7 Point Estimation
1 Lecture 16: Point Estimation Concepts and Methods Devore, Ch
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
Confidence Interval & Unbiased Estimator Review and Foreword.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Conditional Expectation
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
LEARNING OBJECTIVES After careful study of this chapter you should be able to do the following: 1.Explain the general concepts of estimating the parameters.
Statistics for Business and Economics 7 th Edition Chapter 7 Estimation: Single Population Copyright © 2010 Pearson Education, Inc. Publishing as Prentice.
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Chapter 3 Applied Statistics and Probability for Engineers
Chapter 6: Sampling Distributions
Applied Statistics and Probability for Engineers
Univariate Gaussian Case (Cont.)
Statistical Estimation
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
STATISTICS POINT ESTIMATION
STATISTICAL INFERENCE
12. Principles of Parameter Estimation
6 Point Estimation.
Some General Concepts of Point Estimation
Point and interval estimations of parameters of the normally up-diffused sign. Concept of statistical evaluation.
Probability Theory and Parameter Estimation I
Sampling Distributions and Estimation
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 6: Sampling Distributions
STATISTICAL INFERENCE PART I POINT ESTIMATION
Sample Mean Distributions
Parameter, Statistic and Random Samples
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
CONCEPTS OF ESTIMATION
More about Posterior Distributions
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
6 Point Estimation.
Parametric Methods Berlin Chen, 2005 References:
Chapter 9 Chapter 9 – Point estimation
12. Principles of Parameter Estimation
Chapter 8 Estimation.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Presentation transcript:

Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 7 Point Estimation of Parameters and Sampling Distributions

7 Point Estimation of Parameters and Sampling Distributions CHAPTER OUTLINE 7-1 Point Estimation 7-3.4 Mean Squared Error of an 7-2 Sampling Distributions and the Central Limit Theorem 7-4 Methods of Point Estimation 7-3 General Concepts of Point 7-4.1 Method of Moments Estimation 7-4.2 Method of Maximum 7-3.1 Unbiased Estimators Likelihood 7-3.2 Variance of a Point 7-4.3 Bayesian Estimation of Estimator Parameters 7-3.3 Standard Error: Reporting a Point Estimate Chapter 7 Title and Outline

Learning Objectives for Chapter 7 After careful study of this chapter, you should be able to do the following: General concepts of estimating the parameters of a population or a probability distribution. Important role of the normal distribution as a sampling distribution. The central limit theorem. Important properties of point estimators, including bias, variances, and mean square error. Constructing point estimators using the method of moments, and the method of maximum likelihood. Compute and explain the precision with which a parameter is estimated. Constructing a point estimator using the Bayesian approach. Chapter 7 Learning Objectives

Point Estimation A point estimate is a reasonable value of a population parameter. X1, X2,…, Xn are random variables. Functions of these random variables, x-bar and s2, are also random variables called statistics. Statistics have their unique distributions which are called sampling distributions. Sec 7-1 Point Estimation

Point Estimator Sec 7-1 Point Estimation

Some Parameters & Their Statistics There could be choices for the point estimator of a parameter. To estimate the mean of a population, we could choose the: Sample mean. Sample median. Average of the largest & smallest observations in the sample. Sec 7-1 Point Estimation

Some Definitions The random variables X1, X2,…,Xn are a random sample of size n if: The Xi ‘s are independent random variables. Every Xi has the same probability distribution. A statistic is any function of the observations in a random sample. The probability distribution of a statistic is called a sampling distribution. Sec 7-2 Sampling Distributions and the Central Limit Theorem

Central Limit Theorem Sec 7-2 Sampling Distributions and the Central Limit Theorem

Example 7-2: Central Limit Theorem Suppose that a random variable X has a continuous uniform distribution: Find the distribution of the sample mean of a random sample of size n = 40. Figure 7-5 The distribution of X and X for Example 7-2. Sec 7-2 Sampling Distributions and the Central Limit Theorem

Sampling Distribution of a Difference in Sample Means If we have two independent populations with means μ1 and μ2, and variances σ12 and σ22, and If X-bar1 and X-bar2 are the sample means of two independent random samples of sizes n1 and n2 from these populations: Then the sampling distribution of: is approximately standard normal, if the conditions of the central limit theorem apply. If the two populations are normal, then the sampling distribution of Z is exactly standard normal. Sec 7-2 Sampling Distributions and the Central Limit Theorem

Example 7-3: Aircraft Engine Life The effective life of a component used in jet-turbine aircraft engine is a random variable with mean 5000 and SD 40 hours and is close to a normal distribution. The engine manufacturer introduces an improvement into the Manufacturing process for this component that changes the parameters to 5050 and 30. Random samples of size 16 and 25 are selected. What is the probability that the difference in the two sample means is at least 25 hours? Figure 7-6 The sampling distribution of X2 − X1 in Example 7-3. Sec 7-2 Sampling Distributions and the Central Limit Theorem

Unbiased Estimators Defined Sec 7-3.1 Unbiased Estimators

Example 7-4: Sample Mean & Variance Are Unbiased-1 X is a random variable with mean μ and variance σ2. Let X1, X2,…,Xn be a random sample of size n. Show that the sample mean (X-bar) is an unbiased estimator of μ. Sec 7-3.1 Unbiased Estimators

Example 7-4: Sample Mean & Variance Are Unbiased-2 Show that the sample variance (S2) is a unbiased estimator of σ2. Sec 7-3.1 Unbiased Estimators

Minimum Variance Unbiased Estimators If we consider all unbiased estimators of θ, the one with the smallest variance is called the minimum variance unbiased estimator (MVUE). If X1, X2,…, Xn is a random sample of size n from a normal distribution with mean μ and variance σ2, then the sample X-bar is the MVUE for μ. Sec 7-3.2 Variance of a Point Estimate

Standard Error of an Estimator Sec 7-3.3 Standard Error Reporting a Point Estimate

Example 7-5: Thermal Conductivity These observations are 10 measurements of thermal conductivity of Armco iron. Since σ is not known, we use s to calculate the standard error. Since the standard error is 0.2% of the mean, the mean estimate is fairly precise. We can be very confident that the true population mean is 41.924 ± 2(0.0898) or between 41.744 and 42.104. Sec 7-3.3 Standard Error Reporting a Point Estimate

Mean Squared Error Conclusion: The mean squared error (MSE) of the estimator is equal to the variance of the estimator plus the bias squared. Sec 7-3.4 Mean Squared Error of an Estimator

Relative Efficiency The MSE is an important criterion for comparing two estimators. If the relative efficiency is less than 1, we conclude that the 1st estimator is superior than the 2nd estimator. Sec 7-3.4 Mean Squared Error of an Estimator

Optimal Estimator A biased estimator can be preferred than an unbiased estimator if it has a smaller MSE. Biased estimators are occasionally used in linear regression. An estimator whose MSE is smaller than that of any other estimator is called an optimal estimator. Figure 7-8 A biased estimator that has a smaller variance than the unbiased estimator . Sec 7-3.4 Mean Squared Error of an Estimator

Moments Defined Let X1, X2,…,Xn be a random sample from the probability distribution f(x), where f(x) can be either a: Discrete probability mass function, or Continuous probability density function The kth population moment (or distribution moment) is E(Xk), k = 1, 2, …. If k = 1 (called the first moment), then: Population moment is μ. Sample moment is x-bar. The sample mean is the moment estimator of the population mean. Sec 7-4.1 Method of Moments

Moment Estimators Sec 7-4.1 Method of Moments

Example 7-8: Normal Distribution Moment Estimators Suppose that X1, X2, …, Xn is a random sample from a normal distribution with parameter μ and σ2 where E(X) = μ and E(X2) = μ2 + σ2. Sec 7-4.1 Method of Moments

Example 7-9: Gamma Distribution Moment Estimators-1 Suppose that X1, X2, …, Xn is a random sample from a gamma distribution with parameter r and λ where E(X) = r/ λ and E(X2) = r(r+1)/ λ2 . Sec 7-4.1 Method of Moments

Example 7-9: Gamma Distribution Moment Estimators-2 Using the time to failure data in the table. We can estimate the parameters of the gamma distribution. Sec 7-4.1 Method of Moments

Maximum Likelihood Estimators Suppose that X is a random variable with probability distribution f(x;θ), where θ is a single unknown parameter. Let x1, x2, …, xn be the observed values in a random sample of size n. Then the likelihood function of the sample is: L(θ) = f(x1;θ) ∙ f(x2; θ) ∙…∙ f(xn; θ) Note that the likelihood function is now a function of only the unknown parameter θ. The maximum likelihood estimator (MLE) of θ is the value of θ that maximizes the likelihood function L(θ). Sec 7-4.2 Method of Maximum Likelihood

Example 7-10: Bernoulli Distribution MLE Let X be a Bernoulli random variable. The probability mass function is f(x;p) = px(1-p)1-x, x = 0, 1 where P is the parameter to be estimated. The likelihood function of a random sample of size n is: Sec 7-4.2 Method of Maximum Likelihood

Example 7-11: Normal Distribution MLE for μ Let X be a normal random variable with unknown mean μ and known variance σ2. The likelihood function of a random sample of size n is: Sec 7-4.2 Method of Maximum Likelihood

Example 7-12: Exponential Distribution MLE Let X be a exponential random variable with parameter λ. The likelihood function of a random sample of size n is: Sec 7-4.2 Method of Maximum Likelihood

Example 7-13: Normal Distribution MLEs for μ & σ2 Let X be a normal random variable with both unknown mean μ and variance σ2. The likelihood function of a random sample of size n is: Sec 7-4.2 Method of Maximum Likelihood

Properties of an MLE Notes: Mathematical statisticians will often prefer MLEs because of these properties. Properties (1) and (2) state that MLEs are MVUEs. To use MLEs, the distribution of the population must be known or assumed. Sec 7-4.2 Method of Maximum Likelihood

Invariance Property This property is illustrated in Example 7-13. Sec 7-4.2 Method of Maximum Likelihood

Example 7-14: Invariance For the normal distribution, the MLEs were: Sec 7-4.2 Method of Maximum Likelihood

Complications of the MLE Method The method of maximum likelihood is an excellent technique, however there are two complications: 1. It may not be easy to maximize the likelihood function because the derivative function set to zero may be difficult to solve algebraically. It may not always be possible to use calculus methods directly to determine the maximum of L(ѳ). The following example illustrate this. Sec 7-4.2 Method of Maximum Likelihood

Example 7-16: Gamma Distribution MLE-1 Let X1, X2, …, Xn be a random sample from a gamma distribution. The log of the likelihood function is: Equating the above derivative to zero we get Sec 7-4.2 Method of Maximum Likelihood

Example 7-16: Gamma Distribution MLE-2 Figure 7-11 Log likelihood for the gamma distribution using the failure time data. (a) Log likelihood surface. (b) Contour plot. Sec 7-4.2 Method of Maximum Likelihood

Bayesian Estimation of Parameters-1 The moment and likelihood methods interpret probabilities as relative frequencies and are called objective frequencies. The random variable X has a probability distribution of parameter θ called f(x|θ). Additional information about θ is that it can be summarized as f(θ), the prior distribution, with mean μ0 and variance σ02. Probabilities associated with f(θ) are subjective probabilities. The joint distribution is f(x1, x2, …, xn|θ). The posterior distribution is f(θ|x1, x2, …, xn) is our degree of belief regarding θ after observing the sample data. 7-4.3 Bayesian Estimation of Parameters

Bayesian Estimation of Parameters-2 Now the joint probability distribution of the sample is f(x1, x2, …, xn, θ) = f(x1, x2, …, xn |θ) ∙ f(θ) The marginal distribution is: The desired posterior distribution is: 7-4.3 Bayesian Estimation of Parameters

Example 7-16: Bayes Estimator for the mean of a Normal Distribution -1 Let X1, X2, …, Xn be a random sample from a normal distribution unknown mean μ and known variance σ2. Assume that the prior distribution for μ is: The joint probability distribution of the sample is: 7-4.3 Bayesian Estimation of Parameters

Example 7-17: Bayes Estimator for the mean of a Normal Distribution-2 Now the joint probability distribution of the sample and μ is: Upon completing the square in the exponent, 7-4.3 Bayesian Estimation of Parameters

Example 7-17: Bayes Estimator for the mean of a Normal Distribution-3 which is recognized as a normal probability density function with posterior mean and posterior variance 7-4.3 Bayesian Estimation of Parameters

Example 7-17: Bayes Estimator for the mean of a Normal Distribution-4 To illustrate: The parameters are: μ0 = 0, σ02= 1 Sample: n = 10, x-bar = 0.75, σ2 = 4 7-4.3 Bayesian Estimation of Parameters

Important Terms & Concepts of Chapter 7 Bayes estimator Parameter estimation Bias in parameter estimation Point estimator Central limit theorem Population or distribution moments Estimator vs. estimate Posterior distribution Likelihood function Prior distribution Maximum likelihood estimator Sample moments Mean square error of an estimator Sampling distribution Minimum variance unbiased estimator An estimator has a: Standard error Moment estimator Estimated standard error Normal distribution as the sampling distribution of the: Statistic Statistical inference sample mean Unbiased estimator difference in two sample means Chapter 7 Summary