Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.

Slides:



Advertisements
Similar presentations
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 6 Point Estimation.
Advertisements

Point Estimation Notes of STAT 6205 by Dr. Fan.
Previous Lecture: Distributions. Introduction to Biostatistics and Bioinformatics Estimation I This Lecture By Judy Zhong Assistant Professor Division.
Week 11 Review: Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Week11 Parameter, Statistic and Random Samples A parameter is a number that describes the population. It is a fixed number, but in practice we do not know.
Chapter 7. Statistical Estimation and Sampling Distributions
Chapter 7 Title and Outline 1 7 Sampling Distributions and Point Estimation of Parameters 7-1 Point Estimation 7-2 Sampling Distributions and the Central.
Statistical Estimation and Sampling Distributions
Sampling: Final and Initial Sample Size Determination
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Maximum likelihood (ML) and likelihood ratio (LR) test
Maximum likelihood Conditional distribution and likelihood Maximum likelihood estimations Information in the data and likelihood Observed and Fisher’s.
Chapter 6 Introduction to Sampling Distributions
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Maximum likelihood (ML)
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Parametric Inference.
1 STATISTICAL INFERENCE PART I EXPONENTIAL FAMILY & POINT ESTIMATION.
Maximum likelihood (ML)
Estimation Basic Concepts & Estimation of Proportions
QBM117 Business Statistics Estimating the population mean , when the population variance  2, is known.
STATISTICAL INFERENCE PART I POINT ESTIMATION
Random Sampling, Point Estimation and Maximum Likelihood.
Consistency An estimator is a consistent estimator of θ, if , i.e., if
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
CLASS: B.Sc.II PAPER-I ELEMENTRY INFERENCE. TESTING OF HYPOTHESIS.
Week 41 How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Chapter 8 Estimation ©. Estimator and Estimate estimator estimate An estimator of a population parameter is a random variable that depends on the sample.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Conditional Expectation
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Virtual University of Pakistan
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Section 9.1: Parameter estimation CIS Computational Probability.
Parameter, Statistic and Random Samples
Standard Errors Beside reporting a value of a point estimate we should consider some indication of its precision. For this we usually quote standard error.
STATISTICS POINT ESTIMATION
12. Principles of Parameter Estimation
6 Point Estimation.
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
STATISTICAL INFERENCE PART I POINT ESTIMATION
Parameter, Statistic and Random Samples
ECO 173 Chapter 10: Introduction to Estimation Lecture 5a
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
CONCEPTS OF ESTIMATION
More about Posterior Distributions
Bootstrap - Example Suppose we have an estimator of a parameter and we want to express its accuracy by its standard error but its sampling distribution.
Section 9.1 Sampling Distributions
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Regression Models - Introduction
Statistical Assumptions for SLR
POINT ESTIMATOR OF PARAMETERS
6 Point Estimation.
STATISTICAL INFERENCE PART I POINT ESTIMATION
Chapter 8: Fundamental Sampling Distributions and Data Descriptions:
Chapter 9 Chapter 9 – Point estimation
12. Principles of Parameter Estimation
Chapter 8 Estimation.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Applied Statistics and Probability for Engineers
Regression Models - Introduction
Fundamental Sampling Distributions and Data Descriptions
Presentation transcript:

Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced the data. The statistical model corresponds to the information a statistician brings to the application about what the true distribution is or at least what he or she is willing to assume about it. The variable θ is called the parameter of the model, and the set Ω is called the parameter space. From the definition of a statistical model, we see that there is a unique value , such that fθ is the true distribution that generated the data. We refer to this value as the true parameter value. STA248 week 3

Examples Suppose there are two manufacturing plants for machines. It is known that the life lengths of machines built by the first plant have an Exponential(1) distribution, while machines manufactured by the second plant have life lengths distributed Exponential(1.5). You have purchased five of these machines and you know that all five came from the same plant but do not know which plant. Further, you observe the life lengths of these machines, obtaining a sample (x1, …, x5) and want to make inference about the true distribution of the life lengths of these machines. Suppose we have observations of heights in cm of individuals in a population and we feel that it is reasonable to assume that the distribution of height is the population is normal with some unknown mean and variance. The statistical model in this case is where Ω = R×R+, where R+ = (0, ∞). STA248 week 3

Goals of Statistics Estimate unknown parameters of underlying probability distribution. Measure errors of these estimates. Test whether data gives evidence that parameters are (or are not) equal to a certain value or the probability distribution have a particular form. STA248 week 3

Point Estimation Most statistical procedures involve estimation of the unknown value of the parameter of the statistical model. A point estimator of the parameter θ is a function of the underlying random variables and so it is a random variable with a distribution function. A point estimate of the parameter θ is a function on the data; it is a statistic. For a given sample an estimate is a number. Notation… STA248 week 3

What Makes a Good Estimator? Unbiased Consistent Minimum variance With know probability distribution STA248 week 3

Properties of Point Estimators - Unbiased Let be a point estimator for a parameter θ. Then is an unbiased estimator if There may not always exist an unbiased estimator for θ. unbiased for θ, does not mean is unbiased for g(θ). STA248 week 3

Example - Common Point Estimators A natural estimate for the population mean μ is the sample mean (in any distribution). The sample mean is an unbiased estimator of the population mean. There are two common estimator for the population variance … STA248 week 3

Claim Let X1, X2,…, Xn be random sample of size n from a population with mean μ and variance σ2 . The sample variance s2 is an unbiased estimator of the population variance σ2. Proof… STA248 week 3

Example Suppose X1, X2,…, Xn is a random sample from U(0, θ) distribution. Let . Find the density of and its mean. Is unbiased? STA248 week 3

Asymptotically Unbiased Estimators An estimator is asymptotically unbiased if Example: STA248 week 3

Consistency An estimator is a consistent estimator of θ, if , i.e., if converge in probability to θ. STA248 week 3

Minimum Variance An estimator for θ is a function of underlying random variables and so it is a random variable and has its own probability distribution function. This probability distribution is called sampling distribution. We can use the sampling distribution to get variance of an estimator. A better estimate has smaller variance; it is more likely to produce estimated close to the true value of the parameter if it is unbiased. The standard deviation of the sampling distribution of an estimator is usually called the standard error of the estimator. STA248 week 3

Examples STA248 week 3

How to find estimators? There are two main methods for finding estimators: 1) Method of moments. 2) The method of Maximum likelihood. Sometimes the two methods will give the same estimator. STA248 week 3

Method of Moments The method of moments is a very simple procedure for finding an estimator for one or more parameters of a statistical model. It is one of the oldest methods for deriving point estimators. Recall: the k moment of a random variable is These will very often be functions of the unknown parameters. The corresponding k sample moment is the average . The estimator based on the method of moments will be the solutions to the equation μk = mk. STA248 week 3

Examples STA248 week 3

The Likelihood Function Let x1, …, xn be sample observations taken on corresponding random variables X1, …, Xn whose distribution depends on a parameter θ. The likelihood function defined on the parameter space Ω is given by Note that for the likelihood function we are fixing the data, x1,…, xn, and varying the value of the parameter. The value L(θ | x1, …, xn) is called the likelihood of θ. It is the probability of observing the data values we observed given that θ is the true value of the parameter. It is not the probability of θ given that we observed x1, …, xn. STA248 week 3

Maximum Likelihood Estimators In the likelihood function, different values of θ will attach different probabilities to a particular observed sample. The likelihood function, L(θ | x1, …, xn), can be maximized over θ, to give the parameter value that attaches the highest possible probability to a particular observed sample. We can maximize the likelihood function to find an estimator of θ. This estimator is a statistics – it is a function of the sample data. It is denoted by STA248 week 3

The log likelihood function l(θ) = ln(L(θ)) is the log likelihood function. Both the likelihood function and the log likelihood function have their maximums at the same value of It is often easier to maximize l(θ). STA248 week 3

Examples STA248 week 3

Properties of MLE Maximum likelihood estimators (MLEs) are consistent. The MLE of any parameter is asymptotically unbiased. MLE has variance that is nearly as small as can be achieved by any estimator (asymptotically). Distribution of MLSs is approximately Normal (asymptotically). STA248 week 3