Econ 140 Lecture 61 Inference about a Mean Lecture 6.

Slides:



Advertisements
Similar presentations
Previous Lecture: Distributions. Introduction to Biostatistics and Bioinformatics Estimation I This Lecture By Judy Zhong Assistant Professor Division.
Advertisements

CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Chapter 7. Statistical Estimation and Sampling Distributions
1 Virtual COMSATS Inferential Statistics Lecture-7 Ossam Chohan Assistant Professor CIIT Abbottabad.
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
Central Limit Theorem.
Point estimation, interval estimation
1 Introduction to Estimation Chapter Introduction Statistical inference is the process by which we acquire information about populations from.
Maximum likelihood (ML)
SUMS OF RANDOM VARIABLES Changfei Chen. Sums of Random Variables Let be a sequence of random variables, and let be their sum:
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 6 Introduction to Sampling Distributions.
Econ 140 Lecture 71 Classical Regression Lecture 7.
Multiple Regression Analysis
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
Multiple Regression Analysis
SAMPLING DISTRIBUTIONS. SAMPLING VARIABILITY
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 8 1Probability, Bayes’ theorem, random variables, pdfs 2Functions of.
Visual Recognition Tutorial
Class notes for ISE 201 San Jose State University
Statistical Background
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Inference about a Mean Part II
1 Inference About a Population Variance Sometimes we are interested in making inference about the variability of processes. Examples: –Investors use variance.
Review of Probability and Statistics
Today Today: Chapter 8, start Chapter 9 Assignment: Recommended Questions: 9.1, 9.8, 9.20, 9.23, 9.25.
Maximum likelihood (ML)
Standard error of estimate & Confidence interval.
Chapter 6: Sampling Distributions
Chapter 7 Estimation: Single Population
Investment Analysis and Portfolio management Lecture: 24 Course Code: MBF702.
STA Lecture 161 STA 291 Lecture 16 Normal distributions: ( mean and SD ) use table or web page. The sampling distribution of and are both (approximately)
AP Statistics Chapter 9 Notes.
QBM117 Business Statistics Estimating the population mean , when the population variance  2, is known.
1 Introduction to Estimation Chapter Concepts of Estimation The objective of estimation is to determine the value of a population parameter on the.
1 SAMPLE MEAN and its distribution. 2 CENTRAL LIMIT THEOREM: If sufficiently large sample is taken from population with any distribution with mean  and.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Geo479/579: Geostatistics Ch12. Ordinary Kriging (1)
Maximum Likelihood Estimator of Proportion Let {s 1,s 2,…,s n } be a set of independent outcomes from a Bernoulli experiment with unknown probability.
Chapter 7: Sample Variability Empirical Distribution of Sample Means.
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
Chapter 7 Sampling and Sampling Distributions ©. Simple Random Sample simple random sample Suppose that we want to select a sample of n objects from a.
8 Sampling Distribution of the Mean Chapter8 p Sampling Distributions Population mean and standard deviation,  and   unknown Maximal Likelihood.
Chapter 5 Parameter estimation. What is sample inference? Distinguish between managerial & financial accounting. Understand how managers can use accounting.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Principles of Parameter Estimation.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Statistics and Quantitative Analysis U4320 Segment 5: Sampling and inference Prof. Sharyn O’Halloran.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
Confidence Interval & Unbiased Estimator Review and Foreword.
Estimators and estimates: An estimator is a mathematical formula. An estimate is a number obtained by applying this formula to a set of sample data. 1.
Topic 5 - Joint distributions and the CLT
Review of Probability. Important Topics 1 Random Variables and Probability Distributions 2 Expected Values, Mean, and Variance 3 Two Random Variables.
5. Consistency We cannot always achieve unbiasedness of estimators. -For example, σhat is not an unbiased estimator of σ -It is only consistent -Where.
Review Normal Distributions –Draw a picture. –Convert to standard normal (if necessary) –Use the binomial tables to look up the value. –In the case of.
Chapter 5 Sampling Distributions. The Concept of Sampling Distributions Parameter – numerical descriptive measure of a population. It is usually unknown.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
G. Cowan Lectures on Statistical Data Analysis Lecture 9 page 1 Statistical Data Analysis: Lecture 9 1Probability, Bayes’ theorem 2Random variables and.
1 Estimation Chapter Introduction Statistical inference is the process by which we acquire information about populations from samples. There are.
Sampling Distributions Chapter 18. Sampling Distributions A parameter is a number that describes the population. In statistical practice, the value of.
Chapter 7, part D. VII. Sampling Distribution of The sampling distribution of is the probability distribution of all possible values of the sample proportion.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Virtual University of Pakistan
Chapter 6: Sampling Distributions
Inference: Conclusion with Confidence
STA 291 Spring 2010 Lecture 12 Dustin Lueker.
Inference: Conclusion with Confidence
Chapter 6: Sampling Distributions
Random Sampling Population Random sample: Statistics Point estimate
Basic Econometrics Chapter 4: THE NORMALITY ASSUMPTION:
Presentation transcript:

Econ 140 Lecture 61 Inference about a Mean Lecture 6

Econ 140 Lecture 62 Today’s Plan Start to explore more concrete ideas of statistical inference Look at the process of generalizing from the sample mean to the population value Consider properties of the sample mean as a point estimator Properties of an estimator: BLUE (Best Linear Unbiased Estimator)

Econ 140 Lecture 63 Sample and Population Differences So far we’ve seen how weights connect a sample and a population But what if all we have is a sample without any weights? What can the estimation of the mean tell us? We need the sample to be composed of independently random and identically drawn observations

Econ 140 Lecture 64 Estimating the Expected value We’ve dealt with the expected value µ y =E(Y) and the variance V(Y) =  2 Previously, our estimation of the expected value of the mean was –But this is only only a good estimate of the true expected value if the sample is an unbiased representation of the population –What does the actual estimator tell us?

Econ 140 Lecture 65 BLUE We need to consider the properties of as a point estimator of µ Three properties of an estimator : BLUE –Best (efficiency) –Linearity –Unbiasedness –(also Consistency) We’ll look at linearity first, then unbiasedness and efficiency

Econ 140 Lecture 66 BLUE: Linearity is a linear function of sample observations The values of Y are added up in a linear fashion such that all Y values appear with weight equal

Econ 140 Lecture 67 BLUE: Unbiasedness Proving that is an unbiased estimator of the expected value of µ We can rewrite the equation for –This expression says that each Y has an equal weight of 1/n Since c i is a constant, the expectation of is

Econ 140 Lecture 68 Proving Unbiasedness Lets examine an estimator that is biased and inefficient We can define some other estimator m as We can then plug the equation for c’ into the equation for m and take its expectation The expected value of this new estimator m is biased if

Econ 140 Lecture 69 BLUE: Best (Efficiency) To look at efficiency, we want to consider the variance of We can redefine as Our variance can be written as –Where the last term is the covariance term –Covariance cancels out because we are assuming that the sample was constructed under independence. So there should be no covariance between the Y values –Note: we’ll see later in the semester that covariance will not always be zero

Econ 140 Lecture 610 BLUE: Best (Efficiency) (2) So how did we get the equation for the variance of ?

Econ 140 Lecture 611 Variance Our expression for variance shows that the variance of is dependent on the sample size n How is this different from the variance of Y?

Econ 140 Lecture 612 Variance (2) Before when we were considering the distribution around µ y we were considering the distribution of Y Now we are considering as a point estimator for µ y –The estimate for will have its own probability distribution much like Y had its own –The difference is that the distribution for has a variance of  2 /n whereas Y has a variance of  2

Econ 140 Lecture 613 Proving Efficiency The variance of m looks like this V(m) =  i c i ’ 2 V(Y) +  h  i c h ’c i ’C(Y h Y i ) Why is this not the most efficient estimate? We have an inefficient estimator if we use anything other than c i for weights

Econ 140 Lecture 614 Consistency This isn’t directly a part of BLUE The idea is that an optimal estimator is best, linear, and unbiased But, an estimator can be biased or unbiased and still be consistent Consistency means that with repeated sampling, the estimator tends to the same value for

Econ 140 Lecture 615 Consistency (2) We write our estimator of µ as We can write a second estimator of µ The expected value of Y* is

Econ 140 Lecture 616 Consistency (3) If n is small, say 10, –Y* will be a biased estimator of µ –But, Y* will be a consistent estimator –So as n approaches infinity Y* becomes an unbiased estimator of µ

Econ 140 Lecture 617 Law of Large Numbers Think of this picture: As you draw samples of larger and larger size, the law of large numbers says that your estimation of the sample mean will become a better approximation of µ y The law only hold if you are drawing random samples

Econ 140 Lecture 618 Central Limit Theorem Even if the underlying population is not normally distributed, the sampling distribution of the mean tends to normality as sample size increases –This is an important result if n < 30

Econ 140 Lecture 619 What have we done today? Examined the properties of an estimator. Estimator was for the estimation of a value for an unknown population mean. Desirable properties are BLUE: Best Linear Unbiased. Also should include consistency.