Week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions.

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Chapter 7. Statistical Estimation and Sampling Distributions
Statistical Estimation and Sampling Distributions
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Objectives (BPS chapter 24)
The General Linear Model. The Simple Linear Model Linear Regression.
Simple Linear Regression
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
The Simple Linear Regression Model: Specification and Estimation
Chapter 10 Simple Regression.
Simple Linear Regression
Part 2b Parameter Estimation CSE717, FALL 2008 CUBS, Univ at Buffalo.
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
Maximum-Likelihood estimation Consider as usual a random sample x = x 1, …, x n from a distribution with p.d.f. f (x;  ) (and c.d.f. F(x;  ) ) The maximum.
Multivariate Regression Model y =    x1 +  x2 +  x3 +… +  The OLS estimates b 0,b 1,b 2, b 3.. …. are sample statistics used to estimate 
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Violations of Assumptions In Least Squares Regression.
July 3, Department of Computer and Information Science (IDA) Linköpings universitet, Sweden Minimal sufficient statistic.
Statistics 350 Lecture 17. Today Last Day: Introduction to Multiple Linear Regression Model Today: More Chapter 6.
Chapter 11 Simple Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares.
Chapter 7 Point Estimation
Properties of OLS How Reliable is OLS?. Learning Objectives 1.Review of the idea that the OLS estimator is a random variable 2.How do we judge the quality.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 Copyright © 2007 Thomson Asia Pte. Ltd. All rights reserved. CH5 Multiple Regression Analysis: OLS Asymptotic 
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Geology 5670/6670 Inverse Theory 21 Jan 2015 © A.R. Lowry 2015 Read for Fri 23 Jan: Menke Ch 3 (39-68) Last time: Ordinary Least Squares Inversion Ordinary.
The Simple Linear Regression Model: Specification and Estimation ECON 4550 Econometrics Memorial University of Newfoundland Adapted from Vera Tabakova’s.
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
CLASSICAL NORMAL LINEAR REGRESSION MODEL (CNLRM )
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
Week 31 The Likelihood Function - Introduction Recall: a statistical model for some data is a set of distributions, one of which corresponds to the true.
Week 21 Order Statistics The order statistics of a set of random variables X 1, X 2,…, X n are the same random variables arranged in increasing order.
Parameter Estimation. Statistics Probability specified inferred Steam engine pump “prediction” “estimation”
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
Computacion Inteligente Least-Square Methods for System Identification.
Conditional Expectation
Week 21 Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
STA302/1001 week 11 Regression Models - Introduction In regression models, two types of variables that are studied:  A dependent variable, Y, also called.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Stat 223 Introduction to the Theory of Statistics
Probability Theory and Parameter Estimation I
BIVARIATE REGRESSION AND CORRELATION
Simple Linear Regression - Introduction
t distribution Suppose Z ~ N(0,1) independent of X ~ χ2(n). Then,
Random Sampling Population Random sample: Statistics Point estimate
Inference about the Slope and Intercept
Regression Models - Introduction
More about Normal Distributions
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Inference about the Slope and Intercept
Statistical Assumptions for SLR
مدلسازي تجربي – تخمين پارامتر
Simple Linear Regression
Stat 223 Introduction to the Theory of Statistics
Simple Linear Regression
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Statistical Model A statistical model for some data is a set of distributions, one of which corresponds to the true unknown distribution that produced.
Inference about the Slope and Intercept
Regression Models - Introduction
Presentation transcript:

week 21 Statistical Assumptions for SLR  Recall, the simple linear regression model is Y i = β 0 + β 1 X i + ε i where i = 1, …, n.  The assumptions for the simple linear regression model are: 1) E(ε i )=0 2) Var(ε i ) = σ 2 3) ε i ’s are uncorrelated. These assumptions are also called Gauss-Markov conditions. The above assumptions can be stated in terms of Y’s…

week 22 Possible Violations of Assumptions Straight line model is inappropriate… Var(Y i ) increase with X i …. Linear model is not appropriate for all the data…

week 23 Properties of Least Squares Estimates The least-square estimates b 0 and b 1 are linear in Y’s. That it, there exists constants c i, d i such that, Proof: Exercise.. The least squares estimates are unbiased estimators for β 0 and β 1. Proof:…

week 24 Gauss-Markov Theorem The least-squares estimates are BLUE (Best Linear, Unbiased Estimators). Of all the possible linear, unbiased estimators of β 0 and β 1 the least squares estimates have the smallest variance. The variance of the least-squares estimates is…

week 25 Estimation of Error Term Variance σ 2 The variance σ 2 of the error terms ε i ’s needs to be estimated to obtain indication of the variability of the probability distribution of Y. Further, a variety of inferences concerning the regression function and the prediction of Y require an estimate of σ 2. Recall, for random variable Z the estimates of the mean and variance of Z based on n realization of Z are…. Similarly, the estimate of σ 2 is S 2 is called the MSE – Mean Square Error it is an unbiased estimator of σ 2 (proof in Chapter 5).

week 26 Normal Error Regression Model In order to make inference we need one more assumption about ε i ’s. We assume that ε i ’s have a Normal distribution, that is ε i ~ N(0, σ 2 ). The Normality assumption implies that the errors ε i ’s are independent (since they are uncorrelated). Under the Normality assumption of the errors, the least squares estimates of β 0 and β 1 are equivalent to their maximum likelihood estimators. This results in additional nice properties of MLE’s: they are consistent, sufficient and MVUE.

week 27 Example: Calibrating a Snow Gauge Researchers wish to measure snow density in mountains using gamma ray transitions called “gain”. The measuring device needs to be calibrated. It is done with polyethylene blocks of known density. We want to know what density of snow results in particular readings from gamma ray detector. The variables are: Y- gain, X – density. Data: 9 densities in g/cm 3 and 10 measurements of gain for each.