CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION

Slides:



Advertisements
Similar presentations
The Simple Linear Regression Model Specification and Estimation Hill et al Chs 3 and 4.
Advertisements

Properties of Least Squares Regression Coefficients
Multiple Regression Analysis
Lesson 10: Linear Regression and Correlation
The Simple Regression Model
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Theory and Estimation of Regression Models Simple Regression Theory
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Bivariate Regression Analysis
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
Lecture 3 Cameron Kaplan
The Simple Linear Regression Model: Specification and Estimation
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Simple Linear Regression
3-variable Regression Derive OLS estimators of 3-variable regression
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
Econ 140 Lecture 71 Classical Regression Lecture 7.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Simple Linear Regression Analysis
Ordinary Least Squares
ECONOMETRICS I CHAPTER 5: TWO-VARIABLE REGRESSION: INTERVAL ESTIMATION AND HYPOTHESIS TESTING Textbook: Damodar N. Gujarati (2004) Basic Econometrics,
Introduction to Linear Regression and Correlation Analysis
Chapter 11 Simple Regression
5.1 Basic Estimation Techniques  The relationships we theoretically develop in the text can be estimated statistically using regression analysis,  Regression.
MTH 161: Introduction To Statistics
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1Spring 02 First Derivatives x y x y x y dy/dx = 0 dy/dx > 0dy/dx < 0.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Chapter Three TWO-VARIABLEREGRESSION MODEL: THE PROBLEM OF ESTIMATION
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Chapter 2 Ordinary Least Squares Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Chap 5 The Multiple Regression Model
The Instrumental Variables Estimator The instrumental variables (IV) estimator is an alternative to Ordinary Least Squares (OLS) which generates consistent.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Econometrics III Evgeniya Anatolievna Kolomak, Professor.
Multiple Regression Analysis: Estimation
Objectives By the end of this lecture students will:
The Simple Linear Regression Model: Specification and Estimation
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
Chapter 5: The Simple Regression Model
Evgeniya Anatolievna Kolomak, Professor
ECONOMETRICS DR. DEEPTI.
Chapter 3: TWO-VARIABLE REGRESSION MODEL: The problem of Estimation
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
Two-Variable Regression Model: The Problem of Estimation
Chapter 6: MULTIPLE REGRESSION ANALYSIS
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Linear regression Fitting a straight line to observations.
Simple Linear Regression
Linear Regression Summer School IFPRI
Ch3 The Two-Variable Regression Model
Financial Econometrics Fin. 505
Regression Models - Introduction
Presentation transcript:

CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION ECONOMETRICS I CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION Textbook: Damodar N. Gujarati (2004) Basic Econometrics, 4th edition, The McGraw-Hill Companies

3.1 THE METHOD OF ORDINARY LEAST SQUARES PRF: SRF: How is SRF determined? We do not minimize the sum of the residuals! Why not?

Least squares criterion

3.1 THE METHOD OF ORDINARY LEAST SQUARES We adopt the least-squares criterion We want to minimize the sum of the squared residuals. This sum is a function of estimated parameters: Normal equations:

3.1 THE METHOD OF ORDINARY LEAST SQUARES Solving the normal equations simultaneously, we obtain the following: Beta2-hat can be alternatively expressed as the following:

Three Statistical Properties of OLS Estimators I. The OLS estimators are expressed solely in terms of the observable quantities (i.e. X and Y). Therefore they can easily be computed. II. They are point estimators (not interval estimators). Given the sample, each estimator provide only a single (point) value of the relevant population parameter. III. Once the OLS estimates are obtained from the sample data, the sample regression line can be easily obtained.

The properties of the regression line It passes through the sample means of Y and X.

The properties of the regression line 2.

The properties of the regression line 3. The mean value of the residuals is zero.

The properties of the regression line 4. 5.

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares

Example of perfect multicollinearity: X1 = 2X2+X3 3.2 The Classical Linear Regression Model: The Assumptions Underlying the Method of Least Squares Example of perfect multicollinearity: X1 = 2X2+X3 Y X1 X2 X3 6 5 2 1 11 10 4 17 22 16 25 19 8 3 33 15

PRECISION OR STANDARD ERRORS OF LEAST SQUARES ESTIMATES var: variance se: standard error : the constant homoscedastic variance of ui : the standard error of the estimate : OLS estimator of

Gauss – Markov Theorem An estimator, say the OLS estimator , is said to be a best linear unbiased estimator (BLUE) of β2 if the following hold:

The coefficient of determination r2 TSS: total sum of squares ESS: explained sum of squares RSS: residual sum of squares

The coefficient of determination r2 The quantity r2 thus defined is known as the (sample) coefficient of determination and is the most commonly used measure of the goodness of fit of a regression line. Verbally, r2 measures the proportion or percentage of the total variation in Y explained by the regression model.

The coefficient of determination r2

The coefficient of determination r2

The coefficient of correlation r r is the sample correlation coeffient

Some of the properties of r

Homework Study the numerical example on pages 87-90. There will be questions on the midterm exam similar to the ones in this example. Data on page 88:

Homework

Homework