FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x 2 +...  k x k + u 1. Estimation.

Slides:



Advertisements
Similar presentations
Multiple Regression Analysis
Advertisements

The Simple Regression Model
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
Chapter 12 Simple Linear Regression
3.3 Omitted Variable Bias -When a valid variable is excluded, we UNDERSPECIFY THE MODEL and OLS estimates are biased -Consider the true population model:
3.2 OLS Fitted Values and Residuals -after obtaining OLS estimates, we can then obtain fitted or predicted values for y: -given our actual and predicted.
Ch.6 Simple Linear Regression: Continued
The General Linear Model. The Simple Linear Model Linear Regression.
1 Chapter 2 Simple Linear Regression Ray-Bing Chen Institute of Statistics National University of Kaohsiung.
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Assumption MLR.3 Notes (No Perfect Collinearity)
Some Terms Y =  o +  1 X Regression of Y on X Regress Y on X X called independent variable or predictor variable or covariate or factor Which factors.
The Simple Linear Regression Model: Specification and Estimation
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 2. Hypothesis Testing.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Economics 20 - Prof. Anderson1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
Simple Linear Regression
Econ Prof. Buckles1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
1Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 6. Heteroskedasticity.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Multiple Regression Analysis
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Multiple Regression Analysis
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 3. Asymptotic Properties.
Topic 3: Regression.
The Simple Regression Model
Topic4 Ordinary Least Squares. Suppose that X is a non-random variable Y is a random variable that is affected by X in a linear fashion and by the random.
Lecture 1 (Ch1, Ch2) Simple linear regression
Lecture 2 (Ch3) Multiple linear regression
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
Economics Prof. Buckles
Multivariate Regression Analysis Estimation. Why Multiple Regression? Suppose you want to estimate the effect of x1 on y, but you know the following:
Ordinary Least Squares
3. Multiple Regression Analysis: Estimation -Although bivariate linear regressions are sometimes useful, they are often unrealistic -SLR.4, that all factors.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
2.4 Units of Measurement and Functional Form -Two important econometric issues are: 1) Changing measurement -When does scaling variables have an effect.
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
1 Javier Aparicio División de Estudios Políticos, CIDE Primavera Regresión.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 We will now look at the properties of the OLS regression estimators with the assumptions of Model B. We will do this within the context of the simple.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Chap 5 The Multiple Regression Model
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
Multiple Regression Analysis: Estimation. Multiple Regression Model y = ß 0 + ß 1 x 1 + ß 2 x 2 + …+ ß k x k + u -ß 0 is still the intercept -ß 1 to ß.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Lecture 6 Feb. 2, 2015 ANNOUNCEMENT: Lab session will go from 4:20-5:20 based on the poll. (The majority indicated that it would not be a problem to chance,
The simple linear regression model and parameter estimation
Ch. 2: The Simple Regression Model
Multiple Regression Analysis: Estimation
Regression.
Welcome to Econ 420 Applied Regression Analysis
The Simple Regression Model
Multiple Regression Analysis
Econ 3790: Business and Economics Statistics
Ch. 2: The Simple Regression Model
Chapter 6: MULTIPLE REGRESSION ANALYSIS
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
Simple Linear Regression
Multiple Regression Analysis: Estimation
The Simple Regression Model
Linear Regression Summer School IFPRI
Presentation transcript:

FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation

FIN357 Li2 Similar to with Simple Regression  0 is still the intercept  1 to  k all called slope parameters u is still the error term (or disturbance) Still assume that E(u|x 1,x 2, …,x k ) = 0 Still minimizing the sum of squared residuals, so have k+1 first order conditions

FIN357 Li3 Interpreting the coefficient

FIN357 Li4 Simple vs Multiple Reg Estimate

FIN357 Li5 Goodness-of-Fit

FIN357 Li6 Goodness-of-Fit (continued)  Can compute the fraction of the total sum of squares (SST) that is explained by the model.  R 2 = SSE/SST = 1 – SSR/SST

FIN357 Li7 Too Many or Too Few Variables What happens if we include variables in our specification that don’t belong? OLS estimators remain unbiased. Our test will still be valid, but less powerful (meaning t-statistics may be smaller in magnitude, and less likely to detect significant relationship) What if we exclude a variable from our specification that does belong? OLS estimators will usually be biased

FIN357 Li8 Omitted Variable Bias

FIN357 Li9 Omitted Variable Bias (cont)

FIN357 Li10 Omitted Variable Bias Summary Two cases where there is no bias using a simple regression  2 = 0, that is x 2 doesn’t really belong in model x 1 and x 2 are uncorrelated in the sample.

FIN357 Li11 Assumptions for Unbiasedness Population model is linear in parameters: y =  0 +  1 x 1 +  2 x 2 +…+  k x k + u We use a random sample of size n, {(x i1, x i2,…, x ik, y i ): i=1, 2, …, n}, from the population E(u|x 1, x 2,… x k ) = 0. None of the x’s is constant, and there are no exact linear relationships among x’s

FIN357 Li12 Given the above 4 assumptions OLS estimators of coefficients will be unbiased

FIN357 Li13 Variance of the OLS Estimators Assume Var(u|x 1, x 2,…, x k ) =  2 (Homoskedasticity) The 4 assumptions (previous page) for unbiasedness, plus this homoskedasticity assumption are known as the Gauss-Markov assumptions

FIN357 Li14 The Gauss-Markov Theorem Given our 5 Gauss-Markov Assumptions it can be shown that OLS is “BLUE” Best Linear Unbiased Estimator Thus, if the assumptions hold, use OLS