Chapter 6 Introduction to Multiple Regression. 2 Outline 1. Omitted variable bias 2. Causality and regression analysis 3. Multiple regression and OLS.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Kin 304 Regression Linear Regression Least Sum of Squares
Welcome to Econ 420 Applied Regression Analysis
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Linear Regression with Multiple Regressors
Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Chapter 6 Linear Regression with Multiple Regressors.
Linear Regression with Multiple Regressors
Regression with a Binary Dependent Variable
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Chapter 7 Hypothesis Tests and Confidence Intervals in Multiple Regression.
Linear Regression with One Regression
Chapter 15 Estimation of Dynamic Causal Effects. 2 Estimation of Dynamic Causal Effects (SW Chapter 15)
Additional Topics in Regression Analysis
Chapter 4 Multiple Regression.
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Chapter 8 Nonlinear Regression Functions. 2 Nonlinear Regression Functions (SW Chapter 8)
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
FIN357 Li1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
Topic 3: Regression.
Chapter 9 Assessing Studies Based on Multiple Regression.
Analysis of Covariance Goals: 1)Reduce error variance. 2)Remove sources of bias from experiment. 3)Obtain adjusted estimates of population means.
1Prof. Dr. Rainer Stachuletz Panel Data Methods y it =  0 +  1 x it  k x itk + u it.
Stat 112: Lecture 9 Notes Homework 3: Due next Thursday
Multivariate Regression Analysis Estimation. Why Multiple Regression? Suppose you want to estimate the effect of x1 on y, but you know the following:
Simple Linear Regression Analysis
Nonlinear Regression Functions
Chapter 11 Simple Regression
Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Chapter 4 Linear Regression with One Regressor.
Assessing Studies Based on Multiple Regression
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
Introduction: Research design Spring
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
MTH 161: Introduction To Statistics
OLS SHORTCOMINGS Preview of coming attractions. QUIZ What are the main OLS assumptions? 1.On average right 2.Linear 3.Predicting variables and error term.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Roger B. Hammer Assistant Professor Department of Sociology Oregon State University Conducting Social Research Ordinary Least Squares Regression.
Lecture Slide #1 Logistic Regression Analysis Estimation and Interpretation Hypothesis Tests Interpretation Reversing Logits: Probabilities –Averages.
Chapter 5 Regression with a Single Regressor: Hypothesis Tests and Confidence Intervals.
如何解釋迴歸估計結果. Class sizes and test scores Empirical problem: Class size and educational output.  Policy question: What is the effect of reducing class.
Chapter 13 Multiple Regression
Stat 112 Notes 9 Today: –Multicollinearity (Chapter 4.6) –Multiple regression and causal inference.
Instrumental Variables: Introduction Methods of Economic Investigation Lecture 14.
ECON 338/ENVR 305 CLICKER QUESTIONS Statistics – Question Set #8 (from Chapter 10)
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Class 5 Multiple Regression CERAM February-March-April 2008 Lionel Nesta Observatoire Français des Conjonctures Economiques
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
The General Linear Model. Estimation -- The General Linear Model Formula for a straight line y = b 0 + b 1 x x y.
1 AAEC 4302 ADVANCED STATISTICAL METHODS IN AGRICULTURAL RESEARCH Part II: Theory and Estimation of Regression Models Chapter 5: Simple Regression Theory.
Chapter 11 REGRESSION Multiple Regression  Uses  Explanation  Prediction.
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Linear Regression with One Regression
Regression Chapter 6 I Introduction to Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Regression.
Welcome to Econ 420 Applied Regression Analysis
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
Simple Linear Regression
Chapter 6: MULTIPLE REGRESSION ANALYSIS
Migration and the Labour Market
Instrumental Variables
Simple Linear Regression
Violations of Assumptions In Least Squares Regression
Estimation of Dynamic Causal Effects
Presentation transcript:

Chapter 6 Introduction to Multiple Regression

2 Outline 1. Omitted variable bias 2. Causality and regression analysis 3. Multiple regression and OLS 4. Measures of fit 5. Sampling distribution of the OLS estimator

3 Omitted Variable Bias (SW Section 6.1)

4 Omitted variable bias, ctd.

5

6

7

8 The omitted variable bias formula:

9

10 Digression on causality and regression analysis

11 Ideal Randomized Controlled Experiment  Ideal: subjects all follow the treatment protocol – perfect compliance, no errors in reporting, etc.!  Randomized: subjects from the population of interest are randomly assigned to a treatment or control group (so there are no confounding factors)  Controlled: having a control group permits measuring the differential effect of the treatment  Experiment: the treatment is assigned as part of the experiment: the subjects have no choice, so there is no “reverse causality” in which subjects choose the treatment they think will work best.

12 Back to class size:

13

14 Return to omitted variable bias

15 The Population Multiple Regression Model (SW Section 6.2)

16 Interpretation of coefficients in multiple regression

17

18 The OLS Estimator in Multiple Regression (SW Section 6.3)

19 Example: the California test score data

20 Multiple regression in STATA

21 Measures of Fit for Multiple Regression (SW Section 6.4)

22 SER and RMSE

23 R 2 and

24 R 2 and, ctd.

25 Measures of fit, ctd.

26 The Least Squares Assumptions for Multiple Regression (SW Section 6.5)

27 Assumption #1: the conditional mean of u given the included X’s is zero.

28

29

30

31 The Sampling Distribution of the OLS Estimator (SW Section 6.6)

32 Multicollinearity, Perfect and Imperfect (SW Section 6.7)

33 The dummy variable trap

34 Perfect multicollinearity, ctd.

35 Imperfect multicollinearity

36 Imperfect multicollinearity, ctd.