CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE

Slides:



Advertisements
Similar presentations
Regression Analysis.
Advertisements

Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Multiple Regression Analysis
CHAPTER 3: TWO VARIABLE REGRESSION MODEL: THE PROBLEM OF ESTIMATION
The Multiple Regression Model.
Tests of Significance for Regression & Correlation b* will equal the population parameter of the slope rather thanbecause beta has another meaning with.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Ch11 Curve Fitting Dr. Deshi Ye
Objectives (BPS chapter 24)
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
1 Lecture 2: ANOVA, Prediction, Assumptions and Properties Graduate School Social Science Statistics II Gwilym Pryce
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 2. Hypothesis Testing.
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Chapter 10 Simple Regression.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
Chapter 4 Multiple Regression.
All rights reserved by Dr.Bill Wan Sing Hung - HKBU 4A.1 Week 4a Multiple Regression The meaning of partial regression coefficients.
4. Multiple Regression Analysis: Estimation -Most econometric regressions are motivated by a question -ie: Do Canadian Heritage commercials have a positive.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
SIMPLE LINEAR REGRESSION
Chapter 11 Multiple Regression.
SIMPLE LINEAR REGRESSION
Simple Linear Regression Analysis
ECONOMETRICS I CHAPTER 5: TWO-VARIABLE REGRESSION: INTERVAL ESTIMATION AND HYPOTHESIS TESTING Textbook: Damodar N. Gujarati (2004) Basic Econometrics,
CHAPTER 2: TWO VARIABLE REGRESSION ANALYSIS: SOME BASIC IDEAS
SIMPLE LINEAR REGRESSION
Inference for regression - Simple linear regression
Chapter 13: Inference in Regression
Chapter 11 Simple Regression
Correlation and Linear Regression
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
THE PROBLEM OF INFERENCE
QMS 6351 Statistics and Research Methods Regression Analysis: Testing for Significance Chapter 14 ( ) Chapter 15 (15.5) Prof. Vera Adamchik.
Random Regressors and Moment Based Estimation Prepared by Vera Tabakova, East Carolina University.
Copyright © 2005 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics Thomas Maurice eighth edition Chapter 4.
1 Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
Y X 0 X and Y are not perfectly correlated. However, there is on average a positive relationship between Y and X X1X1 X2X2.
ANOVA Assumptions 1.Normality (sampling distribution of the mean) 2.Homogeneity of Variance 3.Independence of Observations - reason for random assignment.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
May 2004 Prof. Himayatullah 1 Basic Econometrics Chapter 7 MULTIPLE REGRESSION ANALYSIS: The Problem of Estimation.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
1 Prof. Dr. Rainer Stachuletz Multiple Regression Analysis y =  0 +  1 x 1 +  2 x  k x k + u 1. Estimation.
I271B QUANTITATIVE METHODS Regression and Diagnostics.
Chap 5 The Multiple Regression Model
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
8- Multiple Regression Analysis: The Problem of Inference The Normality Assumption Once Again Example 8.1: U.S. Personal Consumption and Personal Disposal.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Heteroscedasticity Chapter 8
23. Inference for regression
26134 Business Statistics Week 5 Tutorial
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
THE LINEAR REGRESSION MODEL: AN OVERVIEW
STOCHASTIC REGRESSORS AND THE METHOD OF INSTRUMENTAL VARIABLES
ASSET PRICE VOLATILITY: THE ARCH AND GARCH MODELS
THE LOGIT AND PROBIT MODELS
CHAPTER 29: Multiple Regression*
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
BOOTSTRAPPING: LEARNING FROM THE SAMPLE
LIMITED DEPENDENT VARIABLE REGRESSION MODELS
MULTIVARIATE REGRESSION MODELS
Basic Practice of Statistics - 3rd Edition Inference for Regression
Heteroskedasticity.
Chapter 7: The Normality Assumption and Inference with OLS
Presentation transcript:

CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE ECONOMETRICS I CHAPTER 8 MULTIPLE REGRESSION ANALYSIS: THE PROBLEM OF INFERENCE Textbook: Damodar N. Gujarati (2004) Basic Econometrics, 4th edition, The McGraw-Hill Companies

8.1 THE NORMALITY ASSUMPTION ONCE AGAIN We continue to assume that the ui follow the normal distribution with zero mean and constant variance σ2. With normality assumption we find that the OLS estimators of the partial regression coefficients are best linear unbiased estimators (BLUE).

8.1 THE NORMALITY ASSUMPTION ONCE AGAIN

8.1 THE NORMALITY ASSUMPTION ONCE AGAIN

8.2 EXAMPLE 8.1: CHILD MORTALITY EXAMPLE REVISITED

8.2 EXAMPLE 8.1: CHILD MORTALITY EXAMPLE REVISITED

8.3 HYPOTHESIS TESTING IN MULTIPLE REGRESSION: GENERAL COMMENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.4 HYPOTHESIS TESTING ABOUT INDIVIDUAL REGRESSION COEFFICIENTS

8.5 TESTING THE OVERALL SIGNIFICANCE OF THE SAMPLE REGRESSION

The Analysis of Variance Approach to Testing the Overall Significance of an Observed Multiple Regression: The F Test

The Analysis of Variance Approach to Testing the Overall Significance of an Observed Multiple Regression: The F Test

The Analysis of Variance Approach to Testing the Overall Significance of an Observed Multiple Regression: The F Test

The Analysis of Variance Approach to Testing the Overall Significance of an Observed Multiple Regression: The F Test

Testing the Overall Significance of a Multiple Regression: The F Test

Testing the Overall Significance of a Multiple Regression: The F Test

An Important Relationship between R2 and F

An Important Relationship between R2 and F

An Important Relationship between R2 and F where use is made of the definition R2 = ESS/TSS. Equation on the left shows how F and R2 are related. These two vary directly. When R2 = 0, F is zero ipso facto. The larger the R2, the greater the F value. In the limit, when R2 = 1, F is infinite. Thus the F test, which is a measure of the overall significance of the estimated regression, is also a test of significance of R2. In other words, testing the null hypothesis (8.5.9) is equivalent to testing the null hypothesis that (the population) R2 is zero.

An Important Relationship between R2 and F

Testing the Overall Significance of a Multiple Regression in Terms of R2

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable This F value is highly significant, as the computed p value is 0.0008.

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable This F value is highly significant, suggesting that addition of FLR to the model significantly increases ESS and hence the R2 value. Therefore, FLR should be added to the model. Again, note that if you square the t value of the FLR coefficient in the multiple regression (8.2.1), which is (−10.6293)2, you will obtain the F value of (8.5.17).

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

The “Incremental” or “Marginal” Contribution of an Explanatory Variable

8.6 TESTING THE EQUALITY OF TWO REGRESSION COEFFICIENTS

8.6 TESTING THE EQUALITY OF TWO REGRESSION COEFFICIENTS

8.6 TESTING THE EQUALITY OF TWO REGRESSION COEFFICIENTS

8.6 TESTING THE EQUALITY OF TWO REGRESSION COEFFICIENTS

8.6 TESTING THE EQUALITY OF TWO REGRESSION COEFFICIENTS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

8.7 RESTRICTED LEAST SQUARES: TESTING LINEAR EQUALITY RESTRICTIONS

General F Testing

General F Testing

General F Testing

General F Testing

General F Testing

General F Testing

General F Testing

General F Testing

General F Testing

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST

8.8 TESTING FOR STRUCTURAL OR PARAMETER STABILITY OF REGRESSION MODELS: THE CHOW TEST