1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the.

Slides:



Advertisements
Similar presentations
Further Inference in the Multiple Regression Model Hill et al Chapter 8.
Advertisements

Managerial Economics in a Global Economy
Multivariate Regression
Welcome to Econ 420 Applied Regression Analysis
Chapter 12 Simple Linear Regression
Forecasting Using the Simple Linear Regression Model and Correlation
Uji Kelinearan dan Keberartian Regresi Pertemuan 02 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
Conclusion to Bivariate Linear Regression Economics 224 – Notes for November 19, 2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Specification Error II
Studenmund(2006): Chapter 8
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Assumption MLR.3 Notes (No Perfect Collinearity)
Module II Lecture 6: Heteroscedasticity: Violation of Assumption 3
The Simple Linear Regression Model: Specification and Estimation
Chapter 13 Additional Topics in Regression Analysis
Multiple Linear Regression Model
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
CHAPTER 3 ECONOMETRICS x x x x x Chapter 2: Estimating the parameters of a linear regression model. Y i = b 1 + b 2 X i + e i Using OLS Chapter 3: Testing.
REGRESSION What is Regression? What is the Regression Equation? What is the Least-Squares Solution? How is Regression Based on Correlation? What are the.
CHAPTER 4 ECONOMETRICS x x x x x Multiple Regression = more than one explanatory variable Independent variables are X 2 and X 3. Y i = B 1 + B 2 X 2i +
SIMPLE LINEAR REGRESSION
Chapter Topics Types of Regression Models
Multicollinearity Omitted Variables Bias is a problem when the omitted variable is an explanator of Y and correlated with X1 Including the omitted variable.
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
1 Simple Linear Regression and Correlation Chapter 17.
Topic 3: Regression.
1 4. Multiple Regression I ECON 251 Research Methods.
Violations of Assumptions In Least Squares Regression.
Ekonometrika 1 Ekonomi Pembangunan Universitas Brawijaya.
1Prof. Dr. Rainer Stachuletz Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
Chapter 15: Model Building
Regression and Correlation Methods Judy Zhong Ph.D.
ECON 7710, Heteroskedasticity What is heteroskedasticity? What are the consequences? How is heteroskedasticity identified? How is heteroskedasticity.
Understanding Multivariate Research Berry & Sanders.
2-1 MGMG 522 : Session #2 Learning to Use Regression Analysis & The Classical Model (Ch. 3 & 4)
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Specification Error I.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Chapter 5: Regression Analysis Part 1: Simple Linear Regression.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin The Two-Variable Model: Hypothesis Testing chapter seven.
Inferential Statistics. The Logic of Inferential Statistics Makes inferences about a population from a sample Makes inferences about a population from.
Chapter 4 The Classical Model Copyright © 2011 Pearson Addison-Wesley. All rights reserved. Slides by Niels-Hugo Blunch Washington and Lee University.
1 Introduction What does it mean when there is a strong positive correlation between x and y ? Regression analysis aims to find a precise formula to relate.
1 B IVARIATE AND MULTIPLE REGRESSION Estratto dal Cap. 8 di: “Statistics for Marketing and Consumer Research”, M. Mazzocchi, ed. SAGE, LEZIONI IN.
Scatter Diagrams scatter plot scatter diagram A scatter plot is a graph that may be used to represent the relationship between two variables. Also referred.
Review Session Linear Regression. Correlation Pearson’s r –Measures the strength and type of a relationship between the x and y variables –Ranges from.
1. The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2. Homoscedasticity--the.
5-1 MGMG 522 : Session #5 Multicollinearity (Ch. 8)
Economics 20 - Prof. Anderson1 Time Series Data y t =  0 +  1 x t  k x tk + u t 1. Basic Analysis.
MULTICOLLINEARITY: WHAT HAPPENS IF THE REGRESSORS ARE CORRELATED?
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Chapter 4. The Normality Assumption: CLassical Normal Linear Regression Model (CNLRM)
Regression Overview. Definition The simple linear regression model is given by the linear equation where is the y-intercept for the population data, is.
Ch5 Relaxing the Assumptions of the Classical Model
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
REGRESSION DIAGNOSTIC II: HETEROSCEDASTICITY
Multivariate Regression
Fundamentals of regression analysis
Multiple Regression A curvilinear relationship between one variable and the values of two or more other independent variables. Y = intercept + (slope1.
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
REGRESSION DIAGNOSTIC I: MULTICOLLINEARITY
BEC 30325: MANAGERIAL ECONOMICS
Checking Assumptions Primary Assumptions Secondary Assumptions
Chapter 13 Additional Topics in Regression Analysis
Linear Regression Summer School IFPRI
Financial Econometrics Fin. 505
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

1.The independent variables do not form a linearly dependent set--i.e. the explanatory variables are not perfectly correlated. 2.Homoscedasticity --the probability distributions of the error term have a constant variance for all values of the independent variables ( X i 's). Assumptions of Regression Analysis

Perfect multicollinearity is a violation of assumption (1).Heteroscedasticity is a violation of assumption (2)

Suppose we wanted to estimate the following specification using quarterly time series data: Auto Sales t =  0 +  1 Income t +  2 Prices t where Income t is (nominal) income in quarter t and Prices t is an index of auto prices in quarter t. Multicollinearity is a problem with time series regression The data reveal there is a strong (positive) correlation between nominal income and car prices

0 (Nominal) income Car prices Approximate linear relationship between explanatory variables

Why is multicollinearity a problem? In the case of perfectly collinear explanatory variables, OLS does not work. In the case where there is an approximate linear relationship among the explanatory variables ( X i’s), the estimates of the coefficients are still unbiased, but you run into the following problems: –High standard errors of the estimates of the coefficients—thus low t-ratios –Co-mingling of the effects of explanatory variables. –Estimates of the coefficients tends to be “unstable.”

What do about multicollinearity Increase sample size Delete one or more explanatory variables

Understanding heteroscedasticity This problem pops up when using cross sectional data

Consider the following model: Y i is the “determined” part of the equation and ε i is the error term. Remember we assume in regression that : E(ε i ) =0

JAR #1JAR #2  = 0 Two distributions with the same mean and different variances

X1X1 X2X2 X2X2 X Y 0 f(x) The disturbance distributions of heteroscedasticity

Household Income Spending for electronics Scatter diagram of ascending heteroscedasticity

Why is heteroscedasticity a problem? Heteroscedasticity does not give us biased estimates of the coefficients--however, it does make the standard errors of the estimates unreliable. That is, we will understate the standard errors. Due to the aforementioned problem, t-tests cannot be trusted. We run the risk of rejecting a null hypothesis that should not be rejected.