Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.

Slides:



Advertisements
Similar presentations
Probability models- the Normal especially.
Advertisements

Regression Analysis.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Nine.
Multiple Regression W&W, Chapter 13, 15(3-4). Introduction Multiple regression is an extension of bivariate regression to take into account more than.
Welcome to Econ 420 Applied Regression Analysis
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Ten.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Fourteen.
Forecasting Using the Simple Linear Regression Model and Correlation
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Chapter 11 Autocorrelation.
Regression Analysis Notes. What is a simple linear relation? When one variable is associated with another variable in such a way that two numbers completely.
Introduction and Overview
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 21 Autocorrelation and Inferences about the Slope.
Chapter 13 Additional Topics in Regression Analysis
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 11 th Edition.
Additional Topics in Regression Analysis
January 6, morning session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Prentice-Hall, Inc.Chap 13-1 Statistics for Managers Using Microsoft® Excel 5th Edition Chapter.
Marietta College Week 13 1 Tuesday, April 5 2 Exam 3: Monday, April 25, 12- 2:30PM.
REGRESSION MODEL ASSUMPTIONS. The Regression Model We have hypothesized that: y =  0 +  1 x +  | | + | | So far we focused on the regression part –
Chapter Topics Types of Regression Models
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Topic 3: Regression.
Multiple Regression and Correlation Analysis
Linear Regression Example Data
Slide Copyright © 2010 Pearson Education, Inc. Active Learning Lecture Slides For use with Classroom Response Systems Business Statistics First Edition.
© 2000 Prentice-Hall, Inc. Chap Forecasting Using the Simple Linear Regression Model and Correlation.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 13-1 Chapter 13 Simple Linear Regression Basic Business Statistics 10 th Edition.
Chapter 7 Forecasting with Simple Regression
Introduction to Regression Analysis, Chapter 13,
Chapter 13 Simple Linear Regression
Ordinary Least Squares
12 Autocorrelation Serial Correlation exists when errors are correlated across periods -One source of serial correlation is misspecification of the model.
Chapter 13 Simple Linear Regression
Marketing Research Aaker, Kumar, Day and Leone Tenth Edition
Introduction to Linear Regression and Correlation Analysis
Regression Method.
Understanding Multivariate Research Berry & Sanders.
© 2003 Prentice-Hall, Inc.Chap 13-1 Basic Business Statistics (9 th Edition) Chapter 13 Simple Linear Regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Two Ending Sunday, September 9 (Note: You must go over these slides and complete every.
Pure Serial Correlation
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Five Ending Wednesday, September 26 (Note: Exam 1 is on September 27)
EQT 373 Chapter 3 Simple Linear Regression. EQT 373 Learning Objectives In this chapter, you learn: How to use regression analysis to predict the value.
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Thirteen.
1Spring 02 Problems in Regression Analysis Heteroscedasticity Violation of the constancy of the variance of the errors. Cross-sectional data Serial Correlation.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Seven.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Eight.
Lecture 10: Correlation and Regression Model.
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Simple Linear Regression.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
Forecasting. Model with indicator variables The choice of a forecasting technique depends on the components identified in the time series. The techniques.
Chapter 13 Simple Linear Regression
REGRESSION DIAGNOSTIC III: AUTOCORRELATION
Kakhramon Yusupov June 15th, :30pm – 3:00pm Session 3
Simple Linear Regression
Fundamentals of regression analysis
Chapter 13 Simple Linear Regression
Pure Serial Correlation
Multiple Regression Chapter 14.
BEC 30325: MANAGERIAL ECONOMICS
BEC 30325: MANAGERIAL ECONOMICS
Presentation transcript:

Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve

Answer Key to Assignment 9 (30 points) 1.11, Page 131 No. The correlation coefficient r is not a slope from a line, like B is. It shows how STUDY and LIBRARY move together on a numerical scale, from –1 to 1. B is not on such a scale. B shows the movement in Y associated with a one-unit movement in X. If there is more than one independent variable, B is measured keeping the other independent variables constant. When r is calculated, none of the other independent variables present in the model are held constant.

Answer Key to Assignment 9 (30 points) 2. 13, page 132 a. –The correlation coefficient between INCOME and WEALTH is 0.82, which is high enough to indicate that there could be a multicollinearity problem, but it is not overwhelming evidence. –Running a regression where INCOME is the dependent variable and WEALTH is the independent variable or vice versa gives an F-statistic (and a t-statistic) that is statistically significant at a 1% error level. –The R2 is This provides only mild support that there is a multicollinearity problem. –The variance inflation factor is = 3. This indicates that multicollinearity is only a small problem, if it is a problem at all. b.There is some evidence of multicollinearity, but it does not seem to be a big problem.

Answer Key to Assignment 9 (30 points) 2. 13, page 132 c.INCOME and WEALTH should be correlated to some extent, since most people who have higher income will eventually have more wealth. d.It might seem that the answers are contradictory, but they are not. It might seem that if you have INCOME and WEALTH in the same model, there should be multicollinearity, but in this particular data set, there is enough variation between INCOME and WEALTH that multicollinearity is not a big problem. There must be some people in the data set who have high income but have not accumulated as much wealth as you might expect. Perhaps there are others in the data set who have lower income but have more wealth than you would expect, because they are especially thrifty or they inherited wealth. As stated in the chapter, multicollinearity is a characteristic displayed by the data. This means that for any model, one sample could give results that exhibit multicollinearity, but a different sample might not.

Autocorrelation (Chapter 7- Up to Page 145) Suppose we are using time series data to estimate consumption (C) as a function of income (Y) and other factors C t = B 1 + B 2 Y t +…..+ e t –Where t = (1, 2, 3, ….T) –This means that C 1 = B 1 + B 2 Y 1 +…. + e 1, and C 2 = B 1 + B 2 Y 2 +…. + e 2 ….. …… C T = B1 + B 2 Y T +…. + e T …… One of the classical assumptions regarding the error terms is –No correlation among the error terms If this assumption is violated then autocorrelation becomes a problem.

First Order Autocorrelation e 2 = ρ e 1 + u 2 –That is, the error term in period 2 depends on the error term in period 1 –Where, u 2 is a normally distributed error with mean of zero and constant variance

Second Order Autocorrelation e 3 = ρ 1 e 1 + ρ 2 e2 + u 3 –That is, the error term in period 3 depends on the error term in period 1 and the error term in period 2. –Where, u 3 is a normally distributed error with mean of zero and constant variance

Higher Order Autocorrelation e t = ρ 1 e t-1 + ρ 2 e t-2 + ρ 3 e t-3 + ….. + u t –That is, the error term in period t depends on the error term in period t-1, the error term in period t-2, and the error term in period t- 3,…etc. –Where, u t is a normally distributed error with mean of zero and constant variance

Types of Autocorrelation 1.Positive Errors form a pattern A positive error is usually followed by another positive error A negative error is usually followed by another negative error More common

Example of positive autocorrelation

Types of Autocorrelation 2. Negative A positive error is usually followed by a negative error or visa-versa Rare

Example of negative autocorrelation

Causes of Autocorrelation Wrong functional form Omitted variables Data error Lingering shock over time

Consequences of Autocorrelation Unbiased estimates but wrong standard errors –In case of positive autocorrelation standard error of the estimated coefficients drops –Consequences?

Should we suspect Autocorrelation? If you are using time series data definitely Easy to check 1.Run the regression 2.Plot residuals 3.If it looks like they are forming a pattern  suspect autocorrelation

A Formal Test For First Order Autocorrelation Durbin-Watson test Durbin Watson Stat. (d) It can be shown that d is approximately equal to 2 (1- ρ) What is d under perfect positive correlation? ρ = 1  d = 0 What is d under perfect negative correlation? ρ = -1  d = 4 What is d under no autocorrelation? ρ = 0  d = 2 What is the range of values for d? 0 to 4

EViews calculates d statistics If d >2, you will to test for negative autocorrelation. Null and alternative hypotheses –H0: ρ≥0 –HA: ρ<0 Choose the level of significance (1% or 5%) Critical dstat (page ) Decision rule –If d>4-d L  reject H0  there is significant negative first order autocorrelation –If d< 4-d U  don’t reject H0  there is no evidence of a significant autocorrelation – if d is between 4 – d L and 4 – d u  the test is inconclusive.

If d <2, test for positive autocorrelation. Null and alternative hypotheses –H0: ρ≤0 –HA: ρ>0 Choose the level of significance (1% or 5%) Critical dstat (page ) Decision rule –If d< d L  reject H0  there is significant positive first order autocorrelation –If d> d U  don’t reject H0  there is no evidence of a significant autocorrelation – if d is between d L and d u  the test is inconclusive.

Assignment 10 (30 points) Due: Before 10 PM, Friday, November 16 #5 and #6 and 9 page 156