Marietta College Week 13 1 Tuesday, April 5 2 Exam 3: Monday, April 25, 12- 2:30PM.

Slides:



Advertisements
Similar presentations
Managerial Economics in a Global Economy
Advertisements

Marietta College Week 2 1 Collect Asst 2: Due Tuesday in class 1.#3, Page 25 2.
Welcome to Econ 420 Applied Regression Analysis
The Multiple Regression Model.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Ten.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Fourteen.
Irwin/McGraw-Hill © Andrew F. Siegel, 1997 and l Chapter 12 l Multiple Regression: Predicting One Factor from Several Others.
Conclusion to Bivariate Linear Regression Economics 224 – Notes for November 19, 2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Studenmund(2006): Chapter 8
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Three Ending Tuesday, September 11 (Note: You must go over these slides and complete every.
Bivariate Regression Analysis
The Use and Interpretation of the Constant Term
LECTURE 3 Introduction to Linear Regression and Correlation Analysis
Lecture 4 Econ 488. Ordinary Least Squares (OLS) Objective of OLS  Minimize the sum of squared residuals: where Remember that OLS is not the only possible.
Problems in Applying the Linear Regression Model Appendix 4A
Chapter 13 Multiple Regression
Marietta College Week 4 1 Corrections Exam 1 – Tuesday, February 10 8 Exam 2 – Tuesday, March Exam 3 – Monday, April 25, 12-2:30PM 2.
Linear Regression.
Marietta College Week 14 1 Tuesday, April 12 2 Exam 3: Monday, April 25, 12- 2:30PM Bring your laptops to class on Thursday too.
Statistics for Managers Using Microsoft® Excel 5th Edition
Statistics for Managers Using Microsoft® Excel 5th Edition
Chapter 12 Multiple Regression
January 6, morning session 1 Statistics Micro Mini Multiple Regression January 5-9, 2008 Beth Ayers.
Lecture 19: Tues., Nov. 11th R-squared (8.6.1) Review
Fall 2006 – Fundamentals of Business Statistics 1 Chapter 13 Introduction to Linear Regression and Correlation Analysis.
Linear Regression and Correlation Analysis
Lecture 20 Simple linear regression (18.6, 18.9)
Statistical Analysis SC504/HS927 Spring Term 2008 Session 7: Week 23: 7 th March 2008 Complex independent variables and regression diagnostics.
Stat 217 – Day 25 Regression. Last Time - ANOVA When?  Comparing 2 or means (one categorical and one quantitative variable) Research question  Null.
Topic 3: Regression.
1 4. Multiple Regression I ECON 251 Research Methods.
Empirical Estimation Review EconS 451: Lecture # 8 Describe in general terms what we are attempting to solve with empirical estimation. Understand why.
Basic Business Statistics, 11e © 2009 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 11 th Edition.
Correlation and Regression Analysis
Introduction to Regression Analysis, Chapter 13,
Copyright ©2011 Pearson Education 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft Excel 6 th Global Edition.
Forecasting Revenue: An Example of Regression Model Building Setting: Possibly a large set of predictor variables used to predict future quarterly revenues.
Ordinary Least Squares
Chapter 8 Forecasting with Multiple Regression
Copyright © 2011 Pearson Education, Inc. Multiple Regression Chapter 23.
Introduction to Linear Regression and Correlation Analysis
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 12-1 Chapter 12 Simple Linear Regression Statistics for Managers Using.
Hypothesis Testing in Linear Regression Analysis
Regression Method.
Copyright ©2011 Pearson Education, Inc. publishing as Prentice Hall 15-1 Chapter 15 Multiple Regression Model Building Statistics for Managers using Microsoft.
1 Least squares procedure Inference for least squares lines Simple Linear Regression.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Two Ending Sunday, September 9 (Note: You must go over these slides and complete every.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Twelve.
Pure Serial Correlation
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Five Ending Wednesday, September 26 (Note: Exam 1 is on September 27)
Chap 14-1 Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chapter 14 Additional Topics in Regression Analysis Statistics for Business.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Six.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Four Ending Wednesday, September 19 (Assignment 4 which is included in this study guide.
1 Regression Analysis The contents in this chapter are from Chapters of the textbook. The cntry15.sav data will be used. The data collected 15 countries’
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Seven.
Welcome to Econ 420 Applied Regression Analysis Study Guide Week Eight.
Multiple Regression I 1 Copyright © 2005 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 4 Multiple Regression Analysis (Part 1) Terry Dielman.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
Copyright © 2004 by The McGraw-Hill Companies, Inc. All rights reserved.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 15-1 Chapter 15 Multiple Regression Model Building Basic Business Statistics 10 th Edition.
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 14-1 Chapter 14 Multiple Regression Model Building Statistics for Managers.
1 Regression Review Population Vs. Sample Regression Line Residual and Standard Error of Regression Interpretation of intercept & slope T-test, F-test.
26134 Business Statistics Week 4 Tutorial Simple Linear Regression Key concepts in this tutorial are listed below 1. Detecting.
Yandell – Econ 216 Chap 15-1 Chapter 15 Multiple Regression Model Building.
Chapter 15 Multiple Regression Model Building
Linear Regression.
Multiple Regression Chapter 14.
Chapter 13 Additional Topics in Regression Analysis
Presentation transcript:

Marietta College Week 13 1

Tuesday, April 5 2 Exam 3: Monday, April 25, 12- 2:30PM

Leadership Q&A David Leonhardt Economics Journalist Washington Bureau The New York Times TONIGHT 7:30pm McDonough Gallery Cosponsored by McDonough Center for Leadership & Business and the Economic Roundtable of the Ohio Valley Is anyone interested in going to breakfast with him tomorrow 8 am, Lafayette Hotel

This is the last bonus opportunity of this semester 2 points for attending 2-5 points per question 2-10 points per summary Summaries are due before 5 pm on Friday, April 8 via an attachment to me Total bonus points will be divided by 3 and added to your exams.

Return and discuss Asst 18 # 12 Page 240 a) The estimated coefficients all are in the expected direction R 2 bar seems fairly low. Always check the significance at 10 percent or better Coefficients of A, A 2 and S are significant. You can only interpret the magnitude of coefficients if they pass the t-test of significance Significance has to do with t-test Importance has to do with the absolute value of coefficient. 5

b) It implies that wages rise at a declining rate with respect to age and eventually fall. Does not imply perfect collinearity (non-linear correlation). c) Semilog (Ln W) is a possibility The slope coefficient represents the percentage change in wage caused by a one-unit increase in the independent variable (holding constant all the other independent variables). Since pay raises are often discussed in percentage terms, this makes sense. Phil & Yuan say, but what about the meaning of coefficient of A 2 in a semi log function? (great point) Linda says, it depends on the purpose of the study (great point) 6

d) It’s a good habit to ignore (except to make sure that one exists) even if it looks too large or too small. Intercept picks up the mean of the error term & and that is affected by omitted variables. e) The poor fit and the insignificant estimated coefficient of union membership are all reasons for being extremely cautious about using this regression to draw any conclusions about union membership. 7

Collect Asst 19 # 5 Page 234 – Including Part e (data is available online under STOCK in Chapter 7) 8

Imperfect Multicollinearity Problem What is it? Let’s say you estimate an regression equation, what makes you suspicious about possible mulit problem? What are the two formal tests we talked about before? 9

Example Income = f (wage rate, tax rate, hours of work, ….) Wage rate, tax rate and hours of work may be all highly correlated with each other Problem: simple correlation coefficient may not capture this. Sometimes 3 or more independent variables are correlated

Regress each independent variable (say X 1 ) on the other independent variables (X 2, X 3, X 4 ) Then calculate VIF VIF = 1 / (1- R 2 ) If VIF > 5  then X1 is highly correlated with the other independent variables Do the same for all of the other independent variables Test of Multicollinearity among 3 or more independent variables

Asst 20 Data set: DRUGS (Chapter 5, P 157) Estimate Equation Before you run any formal tests, do you suspect an imperfect mulitcollinearity problem? Why or why not? 2.Examine the absolute values of the correlation coefficients between the independent variables included in Equation Do you find any evidence of muliticollinerity problem? Discuss. 3.Examine the VIF of the two most suspicious independent variables in Equation 5.1 based on what your found in Section 2 above. Do you find any evidence of muliticollinerity problem? Discuss 12

Thursday, April 7 Exam 3: Monday, April 25, 12- 2:30PM If you asked David Leonhardt questions on Tuesday night, write it down and give it to me today. Summaries are due before 5 pm tomorrow via an attachment. Bring laptops to class on Tuesday 13

Return and discuss Asst 19 # 5 Page 234 – Including Part e (data is available online under STOCK in Chapter 7) 14

(a) You are correct but note that the null and alternative hypotheses are not about beta hats, they are about betas. (b) It’s unusual to have a lagged variable in a cross- sectional model. BETA is lagged. 15

16 Part C Should we include EARN in the set of our independent variables 1.Does the theory call for its inclusion? Yes, but a version of it is in dependent variable  exclude EARN 2.Is the estimated coefficient of EARN significant in the right direction? No  exclude EARN 3.As you include EARN, does the adjusted R squared goes up? Yes  include EARN 4.As you include EARN, do the other variables’ coefficients change significantly? Change somewhat! 5.As you include EARN, do AIC and SC go down? No  exclude EARN

(d) The functional form is a semilog left, which is appropriate both on a theoretical basis and also because two of the independent variables are expressed as percentages. 17

(e) EARN, DIV, and Beta all can be negative, can’t take their log. 18

Return and discuss Asst 20 Data set: DRUGS (Chapter 5, P 157) Estimate Equation Before you run any formal tests, do you suspect an imperfect mulitcollinearity problem? Why or why not? 2.Examine the absolute values of the correlation coefficients between the independent variables included in Equation Do you find any evidence of muliticollinerity problem? Discuss. 3.Examine the VIF of the two most suspicious independent variables in Equation 5.1 based on what your found in Section 2 above. Do you find any evidence of muliticollinerity problem? Discuss 19

Dependent Variable: P Method: Least Squares Sample: 1 32 Included observations: 32 VariableCoefficientStd. Errort-StatisticProb. C GDPN CVN PP DPC IPC R-squared Adjusted R-squared Do we suspect multicollinearity problem? What should we look for? R bar squared is high but we have two insignificant variables

Correlation Matrix GDPNCVNPPDPCIPC GDPN CVN PP DPC IPC 1 21 Why are the diagonal values all 1? Why did I eliminate the values in bottom half of the table? Is multicollinearity a problem?

Dependent Variable: GDPN Method: Least Squares Sample: 1 32 Included observations: 32 VariableCoefficientStd. Errort-StatisticProb. C CVN PP DPC IPC R-squared VIF = 1/ (1-0.77) VIF = 4.34 VIF<5  no serious multicollinearity problem

Dependent Variable: CVN Method: Least Squares Sample: 1 32 Included observations: 32 VariableCoefficientStd. Errort-StatisticProb. C DPC GDPN IPC PP R-squared VIF = 1/ (1-0.78) VIF = 4.54 VIF<5  no serious multicollinearity problem

1.If your main goal is to use the equation for forecasting and you don’t want to do specific t- test on each estimated coefficient then do nothing. ◦ This is because multicollinearity does not affect the predictive power of your equation. 2.If it seems that you have a redundant variable, drop it. ◦ Examples ◦ You don’t need both real and nominal interest rates in your model ◦ You don’t need both nominal and real GDP in your model Remedies for Multicollinearity

3. If all variables need to stay in the equation, transform the multicollinear variables  Example:  Number of domestic cars sold = B 0 + B 1 average price of domestic cars + B 2 average price of foreign cars +…..+ є  Problems: Prices of domestic and foreign cars are highly correlated  Solution:  Number of domestic cars sold = B 0 + B 1 the ratio of average price of domestic cars to the average price of foreign cars +…..+ є 4. Increase the sample size or choose a different random sample Remedies for Multicollinearity

Asst 21: Due Tuesday in class Use the data set FISH in Chapter 8 (P 274) to run the following regression equation: F = f (PF, PB, Yd, P, N) 1)Conduct all 3 tests of imperfect multicollinearity problem and report your results. 2)If you find an evidence for imperfect multicollinearity problem, suggest and implement a reasonable solution. 26

27 Chapter 9 (Autocorrelation or Serial Correlation) Suppose we are using time series data to estimate consumption (C) as a function of income (Y) and other factors C t = β 1 + β 2 Y t +…..+ є t – Where t = (1, 2, 3, ….T) – This means that C 1 = β 1 + β 2 Y 1 +…. + є 1, and C 2 = β 1 + β 2 Y 2 +…. + є 2 ….. …… C T = β 1 + β 2 Y T +…. + є T …… One of the classical assumptions regarding the error terms is – No correlation among the error terms in the theoretical equation If this assumption is violated then there is a problem of pure serial correlation (autocorrelation).

28 First Order Pure Autocorrelation є 2 = ρ є 1 + u 2 – That is, the error term in period 2 depends on the error term in period 1 – Where, u 2 is a normally distributed error with the mean of zero and constant variance

29 Second Order Pure Autocorrelation є 3 = ρ 1 є 1 + ρ 2 є 2 + u 3 – That is, the error term in period 3 depends on the error term in period 1 and the error term in period 2. – Where, u 3 is a normally distributed error with the mean of zero and constant variance

30 Higher Order Pure Autocorrelation є t = ρ 1 є t-1 + ρ 2 є t-2 + ρ 3 є t-3 + ….. + u t – That is, the error term in period t depends on the error term in period t-1, the error term in period t- 2, and the error term in period t-3,…etc. – Where, u t is a normally distributed error with the mean of zero and constant variance

When the true (theoretical) regression line does not have an autocorrelation problem but our estimated equation does. Why? 1.Specification error 2.Wrong functional form 3.Data error 31 What is the Impure Serial Correlation?

32 Types of Serial Correlation 1.Positive Errors form a pattern A positive error is usually followed by another positive error A negative error is usually followed by another negative error More common

33 Example of positive autocorrelation

34 Types of Serial Correlation 2. Negative A positive error is usually followed by a negative error or visa-versa Less common

35 Example of negative autocorrelation

36 EViews allows you to see the residuals’ graph After you estimate the regression equation Click on View on your regression output Click on Actual, Fitted, Residual Table

ActualFittedResidualResidual Plot |*. |. | | *. |. | | *. |. | |.* |. | |.* |. | |. * |. | |. | *. | |. | *. | |. | *. | |. |. * | |. |. *| |. | *. | |. |. * | |. |. * | |. *|. | |. * |. | |. |.* | |. | *. | |. | *. | |. | *. | |. * |. | |.* |. | ……… 37 What type of serial correlation may we have? Negative residuals seem to be followed by other negative residuals  suspect positive autocorrelation