Lecture Week 3 Topics in Regression Analysis
Overview Multiple regression Dummy variables Tests of restrictions 2 nd hour: some issues in cost of capital
Multiple Regression Same principle as simple regression – but difficult to draw. Key elements: –Definition of variables: source, measurement, number of observations, raw, log, squared, etc. –Method of estimation: typically ordinary least squares for linear forms –The output
Log-log formulation
Defining an equation in Eviews Once you’ve created a workfile and loaded or imported your data, click on: Objects….New objects…Equation Then type in the list of variables, e.g. xssretsp c xsretftse Once you have the first result you can use Proc…Specify/estimate to adjust the equation
Looking at multiple regression output Dependent Variable: FTSE100 Method: Least Squares Date: 02/04/04 Time: 13:49 Sample(adjusted): 1/02/ /20/2003 Included observations: 208 after adjusting endpoints Variable CoefficientStd.Error t-StatisticProb. C FTSE100(-1) MON R-squared0.9594Mean dependent var Adjusted R-squared S.D. dependent var S.E. of regression Akaike info criterion Sum squared resid Schwarz criterion Log likelihood F-statistic Durbin-Watson stat Prob(F-statistic)
What the third box means R squared: proportion of variance of Y explained by reg Adjusted R squared = takes account of number of variables s.e. of regression = s.d. of residuals Log likelihood: function of log(sum sq resid) Durbin-Watson: Test for 1 st order autocorrelation Akaike and Schwartz information criteria: used for selecting between non-nested models with different numbers of parameters F- statistic: tests null that all slope coeffs are zero
Dummy variables Intercept dummy: Y = a + b X + c Dummy Y X Dummy = 0 c Dummy = 1
A few points on tests Classical tests based on “nested” hypotheses Lots of them based on “Do the data support the alternate hypothesis against the null?” Core methodology= difference in fit between alternate hypothesis (typically unrestricted form) versus null hypothesis (typically restricted form) t tests, F tests, likelihood-function based tests all based ultimately on what happens to sum of squared residuals in presence/absence of restriction b 1, b 2, b 3, can be anything b 2 = 0