F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES 1 We now come to more general F tests of goodness of fit. This is a test of the joint explanatory power.

Slides:



Advertisements
Similar presentations
CHOW TEST AND DUMMY VARIABLE GROUP TEST
Advertisements

EC220 - Introduction to econometrics (chapter 5)
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: slope dummy variables Original citation: Dougherty, C. (2012) EC220 -
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: interactive explanatory variables Original citation: Dougherty, C. (2012)
HETEROSCEDASTICITY-CONSISTENT STANDARD ERRORS 1 Heteroscedasticity causes OLS standard errors to be biased is finite samples. However it can be demonstrated.
EC220 - Introduction to econometrics (chapter 7)
1 BINARY CHOICE MODELS: PROBIT ANALYSIS In the case of probit analysis, the sigmoid function F(Z) giving the probability is the cumulative standardized.
EC220 - Introduction to econometrics (chapter 2)
00  sd  0 –sd  0 –1.96sd  0 +sd 2.5% CONFIDENCE INTERVALS probability density function of X null hypothesis H 0 :  =  0 In the sequence.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 6) Slideshow: variable misspecification iii: consequences for diagnostics Original.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 2) Slideshow: testing a hypothesis relating to a regression coefficient (2010/2011.
TESTING A HYPOTHESIS RELATING TO THE POPULATION MEAN 1 This sequence describes the testing of a hypothesis at the 5% and 1% significance levels. It also.
EC220 - Introduction to econometrics (chapter 1)
1 INTERPRETATION OF A REGRESSION EQUATION The scatter diagram shows hourly earnings in 2002 plotted against years of schooling, defined as highest grade.
TESTING A HYPOTHESIS RELATING TO A REGRESSION COEFFICIENT This sequence describes the testing of a hypotheses relating to regression coefficients. It is.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: prediction Original citation: Dougherty, C. (2012) EC220 - Introduction.
SLOPE DUMMY VARIABLES 1 The scatter diagram shows the data for the 74 schools in Shanghai and the cost functions derived from a regression of COST on N.
BINARY CHOICE MODELS: LOGIT ANALYSIS
1 In the previous sequence, we were performing what are described as two-sided t tests. These are appropriate when we have no information about the alternative.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 3) Slideshow: precision of the multiple regression coefficients Original citation:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: semilogarithmic models Original citation: Dougherty, C. (2012) EC220.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 4) Slideshow: nonlinear regression Original citation: Dougherty, C. (2012) EC220 -
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: Chow test Original citation: Dougherty, C. (2012) EC220 - Introduction.
TOBIT ANALYSIS Sometimes the dependent variable in a regression model is subject to a lower limit or an upper limit, or both. Suppose that in the absence.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: dummy variable classification with two categories Original citation:
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: two sets of dummy variables Original citation: Dougherty, C. (2012) EC220.
1 PREDICTION In the previous sequence, we saw how to predict the price of a good or asset given the composition of its characteristics. In this sequence,
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: the effects of changing the reference category Original citation: Dougherty,
Christopher Dougherty EC220 - Introduction to econometrics (chapter 5) Slideshow: dummy classification with more than two categories Original citation:
DUMMY CLASSIFICATION WITH MORE THAN TWO CATEGORIES This sequence explains how to extend the dummy variable technique to handle a qualitative explanatory.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 10) Slideshow: Tobit models Original citation: Dougherty, C. (2012) EC220 - Introduction.
1 INTERACTIVE EXPLANATORY VARIABLES The model shown above is linear in parameters and it may be fitted using straightforward OLS, provided that the regression.
THE FIXED AND RANDOM COMPONENTS OF A RANDOM VARIABLE 1 In this short sequence we shall decompose a random variable X into its fixed and random components.
1 TWO SETS OF DUMMY VARIABLES The explanatory variables in a regression model may include multiple sets of dummy variables. This sequence provides an example.
Confidence intervals were treated at length in the Review chapter and their application to regression analysis presents no problems. We will not repeat.
CONFLICTS BETWEEN UNBIASEDNESS AND MINIMUM VARIANCE
1 t TEST OF A HYPOTHESIS RELATING TO A POPULATION MEAN The diagram summarizes the procedure for performing a 5% significance test on the slope coefficient.
1 PROXY VARIABLES Suppose that a variable Y is hypothesized to depend on a set of explanatory variables X 2,..., X k as shown above, and suppose that for.
MULTIPLE RESTRICTIONS AND ZERO RESTRICTIONS
F TEST OF GOODNESS OF FIT FOR THE WHOLE EQUATION 1 This sequence describes two F tests of goodness of fit in a multiple regression model. The first relates.
MULTIPLE REGRESSION WITH TWO EXPLANATORY VARIABLES: EXAMPLE 1 This sequence provides a geometrical interpretation of a multiple regression model with two.
Simple regression model: Y =  1 +  2 X + u 1 We have seen that the regression coefficients b 1 and b 2 are random variables. They provide point estimates.
Christopher Dougherty EC220 - Introduction to econometrics (chapter 9) Slideshow: instrumental variable estimation: variation Original citation: Dougherty,
. reg LGEARN S WEIGHT85 Source | SS df MS Number of obs = F( 2, 537) = Model |
Christopher Dougherty EC220 - Introduction to econometrics (chapter 6) Slideshow: multiple restrictions and zero restrictions Original citation: Dougherty,
POSSIBLE DIRECT MEASURES FOR ALLEVIATING MULTICOLLINEARITY 1 What can you do about multicollinearity if you encounter it? We will discuss some possible.
(1)Combine the correlated variables. 1 In this sequence, we look at four possible indirect methods for alleviating a problem of multicollinearity. POSSIBLE.
1 We will continue with a variation on the basic model. We will now hypothesize that p is a function of m, the rate of growth of the money supply, as well.
COST 11 DUMMY VARIABLE CLASSIFICATION WITH TWO CATEGORIES 1 This sequence explains how you can include qualitative explanatory variables in your regression.
Definition of, the expected value of a function of X : 1 EXPECTED VALUE OF A FUNCTION OF A RANDOM VARIABLE To find the expected value of a function of.
RAMSEY’S RESET TEST OF FUNCTIONAL MISSPECIFICATION 1 Ramsey’s RESET test of functional misspecification is intended to provide a simple indicator of evidence.
1 NONLINEAR REGRESSION Suppose you believe that a variable Y depends on a variable X according to the relationship shown and you wish to obtain estimates.
1 HETEROSCEDASTICITY: WEIGHTED AND LOGARITHMIC REGRESSIONS This sequence presents two methods for dealing with the problem of heteroscedasticity. We will.
1 ESTIMATORS OF VARIANCE, COVARIANCE, AND CORRELATION We have seen that the variance of a random variable X is given by the expression above. Variance.
1 CHANGES IN THE UNITS OF MEASUREMENT Suppose that the units of measurement of Y or X are changed. How will this affect the regression results? Intuitively,
SEMILOGARITHMIC MODELS 1 This sequence introduces the semilogarithmic model and shows how it may be applied to an earnings function. The dependent variable.
GRAPHING A RELATIONSHIP IN A MULTIPLE REGRESSION MODEL The output above shows the result of regressing EARNINGS, hourly earnings in dollars, on S, years.
1 REPARAMETERIZATION OF A MODEL AND t TEST OF A LINEAR RESTRICTION Linear restrictions can also be tested using a t test. This involves the reparameterization.
WHITE TEST FOR HETEROSCEDASTICITY 1 The White test for heteroscedasticity looks for evidence of an association between the variance of the disturbance.
1 COMPARING LINEAR AND LOGARITHMIC SPECIFICATIONS When alternative specifications of a regression model have the same dependent variable, R 2 can be used.
VARIABLE MISSPECIFICATION II: INCLUSION OF AN IRRELEVANT VARIABLE In this sequence we will investigate the consequences of including an irrelevant variable.
FOOTNOTE: THE COCHRANE–ORCUTT ITERATIVE PROCESS 1 We saw in the previous sequence that AR(1) autocorrelation could be eliminated by a simple manipulation.
VARIABLE MISSPECIFICATION I: OMISSION OF A RELEVANT VARIABLE In this sequence and the next we will investigate the consequences of misspecifying the regression.
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Introduction to Econometrics, 5th edition
Presentation transcript:

F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES 1 We now come to more general F tests of goodness of fit. This is a test of the joint explanatory power of a group of variables when they are added to a regression model.

2 For example, in the original specification, Y may be written as a simple function of X 2. In the second, we add X 3 and X 4. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

3 The null hypothesis is that neither X 3 nor X 4 belongs in the model. The alternative hypothesis is that at least one of them does, perhaps both. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES or or bothand

4 When new variables are added to the model, RSS cannot rise. In general, it will fall. If the new variables are irrelevant, it will fall only by a random amount. The test evaluates whether the fall in RSS is greater than would be expected on a pure chance basis. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES or or bothand

5 The appropriate test is an F test. For this test, and for several others which we will encounter, it is useful to think of the F statistic as having the structure indicated above. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

6 The ‘reduction in RSS’ is the reduction when the change is made, in this case, when the group of new variables is added. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

7 The ‘cost in d.f.’ is the reduction in the number of degrees of freedom remaining after making the change. In the present case it is equal to the number of new variables added, because that number of new parameters are estimated. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

8 (Remember that the number of degrees of freedom in a regression equation is the number of observations, less the number of parameters estimated. In this example, it would fall from n – 2 to n – 4 when X 3 and X 4 are added.) F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

9 The ‘RSS remaining’ is the residual sum of squares after making the change. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

10 The ‘degrees of freedom remaining’ is the number of degrees of freedom remaining after making the change. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining or or bothand

. reg S ASVABC Source | SS df MS Number of obs = F( 1, 538) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | _cons | We will illustrate the test with an educational attainment example. Here is S regressed on ASVABC using Data Set 21. We make a note of the residual sum of squares. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons | Now we have added the highest grade completed by each parent. Does parental education have a significant impact? Well, we can see that a t test would show that SF has a highly significant coefficient, but we will perform the F test anyway. We make a note of RSS. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 13 F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES The improvement in the fit on adding the parental variables is the reduction in the residual sum of squares. or or bothand

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 14 F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES The cost is 2 degrees of freedom because 2 additional parameters have been estimated. or or bothand

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 15 F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES The remaining unexplained is the residual sum of squares after adding SM and SF. or or bothand

or or bothand F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 16 F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES The number of degrees of freedom remaining is n – k, that is, 540 – 4 = 536.

17 F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES The F statistic is or or bothand F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

18 The critical value of F(2,500) at the 0.1% level is The critical value of F(2,536) must be lower, so we reject H 0 and conclude that the parental education variables do have significant joint explanatory power. or or bothand F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

19 This sequence will conclude by showing that t tests are equivalent to marginal F tests when the additional group of variables consists of just one variable. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

20 Suppose that in the original model Y is a function of X 2 and X 3, and that in the revised model X 4 is added. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

21 The null hypothesis for the F test of the explanatory power of the additional ‘group’ is that all the new slope coefficients are equal to zero. There is of course only one new slope coefficient,  4. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

22 The F test has the usual structure. We will illustrate it with an educational attainment model where S depends on ASVABC and SM in the original model and on SF as well in the revised model. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

23 Here is the regression of S on ASVABC and SM. We make a note of the residual sum of squares. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM Source | SS df MS Number of obs = F( 2, 537) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | _cons |

24 Now we add SF and again make a note of the residual sum of squares. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 25 The reduction in the residual sum of squares is the reduction on adding SF. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 26 The cost is just the single degree of freedom lost when estimating  4. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining 27 The RSS remaining is the residual sum of squares after adding SF. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES degrees of freedom remaining

F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining 28 The number of degrees of freedom remaining after adding SF is 540 – 4 = 536. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES

29 Hence the F statistic is F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

30 The critical value of F at the 0.1% significance level with 500 degrees of freedom is The critical value with 536 degrees of freedom must be lower, so we reject H 0 at the 0.1% level. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

31 The null hypothesis we are testing is exactly the same as for a two-sided t test on the coefficient of SF. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES F (cost in d.f., d.f. remaining) = reduction in RSScost in d.f. RSS remaining degrees of freedom remaining

32 We will perform the t test. The t statistic is F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

33 The critical value of t at the 0.1% level with 500 degrees of freedom is The critical value with 536 degrees of freedom must be lower. So we reject H 0 again. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

34 It can be shown that the F statistic for the F test of the explanatory power of a ‘group’ of one variable must be equal to the square of the t statistic for that variable. (The difference in the last digit is due to rounding error.) F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

35 It can also be shown that the critical value of F must be equal to the square of the critical value of t. (The critical values shown are for 500 degrees of freedom, but this must also be true for 536 degrees of freedom.) F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

36 Hence the conclusions of the two tests must coincide. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

37 This result means that the t test of the coefficient of a variable is a test of its marginal explanatory power, after all the other variables have been included in the equation. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

38 If the variable is correlated with one or more of the other variables, its marginal explanatory power may be quite low, even if it genuinely belongs in the model. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

39 If all the variables are correlated, it is possible for all of them to have low marginal explanatory power and for none of the t tests to be significant, even though the F test for their joint explanatory power is highly significant. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

40 If this is the case, the model is said to be suffering from the problem of multicollinearity discussed in the previous sequence. F TESTS RELATING TO GROUPS OF EXPLANATORY VARIABLES. reg S ASVABC SM SF Source | SS df MS Number of obs = F( 3, 536) = Model | Prob > F = Residual | R-squared = Adj R-squared = Total | Root MSE = S | Coef. Std. Err. t P>|t| [95% Conf. Interval] ASVABC | SM | SF | _cons |

Copyright Christopher Dougherty These slideshows may be downloaded by anyone, anywhere for personal use. Subject to respect for copyright and, where appropriate, attribution, they may be used as a resource for teaching an econometrics course. There is no need to refer to the author. The content of this slideshow comes from Section 3.5 of C. Dougherty, Introduction to Econometrics, fourth edition 2011, Oxford University Press. Additional (free) resources for both students and instructors may be downloaded from the OUP Online Resource Centre Individuals studying econometrics on their own who feel that they might benefit from participation in a formal course should consider the London School of Economics summer school course EC212 Introduction to Econometrics or the University of London International Programmes distance learning course EC2020 Elements of Econometrics