Download presentation
Presentation is loading. Please wait.
Published byBartholomew Young Modified over 9 years ago
1
Econ 140 Lecture 151 Multiple Regression Applications Lecture 15
2
Econ 140 Lecture 152 Today’s Plan Two topics and how they relate to multiple regression –Multicollinearity –Dummy variables
3
Econ 140 Lecture 153 Multicollinearity Suppose we have the following regression equation: Y = a + b 1 X 1 + b 2 X 2 + e Multicollinearity occurs when some or all of the independent X variables are linearly related Different forms of multicollinearity: –Perfect: OLS estimation will not work –Non-perfect: comes out of applied work - presents problems for inference and interpretation of the results. –No test for detection - only possible to compare alternative specified forms of the model.
4
Econ 140 Lecture 154 Multicollinearity Example Again we’ll use returns to education where: –the dependent variable Y is (log) wages –the independent variables (X’s) are age, experience, and years of schooling Experience is defined as years in the labor force, or the difference between age and years of schooling –this can be written: Experience = Age - Years of school –What’s the problem with this?
5
Econ 140 Lecture 155 Multicollinearity Example (2) Note that we’ve expressed experience as the difference of two of our other independent variables –by constructing experience in this manner we create a collinear dependence between age and experience –the relationship between age and experience is a linear relationship such that: as age increases, for given years of schooling, experience also increases We can write our regression equation for this example: Ln(Wages) = a + b 1 Experience + b 2 Age + e
6
Econ 140 Lecture 156 Multicollinearity Example (3) Recall that our estimate for b 1 is Where x 1 = experience and x 2 = age The problem is that x 1 and x 2 are linearly related –as we get closer to perfect collinearity, the denominator will go to zero. –OLS won’t work!
7
Econ 140 Lecture 157 Multicollinearity Example (4) Recall that the estimated variance for is: –So as x 1 and x 2 approach perfect collinearity, the denominator will go to zero and the expression for the the estimated variance of will increase Implications: –with multicollinearity, you will get large standard errors on partial coefficients –your t-ratios, given the null hypothesis that the value of the coefficient is zero, will be small
8
Econ 140 Lecture 158 More Multicollinearity Examples In L15_1.xls we have individual data on age, years of education, weekly earnings, school age, and experience –we can perform a regression to calculate returns given age and experience –we can also estimate bivariate models including only age, only experience, and only years of schooling –we expect that the problem is that experience is related to age (to test this, we can regress age on experience) if the slope coefficient on experience is 1, there is perfect multicollinearity
9
Econ 140 Lecture 159 More Multicollinearity Examples (2) On L15_2.xls there is a made-up example of perfect multicollinearity –OLS is unable to calculate the slope coefficients –calculating the products and cross-products, we find that the denominator for the slope coefficients is zero as predicted –If we have is an applied problem with these properties: 1) OLS is still unbiased 2) Large variance, standard errors, and difficult hypothesis testing 3) Few significant coefficients but a high R 2
10
Econ 140 Lecture 1510 More Multicollinearity Examples (3) What to do with L15_1.xls? –There’s simply not enough variation –We can collect more data or rethink the model –We can test for partial correlations between the X variables. –Always try specification checks. –Alternatively, try to re-scale variables so that the correlation is not the same.
11
Econ 140 Lecture 1511 Dummy variables Dummy variables allow you to include qualitative variables (or variables that otherwise cannot be quantified) in your regression –examples include: gender, race, marital status, and religion –also becomes important when looking at “regime shifts” which may be new policy initiatives, economic change, or seasonality We will look at some examples: –using female as a qualitative variable –using marital status as a qualitative variable –using the Phillips curve to demonstrate a regime shift
12
Econ 140 Lecture 1512 Qualitative example: female We’ll construct a dummy variable: D i = 0 if not femalei = 1, …n D i = 1 if female –We can do this with any qualitative variable –Note: assigning the values for the dummy variable is an arbitrary choice On L15_3.xls there is a sample from the current CPS –to create the dummy variable “female” we assign the value one and zero to the CPS’ value of two and one for sex, respectively –we can include the dummy variable in the regression equation like we would any other variable
13
Econ 140 Lecture 1513 Qualitative example: female (2) We estimate the following equation: Now we can ask: what are the expected earnings given that a person is male? Similarly, what are the expected earnings given that a person is female? E(Y i | D i = 1) = a + b(1) = a + b = 5.975 - 0.485 = 5.490
14
Econ 140 Lecture 1514 Qualitative example: female (4) We can use other variables to extend our analysis for example we can include age to get the equation: Y = a + b 1 D i + b 2 X i + e –where X i can be any or all relevant variables –D i and the related coefficient b 1 will indicate how much, on average, females earn less than males –for males the intercept will be –for females the intercept will be
15
Econ 140 Lecture 1515 Qualitative example: female (5) The estimated regression found on the spreadsheet is The expected weekly earnings for men are: The expected weekly earnings for women are:
16
Econ 140 Lecture 1516 Qualitative example: female (6) An important note: We can not include dummy variables for both male and female in the same regression equation –suppose we have Y = a + b 1 D 1i + b 2 D 2i + e –where: D 1i = 0 if male D 1i = 1 if female D 2i = 0 if female D 2i = 1 if male –OLS won’t be able to estimate the regression coefficients because D 1i and D 2i show perfect multicollinearity with intercept a So if you have m qualitative variables, you should include (m-1) dummy variables in the regression equation
17
Econ 140 Lecture 1517 Example: marital status The spreadsheet (L15_3.xls) also estimates the following regression equation using two distinct dummy variables: –where: D 1i = 0 if maleD 1i = 1 if female D 2i = 0 if other D 2i = 1 if married Using the regression equation we can create four categories: married males, unmarried males, married females, and unmarried females
18
Econ 140 Lecture 1518 Example: marital status (2) Expected earnings for unmarried males: Expected earnings for unmarried females: Expected earnings for married males: Expected earnings for unmarried females:
19
Econ 140 Lecture 1519 Interactive terms So far we’ve only used dummy variables to change the intercept We can also use dummy variables to alter the partial slope coefficients Let’s think about this model: ln(W i )= a + b 1 Age i + b 2 Married i + e –we could argue that would be different for males and females –we want to think about two sub-sample groups: males and females –we can test the hypothesis that the partial slope coefficients will be different for these 2 groups
20
Econ 140 Lecture 1520 Interactive terms (2) To test our hypothesis we’ll estimate the regression equation for the whole sample and then for the two sub-sample groups We test to see if our estimated coefficients are the same between males and females Our null hypothesis is: H 0 : a M, b 1M, b 2M = a F, b 1F, b 2F
21
Econ 140 Lecture 1521 Interactive terms (3) We have an unrestricted form and a restricted form –unrestricted: used when we estimate for the sub-sample groups separately –restricted: used when we estimate for the whole sample What type of statistic will we use to carry out this test? –F-statistic: q = k, the number of parameters in the model n = n 1 + n 2 where n is complete sample size
22
Econ 140 Lecture 1522 Interactive terms (4) The sum of squared residuals for the unrestricted form will be: SSR U = SSR M + SSR F L15.4.xls –the data are sorted according to the dummy variable “female” –there is a second dummy variable for marital status –there are 3 estimated regression equations, one each for the total sample, male sub-sample, and female sub- sample
23
Econ 140 Lecture 1523 Interactive terms (5) The output allows us to gather the necessary sum of squared residuals and sample sizes to construct the estimate: –Since F 0.05,3, 27 = 2.96 > F* we cannot reject the null hypothesis that the partial slope coefficients are the same for males and females
24
Econ 140 Lecture 1524 Interactive terms (6) What if F* > F 0.05,3, 27 ? How to read the results? –There’s a difference between the two sub-samples and therefore we should estimate the wage equations separately –Or we could interact the dummy variables with the other variables To interact the dummy variables with the age and marital status variables, we multiply the dummy variable by the age and marital status variables to get: W t = a + b 1 Age i + b 2 Married i + b 3 D i + b 4 (D i *Age i ) + b 5 (D i *Married i ) + e i
25
Econ 140 Lecture 1525 Interactive terms (7) Using L15.4.xls you can construct the interactive terms by multiplying the FEMALE column by the AGE and MARRIED columns –one way to see if the two sub-samples are different, look at the t-ratios on the interactive terms –in this example, neither of the t-ratios are statistically significant so we can not reject the null hypothesis We now know how to use dummy variables to indicate the importance of sub-sample groups within the data –dummy variables are also useful for testing for structural breaks or regime shifts
26
Econ 140 Lecture 1526 Interactive terms (8) If we want to estimate the equation for the first sub-sample (males) we take the expectation of the wage equation where the dummy variable for female takes the value of zero: E(W i |D i = 0) = a + b 1 Age i + b 2 Married i We can do the same for the second sub-sample (Females) E(W i |D i = 1) = (a + b 3 ) + (b 1 + b 4 )Age i + (b 2 + b 3 ) Married i We can see that by using only one regression equation, we have allowed the intercept and partial slope coefficients to vary by sub-sample
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.