Linear Models Alan Lee Sample presentation for STATS 760
Contents The problem Typical data Exploratory Analysis The Model Estimation and testing Diagnostics Software A Worked Example
The Problem To model the relationship between a continuous variable Y and several explanatory variables x 1,… x k. Given values of x 1,… x k, predict the value of Y.
Typical Data Data on 5000 motor vehicle insurance policies having at least one claim Variables are –Y: log(amount of claim) –x 1 : sex of policy holder –x 2 : age of policy holder –x 3 : age of car –x 4 : car type (1-20 score, 1=Toyota Corolla, 20 = Porsche)
Exploratory Analysis Plot Y against other variables Scatterplot matrix Smooth as necessary
Log claims vs car age
The Model Relationship is modelled using the conditional distribution of Y given x 1,…x k. (covariates) Assume conditional distribution of Y is N( , 2 ) where depends on the covariates.
The Model (2) If all covariates are “continuous”, then x k x k In addition, all Y’s are assumed independent.
Estimation and Testing Estimate the ’s Estimate the error variance 2 Test if ’s Check goodness-of-fit
Least Squares Estimate ’s by values that minimize the sum of squares (Least squares estimates, LSE’s) Minimizing values are the solution of the Normal Equations. Minimum value is the residual sum of squares (RSS) estimated by RSS/(n-k-1)
Goodness of Fit Goodness of fit measured by R 2 : 0 R 2 1 (why?) R 2 =1 iff perfect fit (data all on a plane)
Prediction Y predicted by where the hat indicates the LSE Standard errors: 2 kinds, one for mean value of Y for a set of x’s, the other for an individual y for a particular set of x’s
Interpretation of Coefficients The LSE for variable x j is the amount we expect y to increase if x j is increased by a unit amount, assuming all the other x’s are held fixed The test for j = 0 is that variable j makes no contribution to the fit, given all other variables are in the model
Checking Assumptions (1) Tools are residuals, fitted values and hat matrix diagonals Fitted values Residuals Hat matrix diagonals (Measure the effect of an observation on its fitted value)
Checking Assumptions (2) Assumptions are –Mean linear in the x’s (plot residuals v fitted values, partial residual plot, CERES plots) –Constant variance (plot squared residuals v fitted values) –Independence (time series plot, residuals v preceding) –Normality/outliers (normal plot)
Remedial Action Transform variables Delete outliers Weighted least squares
Software SAS: PROC REG, PROC GLM R-Plus, R: lm Usage: lm(model formula, dataframe, weights,…)
Model Formula Assume k=3 If x 1,x 2,x 3 all continuous, fit a plane Y~x1 + x2 + x3 If x 1 categorical (eg gender) and x 2, x 3 continuous, fit a different plane/curve in x 2,x 3 for each level of x 1 : Y~x1 + x2 + x3 (planes parallel) Y~x1 + x2 + x3 + x1:x2 + x1:x3 (planes different)
Insurance Example (1) cars.lm<-lm(logad~poly(CARAGE,2)+PRIMAGEN+gender) summary(cars.lm) Call: lm(formula = logad ~ poly(CARAGE, 2) + PRIMAGEN + gender) Residuals: Min 1Q Median 3Q Max Coefficients: Estimate Std. Error t value Pr(>|t|) (Intercept) < 2e-16 *** poly(CARAGE, 2) e-09 *** poly(CARAGE, 2) e-11 *** PRIMAGEN ** gender Signif. codes: 0 `***' `**' 0.01 `*' 0.05 `.' 0.1 ` ' 1 Residual standard error: on 4995 degrees of freedom Multiple R-Squared: , Adjusted R-squared: F-statistic: on 4 and 4995 DF, p-value: < 2.2e-16
Insurance Example (2) > plot(cars.lm)