Lectures 15,16 – Additive Models, Trees, and Related Methods Rice ECE697 Farinaz Koushanfar Fall 2006
Summary Generalized Additive Models Tree-Based Methods PRIM – Bump Hunting Mutlivariate Adaptive Regression Splines (MARS) Missing Data
Additive Models In real life, effects are nonlinear Note: Some slides are borrowed from Tibshirani
Examples
The Price for Additivity Data from a study of Diabetic children, Predicting log C-peptide (a blood measurement)
Generalized Additive Models (GAM) Two-class Logistic Regression
Other Examples
Fitting Additive Models Given observations xi,yi, a criterion like the penalized sum of squares can be specified for this problem, where ’s are tuning parameters The mean of error term is zero!
Fitting Additive Models
The Backfitting Algorithm for Additive Models Initialize: Cycle: j=1,2,…,p,1,2,…,p,1,… Until the functions f j change less than a prespecified threshold
Fitting Additive Models (Cont’d)
Example: Penalized Least square
Example: Fitting GAM for Logistic Regression (Newton-Raphson Algorithm)
Example: Predicting Spam Data from 4601 mail messages, spam=1, =0, filter trained for each user separately Goal: predict whether an is spam (junk mail) or good Input features: relative frequencies in a message of 57 of the commonly occurring words and punctuation marks in all training set Not all errors are equal; we want to avoid filtering out good , while letting spam get through is not desirable but less serious in its consequences
Predictors
Details
Some Important Features
Results Test data confusion matrix for the additive logistic regression model fit to the spam training data The overall test error rate is 5.3%
Summary of Additive Logistic Fit Significant predictors from the additive model fit to the spam training data. The coefficients represent the linear part of f ^ j, along with their standard errors and Z-score. The nonlinear p-value represents a test of nonlinearity of f ^ j
Example: Plots for Spam Analysis Figure 9.1. Spam analysis: estimated functions for significant predictors. The rug plot along the bottom of each frame indicates the observed values of the corresponding predictor. For many predictors, the nonlinearity picks up the discontinuity at zero.
In Summary Additive models are a useful extension to linear models, making them more flexible The backfitting procedure is simple and modular Limitations for large data mining applications Backfitting fits all predictors, which is not desirable when a large number are available
Tree-Based Methods
Node Impurity Measures
Results for Spam Example
Pruned tree for the Spam Example
Classification Rules Fit to the Spam Data
PRIM-Bump Hunting
Number of Observations in a Box
Basis Functions
MARS Forward Modeling Procedure
Multiplication of Basis Functions
MARS on Spam Example