Download presentation
Presentation is loading. Please wait.
Published byMavis Bishop Modified over 9 years ago
1
Purpose of Regression Analysis Regression analysis is used primarily to model causality and provide prediction –Predicts the value of a dependent (response) variable based on the value of at least one independent (explanatory) variable –Explains the effect of the independent variables on the dependent variable
2
Types of Regression Models Positive Linear Relationship Negative Linear Relationship Relationship NOT Linear No Relationship
3
Simple Linear Regression Model Relationship between variables is described by a linear function The change of one variable causes the change in the other variable A dependency of one variable on the other
4
Population Regression Line (conditional mean) Population Linear Regression average value (conditional mean) Population regression line is a straight line that describes the dependence of the average value (conditional mean) of one variable on the other Population Y intercept Population Slope Coefficient Random Error Dependent (Response) Variable Independent (Explanatory) Variable
5
Population Linear Regression (continued) = Random Error Y X (Observed Value of Y) = Observed Value of Y (Conditional Mean)
6
estimate Sample regression line provides an estimate of the population regression line as well as a predicted value of Y Sample Linear Regression Sample Y Intercept Sample Slope Coefficient Residual Sample Regression Line (Fitted Regression Line, Predicted Value)
7
Sample Linear Regression and are obtained by finding the values of and that minimizes the sum of the squared residuals estimate provides an estimate of estimate provides and estimate of (continued)
8
Sample Linear Regression (continued) Y X Observed Value
9
Interpretation of the Slope and the Intercept is the average value of Y when the value of X is zero. measures the change in the average value of Y as a result of a one-unit change in X.
10
estimated is the estimated average value of Y when the value of X is zero. estimated is the estimated change in the average value of Y as a result of a one-unit change in X. (continued) Interpretation of the Slope and the Intercept
11
Simple Linear Regression: Example You want to examine the linear dependency of the annual sales of produce stores on their size in square footage. Sample data for seven stores were obtained. Find the equation of the straight line that fits the data best. Annual Store Square Sales Feet($1000) 1 1,726 3,681 2 1,542 3,395 3 2,816 6,653 4 5,555 9,543 5 1,292 3,318 6 2,208 5,563 7 1,313 3,760
12
Scatter Diagram: Example Excel Output
13
Equation for the Sample Regression Line: Example From Excel Printout:
14
Excel Output Regression Statistics Multiple R0.970557 R Square0.941981 Adjusted R Square0.930378 Standard Error611.7515 Observations7 ANOVA dfSSMSF Significance F Regression130380456 81.179090.000281 Residual51871200374239.9 Total632251656 Coefficient s Standard Errort StatP-valueLower 95%Upper 95% Intercept1636.415451.49533.6244330.015149475.81092797.019 X Variable 11.4866340.1649999.0099440.0002811.062491.910777
15
Graph of the Sample Regression Line: Example Y i = 1636.415 +1.487X i
16
Interpretation of Results: Example The slope of 1.487 means that for each increase of one unit in X, we predict the average of Y to increase by an estimated 1.487 units. The model estimates that for each increase of one square foot in the size of the store, the expected annual sales are predicted to increase by $1487.
17
How Good is the regression? R 2 Confidence Intervals Residual Plots Analysis of Variance Hypothesis (t) tests
18
Measure of Variation: The Sum of Squares SST = SSR + SSE Total Sample Variability = Explained Variability + Unexplained Variability
19
Measure of Variation: The Sum of Squares SST = total sum of squares –Measures the variation of the Y i values around their mean Y SSR = regression sum of squares –Explained variation attributable to the relationship between X and Y SSE = error sum of squares –Variation attributable to factors other than the relationship between X and Y (continued)
20
Measure of Variation: The Sum of Squares (continued) XiXi Y X Y SST = (Y i - Y) 2 SSE = (Y i - Y i ) 2 SSR = (Y i - Y) 2 _ _ _
21
The Coefficient of Determination Measures the proportion of variation in Y that is explained by the independent variable X in the regression model
22
Coefficients of Determination (r 2 ) and Correlation (r) r 2 = 1, r 2 =.8,r 2 = 0, Y Y i =b 0 +b 1 X i X ^ Y Y i =b 0 +b 1 X i X ^ Y Y i =b 0 +b 1 X i X ^ Y Y i =b 0 +b 1 X i X ^ r = +1 r = -1 r = +0.9r = 0
23
Linear Regression Assumptions 1.Linearity 2.Normality –Y values are normally distributed for each X –Probability distribution of error is normal 2.Homoscedasticity (Constant Variance) 3.Independence of Errors
24
Residual Analysis Purposes –Examine linearity –Evaluate violations of assumptions Graphical Analysis of Residuals –Plot residuals vs. X i, Y i and time
25
Residual Analysis for Linearity Not Linear Linear X e e X Y X Y X
26
Residual Analysis for Homoscedasticity Heteroscedasticity Homoscedasticity SR X X Y X X Y
27
Y values are normally distributed around the regression line. For each X value, the “spread” or variance around the regression line is the same. Variation of Errors around the Regression Line X1X1 X2X2 X Y f(e) Sample Regression Line
28
Residual Analysis:Excel Output for Produce Stores Example Excel Output
29
Residual Analysis for Independence Not Independent Independent e e Time Residual is plotted against time to detect any autocorrelation No Particular PatternCyclical Pattern Graphical Approach
30
Inference about the Slope: t Test t test for a population slope –Is there a linear dependency of Y on X ? Null and alternative hypotheses –H 0 : 1 = 0(no linear dependency) –H 1 : 1 0(linear dependency) Test statistic –
31
Example: Produce Store Data for Seven Stores: Estimated Regression Equation: The slope of this model is 1.487. Is square footage of the store affecting its annual sales? Annual Store Square Sales Feet($000) 1 1,726 3,681 2 1,542 3,395 3 2,816 6,653 4 5,555 9,543 5 1,292 3,318 6 2,208 5,563 7 1,313 3,760 Y i = 1636.415 +1.487X i
32
Inferences about the Slope: t Test Example H 0 : 1 = 0 H 1 : 1 0 .05 df 7 - 2 = 5 Critical Value(s): Test Statistic: Decision: Conclusion: There is evidence that square footage affects annual sales. t 02.5706-2.5706.025 Reject.025 From Excel Printout Reject H 0
33
Population Y-intercept Population slopesRandom Error The Multiple Regression Model Relationship between 1 dependent & 2 or more independent variables is a linear function Dependent (Response) variable for sample Independent (Explanatory) variables for sample model Residual
34
Population Multiple Regression Model Bivariate model
35
Sample Multiple Regression Model Bivariate model Sample Regression Plane
36
Simple and Multiple Regression Compared Coefficients in a simple regression pick up the impact of that variable plus the impacts of other variables that are correlated with it and the dependent variable. Coefficients in a multiple regression net out the impacts of other variables in the equation.
37
Simple and Multiple Regression Compared:Example Two simple regressions: – Multiple regression: –
38
Multiple Linear Regression Equation Too complicated by hand! Ouch!
39
Interpretation of Estimated Coefficients Slope (b i ) –Estimated that the average value of Y changes by b i for each 1 unit increase in X i holding all other variables constant (ceteris paribus) –Example: if b 1 = -2, then fuel oil usage (Y) is expected to decrease by an estimated 2 gallons for each 1 degree increase in temperature (X 1 ) given the inches of insulation (X 2 ) Y-intercept (b 0 ) –The estimated average value of Y when all X i = 0
40
Multiple Regression Model: Example ( 0 F) Develop a model for estimating heating oil used for a single family home in the month of January based on average temperature and amount of insulation in inches.
41
Sample Multiple Regression Equation: Example Excel Output For each degree increase in temperature, the estimated average amount of heating oil used is decreased by 5.437 gallons, holding insulation constant. For each increase in one inch of insulation, the estimated average use of heating oil is decreased by 20.012 gallons, holding temperature constant.
42
Confidence Interval Estimate for the Slope Provide the 95% confidence interval for the population slope 1 (the effect of temperature on oil consumption). -6.169 1 -4.704 The estimated average consumption of oil is reduced by between 4.7 gallons to 6.17 gallons per each increase of 1 0 F.
43
Coefficient of Multiple Determination Proportion of total variation in Y explained by all X variables taken together – Never decreases when a new X variable is added to model –Disadvantage when comparing models
44
Adjusted Coefficient of Multiple Determination Proportion of variation in Y explained by all X variables adjusted for the number of X variables used – –Penalize excessive use of independent variables –Smaller than –Useful in comparing among models
45
Coefficient of Multiple Determination Excel Output Adjusted r 2 reflects the number of explanatory variables and sample size is smaller than r 2
46
Interpretation of Coefficient of Multiple Determination –96.56% of the total variation in heating oil can be explained by different temperature and amount of insulation – 95.99% of the total fluctuation in heating oil can be explained by different temperature and amount of insulation after adjusting for the number of explanatory variables and sample size
47
Using The Model to Make Predictions Predict the amount of heating oil used for a home if the average temperature is 30 0 and the insulation is six inches. The predicted heating oil used is 278.97 gallons
48
Testing for Overall Significance Shows if there is a linear relationship between all of the X variables together and Y Use F test statistic Hypotheses: –H 0 : … k = 0 (no linear relationship) –H 1 : at least one i ( at least one independent variable affects Y ) The null hypothesis is a very strong statement Almost always reject the null hypothesis
49
Test for Significance: Individual Variables Shows if there is a linear relationship between the variable X i and Y Use t test statistic Hypotheses: –H 0 : i 0 (no linear relationship) –H 1 : i 0 (linear relationship between X i and Y)
50
Residual Plots Residuals vs. –May need to transform variable Residuals vs. –May need to transform variable Residuals vs. time –May have autocorrelation
51
Residual Plots: Example No Discernable Pattern Maybe some non- linear relationship
52
The Quadratic Regression Model Relationship between one response variable and two or more explanatory variables is a quadratic polynomial function Useful when scatter diagram indicates non- linear relationship Quadratic model : – The second explanatory variable is the square of the first variable
53
Quadratic Regression Model (continued) Quadratic models may be considered when scatter diagram takes on the following shapes: X1X1 Y X1X1 X1X1 YYY 2 > 0 2 < 0 2 = the coefficient of the quadratic term X1X1
54
Testing for Significance: Quadratic Model Testing for Overall Relationship –Similar to test for linear model –F test statistic = Testing the Quadratic Effect –Compare quadratic model with the linear model –Hypotheses (No 2 nd order polynomial term) (2 nd order polynomial term is needed)
55
Dummy Variable Models Categorical explanatory variable (dummy variable) with two or more levels: Yes or no, on or off, male or female, Coded as 0 or 1 Only intercepts are different Assumes equal slopes across categories The number of dummy variables needed is (number of levels - 1) Regression model has same form:
56
Dummy-Variable Models (with 2 Levels) Given: Y = Assessed Value of House X 1 = Square footage of House X 2 = Desirability of Neighborhood = Desirable (X 2 = 1) Undesirable (X 2 = 0) 0 if undesirable 1 if desirable Same slopes
57
Undesirable Desirable Location Dummy-Variable Models (with 2 Levels) (continued) X 1 (Square footage) Y (Assessed Value) b 0 + b 2 b0b0 Same slopes Intercepts different
58
Interpretation of the Dummy Variable Coefficient (with 2 Levels) Example: : GPA 0 Female 1 Male : Annual salary of college graduate in thousand $ On average, male college graduates are making an estimated six thousand dollars more than female college graduates with the same GPA. :
59
Dummy-Variable Models (with 3 Levels)
60
Interpretation of the Dummy Variable Coefficients (with 3 Levels) With the same footage, a Split- level will have an estimated average assessed value of 18.84 thousand dollars more than a Condo. With the same footage, a Ranch will have an estimated average assessed value of 23.53 thousand dollars more than a Condo.
61
Dummy Variables Predict Weekly Sales in a Grocery Store Possible independent variables: –Price –Grocery Chain Data Set: –Grocery.xlsGrocery.xls Interaction Effect?
62
Interaction Regression Model Hypothesizes interaction between pairs of X variables –Response to one X variable varies at different levels of another X variable Contains two-way cross product terms – Can be combined with other models –E.G., Dummy variable model
63
Effect of Interaction Given: – Without interaction term, effect of X 1 on Y is measured by 1 With interaction term, effect of X 1 on Y is measured by 1 + 3 X 2 Effect changes as X 2 increases
64
Y = 1 + 2X 1 + 3(1) + 4X 1 (1) = 4 + 6X 1 Y = 1 + 2X 1 + 3(0) + 4X 1 (0) = 1 + 2X 1 Interaction Example Effect (slope) of X 1 on Y does depend on X 2 value X1X1 4 8 12 0 010.51.5 Y Y = 1 + 2X 1 + 3X 2 + 4X 1 X 2
65
Interaction Regression Model Worksheet Multiply X 1 by X 2 to get X 1 X 2. Run regression with Y, X 1, X 2, X 1 X 2
66
Hypothesize interaction between pairs of independent variables Contains 2-way product terms Evaluating Presence of Interaction
67
Using Transformations Requires data transformation Either or both independent and dependent variables may be transformed Can be based on theory, logic or scatter diagrams
68
Inherently Linear Models Non-linear models that can be expressed in linear form –Can be estimated by least squares in linear form Require data transformation
69
Transformed Multiplicative Model (Log- Log) Similarly for X 2
70
Square Root Transformation 1 > 0 1 < 0 Similarly for X 2 Transforms one of above model to one that appears linear. Often used to overcome heteroscedasticity.
71
Linear-Logarithmic Transformation 1 > 0 1 < 0 Similarly for X 2 Transformed from an original multiplicative model
72
Exponential Transformation (Log-Linear) Original Model 1 > 0 1 < 0 Transformed Into:
73
Model Building / Model Selection Find “the best” set of explanatory variables among all the ones given. “Best subset” regression (only linear models) –Requires a lot of computation (2 N regressions) “Stepwise regression” “Common Sense” methodology –Run regression with all variables –Throw out variables not statistically significant –“Adjust” model by including some excluded variables, one at a time Tradeoff: Parsimony vs. Fit
74
Association ≠ Causation !
75
Regression Limitations R 2 measures the association between independent and dependent variables Association ≠ Causation ! Be careful about doing predictions that involve extrapolation Inclusion / Exclusion of independent variables is subject to a type I / type II error
76
Multi-collinearity What? –When one independent variable is highly correlated (“collinear”) with one or more other independent variables –Examples: square feet and square meters as independent variables to predict house price (1 sq ft is roughly 0.09 sq meters) “total rooms” and bedrooms plus bathrooms for a house How to detect? –Run a regression with the “not-so-independent” independent variable (in the examples above: square feet and total rooms) as a function of all other remaining independent variables, e.g.: X 1 = β 0 + β 2 X 2 + …+ β k X k –If R 2 of the above regression is > 0.8, then one suspects multi- collinearity to be present
77
Multi-collinearity What effect? –Coefficient estimates are unreliable –Can still be used for predicting values for Y –If possible, delete the “not-so-independent” independent variable When to check? –When one suspects that two variables measure the same thing, or when the two variables are highly correlated –When one suspects that one independent variable is a (linear) function of the other independent variables (continued)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.