Download presentation
Presentation is loading. Please wait.
1
Chapter Topics Types of Regression Models
Determining the Simple Linear Regression Equation Measures of Variation in Regression and Correlation Assumptions of Regression and Correlation Residual Analysis and the Durbin-Watson Statistic Estimation of Predicted Values Correlation - Measuring the Strength of the Association
2
Purpose of Regression and Correlation Analysis
Regression Analysis is Used Primarily for Prediction A statistical model used to predict the values of a dependent or response variable based on values of at least one independent or explanatory variable Correlation Analysis is Used to Measure Strength of the Association Between Numerical Variables
3
Types of Regression Models
Positive Linear Relationship Relationship NOT Linear Negative Linear Relationship No Relationship
4
Simple Linear Regression Model
Relationship Between Variables Is a Linear Function The Straight Line that Best Fit the Data Y intercept Random Error Dependent (Response) Variable Independent (Explanatory) Variable Slope
5
Sample Linear Regression Model
Ù Ù Yi = Predicted Value of Y for observation i Xi = Value of X for observation i b0 = Sample Y - intercept used as estimate of the population b0 b1 = Sample Slope used as estimate of the population b1
6
Simple Linear Regression Equation: Example
Annual Store Square Sales Feet ($000) , ,681 , ,395 , ,653 , ,543 , ,318 , ,563 , ,760 You wish to examine the relationship between the square footage of produce stores and its annual sales. Sample data for 7 stores were obtained. Find the equation of the straight line that fits the data best
7
Equation for the Best Straight Line
Ù From Excel Printout:
8
Graph of the Best Straight Line
Yi = Xi Ù
9
Interpreting the Results
Ù Yi = Xi The slope of means for each increase of one unit in X, the Y is estimated to increase 1.487units. For each increase of 1 square foot in the size of the store, the model predicts that the expected annual sales are estimated to increase by $1487.
10
Measures of Variation: The Sum of Squares
SST = Total Sum of Squares measures the variation of the Yi values around their mean Y _ SSR = Regression Sum of Squares explained variation attributable to the relationship between X and Y SSE = Error Sum of Squares variation attributable to factors other than the relationship between X and Y
11
Measures of Variation: The Sum of Squares
Y Ù SSE =å(Yi - Yi )2 _ Ù Yi = b0 + b1Xi SST = å(Yi - Y)2 _ Ù SSR = å(Yi - Y)2 _ Y X Xi
12
The Sum of Squares: Example
Excel Output for Produce Stores SSR SSE SST
13
The Coefficient of Determination
SSR regression sum of squares r2 = = SST total sum of squares Measures the proportion of variation that is explained by the independent variable X in the regression model
14
Coefficients of Determination (r2) and Correlation (r)
Y r = +1 r2 = 1, Y r = -1 ^ Y = b + b X i 1 i ^ Y = b + b X i 1 i X X r2 = .8, Y r = +0.9 Y r2 = 0, r = 0 ^ ^ Y = b + b X Y = b + b X i 1 i i 1 i X X
15
Standard Error of Estimate
Ù = The standard deviation of the variation of observations around the regression line
16
Measures of Variation: Example Excel Output for Produce Stores
Syx r2 = .94 94% of the variation in annual sales can be explained by the variability in the size of the store as measured by square footage
17
Linear Regression Assumptions
For Linear Models 1. Normality Y Values Are Normally Distributed For Each X Probability Distribution of Error is Normal 2. Homoscedasticity (Constant Variance) 3. Independence of Errors
18
Variation of Errors Around the Regression Line
y values are normally distributed around the regression line. For each x value, the “spread” or variance around the regression line is the same. f(e) Y X2 X1 X Regression Line
19
Residual Analysis Purposes Graphical Analysis of Residuals
Examine Linearity Evaluate violations of assumptions Graphical Analysis of Residuals Plot residuals Vs. Xi values Difference between actual Yi & predicted Yi Studentized residuals: Allows consideration for the magnitude of the residuals Ù
20
Residual Analysis for Linearity
ü Not Linear Linear e e X X
21
Residual Analysis for Homoscedasticity
ü Heteroscedasticity Homoscedasticity SR SR X X Using Standardized Residuals
22
The Durbin-Watson Statistic
Used when data is collected over time to detect autocorrelation (Residuals in one time period are related to residuals in another period) Measures Violation of independence assumption Should be close to 2. If not, examine the model for autocorrelation.
23
Residual Analysis for Independence
ü Not Independent Independent SR SR X X
24
Inferences about the Slope: t Test
t Test for a Population Slope Is a Linear Relationship Between X & Y ? Null and Alternative Hypotheses H0: b1 = 0 (No Linear Relationship) H1: b1 ¹ 0 (Linear Relationship) Test Statistic: Where and df = n - 2
25
Example: Produce Stores
Data for 7 Stores: Regression Model Obtained: Annual Store Square Sales Feet ($000) , ,681 , ,395 , ,653 , ,543 , ,318 , ,563 , ,760 Ù Yi = Xi The slope of this model is Is there a linear relationship between the square footage of a store and its annual sales?
26
Inferences about the Slope: t Test
Test Statistic: Decision: Conclusion: H0: b1 = 0 H1: b1 ¹ 0 a = .05 df = = 7 Critical Value(s): From Excel Printout Reject H0 Reject Reject .025 .025 There is evidence of a relationship. t 2.5706
27
Inferences about the Slope: Confidence Interval Example
Confidence Interval Estimate of the Slope b1± tn-2 Excel Printout for Produce Stores At 95% level of Confidence The confidence Interval for the slope is (1.062, 1.911). Does not include 0. Conclusion: There is a significant linear relationship between annual sales and the size of the store.
28
Estimation of Predicted Values
Confidence Interval Estimate for mXY The Mean of Y given a particular Xi Size of interval vary according to distance away from mean, X. Standard error of the estimate t value from table with df=n-2
29
Estimation of Predicted Values
Confidence Interval Estimate for Individual Response Yi at a Particular Xi Addition of this 1 increased width of interval from that for the mean Y
30
Interval Estimates for Different Values of X
Confidence Interval for the mean of Y Confidence Interval for a individual Yi Y Ù Yi = b0 + b1Xi _ X X A Given X
31
Example: Produce Stores
Data for 7 Stores: Annual Store Square Sales Feet ($000) , ,681 , ,395 , ,653 , ,543 , ,318 , ,563 , ,760 Predict the annual sales for a store with 2000 square feet. Regression Model Obtained: Ù Yi = Xi
32
Estimation of Predicted Values: Example
Confidence Interval Estimate for Individual Y Find the 95% confidence interval for the average annual sales for stores of 2,000 square feet Ù Predicted Sales Yi = Xi = ($000) tn-2 = t5 = X = SYX = = ± Confidence interval for mean Y
33
Estimation of Predicted Values: Example
Confidence Interval Estimate for mXY Find the 95% confidence interval for annual sales of one particular store of 2,000 square feet Ù Predicted Sales Yi = Xi = ($000) tn-2 = t5 = X = SYX = = ± Confidence interval for individual Y
34
Correlation: Measuring the Strength of Association
Answer ‘How Strong Is the Linear Relationship Between 2 Variables?’ Coefficient of Correlation Used Population correlation coefficient denoted r (‘Rho’) Values range from -1 to +1 Measures degree of association Is the Square Root of the Coefficient of Determination
35
Test of Coefficient of Correlation
Tests If There Is a Linear Relationship Between 2 Numerical Variables Same Conclusion as Testing Population Slope b1 Hypotheses H0: r = 0 (No Correlation) H1: r ¹ 0 (Correlation)
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.