Download presentation
1
Linear Regression and Correlation Analysis
2
Methods for Studying Relationships
Chapter Goals To understand the methods for displaying and describing relationship among two variables Methods for Studying Relationships Models Linear regression Correlations Frequency tables Graphical Scatter plots Line plots 3-D plots
3
Two Quantitative Variables
The response variable, also called the dependent variable, is the variable we want to predict, and is usually denoted by y. The explanatory variable, also called the independent variable, is the variable that attempts to explain the response, and is denoted by x.
4
Scatter Plots and Correlation
A scatter plot (or scatter diagram) is used to show the relationship between two variables Correlation analysis is used to measure strength of the association (linear relationship) between two variables Only concerned with strength of the relationship No causal effect is implied
5
Example The following graph shows the scatter plot of Exam 1 score (x) and Exam 2 score (y) for 354 students in a class. Is there a relationship between x and y?
6
Scatter Plot Examples y x Linear relationships
Curvilinear relationships
7
Scatter Plot Examples (continued) No relationship y x y x
8
Correlation Coefficient
(continued) The population correlation coefficient ρ (rho) measures the strength of the association between the variables The sample correlation coefficient r is an estimate of ρ and is used to measure the strength of the linear relationship in the sample observations
9
Features of ρ and r Unit free Range between -1 and 1
The closer to -1, the stronger the negative linear relationship The closer to 1, the stronger the positive linear relationship The closer to 0, the weaker the linear relationship
10
Examples of Approximate r Values
Tag with appropriate value: -1, -.6, 0, +.3, 1 y x y
11
Earlier Example
12
Questions? What kind of relationship would you expect in the following situations: Age (in years) of a car, and its price. Number of calories consumed per day and weight. Height and IQ of a person.
13
Exercise Identify the two variables that vary and decide which should be the independent variable and which should be the dependent variable. Sketch a graph that you think best represents the relationship between the two variables. The size of a persons vocabulary over his or her lifetime. 2. The distance from the ceiling to the tip of the minute hand of a clock hung on the wall.
14
Introduction to Regression Analysis
Regression analysis is used to: Predict the value of a dependent variable based on the value of at least one independent variable. Explain the impact of changes in an independent variable on the dependent variable. Dependent variable: the variable we wish to explain. Independent variable: the variable used to explain the dependent variable.
15
Simple Linear Regression Model
Only one independent variable, x. Relationship between x and y is described by a linear function. Changes in y are assumed to be caused by changes in x.
16
Types of Regression Models
Positive Linear Relationship Negative Linear Relationship Relationship NOT Linear No Relationship
17
Population Linear Regression
The population regression model: Population y intercept Population Slope Coefficient Random Error term, or residual Independent Variable Dependent Variable Random Error component Linear component
18
Linear Regression Assumptions
The Error terms εi, i=1, 2. …, n are independent and εi ~ Normal (0, σ2). The Error terms εi, i=1, 2. …, n have constant variance σ2. The underlying relationship between the X variable and the Y variable is linear.
19
Population Linear Regression
(continued) Y Observed Value of yi for xi εi Slope = β1 Predicted Value of yi for xi Random Error for this xi value Y-Intercept = β0 xi X
20
Estimated Regression Model
The Regression Function: The sample regression line provides an estimate of the population regression function. Estimate of the regression y-intercept Estimate of the regression slope Estimated (or predicted) y value Independent variable The individual random error terms ei have a mean of zero
21
Earlier Example
22
Least Squares Criterion
Residual A residual is the difference between the observed response yi and the predicted response ŷi. Thus, for each pair of observations (xi, yi), the ith residual is ei = yi − ŷi = yi − (b0 + b1xi) Least Squares Criterion b0 and b1 are obtained by finding the values of b0 and b1 that minimize the sum of the squared residuals.
23
Interpretation of the Slope and the Intercept
b0 is the estimated average value of y when the value of x is zero. b1 is the estimated change in the average value of y as a result of a one-unit change in x.
24
The Least Squares Equation
The formulas for b1 and b0 are: algebraic equivalent: and
25
Finding the Least Squares Equation
The coefficients b0 and b1 will usually be found using computer software, such as Excel, Minitab, or SPSS. Other regression measures will also be computed as part of computer-based regression analysis.
26
Simple Linear Regression Example
A real estate agent wishes to examine the relationship between the selling price of a home and its size (measured in square feet) A random sample of 10 houses is selected Dependent variable (y) = house price in $1000s Independent variable (x) = square feet
27
Sample Data for House Price Model
House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255
28
SPSS Output The regression equation is:
29
Graphical Presentation
House price model: scatter plot and regression line Slope = 0.110 Intercept =
30
Interpretation of the Intercept, b0
b0 is the estimated average value of Y when the value of X is zero (if x = 0 is in the range of observed x values) Here, no houses had 0 square feet, so b0 = just indicates that, for houses within the range of sizes observed, $98, is the portion of the house price not explained by square feet
31
Interpretation of the Slope Coefficient, b1
b1 measures the estimated change in the average value of Y as a result of a one-unit change in X Here, b1 = tells us that the average value of a house increases by ($1000) = $109.77, on average, for each additional one square foot of size
32
Least Squares Regression Properties
The sum of the residuals from the least squares regression line is 0 ( ). The sum of the squared residuals is a minimum (minimized ). The simple regression line always passes through the mean of the y variable and the mean of the x variable. The least squares coefficients b0 and b1 are unbiased estimates of β0 and β1.
33
Exercise The growth of children from early childhood through adolescence generally follows a linear pattern. Data on the heights of female Americans during childhood, from four to nine years old, were compiled and the least squares regression line was obtained as ŷ = x where ŷ is the predicted height in inches, and x is age in years. Interpret the value of the estimated slope b1 = 2. 4. Would interpretation of the value of the estimated y-intercept, b0 = 32, make sense here? What would you predict the height to be for a female American at 8 years old? What would you predict the height to be for a female American at 25 years old? How does the quality of this answer compare to the previous question?
34
Coefficient of Determination, R2
The coefficient of determination is the portion of the total variation in the dependent variable that is explained by variation in the independent variable. The coefficient of determination is also called R-squared and is denoted as R2
35
Coefficient of Determination, R2
(continued) Note: In the single independent variable case, the coefficient of determination is where: R2 = Coefficient of determination r = Simple correlation coefficient
36
Examples of Approximate R2 Values
y y x x y y x x
37
Examples of Approximate R2 Values
y No linear relationship between x and y: The value of Y does not depend on x. (None of the variation in y is explained by variation in x) x R2 = 0
38
SPSS Output
39
Standard Error of Estimate
The standard deviation of the variation of observations around the regression line is called the standard error of estimate The standard error of the regression slope coefficient (b1) is given by sb1
40
SPSS Output
41
Comparing Standard Errors
Variation of observed y values from the regression line Variation in the slope of regression lines from different possible samples y y x x y y x x
42
Inference about the Slope: t-Test
t-test for a population slope Is there a linear relationship between X and Y? Null and alternate hypotheses H0: β1 = 0 (no linear relationship) H1: β1 0 (linear relationship does exist) Test statistic: Degree of Freedom: where: b1 = Sample regression slope coefficient β1 = Hypothesized slope sb1 = Estimator of the standard error of the slope
43
Example: Inference about the Slope: t Test
(continued) Estimated Regression Equation: House Price in $1000s (y) Square Feet (x) 245 1400 312 1600 279 1700 308 1875 199 1100 219 1550 405 2350 324 2450 319 1425 255 The slope of this model is Does square footage of the house affect its sales price?
44
Inferences about the Slope: t Test Example - Continue
Test Statistic: t = 3.329 b1 t* H0: β1 = 0 HA: β1 0 From Excel output: Coefficients Standard Error t Stat P-value Intercept Square Feet d.f. = 10-2 = 8 Decision: Reject H0 Conclusion: a/2=.025 a/2=.025 There is sufficient evidence that square footage affects house price Reject H0 Do not reject H0 Reject H0 -tα/2 t(1-α/2) 2.3060 3.329
45
Confidence Interval for the Slope
Confidence Interval Estimate of the Slope: d.f. = n - 2 Excel Printout for House Prices: Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept Square Feet At 95% level of confidence, the confidence interval for the slope is (0.0337, )
46
Confidence Interval for the Slope
Coefficients Standard Error t Stat P-value Lower 95% Upper 95% Intercept Square Feet Since the units of the house price variable is $1000s, we are 95% confident that the average impact on sales price is between $33.70 and $ per square foot of house size This 95% confidence interval does not include 0. Conclusion: There is a significant relationship between house price and square feet at the .05 level of significance
47
Residual Analysis Purposes Examine for linearity assumption
Examine for constant variance for all levels of x Evaluate normal distribution assumption Graphical Analysis of Residuals Can plot residuals vs. x Can create histogram of residuals to check for normality
48
Residual Analysis for Linearity
x x x x residuals residuals Not Linear Linear
49
Residual Analysis for Constant Variance
Non-constant variance Constant variance x y residuals
50
Example: Residual Output
Predicted House Price Residuals 1 2 3 4 5 6 7 8 9 10
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.