Correlation and Regression

Slides:



Advertisements
Similar presentations
Lesson 10: Linear Regression and Correlation
Advertisements

Inference for Regression
Correlation and regression Dr. Ghada Abo-Zaid
Statistics Measures of Regression and Prediction Intervals.
Correlation and Regression
© The McGraw-Hill Companies, Inc., 2000 CorrelationandRegression Further Mathematics - CORE.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
Linear Regression and Correlation
SIMPLE LINEAR REGRESSION
Correlation and Regression. Correlation What type of relationship exists between the two variables and is the correlation significant? x y Cigarettes.
Chapter 9: Correlation and Regression
Hypothesis Testing for Variance and Standard Deviation
Linear Regression Larson/Farber 4th ed. 1 Section 9.2.
Correlation and Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Section 9.2 Linear Regression © 2012 Pearson Education, Inc. All rights reserved.
SIMPLE LINEAR REGRESSION
Measures of Regression and Prediction Intervals
Introduction to Linear Regression and Correlation Analysis
Linear Regression and Correlation
Chapter 11 Nonparametric Tests Larson/Farber 4th ed.
11 Chapter Nonparametric Tests © 2012 Pearson Education, Inc.
Correlation and Regression
Chapter Correlation and Regression 1 of 84 9 © 2012 Pearson Education, Inc. All rights reserved.
Correlation and Regression
Hypothesis Testing for the Mean (Small Samples)
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Correlation and Regression
S ECTION 9.1 Correlation Larson/Farber 4th ed. 1.
Section 10.3 Comparing Two Variances Larson/Farber 4th ed1.
CHAPTER 14 MULTIPLE REGRESSION
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
© The McGraw-Hill Companies, Inc., Chapter 11 Correlation and Regression.
Correlation and Regression
Production Planning and Control. A correlation is a relationship between two variables. The data can be represented by the ordered pairs (x, y) where.
Correlation and Regression Chapter 9. § 9.3 Measures of Regression and Prediction Intervals.
Elementary Statistics Correlation and Regression.
Welcome to MM207! Unit 9 Seminar. End of term deadlines Final Project due Tuesday, by 11:59 pm ET Unit 10 contains several discussion questions and an.
Copyright © 2013, 2009, and 2007, Pearson Education, Inc. Chapter 13 Multiple Regression Section 13.3 Using Multiple Regression to Make Inferences.
SECTION 7.2 Hypothesis Testing for the Mean (Large Samples) 1 Larson/Farber 4th ed.
© Copyright McGraw-Hill Correlation and Regression CHAPTER 10.
Section 9.2 Linear Regression. Section 9.2 Objectives Find the equation of a regression line Predict y-values using a regression equation.
HAWKES LEARNING SYSTEMS Students Matter. Success Counts. Copyright © 2013 by Hawkes Learning Systems/Quant Systems, Inc. All rights reserved. Section 12.3.
Chapter 9 Correlation and Regression.
Linear Regression 1 Section 9.2. Section 9.2 Objectives 2 Find the equation of a regression line Predict y-values using a regression equation.
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Chapter Correlation and Regression 9.
Correlation & Regression Analysis
Chapter 7 Calculation of Pearson Coefficient of Correlation, r and testing its significance.
Copyright (C) 2002 Houghton Mifflin Company. All rights reserved. 1 Understandable Statistics Seventh Edition By Brase and Brase Prepared by: Lynn Smith.
Copyright © 2015, 2012, and 2009 Pearson Education, Inc. 1 Chapter Correlation and Regression 9.
Correlation and Regression Elementary Statistics Larson Farber Chapter 9 Hours of Training Accidents.
Chapter Correlation and Regression 1 of 84 9 © 2012 Pearson Education, Inc. All rights reserved.
Section 9-1 – Correlation A correlation is a relationship between two variables. The data can be represented by ordered pairs (x,y) where x is the independent.
Section Copyright © 2014, 2012, 2010 Pearson Education, Inc. Lecture Slides Elementary Statistics Twelfth Edition and the Triola Statistics Series.
Correlation and Regression
10 Chapter Chi-Square Tests and the F-Distribution Chapter 10
Regression and Correlation
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Chapter 7 Hypothesis Testing with One Sample.
Chapter 8 Hypothesis Testing with Two Samples.
Correlation and Regression
Correlation and Regression
Correlation and Regression
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
Section 10-1 Correlation © 2012 Pearson Education, Inc. All rights reserved.
Measures of Regression Prediction Interval
Correlation and Regression
Presentation transcript:

Correlation and Regression Chapter 9 Correlation and Regression Larson/Farber 4th ed.

Chapter Outline 9.1 Correlation 9.2 Linear Regression 9.3 Measures of Regression and Prediction Intervals 9.4 Multiple Regression Larson/Farber 4th ed.

Section 9.1 Correlation Larson/Farber 4th ed.

Section 9.1 Objectives Introduce linear correlation, independent and dependent variables, and the types of correlation Find a correlation coefficient Test a population correlation coefficient ρ using a table Perform a hypothesis test for a population correlation coefficient ρ Distinguish between correlation and causation Larson/Farber 4th ed.

Correlation Correlation A relationship between two variables. The data can be represented by ordered pairs (x, y) x is the independent (or explanatory) variable y is the dependent (or response) variable Larson/Farber 4th ed.

Correlation A scatter plot can be used to determine whether a linear (straight line) correlation exists between two variables. x 2 4 –2 – 4 y 6 Example: x 1 2 3 4 5 y – 4 – 2 – 1 Larson/Farber 4th ed.

Types of Correlation As x increases, y tends to decrease. As x increases, y tends to increase. Negative Linear Correlation Positive Linear Correlation x y x y No Correlation Nonlinear Correlation Larson/Farber 4th ed.

Example: Constructing a Scatter Plot A marketing manager conducted a study to determine whether there is a linear relationship between money spent on advertising and company sales. The data are shown in the table. Display the data in a scatter plot and determine whether there appears to be a positive or negative linear correlation or no linear correlation. Advertising expenses, ($1000), x Company sales ($1000), y 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 Larson/Farber 4th ed.

Solution: Constructing a Scatter Plot x y Advertising expenses (in thousands of dollars) Company sales Appears to be a positive linear correlation. As the advertising expenses increase, the sales tend to increase. Larson/Farber 4th ed.

Example: Constructing a Scatter Plot Using Technology Old Faithful, located in Yellowstone National Park, is the world’s most famous geyser. The duration (in minutes) of several of Old Faithful’s eruptions and the times (in minutes) until the next eruption are shown in the table. Using a TI-83/84, display the data in a scatter plot. Determine the type of correlation. Duration x Time, y 1.8 56 3.78 79 1.82 58 3.83 85 1.9 62 3.88 80 1.93 4.1 89 1.98 57 4.27 90 2.05 4.3 2.13 60 4.43 2.3 4.47 86 2.37 61 4.53 2.82 73 4.55 3.13 76 4.6 92 3.27 77 4.63 91 3.65   Larson/Farber 4th ed.

Solution: Constructing a Scatter Plot Using Technology Enter the x-values into list L1 and the y-values into list L2. Use Stat Plot to construct the scatter plot. STAT > Edit… STATPLOT 1 5 50 100 From the scatter plot, it appears that the variables have a positive linear correlation. Larson/Farber 4th ed.

Correlation Coefficient A measure of the strength and the direction of a linear relationship between two variables. The symbol r represents the sample correlation coefficient. A formula for r is The population correlation coefficient is represented by ρ (rho). n is the number of data pairs Larson/Farber 4th ed.

Correlation Coefficient The range of the correlation coefficient is -1 to 1. -1 1 If r = -1 there is a perfect negative correlation If r is close to 0 there is no linear correlation If r = 1 there is a perfect positive correlation Larson/Farber 4th ed.

Linear Correlation Strong negative correlation x y x y r = 0.91 r = 0.88 Strong negative correlation Strong positive correlation x y x y r = 0.42 r = 0.07 Weak positive correlation Nonlinear Correlation Larson/Farber 4th ed.

Calculating a Correlation Coefficient In Words In Symbols Find the sum of the x-values. Find the sum of the y-values. Multiply each x-value by its corresponding y-value and find the sum. Larson/Farber 4th ed.

Calculating a Correlation Coefficient In Words In Symbols Square each x-value and find the sum. Square each y-value and find the sum. Use these five sums to calculate the correlation coefficient. Larson/Farber 4th ed.

Example: Finding the Correlation Coefficient Calculate the correlation coefficient for the advertising expenditures and company sales data. What can you conclude? Advertising expenses, ($1000), x Company sales ($1000), y 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 Larson/Farber 4th ed.

Solution: Finding the Correlation Coefficient x y xy x2 y2 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 540 5.76 50,625 294.4 2.56 33,856 440 4 48,400 624 6.76 57,600 252 1.96 32,400 294.4 2.56 33,856 372 4 34,596 473 4.84 46,225 Σx = 15.8 Σy = 1634 Σxy = 3289.8 Σx2 = 32.44 Σy2 = 337,558 Larson/Farber 4th ed.

Solution: Finding the Correlation Coefficient Σx = 15.8 Σy = 1634 Σxy = 3289.8 Σx2 = 32.44 Σy2 = 337,558 r ≈ 0.913 suggests a strong positive linear correlation. As the amount spent on advertising increases, the company sales also increase. Larson/Farber 4th ed.

Example: Using Technology to Find a Correlation Coefficient Use a technology tool to calculate the correlation coefficient for the Old Faithful data. What can you conclude? Duration x Time, y 1.8 56 3.78 79 1.82 58 3.83 85 1.9 62 3.88 80 1.93 4.1 89 1.98 57 4.27 90 2.05 4.3 2.13 60 4.43 2.3 4.47 86 2.37 61 4.53 2.82 73 4.55 3.13 76 4.6 92 3.27 77 4.63 91 3.65   Larson/Farber 4th ed.

Solution: Using Technology to Find a Correlation Coefficient To calculate r, you must first enter the DiagnosticOn command found in the Catalog menu STAT > Calc r ≈ 0.979 suggests a strong positive correlation. Larson/Farber 4th ed.

Using a Table to Test a Population Correlation Coefficient ρ Once the sample correlation coefficient r has been calculated, we need to determine whether there is enough evidence to decide that the population correlation coefficient ρ is significant at a specified level of significance. Use Table 11 in Appendix B. If |r| is greater than the critical value, there is enough evidence to decide that the correlation coefficient ρ is significant. Larson/Farber 4th ed.

Using a Table to Test a Population Correlation Coefficient ρ Determine whether ρ is significant for five pairs of data (n = 5) at a level of significance of α = 0.01. If |r| > 0.959, the correlation is significant. Otherwise, there is not enough evidence to conclude that the correlation is significant. level of significance Number of pairs of data in sample Larson/Farber 4th ed.

Using a Table to Test a Population Correlation Coefficient ρ In Words In Symbols Determine the number of pairs of data in the sample. Specify the level of significance. Find the critical value. Determine n. Identify . Use Table 11 in Appendix B. Larson/Farber 4th ed.

Using a Table to Test a Population Correlation Coefficient ρ In Words In Symbols Decide if the correlation is significant. Interpret the decision in the context of the original claim. If |r| > critical value, the correlation is significant. Otherwise, there is not enough evidence to support that the correlation is significant. Larson/Farber 4th ed.

Example: Using a Table to Test a Population Correlation Coefficient ρ Using the Old Faithful data, you used 25 pairs of data to find r ≈ 0.979. Is the correlation coefficient significant? Use α = 0.05. Duration x Time, y 1.8 56 3.78 79 1.82 58 3.83 85 1.9 62 3.88 80 1.93 4.1 89 1.98 57 4.27 90 2.05 4.3 2.13 60 4.43 2.3 4.47 86 2.37 61 4.53 2.82 73 4.55 3.13 76 4.6 92 3.27 77 4.63 91 3.65   Larson/Farber 4th ed.

Solution: Using a Table to Test a Population Correlation Coefficient ρ There is enough evidence at the 5% level of significance to conclude that there is a significant linear correlation between the duration of Old Faithful’s eruptions and the time between eruptions. Larson/Farber 4th ed.

Hypothesis Testing for a Population Correlation Coefficient ρ A hypothesis test can also be used to determine whether the sample correlation coefficient r provides enough evidence to conclude that the population correlation coefficient ρ is significant at a specified level of significance. A hypothesis test can be one-tailed or two-tailed. Larson/Farber 4th ed.

Hypothesis Testing for a Population Correlation Coefficient ρ Left-tailed test Right-tailed test Two-tailed test H0: ρ  0 (no significant negative correlation) Ha: ρ < 0 (significant negative correlation) H0: ρ  0 (no significant positive correlation) Ha: ρ > 0 (significant positive correlation) H0: ρ = 0 (no significant correlation) Ha: ρ  0 (significant correlation) Larson/Farber 4th ed.

The t-Test for the Correlation Coefficient Can be used to test whether the correlation between two variables is significant. The test statistic is r The standardized test statistic follows a t-distribution with d.f. = n – 2. In this text, only two-tailed hypothesis tests for ρ are considered. Larson/Farber 4th ed.

Using the t-Test for ρ In Words In Symbols State the null and alternative hypothesis. Specify the level of significance. Identify the degrees of freedom. Determine the critical value(s) and rejection region(s). State H0 and Ha. Identify . d.f. = n – 2. Use Table 5 in Appendix B. Larson/Farber 4th ed.

Using the t-Test for ρ In Words In Symbols Find the standardized test statistic. Make a decision to reject or fail to reject the null hypothesis. Interpret the decision in the context of the original claim. If t is in the rejection region, reject H0. Otherwise fail to reject H0. Larson/Farber 4th ed.

Example: t-Test for a Correlation Coefficient Previously you calculated r ≈ 0.9129. Test the significance of this correlation coefficient. Use α = 0.05. Advertising expenses, ($1000), x Company sales ($1000), y 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 Larson/Farber 4th ed.

Solution: t-Test for a Correlation Coefficient H0: Ha:   d.f. = Rejection Region: ρ = 0 ρ ≠ 0 Test Statistic: 0.05 8 – 2 = 6 Decision: Reject H0 At the 5% level of significance, there is enough evidence to conclude that there is a significant linear correlation between advertising expenses and company sales. t -2.447 0.025 2.447 -2.447 2.447 5.478 Larson/Farber 4th ed.

Correlation and Causation The fact that two variables are strongly correlated does not in itself imply a cause-and-effect relationship between the variables. If there is a significant correlation between two variables, you should consider the following possibilities. Is there a direct cause-and-effect relationship between the variables? Does x cause y? Larson/Farber 4th ed.

Correlation and Causation Is there a reverse cause-and-effect relationship between the variables? Does y cause x? Is it possible that the relationship between the variables can be caused by a third variable or by a combination of several other variables? Is it possible that the relationship between two variables may be a coincidence? Larson/Farber 4th ed.

Section 9.1 Summary Introduced linear correlation, independent and dependent variables and the types of correlation Found a correlation coefficient Tested a population correlation coefficient ρ using a table Performed a hypothesis test for a population correlation coefficient ρ Distinguished between correlation and causation Larson/Farber 4th ed.

Section 9.2 Linear Regression Larson/Farber 4th ed.

Section 9.2 Objectives Find the equation of a regression line Predict y-values using a regression equation Larson/Farber 4th ed.

Regression lines After verifying that the linear correlation between two variables is significant, next we determine the equation of the line that best models the data (regression line). Can be used to predict the value of y for a given value of x. x y Larson/Farber 4th ed.

Residuals Residual The difference between the observed y-value and the predicted y-value for a given x-value on the line. For a given x-value, di = (observed y-value) – (predicted y-value) x y }d1 }d2 d3{ d4{ }d5 d6{ Predicted y-value Observed y-value Larson/Farber 4th ed.

Regression Line Regression line (line of best fit) The line for which the sum of the squares of the residuals is a minimum. The equation of a regression line for an independent variable x and a dependent variable y is ŷ = mx + b y-intercept Predicted y-value for a given x-value Slope Larson/Farber 4th ed.

The Equation of a Regression Line ŷ = mx + b where is the mean of the y-values in the data is the mean of the x-values in the data The regression line always passes through the point Larson/Farber 4th ed.

Example: Finding the Equation of a Regression Line Find the equation of the regression line for the advertising expenditures and company sales data. Advertising expenses, ($1000), x Company sales ($1000), y 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 Larson/Farber 4th ed.

Solution: Finding the Equation of a Regression Line Recall from section 9.1: x y xy x2 y2 2.4 225 1.6 184 2.0 220 2.6 240 1.4 180 186 2.2 215 540 5.76 50,625 294.4 2.56 33,856 440 4 48,400 624 6.76 57,600 252 1.96 32,400 294.4 2.56 33,856 372 4 34,596 473 4.84 46,225 Σx = 15.8 Σy = 1634 Σxy = 3289.8 Σx2 = 32.44 Σy2 = 337,558 Larson/Farber 4th ed.

Solution: Finding the Equation of a Regression Line Σx = 15.8 Σy = 1634 Σxy = 3289.8 Σx2 = 32.44 Σy2 = 337,558 Equation of the regression line Larson/Farber 4th ed.

Solution: Finding the Equation of a Regression Line To sketch the regression line, use any two x-values within the range of the data and calculate the corresponding y-values from the regression line. x Advertising expenses (in thousands of dollars) Company sales y Larson/Farber 4th ed.

Example: Using Technology to Find a Regression Equation Use a technology tool to find the equation of the regression line for the Old Faithful data. Duration x Time, y 1.8 56 3.78 79 1.82 58 3.83 85 1.9 62 3.88 80 1.93 4.1 89 1.98 57 4.27 90 2.05 4.3 2.13 60 4.43 2.3 4.47 86 2.37 61 4.53 2.82 73 4.55 3.13 76 4.6 92 3.27 77 4.63 91 3.65   Larson/Farber 4th ed.

Solution: Using Technology to Find a Regression Equation 100 50 1 5 Larson/Farber 4th ed.

Example: Predicting y-Values Using Regression Equations The regression equation for the advertising expenses (in thousands of dollars) and company sales (in thousands of dollars) data is ŷ = 50.729x + 104.061. Use this equation to predict the expected company sales for the following advertising expenses. (Recall from section 9.1 that x and y have a significant linear correlation.) 1.5 thousand dollars 1.8 thousand dollars 2.5 thousand dollars Larson/Farber 4th ed.

Solution: Predicting y-Values Using Regression Equations ŷ = 50.729x + 104.061 1.5 thousand dollars ŷ =50.729(1.5) + 104.061 ≈ 180.155 When the advertising expenses are $1500, the company sales are about $180,155. 1.8 thousand dollars ŷ =50.729(1.8) + 104.061 ≈ 195.373 When the advertising expenses are $1800, the company sales are about $195,373. Larson/Farber 4th ed.

Solution: Predicting y-Values Using Regression Equations 2.5 thousand dollars ŷ =50.729(2.5) + 104.061 ≈ 230.884 When the advertising expenses are $2500, the company sales are about $230,884. Prediction values are meaningful only for x-values in (or close to) the range of the data. The x-values in the original data set range from 1.4 to 2.6. So, it would not be appropriate to use the regression line to predict company sales for advertising expenditures such as 0.5 ($500) or 5.0 ($5000). Larson/Farber 4th ed.

Section 9.2 Summary Found the equation of a regression line Predicted y-values using a regression equation Larson/Farber 4th ed.

Measures of Regression and Prediction Intervals Section 9.3 Measures of Regression and Prediction Intervals Larson/Farber 4th ed.

Section 9.3 Objectives Interpret the three types of variation about a regression line Find and interpret the coefficient of determination Find and interpret the standard error of the estimate for a regression line Construct and interpret a prediction interval for y Larson/Farber 4th ed.

Variation About a Regression Line Three types of variation about a regression line Total variation Explained variation Unexplained variation To find the total variation, you must first calculate The total deviation The explained deviation The unexplained deviation Larson/Farber 4th ed.

Variation About a Regression Line Total Deviation = Explained Deviation = Unexplained Deviation = y (xi, yi) Unexplained deviation Total deviation Explained deviation (xi, ŷi) (xi, yi) x Larson/Farber 4th ed.

Variation About a Regression Line Total variation The sum of the squares of the differences between the y-value of each ordered pair and the mean of y. Explained variation The sum of the squares of the differences between each predicted y-value and the mean of y. Total variation = Explained variation = Larson/Farber 4th ed.

Variation About a Regression Line Unexplained variation The sum of the squares of the differences between the y-value of each ordered pair and each corresponding predicted y-value. Unexplained variation = The sum of the explained and unexplained variation is equal to the total variation. Total variation = Explained variation + Unexplained variation Larson/Farber 4th ed.

Coefficient of Determination The ratio of the explained variation to the total variation. Denoted by r2 Larson/Farber 4th ed.

Example: Coefficient of Determination The correlation coefficient for the advertising expenses and company sales data as calculated in Section 9.1 is r ≈ 0.913. Find the coefficient of determination. What does this tell you about the explained variation of the data about the regression line? About the unexplained variation? Solution: About 83.4% of the variation in the company sales can be explained by the variation in the advertising expenditures. About 16.9% of the variation is unexplained. Larson/Farber 4th ed.

The Standard Error of Estimate The standard deviation of the observed yi -values about the predicted ŷ-value for a given xi -value. Denoted by se. The closer the observed y-values are to the predicted y-values, the smaller the standard error of estimate will be. n is the number of ordered pairs in the data set Larson/Farber 4th ed.

The Standard Error of Estimate In Words In Symbols Make a table that includes the column heading shown. Use the regression equation to calculate the predicted y-values. Calculate the sum of the squares of the differences between each observed y-value and the corresponding predicted y-value. Find the standard error of estimate. Larson/Farber 4th ed.

Example: Standard Error of Estimate The regression equation for the advertising expenses and company sales data as calculated in section 9.2 is ŷ = 50.729x + 104.061 Find the standard error of estimate. Solution: Use a table to calculate the sum of the squared differences of each observed y-value and the corresponding predicted y-value. Larson/Farber 4th ed.

Solution: Standard Error of Estimate x y ŷ i (yi – ŷ i)2 2.4 225 225.81 (225 – 225.81)2 = 0.6561 1.6 184 185.23 (184 – 185.23)2 = 1.5129 2.0 220 205.52 (220 – 205.52)2 = 209.6704 2.6 240 235.96 (240 – 235.96)2 = 16.3216 1.4 180 175.08 (180 – 175.08)2 = 24.2064 186 (186 – 205.52)2 = 381.0304 2.2 215 215.66 (215 – 215.66)2 = 0.4356 Σ = 635.3463 unexplained variation Larson/Farber 4th ed.

Solution: Standard Error of Estimate n = 8, Σ(yi – ŷ i)2 = 635.3463 The standard error of estimate of the company sales for a specific advertising expense is about $10.29. Larson/Farber 4th ed.

Prediction Intervals Two variables have a bivariate normal distribution if for any fixed value of x, the corresponding values of y are normally distributed and for any fixed values of y, the corresponding x-values are normally distributed. Larson/Farber 4th ed.

Prediction Intervals A prediction interval can be constructed for the true value of y. Given a linear regression equation ŷ = mx + b and x0, a specific value of x, a c-prediction interval for y is ŷ – E < y < ŷ + E where The point estimate is ŷ and the margin of error is E. The probability that the prediction interval contains y is c. Larson/Farber 4th ed.

Constructing a Prediction Interval for y for a Specific Value of x In Words In Symbols Identify the number of ordered pairs in the data set n and the degrees of freedom. Use the regression equation and the given x-value to find the point estimate ŷ. Find the critical value tc that corresponds to the given level of confidence c. d.f. = n – 2 Use Table 5 in Appendix B. Larson/Farber 4th ed.

Constructing a Prediction Interval for y for a Specific Value of x In Words In Symbols Find the standard error of estimate se. Find the margin of error E. Find the left and right endpoints and form the prediction interval. Left endpoint: ŷ – E Right endpoint: ŷ + E Interval: ŷ – E < y < ŷ + E Larson/Farber 4th ed.

Example: Constructing a Prediction Interval Construct a 95% prediction interval for the company sales when the advertising expenses are $2100. What can you conclude? Recall, n = 8, ŷ = 50.729x + 104.061, se = 10.290 Solution: Point estimate: ŷ = 50.729(2.1) + 104.061 ≈ 210.592 Critical value: d.f. = n –2 = 8 – 2 = 6 tc = 2.447 Larson/Farber 4th ed.

Solution: Constructing a Prediction Interval Left Endpoint: ŷ – E Right Endpoint: ŷ + E 210.592 – 26.857 ≈ 183.735 210.592 + 26.857 ≈ 237.449 183.735 < y < 237.449 You can be 95% confident that when advertising expenses are $2100, the company sales will be between $183,735 and $237,449. Larson/Farber 4th ed.

Section 9.3 Summary Interpreted the three types of variation about a regression line Found and interpreted the coefficient of determination Found and interpreted the standard error of the estimate for a regression line Constructed and interpreted a prediction interval for y Larson/Farber 4th ed.

Section 9.4 Multiple Regression Larson/Farber 4th ed.

Section 9.4 Objectives Use technology to find a multiple regression equation, the standard error of estimate and the coefficient of determination Use a multiple regression equation to predict y-values Larson/Farber 4th ed.

Multiple Regression Equation In many instances, a better prediction can be found for a dependent (response) variable by using more than one independent (explanatory) variable. For example, a more accurate prediction for the company sales discussed in previous sections might be made by considering the number of employees on the sales staff as well as the advertising expenses. Larson/Farber 4th ed.

Multiple Regression Equation ŷ = b + m1x1 + m2x2 + m3x3 + … + mkxk x1, x2, x3,…, xk are independent variables b is the y-intercept y is the dependent variable * Because the mathematics associated with this concept is complicated, technology is generally used to calculate the multiple regression equation. Larson/Farber 4th ed.

Example: Finding a Multiple Regression Equation A researcher wants to determine how employee salaries at a certain company are related to the length of employment, previous experience, and education. The researcher selects eight employees from the company and obtains the data shown on the next slide. Use Minitab to find a multiple regression equation that models the data. Larson/Farber 4th ed.

Example: Finding a Multiple Regression Equation Employee Salary, y Employment (yrs), x1 Experience (yrs), x2 Education (yrs), x3 A 57,310 10 2 16 B 57,380 5 6 C 54,135 3 1 12 D 56,985 14 E 58,715 8 F 60,620 20 G 59,200 4 18 H 60,320 17 Larson/Farber 4th ed.

Solution: Finding a Multiple Regression Equation Enter the y-values in C1 and the x1-, x2-, and x3-values in C2, C3 and C4 respectively. Select “Regression > Regression…” from the Stat menu. Use the salaries as the response variable and the remaining data as the predictors. Larson/Farber 4th ed.

Solution: Finding a Multiple Regression Equation The regression equation is ŷ = 49,764 + 364x1 + 228x2 + 267x3 Larson/Farber 4th ed.

Predicting y-Values After finding the equation of the multiple regression line, you can use the equation to predict y-values over the range of the data. To predict y-values, substitute the given value for each independent variable into the equation, then calculate ŷ. Larson/Farber 4th ed.

Example: Predicting y-Values Use the regression equation ŷ = 49,764 + 364x1 + 228x2 + 267x3 to predict an employee’s salary given 12 years of current employment, 5 years of experience, and 16 years of education. Solution: ŷ = 49,764 + 364(12) + 228(5) + 267(16) = 59,544 The employee’s predicted salary is $59,544. Larson/Farber 4th ed.

Section 9.4 Summary Used technology to find a multiple regression equation, the standard error of estimate and the coefficient of determination Used a multiple regression equation to predict y-values Larson/Farber 4th ed.