Applied Statistics and Probability for Engineers

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Probability & Statistical Inference Lecture 9
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
12 Multiple Linear Regression CHAPTER OUTLINE
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Objectives (BPS chapter 24)
Chapter 12 Simple Linear Regression
The Simple Regression Model
SIMPLE LINEAR REGRESSION
SIMPLE LINEAR REGRESSION
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression Analysis
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
SIMPLE LINEAR REGRESSION
Introduction to Linear Regression and Correlation Analysis
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Regression Analysis (2)
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide Simple Linear Regression Part A n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n.
1 1 Slide © 2004 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
INTRODUCTORY LINEAR REGRESSION SIMPLE LINEAR REGRESSION - Curve fitting - Inferences about estimated parameter - Adequacy of the models - Linear.
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
Chapter 11 Linear Regression Straight Lines, Least-Squares and More Chapter 11A Can you pick out the straight lines and find the least-square?
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
Chapter 4 Linear Regression 1. Introduction Managerial decisions are often based on the relationship between two or more variables. For example, after.
6-1 Introduction To Empirical Models Based on the scatter diagram, it is probably reasonable to assume that the mean of the random variable Y is.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
VI. Regression Analysis A. Simple Linear Regression 1. Scatter Plots Regression analysis is best taught via an example. Pencil lead is a ceramic material.
1 1 Slide © 2003 South-Western/Thomson Learning™ Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Simple Linear Regression Analysis Chapter 13.
Chapter 12 Simple Linear Regression n Simple Linear Regression Model n Least Squares Method n Coefficient of Determination n Model Assumptions n Testing.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Free Powerpoint Templates ROHANA BINTI ABDUL HAMID INSTITUT E FOR ENGINEERING MATHEMATICS (IMK) UNIVERSITI MALAYSIA PERLIS.
Chapter Outline EMPIRICAL MODELS 11-2 SIMPLE LINEAR REGRESSION 11-3 PROPERTIES OF THE LEAST SQUARES ESTIMATORS 11-4 SOME COMMENTS ON USES OF REGRESSION.
Applied Regression Analysis BUSI 6220
Chapter 13 Simple Linear Regression
23. Inference for regression
Chapter 20 Linear and Multiple Regression
Chapter 14 Inference on the Least-Squares Regression Model and Multiple Regression.
Inference for Least Squares Lines
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Statistics for Managers using Microsoft Excel 3rd Edition
Correlation and Simple Linear Regression
Chapter 13 Created by Bethany Stubbe and Stephan Kogitz.
John Loucks St. Edward’s University . SLIDES . BY.
Chapter 11 Simple Regression
Statistics for Business and Economics (13e)
Chapter 13 Simple Linear Regression
Slides by JOHN LOUCKS St. Edward’s University.
Correlation and Simple Linear Regression
Lecture Slides Elementary Statistics Thirteenth Edition
The Practice of Statistics in the Life Sciences Fourth Edition
6-1 Introduction To Empirical Models
Prepared by Lee Revere and John Large
Simple Linear Regression
Correlation and Simple Linear Regression
Undergraduated Econometrics
Multiple Regression Chapter 14.
Simple Linear Regression
SIMPLE LINEAR REGRESSION
Simple Linear Regression and Correlation
Product moment correlation
SIMPLE LINEAR REGRESSION
Simple Linear Regression
St. Edward’s University
Chapter 13 Simple Linear Regression
Correlation and Simple Linear Regression
Correlation and Simple Linear Regression
Presentation transcript:

Applied Statistics and Probability for Engineers Sixth Edition Douglas C. Montgomery George C. Runger Chapter 11 Simple Linear Regression and Correlation

11 Simple Linear Regression and Correlation CHAPTER OUTLINE 11-1 Empirical Models 11-8 Correlation 11-2 Simple Linear Regression 11-9 Regression on Transformed Variables 11-3 Properties of the Least Squares 11-10 Logistic Regression Estimators 11-4 Hypothesis Test in Simple Linear Regression 11-4.1 Use of t-tests 11-4.2 Analysis of variance approach to test significance of regression 11-5 Confidence Intervals 11-5.1 Confidence intervals on the slope and intercept 11-5.2 Confidence interval on the mean response 11-6 Prediction of New Observations 11-7 Adequacy of the Regression Model 11-7.1 Residual analysis 11-7.2 Coefficient of determination (R2) Chapter 11 Title and Outline

Learning Objectives for Chapter 11 After careful study of this chapter, you should be able to do the following: Use simple linear regression for building empirical models to engineering and scientific data. Understand how the method of least squares is used to estimate the parameters in a linear regression model. Analyze residuals to determine if the regression model is an adequate fit to the data or to see if any underlying assumptions are violated. Test the statistical hypotheses and construct confidence intervals on the regression model parameters. Use the regression model to make a prediction of a future observation and construct an appropriate prediction interval on the future observation. Apply the correlation model. Use simple transformations to achieve a linear regression model. Chapter 11 Learning Objectives

11-1: Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis is a statistical technique that is very useful for these types of problems. For example, in a chemical process, suppose that the yield of the product is related to the process-operating temperature. Regression analysis can be used to build a model to predict yield at a given temperature level. Sec 11-1 Empirical Models

11-2: Simple Linear Regression The simple linear regression considers a single regressor or predictor x and a dependent or response variable Y. The expected value of Y at each level of x is a random variable: E(Y|x) = b0 + b1x We assume that each observation, Y, can be described by the model Y = b0 + b1x +  Sec 11-2 Simple Linear Regression

11-2: Simple Linear Regression Least Squares Estimates The least-squares estimates of the intercept and slope in the simple linear regression model are (11-1) (11-2) where and . Sec 11-2 Simple Linear Regression

11-2: Simple Linear Regression The fitted or estimated regression line is therefore (11-3) Note that each pair of observations satisfies the relationship where is called the residual. The residual describes the error in the fit of the model to the ith observation yi. Sec 11-2 Simple Linear Regression

11-2: Simple Linear Regression Notation Sec 11-2 Simple Linear Regression

EXAMPLE 11-1 Oxygen Purity We will fit a simple linear regression model to the oxygen purity data in Table 11-1. The following quantities may be computed: and Sec 11-2 Simple Linear Regression

EXAMPLE 11-1 Oxygen Purity - continued Therefore, the least squares estimates of the slope and intercept are and The fitted simple linear regression model (with the coefficients reported to three decimal places) is Sec 11-2 Simple Linear Regression

11-2: Simple Linear Regression Estimating 2 The error sum of squares is It can be shown that the expected value of the error sum of squares is E(SSE) = (n – 2)2. Sec 11-2 Simple Linear Regression

11-2: Simple Linear Regression Estimating 2 An unbiased estimator of 2 is (11-4) where SSE can be easily computed using (11-5) Sec 11-2 Simple Linear Regression

11-3: Properties of the Least Squares Estimators Slope Properties Intercept Properties Sec 11-3 Properties of the Least Squares Estimators

11-4: Hypothesis Tests in Simple Linear Regression 11-4.1 Use of t-Tests Suppose we wish to test H0: b1 = b1,0 H1: b1  b1,0 An appropriate test statistic would be (11-6) Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.1 Use of t-Tests The test statistic could also be written as: We would reject the null hypothesis if |t0| > ta/2,n - 2 Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.1 Use of t-Tests Suppose we wish to test H0: b0 = b0,0 H1: b0  b0,0 An appropriate test statistic would be (11-7) Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.1 Use of t-Tests We would reject the null hypothesis if |t0| > ta/2,n - 2 Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.1 Use of t-Tests An important special case of the hypotheses of Equation 11-18 is H0: b1 = 0 H1: b1  0 These hypotheses relate to the significance of regression. Failure to reject H0 is equivalent to concluding that there is no linear relationship between x and Y. Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression EXAMPLE 11-2 Oxygen Purity Tests of Coefficients We will test for significance of regression using the model for the oxygen purity data from Example 11-1. The hypotheses are H0: b1 = 0 H1: b1  0 and we will use a = 0.01. From Example 11-1 and Table 11-2 we have so the t-statistic in Equation 11-6 becomes Practical Interpretation: Since the reference value of t is t0.005,18 = 2.88, the value of the test statistic is very far into the critical region, implying that H0: b1 = 0 should be rejected. There is strong evidence to support this claim. The P-value for this test is . This was obtained manually with a calculator. Table 11-2 presents the Minitab output for this problem. Notice that the t-statistic value for the slope is computed as 11.35 and that the reported P-value is P = 0.000. Minitab also reports the t-statistic for testing the hypothesis H0: b0 = 0. This statistic is computed from Equation 11-7, with b0,0 = 0, as t0 = 46.62. Clearly, then, the hypothesis that the intercept is zero is rejected. Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.2 Analysis of Variance Approach to Test Significance of Regression The analysis of variance identity is (11-8) Symbolically, SST = SSR + SSE (11-9) Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.2 Analysis of Variance Approach to Test Significance of Regression If the null hypothesis, H0: 1 = 0 is true, the statistic (11-10) follows the F1,n-2 distribution and we would reject if f0 > f,1,n-2. Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression 11-4.2 Analysis of Variance Approach to Test Significance of Regression The quantities, MSR and MSE are called mean squares. Analysis of variance table: Note that MSE = Source of Variation Sum of Squares Degrees of Freedom Mean Square F0 Regression 1 MSR MSR/MSE Error n - 2 MSE Total SST n - 1 Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-4: Hypothesis Tests in Simple Linear Regression EXAMPLE 11-3 Oxygen Purity ANOVA We will use the analysis of variance approach to test for significance of regression using the oxygen purity data model from Example 11-1. Recall that , Sxy = 10.17744, and n = 20. The regression sum of squares is and the error sum of squares is SSE = SST - SSR = 173.38 - 152.13 = 21.25 The analysis of variance for testing H0: b1 = 0 is summarized in the Minitab output in Table 11-2. The test statistic is f0 = MSR/MSE = 152.13/1.18 = 128.86, for which we find that the P-value is , so we conclude that b1 is not zero. There are frequently minor differences in terminology among computer packages. For example, sometimes the regression sum of squares is called the “model” sum of squares, and the error sum of squares is called the “residual” sum of squares. Sec 11-4 Hypothesis Tests in Simple Linear Regression

11-5: Confidence Intervals 11-5.1 Confidence Intervals on the Slope and Intercept Definition Under the assumption that the observation are normally and independently distributed, a 100(1 - a)% confidence interval on the slope b1 in simple linear regression is   (11-11) Similarly, a 100(1 - a)% confidence interval on the intercept b0 is (11-12) Sec 11-5 Confidence Intervals

11-5: Confidence Intervals EXAMPLE 11-4 Oxygen Purity Confidence Interval on the Slope We will find a 95% confidence interval on the slope of the regression line using the data in Example 11-1. Recall that , and (see Table 11-2). Then, from Equation 11-11 we find   Or This simplifies to 12.181  b1  17.713 Practical Interpretation: This CI does not include zero, so there is strong evidence (at a = 0.05) that the slope is not zero. The CI is reasonably narrow (2.766) because the error variance is fairly small. Sec 11-5 Confidence Intervals

11-5: Confidence Intervals 11-5.2 Confidence Interval on the Mean Response Definition A 100(1 - a)% confidence interval about the mean response at the value of , say , is given by   (11-13) where is computed from the fitted regression model. Sec 11-5 Confidence Intervals

11-5: Confidence Intervals Example 11-5 Oxygen Purity Confidence Interval on the Mean Response We will construct a 95% confidence interval about the mean response for the data in Example 11-1. The fitted model is , and the 95% confidence interval on is found from Equation 11-13 as Suppose that we are interested in predicting mean oxygen purity when x0 = 100%. Then and the 95% confidence interval is or 89.23  0.75 Therefore, the 95% CI on mY|1.00 is 88.48  mY|1.00  89.98 This is a reasonable narrow CI. Sec 11-5 Confidence Intervals

11-6: Prediction of New Observations Prediction Interval A 100(1 - a) % prediction interval on a future observation Y0 at the value x0 is given by (11-14) The value is computed from the regression model . Sec 11-6 Prediction of New Observations

11-6: Prediction of New Observations EXAMPLE 11-6 Oxygen Purity Prediction Interval To illustrate the construction of a prediction interval, suppose we use the data in Example 11-1 and find a 95% prediction interval on the next observation of oxygen purity x0 = 1.00%. Using Equation 11-14 and recalling from Example 11-5 that , we find that the prediction interval is which simplifies to 86.83  y0  91.63 This is a reasonably narrow prediction interval. Sec 11-6 Prediction of New Observations

11-7: Adequacy of the Regression Model Fitting a regression model requires several assumptions. Errors are uncorrelated random variables with mean zero; Errors have constant variance; and, Errors be normally distributed. The analyst should always consider the validity of these assumptions to be doubtful and conduct analyses to examine the adequacy of the model Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model 11-7.1 Residual Analysis The residuals from a regression model are ei = yi - ŷi , where yi is an actual observation and ŷi is the corresponding fitted value from the regression model. Analysis of the residuals is frequently helpful in checking the assumption that the errors are approximately normally distributed with constant variance, and in determining whether additional terms in the model would be useful. Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model EXAMPLE 11-7 Oxygen Purity Residuals The regression model for the oxygen purity data in Example 11-1 is . Table 11-4 presents the observed and predicted values of y at each value of x from this data set, along with the corresponding residual. These values were computed using Minitab and show the number of decimal places typical of computer output. A normal probability plot of the residuals is shown in Fig. 11-10. Since the residuals fall approximately along a straight line in the figure, we conclude that there is no severe departure from normality. The residuals are also plotted against the predicted value in Fig. 11-11 and against the hydrocarbon levels xi in Fig. 11-12. These plots do not indicate any serious model inadequacies. Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model Example 11-7 Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model Example 11-7 Figure 11-10 Normal probability plot of residuals, Example 11-7. Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model Example 11-7 Figure 11-11 Plot of residuals versus predicted oxygen purity, ŷ, Example 11-7. Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model 11-7.2 Coefficient of Determination (R2) The quantity is called the coefficient of determination and is often used to judge the adequacy of a regression model. 0  R2  1; We often refer (loosely) to R2 as the amount of variability in the data explained or accounted for by the regression model. Sec 11-7 Adequacy of the Regression Model

11-7: Adequacy of the Regression Model 11-7.2 Coefficient of Determination (R2) For the oxygen purity regression model, R2 = SSR/SST = 152.13/173.38 = 0.877 Thus, the model accounts for 87.7% of the variability in the data. Sec 11-7 Adequacy of the Regression Model

11-8: Correlation We assume that the joint distribution of Xi and Yi is the bivariate normal distribution presented in Chapter 5, and mY and are the mean and variance of Y, mX and are the mean and variance X, and r is the correlation coefficient between Y and X. Recall that the correlation coefficient is defined as (11-15) where sXY is the covariance between Y and X. The conditional distribution of Y for a given value of X = x is (11-16) where (11-17)   (11-18) Sec 11-8 Correlation

11-8: Correlation It is possible to draw inferences about the correlation coefficient r in this model. The estimator of r is the sample correlation coefficient (11-19) Note that (11-20) We may also write: Sec 11-8 Correlation

11-8: Correlation H0: r = 0 H1: r  0 It is often useful to test the hypotheses H0: r = 0 H1: r  0 The appropriate test statistic for these hypotheses is (11-21) Reject H0 if |t0| > t/2,n-2. Sec 11-8 Correlation

11-8: Correlation The test procedure for the hypothesis H0: r = 0 where 0  0 is somewhat more complicated. In this case, the appropriate test statistic is Z0 = (arctanh R - arctanh r0)(n - 3)1/2 (11-22) Reject H0 if |z0| > z/2. Sec 11-8 Correlation

11-8: Correlation The approximate 100(1- )% confidence interval is (11-23) Sec 11-8 Correlation

11-8: Correlation EXAMPLE 11-8 Wire Bond Pull Strength In Chapter 1 (Section 1-3) an application of regression analysis is described in which an engineer at a semiconductor assembly plant is investigating the relationship between pull strength of a wire bond and two factors: wire length and die height. In this example, we will consider only one of the factors, the wire length. A random sample of 25 units is selected and tested, and the wire bond pull strength and wire length arc observed for each unit. The data are shown in Table 1-2. We assume that pull strength and wire length are jointly normally distributed. Figure 11-13 shows a scatter diagram of wire bond strength versus wire length. We have used the Minitab option of displaying box plots of each individual variable on the scatter diagram. There is evidence of a linear relationship between the two variables. The Minitab output for fitting a simple linear regression model to the data is shown below. Sec 11-8 Correlation

11-8: Correlation Figure 11-13 Scatter plot of wire bond strength versus wire length, Example 11-8. Sec 11-8 Correlation

11-8: Correlation Minitab Output for Example 11-8 Regression Analysis: Strength versus Length The regression equation is Strength = 5.11 + 2.90 Length   Predictor Coef SE Coef T P Constant 5.115 1.146 4.46 0.000 Length 2.9027 0.1170 24.80 0.000 S = 3.093 R-Sq = 96.4% R-Sq(adj) = 96.2% PRESS = 272.144 R-Sq(pred) = 95.54% Analysis of Variance Source DF SS MS F P Regression 1 5885.9 5885.9 615.08 0.000 Residual Error 23 220.1 9.6 Total 24 6105.9 Sec 11-8 Correlation

11-8: Correlation Example 11-8 (continued) Now Sxx = 698.56 and Sxy = 2027.7132, and the sample correlation coefficient is Note that r2 = (0.9818)2 = 0.9640 (which is reported in the Minitab output), or that approximately 96.40% of the variability in pull strength is explained by the linear relationship to wire length. Sec 11-8 Correlation

11-8: Correlation Example 11-8 (continued) Now suppose that we wish to test the hypotheses H0: r = 0 H1: r  0 with a = 0.05. We can compute the t-statistic of Equation 11-21 as This statistic is also reported in the Minitab output as a test of H0: b1 = 0. Because t0.025,23 = 2.069, we reject H0 and conclude that the correlation coefficient r  0. Sec 11-8 Correlation

11-8: Correlation Example 11-8 (continued) Finally, we may construct an approximate 95% confidence interval on r from Equation 11-23. Since arctanh r = arctanh 0.9818 = 2.3452, Equation 11-23 becomes which reduces to 0.9585  r  0.9921 Sec 11-8 Correlation

11-9: Transformation and Logistic Regression We occasionally find that the straight-line regression model Y = b0 + b1x +  inappropriate because the true regression function is nonlinear. Sometimes nonlinearity is visually determined from the scatter diagram, and sometimes, because of prior experience or underlying theory, we know in advance that the model is nonlinear. Occasionally, a scatter diagram will exhibit an apparent nonlinear relationship between Y and x. In some of these situations, a nonlinear function can be expressed as a straight line by using a suitable transformation. Such nonlinear models are called intrinsically linear. Sec 11-9 Transformation and Logistic Regression

11-9: Transformation and Logistic Regression EXAMPLE 11-9 Windmill Power A research engineer is investigating the use of a windmill to generate electricity and has collected data on the DC output from this windmill and the corresponding wind velocity. The data are plotted in Figure 11-14 and listed in Table 11-5 Observation Number, i Wind Velocity (mph), xi DC Output, yi 1 5.00 1.582 2 6.00 1.822 3 3.40 1.057 4 2.70 0.500 5 10.00 2.236 6 9.70 2.386 7 9.55 2.294 8 3.05 0.558 9 8.15 2.166 10 6.20 1.866 11 2.90 0.653 12 6.35 1.930 13 4.60 1.562 14 5.80 1.737 15 7.40 2.088 16 3.60 1.137 17 7.85 2.179 18 8.80 2.112 19 7.00 1.800 20 5.45 1.501 21 9.10 2.303 22 10.20 2.310 23 4.10 1.194 24 3.95 1.144 25 2.45 0.123 Table 11-5 Observed Values and Regressor Variable for Example 11-9. Sec 11-9 Transformation and Logistic Regression

11-9: Transformation and Logistic Regression Example 11-9 (Continued) Figure 11-14 Plot of DC output y versus wind velocity x for the windmill data. Figure 11-15 Plot of residuals ei versus fitted values for the windmill data. Sec 11-9 Transformation and Logistic Regression

11-9: Transformation and Logistic Regression Example 11-9 (Continued) Figure 11-16 Plot of DC output versus x’ = 1/x for the windmill data. Sec 11-9 Transformation and Logistic Regression

11-9: Transformation and Logistic Regression Example 11-9 (Continued) Figure 11-17 Plot of residuals versus fitted values for the transformed model for the windmill data. Figure 11-18 Normal probability plot of the residuals for the transformed model for the windmill data. A plot of the residuals from the transformed model versus is shown in Figure 11-17. This plot does not reveal any serious problem with inequality of variance. The normal probability plot, shown in Figure 11-18, gives a mild indication that the errors come from a distribution with heavier tails than the normal (notice the slight upward and downward curve at the extremes). This normal probability plot has the z-score value plotted on the horizontal axis. Since there is no strong signal of model inadequacy, we conclude that the transformed model is satisfactory.   Sec 11-9 Transformation and Logistic Regression

Important Terms & Concepts of Chapter 11 Analysis of variance test in regression Confidence interval on mean response Correlation coefficient Empirical model Confidence intervals on model parameters Intrinsically linear model Least squares estimation of regression model parameters Logistics regression Model adequacy checking Odds ratio Prediction interval on a future observation Regression analysis Residual plots Residuals Scatter diagram Simple linear regression model standard error Statistical test on model parameters Transformations Chapter 11 Summary