Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.

Slides:



Advertisements
Similar presentations
Coefficient of Determination- R²
Advertisements

Managerial Economics in a Global Economy
Topic 12: Multiple Linear Regression
Lesson 10: Linear Regression and Correlation
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 12 Simple Linear Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Simple Linear Regression. G. Baker, Department of Statistics University of South Carolina; Slide 2 Relationship Between Two Quantitative Variables If.
Chapter 10 Curve Fitting and Regression Analysis
1 Simple Linear Regression and Correlation The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES Assessing the model –T-tests –R-square.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 12 Simple Linear Regression
Class notes for ISE 201 San Jose State University
Chapter 10 Simple Regression.
9. SIMPLE LINEAR REGESSION AND CORRELATION
Least Square Regression
Class notes for ISE 201 San Jose State University
Least Square Regression
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Chapter Topics Types of Regression Models
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Simple Linear Regression and Correlation
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS & Updated by SPIROS VELIANITIS.
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Chapter 11 Simple Regression
Statistics for Business and Economics 8 th Edition Chapter 11 Simple Regression Copyright © 2013 Pearson Education, Inc. Publishing as Prentice Hall Ch.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
AP STATISTICS LESSON 3 – 3 LEAST – SQUARES REGRESSION.
Applied Quantitative Analysis and Practices LECTURE#22 By Dr. Osman Sadiq Paracha.
Linear Regression Least Squares Method: the Meaning of r 2.
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Prediction, Goodness-of-Fit, and Modeling Issues Prepared by Vera Tabakova, East Carolina University.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Math 4030 – 11b Method of Least Squares. Model: Dependent (response) Variable Independent (control) Variable Random Error Objectives: Find (estimated)
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Lecture 10 Introduction to Linear Regression and Correlation Analysis.
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
AP STATISTICS LESSON 3 – 3 (DAY 2) The role of r 2 in regression.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Slides by JOHN LOUCKS St. Edward’s University.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
The simple linear regression model and parameter estimation
Regression and Correlation of Data Summary
Regression Analysis AGEC 784.
Part 5 - Chapter
Part 5 - Chapter 17.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Quantitative Methods Simple Regression.
Econ 3790: Business and Economics Statistics
Simple Linear Regression - Introduction
CHAPTER 29: Multiple Regression*
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
Least Squares Method: the Meaning of r2
Simple Linear Regression
Regression and Correlation of Data
Presentation transcript:

Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University Industrial & Systems Engineering Dept. Steve Kennedy 1

Simple Linear Regression If there is a linear relationship between an independent variable x and a dependent variable Y, then where  is the intercept and  is the slope of the linear relationship, and  is the random error, assumed to be normally distributed with mean  = 0 and variance 2 . Residuals: Given regression data points [(xi, yi), i = 1, 2, ..., n], if yihat = a + bxi is the estimate of yi using the linear model, then the residual ei is given by The residual for each data point is the distance of the point from the line in the y direction. We will use the "least squares" technique to minimize the sum of the squares of the residuals.

Least Squares Method We wish to find a and b to minimize the sum of the squares of the errors (residuals), SSE. To minimize, differentiate with respect to a and b, and set each result to 0. This generates two simultaneous equations (called normal equations) & two unknowns. Solving for a and b, we get and a & b are the coefficients of the "best fit" straight line through the data points that minimize SSE.

Coefficient of Determination (R2) The coefficient of determination R2 is a measure of the proportion of variability explained by the fitted model, and thus a measure of the quality of the linear fit. Recall from the previous slide that SSE is the sum of the squares of the errors (residuals), or the amount of variation unexplained by the straight line. SST, the total sum of squares, is the total variability in the data. Then R2 (the square of the correlation coefficient) is defined as R2 tells us the percent of the total variation in the data explained by the straight line relationship. If R2  1, all points are very close to the line.

Data Transformations for Regression If the relationship between the variables is other than linear, we can first transform either the dependent or independent variable or both, and then perform a linear regression on the transformed variables. If, for example, we have: Exponential: If y = ex, use y* = ln y, and regress y* against x. Power: If y = x, use y* = ln y and x* = ln x, and regress y* against x*. Reciprocal: If y =  + (1/x), use x* = 1/x, and regress y against x*. Hyperbolic: If y = x/( + x), use y* = 1/y and use x* = 1/x, and regress y* against x*.

Multiple Linear Regression In a multiple linear regression model, we have k independent variables, x1, x2, ..., xk. The model is The least-squares estimates of the coefficients can be calculated as with simple linear regression, except that there are k + 1 simultaneous equations to solve (use matrix inversion). R2 still describes the goodness of the linear relationship. Multiple linear regression can also be used to calculate the least squares coefficients for a polynomial model of the form by first calculating the square, cube, etc., of the independent variable and then doing a multiple linear regression.