y=a+bx Linear Regression: Method of Least Squares slope y intercept

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Learning Objectives Copyright © 2002 South-Western/Thomson Learning Data Analysis: Bivariate Correlation and Regression CHAPTER sixteen.
Learning Objectives Copyright © 2004 John Wiley & Sons, Inc. Bivariate Correlation and Regression CHAPTER Thirteen.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Chapter 10 Regression. Defining Regression Simple linear regression features one independent variable and one dependent variable, as in correlation the.
Lecture (14,15) More than one Variable, Curve Fitting, and Method of Least Squares.
EGR 105 Foundations of Engineering I Fall 2007 – week 7 Excel part 3 - regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
EGR 105 Foundations of Engineering I Fall 2007 – week 7 Excel part 3 - regression.
SIMPLE LINEAR REGRESSION
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
C82MCP Diploma Statistics School of Psychology University of Nottingham 1 Linear Regression and Linear Prediction Predicting the score on one variable.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
1 1 Slide Simple Linear Regression Chapter 14 BA 303 – Spring 2011.
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis What is regression ?What is regression ? Best-fit lineBest-fit line Least squareLeast square What is regression ?What is regression.
Correlation and simple linear regression Marek Majdan Training in essential biostatistics for Public Health Professionals in BiH, Marek Majdan, PhD;
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
Statistical Methods Statistical Methods Descriptive Inferential
Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.
Y=a+bx Sum of squares of errors Linear Regression: Method of Least Squares The Method of Least Squares is a procedure to determine the best fit line to.
Chapter 8 Curve Fitting.
Regression Regression relationship = trend + scatter
PROGRAM 6: Curve Fitting to fit a straight line to a given set of data points using Least Square Method If data is in terms of two variables x and y then.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
© 2001 Prentice-Hall, Inc.Chap 13-1 BA 201 Lecture 18 Introduction to Simple Linear Regression (Data)Data.
Regression Analysis. 1. To comprehend the nature of correlation analysis. 2. To understand bivariate regression analysis. 3. To become aware of the coefficient.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
y=a+bx Linear Regression: Method of Least Squares slope y intercept y
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
V. Rouillard  Introduction to measurement and statistical analysis CURVE FITTING In graphical form, drawing a line (curve) of best fit through.
OLS Regression What is it? Closely allied with correlation – interested in the strength of the linear relationship between two variables One variable is.
Transforming Numerical Methods Education for the STEM Undergraduate Regression
Chapter 14 Introduction to Regression Analysis. Objectives Regression Analysis Uses of Regression Analysis Method of Least Squares Difference between.
1 Objective Given two linearly correlated variables (x and y), find the linear function (equation) that best describes the trend. Section 10.3 Regression.
Copyright © Cengage Learning. All rights reserved. 8 4 Correlation and Regression.
The Computational Method (mathematics)
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Part 5 - Chapter
Part 5 - Chapter 17.
Linear Regression Special Topics.
Regression and Correlation
Ordinary Least Squares (OLS) Regression
Part 5 - Chapter 17.
y=a+bx Linear Regression: Method of Least Squares slope y intercept y
REGRESSION.
Simple Linear Regression
^ y = a + bx Stats Chapter 5 - Least Squares Regression
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Least-Squares Regression
Nonlinear Fitting.
Discrete Least Squares Approximation
Least Square Regression
Adequacy of Linear Regression Models
Correlation and Regression
Karl’s Pearson Correlation
Simple Linear Regression
Regression & Prediction
Ch 4.1 & 4.2 Two dimensions concept
Mechanical Engineering Majors Authors: Autar Kaw, Luke Snyder
Created by Erin Hodgess, Houston, Texas
Presentation transcript:

y=a+bx Linear Regression: Method of Least Squares slope y intercept The Method of Least Squares is a procedure to determine the best fit line to data; the proof uses simple calculus and linear algebra. The basic problem is to find the best fit straight line y = a + bx given that, for n ϵ {1,…,N}, the pairs (xn; yn) are observed. The form of the fitted curve is Sum of squares of errors slope y=a+bx y intercept

Example 1: Find a 1st order polynomial y=a+bx for the values given in the table. -5 -2 2 4 7 3.5 a=1.188 b=0.484 y=1.188+0.484x With Matlab: clc;clear x=[-5,2,4]; y=[-2,4,3.5]; p=polyfit(x,y,1) x1=-5:0.01:7; yx=polyval(p,x1); plot(x,y,'ro',x1,yx,'b') xlabel('x value') ylabel ('y value') Data point Fitted curve

Example 2: x y 200 3 230 5 240 8 270 10 290 y=a+bx y=200.13 + 8.82x 200 3 230 5 240 8 270 10 290 y=a+bx y=200.13 + 8.82x clc;clear x=[0,3,5,8,10]; y=[200,230,240,270,290]; p=polyfit(x,y,1) x1=-1:0.01:12; yx=polyval(p,x1); plot(x,y,'ro',x1,yx,'b') xlabel('x value') ylabel ('y value') Data point Fitted curve

Example 3: Intercept Slope Tensile tests were performed for a composite material having a crack in order to calculate the fracture toughness. Obtain a linear relationship between the breaking load F and crack length a. Method of Least Squares Slope Intercept

with Matlab: clc;clear x=[10,9.25,9.1,9.4,8.5]; y=[0.5,0.4,0.35,0.45,0.28]; p=polyfit(x,y,1) F=8:0.01:12; a=polyval(p,F); plot(x,y,'ro‘,F,a,'b') xlabel('x value') ylabel ('y value')

Example 4: The change in the interior temperature of an oven with respet to time is given in the Figure. It is desired to model the relationship between the temperature (T) and time (t) by a first order polynomial as T=c1t+c2. Determine the coefficients c1 and c2. T (°C) t (min.) 0 5 10 15 175 204 200 212 Slope Intercept

clc;clear x=[0,5,10,15]; y=[175,204,200,212]; p=polyfit(x,y,1) t=0:0.01:15; T=polyval(p,t); plot(x,y,'ro',t,T,'b') xlabel('x value') ylabel ('y value‘) with Matlab: