Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

Kin 304 Regression Linear Regression Least Sum of Squares
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regresi Linear Sederhana Pertemuan 01 Matakuliah: I0174 – Analisis Regresi Tahun: Ganjil 2007/2008.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
Probabilistic & Statistical Techniques Eng. Tamer Eshtawi First Semester Eng. Tamer Eshtawi First Semester
Simple Regression. Major Questions Given an economic model involving a relationship between two economic variables, how do we go about specifying the.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Read Chapter 17 of the textbook
Curve Fitting and Interpolation: Lecture (IV)
Least Square Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 26 Regression Analysis-Chapter 17.
Least Square Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 25 Regression Analysis-Chapter 17.
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Petter Mostad Linear regression Petter Mostad
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Chapter 2 – Simple Linear Regression - How. Here is a perfect scenario of what we want reality to look like for simple linear regression. Our two variables.
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Relationship of two variables
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
CISE301_Topic41 CISE301: Numerical Methods Topic 4: Least Squares Curve Fitting Lectures 18-19: KFUPM Read Chapter 17 of the textbook.
Linear Regression Least Squares Method: the Meaning of r 2.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
Ch4 Describing Relationships Between Variables. Section 4.1: Fitting a Line by Least Squares Often we want to fit a straight line to data. For example.
Chapter 8 Curve Fitting.
Curve-Fitting Regression
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Regression Regression relationship = trend + scatter
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
Simple Linear Regression. The term linear regression implies that  Y|x is linearly related to x by the population regression equation  Y|x =  +  x.
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Chapter 10: Determining How Costs Behave 1 Horngren 13e.
Linear Prediction Correlation can be used to make predictions – Values on X can be used to predict values on Y – Stronger relationships between X and Y.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Linear Regression Hypothesis testing and Estimation.
Bivariate Regression. Bivariate Regression analyzes the relationship between two variables. Bivariate Regression analyzes the relationship between two.
Linear Regression Essentials Line Basics y = mx + b vs. Definitions
The simple linear regression model and parameter estimation
Regression and Correlation of Data Summary
CHAPTER 3 Describing Relationships
Unit 4 LSRL.
Part 5 - Chapter
Part 5 - Chapter 17.
AP Statistics Chapter 14 Section 1.
Regression Chapter 6 I Introduction to Regression
Kin 304 Regression Linear Regression Least Sum of Squares
Chapter 5 LSRL.
BPK 304W Regression Linear Regression Least Sum of Squares
Simple Linear Regression - Introduction
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear Regression.
Linear regression Fitting a straight line to observations.
Least Squares Method: the Meaning of r2
Chapter 5 LSRL.
Ch11 Curve Fitting II.
Discrete Least Squares Approximation
Least Square Regression
Ch 4.1 & 4.2 Two dimensions concept
Algebra Review The equation of a straight line y = mx + b
A medical researcher wishes to determine how the dosage (in mg) of a drug affects the heart rate of the patient. Find the correlation coefficient & interpret.
Lesson 2.2 Linear Regression.
Regression and Correlation of Data
Presentation transcript:

Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010

Material Outline Curve Fitting –Least square fit –Quantification of error –Coefficient of determination –Coefficient of correlation

4  CURVE FITTING In Curve Fitting, n pairs of numbers are expressed in ((x 1,y 1 ), (x 2,y 2 ), …(x n, y n )). These pairs are possibly from observation or field measurements of certain quantity. The objective: To find a certain function such that we can inter-relate the pairs of numbers,  f(x j )  y j. In other word, if the function is plotted, the resulted graph will best fit the pairs of numbers.

5  CURVE FITTING One method that can be used to find the function for curve fitting of n pairs of observation values is to minimize the discrepancy between n pairs of observations with the curve. To minimize the discrepancy is known as Least Squared Regression. Least Squared Regression  Linear Regression Polynomial Regression

6 LINEAR REGRESSION In Linear regression, n pairs observations or field measurements is fitted to a straight line (linear). Linear or straight line can be written as: y= a 0 + a 1 x + E, in which a 0 : intercept, a 1 : slope, gradient E : error (discrepancy) between data points with the chosen linear line model. The above equation can be written as :  E = y - a 0 - a 1 x  From this equation, it can be seen that the error E is the difference between the true value y with the approximate value a 0 + a 1 x.

7 LINEAR REGRESSION E = y - a 0 - a 1 x  There are several methods to find the Best Fit as to 1.Minimize the sum of the residual (error), E 2.Minimize the sum of the absolute of residual (error), |E| 3.Minimize the sum of the squared of residual (error), E 2  Out of these 3 methods, the best method is to minimize the sum of the squared of residual (error), E 2. One of the advantage of using this method is that the resulted line is unique for each set of n pairs of data.  This approach is known as Least Squares Fit.

8 Least Square Fit  The coefisients a 0 and a 1 in the previous equation will be determined by minimizing the sum of error (residual) squared as follows :  To minimize means (Calculus):

9 Least Square Fit  From previous equations then a 0 and a 1 can be written as:

10  QUANTIFICATION OF ERROR OF LINEAR REGRESSION Standard Deviation between prediction model with data distribution can be quantified using the following formula :

11  QUANTIFICATION OF ERROR OF LINEAR REGRESSION In addition to the sum of the squares of residuals (s r ), there is a quantity the sum of the squares around the mean value s t =  (y-y i ) 2. The difference between s t and s r quantifies the improvement or error reduction due to linear regression rather than average value. Two coefficients to quantifies this improvement is given below: Coefficients of determination (r 2 ) and Correlation coeff. These 2 Coeff quantify the perfect ness of the fit of the linear regression

12 Coefficient of Determination Correlation Coeff  r; 0 r  1; r=1  Perfect Fit r=0  No improvement s t =s r  QUANTIFICATION OF ERROR OF LINEAR REGRESSION

13 ixixi yiyi 110,5 222,5 332,0 444,0 553,5 666,0 775,5 Least Square Fit  Example: Find the linear regression line to fit the following data and estimate the deviation standard.

14 ixixi yiyi x i y i xi2xi2 y i -a 0 -a 1 x i 110,5……… 222,5……… 332,0……… 444,0……… 553,5……… 666,0……… 775,5………  = … Answer:

15 Answer (cont): after completion of the previous Table

16 POLYNOMIAL REGRESSION  For most cases the linear regression that just discussed is appropriate to fit data distribution. For some case, however, it is not. For these cases, Polynomial functions can be used as an alternative.  Polynomial functions can be written as:  As before, the sum of the squares of error can be written as:

17 POLYNOMIAL REGRESSION  In a polynomial function given before, there are m+1 unknown quantities they are: a 0, a 1, …, a m.  These quantities will be determined by minimizing the sum of the squares of error S r as follows  From the above m+1 equations, the parameters a 0, a 1, …, a m can be determined