P M V Subbarao Professor Mechanical Engineering Department

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Things to do in Lecture 1 Outline basic concepts of causality
Statistical Techniques I EXST7005 Simple Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Quantitative Methods 2 Lecture 3 The Simple Linear Regression Model Edmund Malesky, Ph.D., UCSD.
Chapter 10 Curve Fitting and Regression Analysis
Ch11 Curve Fitting Dr. Deshi Ye
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
Lecture (14,15) More than one Variable, Curve Fitting, and Method of Least Squares.
Nonlinear and multiple regression analysis Just a straightforward extension of linear regression analysis.
Curve Fitting and Interpolation: Lecture (IV)
Least Square Regression
Curve-Fitting Regression
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
SIMPLE LINEAR REGRESSION
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
SIMPLE LINEAR REGRESSION
Chi Square Distribution (c2) and Least Squares Fitting
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 Least.
Classification and Prediction: Regression Analysis
Principles of Least Squares
Calibration & Curve Fitting
Objectives of Multiple Regression
Physics 114: Lecture 17 Least Squares Fit to Polynomial
SIMPLE LINEAR REGRESSION
Least-Squares Regression
CpE- 310B Engineering Computation and Simulation Dr. Manal Al-Bzoor
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Simple Linear Regression
Introduction to Error Analysis
CORRELATION & REGRESSION
Unit 4: Modeling Topic 6: Least Squares Method April 1, 2003.
3/2003 Rev 1 I – slide 1 of 33 Session I Part I Review of Fundamentals Module 2Basic Physics and Mathematics Used in Radiation Protection.
Bivariate Data When two variables are measured on a single experimental unit, the resulting data are called bivariate data. You can describe each variable.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Applications The General Linear Model. Transformations.
Linear Functions 2 Sociology 5811 Lecture 18 Copyright © 2004 by Evan Schofer Do not copy or distribute without permission.
Quality of Curve Fitting P M V Subbarao Professor Mechanical Engineering Department Suitability of A Model to a Data Set…..
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Chapter 8 Curve Fitting.
Curve-Fitting Regression
ISCG8025 Machine Learning for Intelligent Data and Information Processing Week 3 Practical Notes Regularisation *Courtesy of Associate Professor Andrew.
1 Chapter 12 Simple Linear Regression. 2 Chapter Outline  Simple Linear Regression Model  Least Squares Method  Coefficient of Determination  Model.
CHAPTER 3 Model Fitting. Introduction Possible tasks when analyzing a collection of data points: Fitting a selected model type or types to the data Choosing.
10B11PD311 Economics REGRESSION ANALYSIS. 10B11PD311 Economics Regression Techniques and Demand Estimation Some important questions before a firm are.
12/17/ Linear Regression Dr. A. Emamzadeh. 2 What is Regression? What is regression? Given n data points best fitto the data. The best fit is generally.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Method 3: Least squares regression. Another method for finding the equation of a straight line which is fitted to data is known as the method of least-squares.
Richard Kass/F02P416 Lecture 6 1 Lecture 6 Chi Square Distribution (  2 ) and Least Squares Fitting Chi Square Distribution (  2 ) (See Taylor Ch 8,
CORRELATION-REGULATION ANALYSIS Томский политехнический университет.
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Part 5 - Chapter
Part 5 - Chapter 17.
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
Linear regression Fitting a straight line to observations.
5.2 Least-Squares Fit to a Straight Line
Discrete Least Squares Approximation
Least Square Regression
Correlation and Regression
Regression and Correlation of Data
Presentation transcript:

P M V Subbarao Professor Mechanical Engineering Department Curve Fitting P M V Subbarao Professor Mechanical Engineering Department An Optimization Method to Develop A Model for Instrument…..

Less Unknowns & More Equations y x

Quantifying error in a curve fit positive or negative error have the same value (data point is above or below the line) Weight greater errors more heavily

Less Unknowns & More Equations y x

Less Unknowns & More Equations x

Hunting for A Shape & Geometric Model to Represent A Data Set

Minimum Energy Method : Least Squares Method If our fit is a straight line

The ‘best’ line has minimum error between line and data points This is called the least squares approach, since square of the error is minimized. Take the derivative of the error with respect to a and b, set each to zero

Solve for the a and b so that the previous two equations both = 0

put these into matrix form

Is a straight line suitable for each of these cases ?

The Least-Squares mth Degree Polynomials When using an mth degree polynomial to approximate the given set of data, (x1,y1), (x2,y2)…… (xn,yn), where n ≥ m, the best fitting curve has the least square error, i.e.,

To obtain the least square error, the unknown coefficients a0, a1, … To obtain the least square error, the unknown coefficients a0, a1, …. and am     must yield zero first derivatives.

Expanding the previous equations, we have The unknown coefficients can hence be obtained by solving the above linear equations.

No matter what the order j, we always get equations LINEAR with respect to the coefficients. This means we can use the following solution method

Selection of Order of Fit 2nd and 6th order look similar, but 6th has a ‘squiggle to it. Is it Required or not?

Under Fit or Over Fit: Picking An appropriate Order Underfit - If the order is too low to capture obvious trends in the data Overfit - over-doing the requirement for the fit to ‘match’ the data trend (order too high) Polynomials become more ‘squiggly’ as their order increases. A ‘squiggly’ appearance comes from inflections in function General rule: pick a polynomial form at least several orders lower than the number of data points. Start with linear and add extra order until trends are matched.

Linear Regression Analysis Linear curve fitting Polynomial curve fitting Power Law curve fitting: y=axb ln(y) = ln(a)+bln(x) Exponential curve fitting: y=aebx ln(y)=ln(a)+bx

Goodness of fit and the correlation coefficient A measure of how good the regression line as a representation of the data. It is possible to fit two lines to data by (a) treating x as the independent variable : y=ax+b, y as the dependent variable or by (b) treating y as the independent variable and x as the dependent variable. This is described by a relation of the form x= a'y +b'. The procedure followed earlier can be followed again to find best values of a’ and b’.

put these into matrix form

Recast the second fit line as: is the slope of this second line, which not same as the first line

The ratio of the slopes of the two lines is a measure of how good the form of the fit is to the data. In view of this the correlation coefficient ρ defined through the relation

Correlation Coefficient The sign of the correlation coefficient is determined by the sign of the covariance. If the regression line has a negative slope the correlation coefficient is negative while it is positive if the regression line has a positive slope. The correlation is said to be perfect if ρ = ± 1. The correlation is poor if ρ ≈ 0. Absolute value of the correlation coefficient should be greater than 0.5 to indicate that y and x are related! In the case of a non-linear fit a quantity known as the index of correlation is defined to determine the goodness of the fit. The fit is termed good if the variance of the deviates is much less than the variance of the y’s. It is required that the index of correlation defined below to be close to ±1 for the fit to be considered good.

r2=1.000 r2=0.991 r2=0.904 r2=0.821 r2=0.493 r2=0.0526

Multi-Variable Regression Analysis Cases considered so far, involved one independent variable and one dependent variable. Sometimes the dependent variable may be a function of more than one variable. For example, the relation of the form is a common type of relationship for flow through an Orifice or Venturi. mass flow rate is a dependent variable and others are independent variables.

Set up a mathematical model as: Taking logarithm both sides Simply: where y is the dependent variable, l, m, n, o and p are independent variables and a, b, c, d, e are the fit parameters.

The least square method may be used to determine the fit parameters. Let the data be available for set of N values of y, l, m, n, o, p values. The quantity to be minimized is given by What is the permissible value of N ?

The normal linear equations are obtained by the usual process of setting the first partial derivatives with respect to the fit parameters to zero.

These equations are solved simultaneously to get the six fit parameters. We may also calculate the index of correlation as an indicator of the quality of the fit. This calculation is left to you!