Data manipulation II: Least Squares Fitting

Slides:



Advertisements
Similar presentations
The Maximum Likelihood Method
Advertisements

Statistical Techniques I EXST7005 Simple Linear Regression.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
ES 240: Scientific and Engineering Computation. InterpolationPolynomial  Definition –a function f(x) that can be written as a finite series of power functions.
Curve-Fitting Regression
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 26 Regression Analysis-Chapter 17.
Statistics: Data Analysis and Presentation Fr Clinic II.
Curve-Fitting Polynomial Interpolation
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Section 8.3 – Systems of Linear Equations - Determinants Using Determinants to Solve Systems of Equations A determinant is a value that is obtained from.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Introduction to Error Analysis
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Regression analysis Control of built engineering objects, comparing to the plan Surveying observations – position of points Linear regression Regression.
Today’s class Spline Interpolation Quadratic Spline Cubic Spline Fourier Approximation Numerical Methods Lecture 21 Prof. Jinbo Bi CSE, UConn 1.
Chapter 8 Curve Fitting.
Curve-Fitting Regression
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
11/26/2015 V. J. Motto 1 Chapter 1: Linear Models V. J. Motto M110 Modeling with Elementary Functions 1.5 Best-Fit Lines and Residuals.
Lecture 22 Numerical Analysis. Chapter 5 Interpolation.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
MODEL FITTING jiangyushan. Introduction The goal of model fitting is to choose values for the parameters in a function to best describe a set of data.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Lecture 8 Post-Graduate Students Advanced Programming (Introduction to MATLAB) Code: ENG 505 Dr. Basheer M. Nasef Computers & Systems Dept.
1 Chapter 4 Interpolation and Approximation Lagrange Interpolation The basic interpolation problem can be posed in one of two ways: The basic interpolation.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
Data Modeling Patrice Koehl Department of Biological Sciences
List manipulation;curve fitting
The simple linear regression model and parameter estimation
Notes on Weighted Least Squares Straight line Fit Passing Through The Origin Amarjeet Bhullar November 14, 2008.
The Maximum Likelihood Method
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Chapter 7. Classification and Prediction
Part 5 - Chapter
Part 5 - Chapter 17.
Salinity Calibration fit with MATLAB
Polynomial Interpolation
Parameter Estimation and Fitting to Data
Numerical Analysis Lecture 25.
CSE 4705 Artificial Intelligence
Ch12.1 Simple Linear Regression
Chapter 5 STATISTICS (PART 4).
The Maximum Likelihood Method
Simple Linear Regression - Introduction
The Maximum Likelihood Method
Physics 114: Exam 2 Review Material from Weeks 7-11
Part 5 - Chapter 17.
LESSON 21: REGRESSION ANALYSIS
Modelling data and curve fitting
Chi Square Distribution (c2) and Least Squares Fitting
J.-F. Pâris University of Houston
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Numerical Analysis Lecture 23.
Nonlinear regression.
MATH 174: Numerical Analysis
M248: Analyzing data Block D UNIT D2 Regression.
Discrete Least Squares Approximation
Least Square Regression
Product moment correlation
ESTIMATION METHODS We know how to calculate confidence intervals for estimates of  and 2 Now, we need procedures to calculate  and 2 , themselves.
Interpolation Practice-6 Additional chapters of mathematics
9. Data Fitting Fitting Experimental Spectrum
Mathematica: Hubble.
Regression Models - Introduction
Presentation transcript:

Data manipulation II: Least Squares Fitting http://mathworld.wolfram.com/LeastSquaresFitting.htm You have measured a set of data points, {xi,yi}, i = 1, 2, …, N, and you know that they should approximately lie on a straight line of the form y = a x + b if the yi’s are plotted against xi’s. We wish to know what are the best values for a and b that make the best fit for the data set.

Vertical offset Let f(xi,a,b) = b xi + a, in which we are looking for the best values for m and c. Vertical least squares fitting proceeds by minimizing the sum of the squares of the vertical deviations R2 of a set of n data points

Brute force minimisation The values of m and c at which R2 is minimized is the best fit values. You can either find these best fit values using “brute force method”, i.e. minimize R2 by scanning the barameter spaces of m and c (see brute_force_linearfit.nb), or be smarter by using a more intelligent approach.

Least Squared Minimisation = =

In matrix form Eq. (1)

= = Eq. (2) = Eq. (3) =  

standard errors = = = = = Eq. (4,5)

Write a Mathematica code to calculate a, b and their standard errors Exercise Write a Mathematica code to calculate a, b and their standard errors based on Eqs. (2,3,4,5). Use the data file: http://www2.fizik.usm.my/tlyoon/teaching/ZCE111/1415SEM2/notes/data_for_linear_fit.dat

Matrix Manipulation The best values for a and b in the fitting equation can also be obtained by solving the matrix equation, Eq. (1). Develop a Mathematica code to implement the matrix calculation. See least_sq_fit_matrix.nb.

Mathematica's built-in functions for data fitting See Math_built_in_linearfit.nb. Syntax: Take[] Syntax: FindFit[ ], LinearModelFit,Normal "BestFit", "ParameterTable" These are Mathematica's built in functions to fit a set of data against a linear formula, such as y = a + b x, and at the same time automatically provide errors of the best fit parameters – very handy way to fit a set of data against any linear formula.

Treating real experimental data and manipulate them 2GP1, VARIABLE_G_PENDULUM.PDF See the file varpendulum.nb

Gaussian function A Gaussian function has the form:It It is parametrised by two parameters, m (average) and s (width, or squared root of variance).

Download and ListPlot the data file, gaussian.dat.

Interpolation[] These data points can be automatically linked up by a best curve using the Mathematica built-in function Interpolation. See interpolation_gaussian_data.nb

NonlinearModelFit Now, how would you ask Mathematica to find out the values of s and m that best fit the data point against a gaussian function? Use NonlinearModelFit See nonlinearfit_gaussian.nb for the use of built-in function to fit a set of data points onto a non-linear function, such as the gaussian distribution.

Lagrange Interpolating Polynomial Given a set of n data point, how do you perform an interpolation, i.e., generate a function that goes through all points (without using the built-in function Interpolation)? A simple way is to use Lagrange Interpolating Polynomial http://mathworld.wolfram.com/LagrangeInterpolatingPolynomial.html

Explicit form of Lagrange interpolating polynomial The Lagrange interpolating polynomial is the polynomial P(x) of degree <=(n-1) that passes through the n points (x1,y1), (x2,y2), …, (xn,yn) Exercise: Write a Mathematica code to realise the Lagrange interpolating polynomial, using the sample data http://www2.fizik.usm.my/tlyoon/teaching/ZCE111/1415SEM2/notes/gaussian.dat

Syntax: Product See http://www2.fizik.usm.my/tlyoon/teaching/ZCE111/1415SEM2/notes/Lagrange_polynomial.nb

Polynomial Interpolation A more rigorous method to perform interpolation is by the way of Polynomial_interpolation. http://en.wikipedia.org/wiki/Polynomial_interpolation Given a set of n + 1 data points (xi, yi) where no two xi are the same, one is looking for a polynomial p of degree at most n with the property p(xi) = yi, i =0, 1, 2, … ,n.

Polynomial Interpolation Suppose that the interpolation polynomial is in the form The statement that p interpolates the data points means that If we substitute equation (1) in here, we get a system of linear equations in the coefficients ak. The system in matrix-vector form reads

Vandermonde matrix

For derivation of the Vandermonde matrix, see the lecture note by Dmitriy Leykekhman, University of Connecticut.

Exercise using the sample data using the sample data, monomial.dat write a code to construct the Vandermonde matrix Solve the matrix equation for the coefficients ak. Then obtain the resultant interpolating polynomial Overlap the interpolating polynomial on the raw data to see if they agree with each other. Compare your interpolating polynomial with that using Mathematica.