Numerical Analysis – Data Fitting Hanyang University Jong-Il Park.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

Instabilities of SVD Small eigenvalues -> m+ sensitive to small amounts of noise Small eigenvalues maybe indistinguishable from 0 Possible to remove small.
Machine Learning and Data Mining Linear regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Kriging.
Siddharth Choudhary.  Refines a visual reconstruction to produce jointly optimal 3D structure and viewing parameters  ‘bundle’ refers to the bundle.
Chapter 10 Curve Fitting and Regression Analysis
Nonlinear Regression Ecole Nationale Vétérinaire de Toulouse Didier Concordet ECVPT Workshop April 2011 Can be downloaded at
P M V Subbarao Professor Mechanical Engineering Department
Computer vision: models, learning and inference
Numerical Analysis –Interpolation
Steepest Decent and Conjugate Gradients (CG). Solving of the linear equation system.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
1cs542g-term Notes  Assignment 1 due tonight ( me by tomorrow morning)
x – independent variable (input)
Maximum likelihood (ML) and likelihood ratio (LR) test
Motion Analysis (contd.) Slides are from RPI Registration Class.
CSci 6971: Image Registration Lecture 4: First Examples January 23, 2004 Prof. Chuck Stewart, RPI Dr. Luis Ibanez, Kitware Prof. Chuck Stewart, RPI Dr.
Function Approximation
Curve-Fitting Regression
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
7. Least squares 7.1 Method of least squares K. Desch – Statistical methods of data analysis SS10 Another important method to estimate parameters Connection.
12 1 Variations on Backpropagation Variations Heuristic Modifications –Momentum –Variable Learning Rate Standard Numerical Optimization –Conjugate.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Chi Square Distribution (c2) and Least Squares Fitting
Maximum likelihood (ML)
2002 January 28 AURA Software Workshop The MATPHOT Algorithm for Digital Point Spread Function CCD Stellar Photometry Kenneth J. Mighell National.
Calibration & Curve Fitting

Least-Squares Regression
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Numerical Analysis - Simulation -
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Biointelligence Laboratory, Seoul National University
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
R. Kass/W03P416/Lecture 7 1 Lecture 7 Some Advanced Topics using Propagation of Errors and Least Squares Fitting Error on the mean (review from Lecture.
Application of Differential Applied Optimization Problems.
Physics 114: Exam 2 Review Lectures 11-16
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Curve-Fitting Regression
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
CS 782 – Machine Learning Lecture 4 Linear Models for Classification  Probabilistic generative models  Probabilistic discriminative models.
SUPA Advanced Data Analysis Course, Jan 6th – 7th 2009 Advanced Data Analysis for the Physical Sciences Dr Martin Hendry Dept of Physics and Astronomy.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Colorado Center for Astrodynamics Research The University of Colorado 1 STATISTICAL ORBIT DETERMINATION ASEN 5070 LECTURE 11 9/16,18/09.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Numerical Analysis – Eigenvalue and Eigenvector Hanyang University Jong-Il Park.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Numerical Analysis – Differential Equation
Variations on Backpropagation.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
Camera Calibration Course web page: vision.cis.udel.edu/cv March 24, 2003  Lecture 17.
R. Kass/Sp07P416/Lecture 71 More on Least Squares Fit (LSQF) In Lec 5, we discussed how we can fit our data points to a linear function (straight line)
Ch3: Model Building through Regression
The Maximum Likelihood Method
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
Linear regression Fitting a straight line to observations.
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Nonlinear regression.
Nonlinear Fitting.
Numerical Analysis – Solving Nonlinear Equations
Presentation transcript:

Numerical Analysis – Data Fitting Hanyang University Jong-Il Park

Department of Computer Science and Engineering, Hanyang University Fitting Exact fit  Interpolation  Extrapolation Approximate fit  Allows some errors  Optimality depends on noise model x x xx x x x x x Approximate fitting to a line e

Department of Computer Science and Engineering, Hanyang University Eg. Interpolation vs. Data Fitting

Department of Computer Science and Engineering, Hanyang University Regression errors(I) Zero-order model1st-order model

Department of Computer Science and Engineering, Hanyang University Regression errors(I) Small errors Large errors

Department of Computer Science and Engineering, Hanyang University Least-Square Data Fitting Problem statement Given N data points (x i, y i ), i=1,...,N a model that has M adjustable parameters, find that minimizes ※ Maximum Likelihood Estimation ML ≡ Least-square if e i is independently distributed Gaussian

Department of Computer Science and Engineering, Hanyang University Fitting data to a straight line Model error Sum of Errors At the minimum error S a b (a*,b*)

Department of Computer Science and Engineering, Hanyang University Data fitting to a polynomial(I) Model At the minimum error (n+1) simultaneous eg. (linear) (n+1) unknowns

Department of Computer Science and Engineering, Hanyang University Data fitting to a polynomial(II) Rewriting the eq.’s into a matrix form where Linear equation

Department of Computer Science and Engineering, Hanyang University General linear least-square(I) error Sum of Errors At the minimum error Model

Department of Computer Science and Engineering, Hanyang University General linear least-square(II) Matrix form where Since We obtain

Department of Computer Science and Engineering, Hanyang University Chi-Square Fitting If each data point (x i, y i ) has its own, known standard deviation σ i, we can formulate a weighted least-square problem : Called the “chi-square” small large Large : less weighted Small : more weighted y x

Department of Computer Science and Engineering, Hanyang University Eg. Chi-square fitting Fitting to a line At the minimum error Define Then we obtain

Department of Computer Science and Engineering, Hanyang University Multi-Dimensional Fit Model error Similarly to the derivation for the 1D fits, we get The only difference is that the dimension of x is generalized to an arbitrary dimension

Department of Computer Science and Engineering, Hanyang University Nonlinear Models Model Eg.) Problem Given find the least-square solution nonlinear fcn. of a parameters

Department of Computer Science and Engineering, Hanyang University Nonlinear fitting: easy case ; linear ◦ Some Nonlinear functions can be transformed into a linear form. Simple BUT Not an optimum fitting!

Department of Computer Science and Engineering, Hanyang University Nonlinear fitting by linearization

Department of Computer Science and Engineering, Hanyang University Nonlinear Least-Square Fitting Model Problem Minimize → where Cost function

Department of Computer Science and Engineering, Hanyang University Levenberg-Marquardt Method(I) Some insight Hessian matrix update term gradient Inverse-Hessian method Steepest Descent method : Near the minimum : Far from the minimum

Department of Computer Science and Engineering, Hanyang University Levenberg-Marquardt Method(II) Idea -Start : steepest descent method -End : Inverse-Hessian method gradually change : Inverse Hessian Steepest descent

Department of Computer Science and Engineering, Hanyang University Levenberg-Marquardt Method(III) Algorithm ⅰ ) Guess ⅱ ) compute ⅲ ) pick a modest, say ⅳ ) solve where ⅴ ) if increase by a factor of 10 and go to ⅳ ) else decrease by a factor of 10 and update and go to ⅳ ) ①

Department of Computer Science and Engineering, Hanyang University Levenberg-Marquardt Method(IV) Inverse Hessian decrease Steepest descent ※ For large, in eq. ①, the is diagonal dominant eq. ① is close to steepest descent!

Department of Computer Science and Engineering, Hanyang University Calculation of the gradient Calculation of the gradient and the Hessian of ◦ Gradient

Department of Computer Science and Engineering, Hanyang University Calculation of the Hessian ignore Sensitive to noise ; symmetric

Department of Computer Science and Engineering, Hanyang University Homework #8: Programming [Due: Dec. 3] Data files are given in the course homepage (fitdata1.dat, fitdata2.dat, fitdata3.dat).