Lecture 5 Polynomial Approximation

Slides:



Advertisements
Similar presentations
Splines and Piecewise Interpolation
Advertisements

1 Eng. Mohamed El-Taher Eng. Ahmed Ibrahim. 2 1.FUNCTION SUMMARY polyfun  Polynomial functions are located in the MATLAB polyfun directory. For a complete.
數值方法 2008, Applied Mathematics NDHU 1 Nonlinear systems Newton’s method The steepest descent method.
©1999 BG Mobasseri 15/24/99 INTERPOLATION AND CURVE FITTING Etter: pp June 16, ‘99.
Curve fit metrics When we fit a curve to data we ask: –What is the error metric for the best fit? –What is more accurate, the data or the fit? This lecture.
數值方法 2008, Applied Mathematics NDHU 1 Spline interpolation.
MATLAB EXAMPLES Interpolation and Integration 58:111 Numerical Calculations Department of Mechanical and Industrial Engineering.
Curve Fitting and Interpolation: Lecture (II)
Lecture 12 Projection and Least Square Approximation Shang-Hua Teng.
Lecture 12 Least Square Approximation Shang-Hua Teng.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 24 Regression Analysis-Chapter 17.
ECIV 301 Programming & Graphics Numerical Methods for Engineers Lecture 21 CURVE FITTING Chapter 18 Function Interpolation and Approximation.
Curve fit noise=randn(1,30); x=1:1:30; y=x+noise ………………………………… [p,s]=polyfit(x,y,1);
CSE 473/573 RANSAC & Least Squares Devansh Arpit.
Numerical Operations S. Awad, Ph.D. M. Corless, M.S.E.E. E.C.E. Department University of Michigan-Dearborn Introduction to Matlab: Polynomial Manipulations.
CMPS1371 Introduction to Computing for Engineers NUMERICAL METHODS.
數值方法 2008, Applied Mathematics NDHU 1 Simpson rule Composite Simpson rule.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Chapter 9 Function Approximation
Multimodal and Sensorial Interfaces for Mobile Robots course task Nicola Piotto a.y. 2007/2008.
1 4. Curve fitting On many occasions one has sets of ordered pairs of data (x 1,...,x n, y 1,...,y n ) which are related by a concrete function Y(X) e.g.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Find the local linear approximation of f(x) = e x at x = 0. Find the local quadratic approximation of f(x) = e x at x = 0.
Consider the following: Now, use the reciprocal function and tangent line to get an approximation. Lecture 31 – Approximating Functions
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Section 14.7 Second-Order Partial Derivatives. Old Stuff Let y = f(x), then Now the first derivative (at a point) gives us the slope of the tangent, the.
Advanced Topics- Polynomials
Lecture 10 2D plotting & curve fitting Subplots Other 2-D Plots Other 2-D Plots Curve fitting © 2007 Daniel Valentine. All rights reserved. Published by.
Approximation on Finite Elements Bruce A. Finlayson Rehnberg Professor of Chemical Engineering.
Scientific Computing General Least Squares. Polynomial Least Squares Polynomial Least Squares: We assume that the class of functions is the class of all.
6.094 Introduction to programming in MATLAB Danilo Šćepanović IAP 2008 Lecture 3 : Solving Equations and Curve Fitting.
Adavanced Numerical Computation 2008, AM NDHU1 Function approximation Linear models Line fitting Hyper-plane fitting Discriminate analysis Least Square.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Polynomials, Curve Fitting and Interpolation. In this chapter will study Polynomials – functions of a special form that arise often in science and engineering.
數值方法 2008 Applied Mathematics, NDHU1  Lagrange polynomial  Polynomial interpolation Lecture 4II.
Spline Interpolation A Primer on the Basics by Don Allen.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
數值方法 2008, Applied Mathematics NDHU 1 Numerical Differentiation.
1 Lecture 8 Post-Graduate Students Advanced Programming (Introduction to MATLAB) Code: ENG 505 Dr. Basheer M. Nasef Computers & Systems Dept.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
數值方法, Applied Mathematics NDHU 1 Linear system. 數值方法, Applied Mathematics NDHU 2 Linear system.
Math 495B Polynomial Interpolation Special case of a step function. Frederic Gibou.
Curve fit metrics When we fit a curve to data we ask: –What is the error metric for the best fit? –What is more accurate, the data or the fit? This lecture.
Global predictors of regression fidelity A single number to characterize the overall quality of the surrogate. Equivalence measures –Coefficient of multiple.
Answers for Review Questions for Lectures 1-4. Review Lectures 1-4 Problems Question 2. Derive a closed form for the estimate of the solution of the equation.
Linear Algebra Curve Fitting. Last Class: Curve Fitting.
Computational Intelligence: Methods and Applications Lecture 14 Bias-variance tradeoff – model selection. Włodzisław Duch Dept. of Informatics, UMK Google:
Lecture 29: Modeling Data. Data Modeling Interpolate between data points, using either linear or cubic spline models Model a set of data points as a polynomial.
Introduction to Programming for Mechanical Engineers
Data manipulation II: Least Squares Fitting
Salinity Calibration fit with MATLAB
Curve-Fitting Spline Interpolation
CSE 4705 Artificial Intelligence
Lecture 3 Taylor Series Expansion
Final Project CSCE 790E (Medical Image Processing)
Lecture 10 2D plotting & curve fitting
Chapter 9 Function Approximation
MATH 2140 Numerical Methods
1. Use the quadratic formula to find all real zeros of the second-degree polynomial
J.-F. Pâris University of Houston
Linear regression Fitting a straight line to observations.
Graphing Techniques.
Curve fit metrics When we fit a curve to data we ask:
Numerical Computation and Optimization
Discrete Least Squares Approximation
Least Square Regression
Curve fit metrics When we fit a curve to data we ask:
Machine learning overview
Numerical Integration
Presentation transcript:

Lecture 5 Polynomial Approximation Polynomial Interpolation Example Limitation Polynomial Approximation (fitting) Line fitting Quadratic curve fitting Polynomial fitting 數值方法

Polynomials P=poly([-4 0 3]); x=linspace(-5,5); y=polyval(P,x); Polynomial determined by zeros in [-4,0,3] P=poly([-4 0 3]); x=linspace(-5,5); y=polyval(P,x); plot(x,y); Plot polynomial P 數值方法

Sampling of paired data Produce a polynomial with zeros [-4,0,3] P=poly([-4 0 3]); x=rand(1,6)*10-5; y=polyval(P,x); plot(x,y, 'ro'); Plot paired data 數值方法

Polynomial interpolation Produce a polynomial with zeros [-4,0,3] P=poly([-4 0 3]); n=6 x=rand(1,n)*10-5; y=polyval(P,x); plot(x,y, 'ro'); Plot the interpolating polynomial v=linspace(min(x),max(x)); z=int_poly(v,x,y); plot(v,z,'k'); 數值方法

Demo_ip Source codes 數值方法

m=3, n=5 >> demo_ip data size n=5 數值方法

m=3, n=6 >> demo_ip data size n=6 數值方法

m=3, n=10 >> demo_ip data size n=10 數值方法

m=3, n=12 >> demo_ip data size n=12 數值方法

Sample with noise P=poly([-4 0 3]);n=6; x=rand(1,n)*10-5; Produce a polynomial with zeros [-4,0,3] P=poly([-4 0 3]);n=6; x=rand(1,n)*10-5; y=polyval(P,x); nois=rand(1,n)*0.5-0.25; plot(x,y+nois, 'ro'); Plot the interpolating polynomial v=linspace(min(x),max(x)); z=int_poly(v,x,y+nois); plot(v,z,'k'); 數值方法

Demo_ip_noisy source codes 數值方法

Noise data Noise ratio noise_ratio = 0.0074 Mean of abs(noise)/abs(signal) noise_ratio = 0.0074 數值方法

Numerical results Small noise causes intolerant interpolation >> demo_ip_noisy data size n:12 noise_ratio = 0.0053 數值方法

Numerical results Small noise causes intolerant interpolation >> demo_ip_noisy data size n:13 noise_ratio = 0.0031 數值方法

Interpolation Vs approximation An interpolating polynomial is expected to satisfy all constraints posed by paired data An interpolating polynomial is unable to retrieve an original target function when noisy paired data are provided For noisy paired data, the goal of polynomial fitting is revised to minimize the mean square approximating error 數值方法

Polynomial approximation Given paired data, (xi,yi), i=1,…,n, the approximating polynomial is required to minimize the mean square error of approximating yi by f(xi) 數值方法

Polynomial fitting fa1d_polyfit.m 數值方法

Line fitting Target: Minimizaing the mean square approximating error 數值方法

Line fitting >> fa1d_polyfit input a function: x.^2+cos(x) :x+5 keyin sample size:300 polynomial degree:1 E = 8.3252e-004 Red: Approximating polynomial a=1;b=5 數值方法

Line fitting Red: Approximating polynomial >> fa1d_polyfit input a function: x.^2+cos(x) :3*x+1/2 keyin sample size:30 polynomial degree:1 E = 0.0010 Red: Approximating polynomial a=3;b=1/2 數值方法

Objective function I Line fitting Target: 數值方法

E1 is a quadratic function of a and b Setting derivatives of E1 to zero leads to a linear system 數值方法

a linear system 數值方法

數值方法

數值方法

數值方法

Objective function II Quadratic polynomial fitting Target: 數值方法

Setting derivatives of E2 to zero leads to a linear system E2 are quadratic Setting derivatives of E2 to zero leads to a linear system 數值方法

a linear system 數值方法

數值方法

數值方法

數值方法

Quadratic polynomial fitting Minimization of an approximating error Target: 數值方法

Quadratic poly fitting input a function: x.^2+cos(x) :3*x.^2-2*x+1 keyin sample size:20 polynomial degree:2 E = 6.7774e-004 a=3 b=-2 c=1 數值方法

Data driven polynomial approximation Minimization of Mean square error (mse) Data driven polynomial approximation f : a polynomial pm Polynomial degree m is less than data size n 數值方法

Special case: polynomial interpolation 數值方法

POLYFIT: Fit polynomial to data polyfit(x,y,m) x : input vectors or predictors y : desired outputs m : degree of interpolating polynomial Use m to prevent from over-fitting Tolerance to noise 數值方法

POLYFIT: Fit polynomial to data A polynomial determined by zeros in [-4,0,3] P=poly([-4 0 3]); n=20;m=3; x=rand(1,20)*10-5; y=polyval(P,x); nois=rand(1,20)*0.5-0.25; plot(x,y+nois, 'ro'); Plot the interpolating polynomial v=linspace(min(x),max(x)); p=polyfit(x,y+nois,m);hold on; plot(v,polyval(p,v),'k'); 數值方法

Non-polynomial sin fx=inline('sin(x)'); n=20;m=3 x=rand(1,n)*10-5; y=fx(x); nois=rand(1,n)*0.5-0.25; plot(x,y+nois, 'ro');hold on 數值方法

Under-fitting m=3 v=linspace(min(x),max(x)); p=polyfit(x,y+nois,m);hold on; plot(v,polyval(p,v),'k'); Under-fitting due to approximating non-polynomial by low-degree polynomials 數值方法

Intolerant mse >> mean((polyval(p,x)-(y+nois)).^2) ans = 0.0898 Under-fitting causes intolerant mean square error 數值方法

Under-fitting M=4 m=4; v=linspace(min(x),max(x)); p=polyfit(x,y+nois,m);hold on; plot(v,polyval(p,v), 'b'); M=4 數值方法

Fitting non-polynomial >> fa1d_polyfit input a function: x.^2+cos(x) :sin(x) keyin sample size:50 polynomial degree:5 E = 0.0365 數值方法

Fitting non-polynomial >> fa1d_polyfit input a function: x.^2+cos(x) :tanh(x+2)+sech(x) keyin sample size:30 polynomial degree:5 E = 0.0097 數值方法