Non-linear Least-Squares Mariolino De Cecco Nicolò Biasi Regularized Least-Squares
Why non-linear? The model is non-linear (e.g. joints, position, ..) The error function is non-linear Regularized Least-Squares
Setup Let f be a function such that where x is a vector of parameters Let {ak,bk} be a set of measurements/constraints. We fit f to the data by solving: Regularized Least-Squares
Example In our case x is the set of e1, e2 shape and a1, a2, a3, size parameters for superquadrics a is the 3D point f is the superquadric in implicit form The residuals are: rk = f(a, x) - 1 Regularized Least-Squares
Overview Existence and uniqueness of minimum Steepest-descent Newton’s method Gauss-Newton’s method Levenberg-Marquardt method Regularized Least-Squares
A non-linear function: the Rosenbrock function Zoom: Global minimum at (1,1) Regularized Least-Squares
Existence of minimum A local minima is characterized by: Regularized Least-Squares
Existence of minimum Regularized Least-Squares
Descent algorithm Start at an initial position x0 Until convergence Find minimizing step dxk xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares
Descent algorithm Start at an initial position x0 Until convergence Find minimizing step dxk using a local approximation of f xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares
Approximation using Taylor series This is the general formulation Regularized Least-Squares
Approximation using Taylor series But in the case of a least squares problem: Regularized Least-Squares
Approximation using Taylor series But in the case of a least squares problem: Regularized Least-Squares
Steepest descent Step where is chosen such that: using a line search algorithm: Regularized Least-Squares
Regularized Least-Squares
Regularized Least-Squares
In the plane of the steepest descent direction Regularized Least-Squares
Rosenbrock function (1000 iterations) Regularized Least-Squares
Newton’s method Determine the step: by a second order approximation At the minimum of N Regularized Least-Squares
Regularized Least-Squares
Problem If is not positive semi-definite, then is not a descent direction: the step increases the error function Uses positive semi-definite approximation of Hessian based on the jacobian (quasi-Newton methods) Regularized Least-Squares
Gauss-Newton method Step: use with the approximate hessian Advantages: No second order derivatives is positive semi-definite Regularized Least-Squares
Rosenbrock function (48 evaluations) Regularized Least-Squares
Problem with Gauss-Newton When the function is locally highly nonlinear the second order solution can lead to non-convergence !!! Regularized Least-Squares
Levenberg-Marquardt algorithm Blends Steepest descent and Gauss-Newton At each step solve, for the descent direction Regularized Least-Squares
Managing the damping parameter General approach: If step fails, increase damping until step is successful If step succeeds, decrease damping to take larger step Improved damping Regularized Least-Squares
Improved damping Means that in the steepest descent phase, if a direction has a low second order derivative, the step in its direction is increased h h Regularized Least-Squares
Rosenbrock function (90 evaluations) Regularized Least-Squares