Download presentation
Presentation is loading. Please wait.
1
Non-linear Least-Squares
Mariolino De Cecco Nicolò Biasi Regularized Least-Squares
2
Why non-linear? The model is non-linear (e.g. joints, position, ..)
The error function is non-linear Regularized Least-Squares
3
Setup Let f be a function such that where x is a vector of parameters
Let {ak,bk} be a set of measurements/constraints. We fit f to the data by solving: Regularized Least-Squares
4
Example In our case x is the set of e1, e2 shape and a1, a2, a3, size parameters for superquadrics a is the 3D point f is the superquadric in implicit form The residuals are: rk = f(a, x) - 1 Regularized Least-Squares
5
Overview Existence and uniqueness of minimum Steepest-descent
Newton’s method Gauss-Newton’s method Levenberg-Marquardt method Regularized Least-Squares
6
A non-linear function: the Rosenbrock function
Zoom: Global minimum at (1,1) Regularized Least-Squares
7
Existence of minimum A local minima is characterized by:
Regularized Least-Squares
8
Existence of minimum Regularized Least-Squares
9
Descent algorithm Start at an initial position x0 Until convergence
Find minimizing step dxk xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares
10
Descent algorithm Start at an initial position x0 Until convergence
Find minimizing step dxk using a local approximation of f xk+1=xk+ dxk Produce a sequence x0, x1, …, xn such that f(x0) > f(x1) > …> f(xn) Regularized Least-Squares
11
Approximation using Taylor series
This is the general formulation Regularized Least-Squares
12
Approximation using Taylor series
But in the case of a least squares problem: Regularized Least-Squares
13
Approximation using Taylor series
But in the case of a least squares problem: Regularized Least-Squares
14
Steepest descent Step where is chosen such that:
using a line search algorithm: Regularized Least-Squares
15
Regularized Least-Squares
16
Regularized Least-Squares
17
In the plane of the steepest descent direction
Regularized Least-Squares
18
Rosenbrock function (1000 iterations)
Regularized Least-Squares
19
Newton’s method Determine the step: by a second order approximation
At the minimum of N Regularized Least-Squares
20
Regularized Least-Squares
21
Problem If is not positive semi-definite, then
is not a descent direction: the step increases the error function Uses positive semi-definite approximation of Hessian based on the jacobian (quasi-Newton methods) Regularized Least-Squares
22
Gauss-Newton method Step: use with the approximate hessian Advantages:
No second order derivatives is positive semi-definite Regularized Least-Squares
23
Rosenbrock function (48 evaluations)
Regularized Least-Squares
24
Problem with Gauss-Newton
When the function is locally highly nonlinear the second order solution can lead to non-convergence !!! Regularized Least-Squares
25
Levenberg-Marquardt algorithm
Blends Steepest descent and Gauss-Newton At each step solve, for the descent direction Regularized Least-Squares
26
Managing the damping parameter
General approach: If step fails, increase damping until step is successful If step succeeds, decrease damping to take larger step Improved damping Regularized Least-Squares
27
Improved damping Means that in the steepest descent phase, if a direction has a low second order derivative, the step in its direction is increased h h Regularized Least-Squares
28
Rosenbrock function (90 evaluations)
Regularized Least-Squares
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.