Download presentation
Presentation is loading. Please wait.
1
CS5321 Numerical Optimization
10 Least Squares Problem 5/9/2019
2
Least-squares problems
Linear least-squares problems QR method Nonlinear least-squares problems Gradient and Hessian of nonlinear LSP Gauss--Newton method Levenberg--Marquardt method Methods for large residual problem 5/9/2019
3
Example of linear least square
y = β1 + β2x (from Wikipedia) β1 = 3.5, β2 = The line is y = x 5/9/2019
4
Linear least squares problems
A linear least-squares problem is f (x)=1/2||Ax−y||2. It’s gradient is f (x)=AT(Ax−y) The optimal solution is at f (x)=0, ATAx=ATy ATAx = ATy is called the normal equation. Perform QR decomposition on matrix A = QR. RT is invertible. The solution x = R−1QTy. A T x = R Q y 5/9/2019
5
Example of nonlinear LS
Á ( x ; t ) = 1 + 2 3 4 e 5 Find (x1,x2,x3,x4,x5) to minimize 1 2 m X j = [ Á ( x ; t ) y ] 5/9/2019
6
Gradient and Hessian of LSP
The object function of least squares problem is where ri are n variable functions. Define The Jacobian Gradient Hessian f ( x ) = 1 2 m X j r R ( x ) = B @ r 1 2 . m C A J ( x ) = B @ r 1 T 2 . m C A r f ( x ) = m X j 1 J T R 2 + 5/9/2019
7
Gauss-Newton method Gauss-Newton uses the Hessian approximation
2 f ( x ) = J T + m X j 1 Gauss-Newton uses the Hessian approximation It’s a good approximation if ||R|| is small. This is the matrix of the normal equation Usually with the line search technique Replace with r 2 f ( x ) J T f ( x ) = 1 2 m X j r f ( x ) = 1 2 k J p + R 5/9/2019
8
Convergence of Gauss-Newton
Suppose each rj is Lipschitz continuously differentiable in a neighborhood N of {x|f(x)f(x0)} and the Jacobians J(x) satisfy ||J(x)z||||z||. Then the Gauss-Newton method, withk that satisfies the Wolfe conditions, has l i m k ! 1 J T R = 5/9/2019
9
Levenberg-Marquardt method
Gauss-Newton + trust region The problem becomes subject to || p || k Optimal condition: (recall that in chap 4) Equivalent linear least-square problem m i n p 1 2 k J + R ( J T + I ) p = R k m i n p 1 2 J I + R 5/9/2019
10
Convergence of Levenberg-Marquardt
Suppose L={x | f (x) f (x0)} is bounded and each rj is Lipschitz continuously differentiable in a neighborhood N of L. Assume for each k, the approximation solution pk of the Levenberg-Marquardt method satisfies the inequality for some constant c1>0, and ||pk||k for some >1. Then m k ( ) p c 1 J T r i n ; l i m k ! 1 J T R = 5/9/2019
11
Large residual problem
2 f ( x ) = J T + m X j 1 When the second term of the Hessian is large Use quasi-Newton to approximate the second term The secant equation of 2rj(x) is The secant equation of the second term and the update formula (next slide) ( B j ) x k + 1 = r 5/9/2019
12
Dennis, Gay, Welsch update formula.
k + 1 ( x ) = m X j r B [ ] J T R Dennis, Gay, Welsch update formula. S k + 1 = ( z s ) y T 2 s = x k + 1 y J T r z 5/9/2019
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.