Steepest Descent Optimization

Slides:



Advertisements
Similar presentations
Optimization with Constraints
Advertisements

Section Differentials Tangent Line Approximations A tangent line approximation is a process that involves using the tangent line to approximate a.
1 L-BFGS and Delayed Dynamical Systems Approach for Unconstrained Optimization Xiaohui XIE Supervisor: Dr. Hon Wah TAM.
Dr. Jie Zou PHY Chapter 2 Solution of Nonlinear Equations: Lecture (III)
Nonlinear Optimization for Optimal Control
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
1 L-BFGS and Delayed Dynamical Systems Approach for Unconstrained Optimization Xiaohui XIE Supervisor: Dr. Hon Wah TAM.
MECH300H Introduction to Finite Element Methods Lecture 2 Review.
The Derivative. Objectives Students will be able to Use the “Newton’s Quotient and limits” process to calculate the derivative of a function. Determine.
12 1 Variations on Backpropagation Variations Heuristic Modifications –Momentum –Variable Learning Rate Standard Numerical Optimization –Conjugate.
Advanced Topics in Optimization
CSE 245: Computer Aided Circuit Simulation and Verification
Why Function Optimization ?
Implementation of Nonlinear Conjugate Gradient Method for MLP Matt Peterson ECE 539 December 10, 2001.
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Computational Optimization
UNCONSTRAINED MULTIVARIABLE
1 Chapter 7 NUMERICAL INTEGRATION. 2 PRELIMINARIES We use numerical integration when the function f(x) may not be integrable in closed form or even in.
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Application of Differential Applied Optimization Problems.
USC3002 Picturing the World Through Mathematics Wayne Lawton Department of Mathematics S , Theme for Semester I, 2007/08.
Loop Application: Numerical Methods, Part 1 The power of Matlab Mathematics + Coding.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
4.5: Linear Approximations, Differentials and Newton’s Method.
MECH345 Introduction to Finite Element Methods Chapter 1 Numerical Methods - Introduction.
CSE 245: Computer Aided Circuit Simulation and Verification Matrix Computations: Iterative Methods I Chung-Kuan Cheng.
1 Solve each: 1. 5x – 7 > 8x |x – 5| < 2 3. x 2 – 9 > 0 :
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Dan Simon Cleveland State University Jang, Sun, and Mizutani Neuro-Fuzzy and Soft Computing Chapter 6 Derivative-Based Optimization 1.
Linear approximation and differentials (Section 2.9)
One Dimensional Search
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Optimization Techniques M. Fatih Amasyalı. Steepest Descent Exact step size x_new = x_old - eps * df; Eps is calculated as: – z(eps)=x – eps*df – Find.
Gradient Methods In Optimization
Variations on Backpropagation.
Survey of unconstrained optimization gradient based algorithms
METHOD OF STEEPEST DESCENT ELE Adaptive Signal Processing1 Week 5.
DIFFERENTIAL EQUATIONS Note: Differential equations are equations containing a derivative. They can be solved by integration to obtain a general solution.
3.5 – Implicit Differentiation
Tangent Line Approximations A tangent line approximation is a process that involves using the tangent line to approximate a value. Newton used this method.
1. Graph 2. Find the area between the above graph and the x-axis Find the area of each: 7.
AAR Rendezvous Algorithm Progress Meeting 10 May 2005 REID A. LARSON, 2d Lt, USAF Control Systems Engineer MARK J. MEARS, Ph.D. Control Systems Engineer.
Downhill product – Uphill product.
Linear approximation and differentials (Section 3.9)
Computational Optimization
Quasi-Newton Methods Problem: SD, CG too slow to converge if NxN H matrix is ill-conditioned. SD: dx = - g (slow but no inverse to store or compute) QN:
Techniques of Integration
CSE 245: Computer Aided Circuit Simulation and Verification
Iterative Non-Linear Optimization Methods
Steepest Descent Optimization
Section Euler’s Method
CS5321 Numerical Optimization
Non-linear Least-Squares
Conjugate Gradient Problem: SD too slow to converge if NxN H matrix is ill-conditioned. SD: dx = - g (slow but no inverse to store or compute) CG: dx =
Variations on Backpropagation.
CSE 245: Computer Aided Circuit Simulation and Verification
CS5321 Numerical Optimization
LSM with Sparsity Constraints
Optimization Part II G.Anuradha.
3.8 Newton’s Method How do you find a root of the following function without a graphing calculator? This is what Newton did.
Linearization and Newton’s Method
Assignment 1: due 1/16/19 Estimate all of the zero of x3-x2-2x+1 graphically. Write a MatLab code for Newton’s method. Use your code to refine the graphical.
Variations on Backpropagation.
Linear approximation and differentials (Section 3.9)
Performance Optimization
Outline Preface Fundamentals of Optimization
L23 Numerical Methods part 3
Section 3: Second Order Methods
Conjugate Gradient Optimization
Presentation transcript:

Steepest Descent Optimization

Outline Newton Method -> Steepest Descent Convergence Rates Exact and Numerical Line Search Regularization

Given: f(x) ~ f(x0) + dx g + dx H dx Newton -> Steepest Descent Method Given: f(x) ~ f(x0) + dx g + dx H dx T T (1) Find: stationary point x* s.t. f(x*)=0 D Soln: Newton’s Method (2) a is step length that should be calculated so we never go uphill

Assume H is diagonally dominant Ill-conditioned f(x) ~ dx + dy 2 0 1 H = Well-conditioned Well-conditioned f(x) ~ dx + .001dy 2 Ill-conditioned 0 .001 H = So H ~ d /H -1 ij ij ii Therefore (2) becomes Preconditoned Steepest Descent if g is Premultiplied by approximation to inverse H

Poorly Conditioned Eqns Steepest Descent For k=1:niter end Ill-conditioned Well-conditioned Poorly Conditioned Eqns Converge Slowly

Outline Newton Method -> Steepest Descent Convergence Rates Exact and Numerical Line Search Regularization

Poorly Conditioned Eqns Steepest Descent For k=1:niter end Poorly Conditioned Eqns Converge Slowly

Outline Newton Method -> Steepest Descent Convergence Rates Exact and Numerical Line Search Regularization

Exact Line Search

Numerical Line Search Problem 1: Solve above eqn for alpha Problem 2: Write Matlab code for Newton’s method with Rosenbrock, But now compute step length Problem 3: Write Matlab code for Steepest Descent method with Rosenbrock, But now compute step length and Preconditoning