Optimization Part II G.Anuradha.

Slides:



Advertisements
Similar presentations
Line Search.
Advertisements

Optimization.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization of thermal processes
Optimization 吳育德.
Performance Optimization
Numerical Optimization
Page 1 Page 1 Engineering Optimization Second Edition Authors: A. Rabindran, K. M. Ragsdell, and G. V. Reklaitis Chapter-2 (Functions of a Single Variable)
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Optimization Methods One-Dimensional Unconstrained Optimization
Engineering Optimization
MAE 552 – Heuristic Optimization Lecture 1 January 23, 2002.
12 1 Variations on Backpropagation Variations Heuristic Modifications –Momentum –Variable Learning Rate Standard Numerical Optimization –Conjugate.
Advanced Topics in Optimization
Optimization using Calculus
CSE 245: Computer Aided Circuit Simulation and Verification
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Why Function Optimization ?
Math for CSLecture 51 Function Optimization. Math for CSLecture 52 There are three main reasons why most problems in robotics, vision, and arguably every.
An Introduction to Optimization Theory. Outline Introduction Unconstrained optimization problem Constrained optimization problem.
Tier I: Mathematical Methods of Optimization
9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
Computational Optimization
UNCONSTRAINED MULTIVARIABLE
Geometry Optimisation Modelling OH + C 2 H 4 *CH 2 -CH 2 -OH CH 3 -CH 2 -O* 3D PES.
ENCI 303 Lecture PS-19 Optimization 2
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
CSE 245: Computer Aided Circuit Simulation and Verification Matrix Computations: Iterative Methods I Chung-Kuan Cheng.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Dan Simon Cleveland State University Jang, Sun, and Mizutani Neuro-Fuzzy and Soft Computing Chapter 6 Derivative-Based Optimization 1.
One Dimensional Search
L22 Numerical Methods part 2 Homework Review Alternate Equal Interval Golden Section Summary Test 4 1.
CHAPTER 10 Widrow-Hoff Learning Ming-Feng Yeh.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
L24 Numerical Methods part 4
Optimality Conditions for Unconstrained optimization One dimensional optimization –Necessary and sufficient conditions Multidimensional optimization –Classification.
Gradient Methods In Optimization
Variations on Backpropagation.
Survey of unconstrained optimization gradient based algorithms
Performance Surfaces.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
10 1 Widrow-Hoff Learning (LMS Algorithm) ADALINE Network  w i w i1  w i2  w iR  =
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
MATH342: Numerical Analysis Sunjae Kim.
MATH 175: Numerical Analysis II
L6 Optimal Design concepts pt B
Computational Optimization
Widrow-Hoff Learning (LMS Algorithm).
CSE 245: Computer Aided Circuit Simulation and Verification
CS5321 Numerical Optimization
Variations on Backpropagation.
CSE 245: Computer Aided Circuit Simulation and Verification
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
A function f is increasing on an open interval I if, for any choice of x1 and x2 in I, with x1 < x2, we have f(x1) < f(x2). A function f is decreasing.
Chapter 7 Optimization.
Instructor :Dr. Aamer Iqbal Bhatti
Section 2.3 – Analyzing Graphs of Functions
Section 4.4 – Analyzing Graphs of Functions
Variations on Backpropagation.
Performance Surfaces.
Performance Optimization
L23 Numerical Methods part 3
Outline Preface Fundamentals of Optimization
Steepest Descent Optimization
Bracketing.
Presentation transcript:

Optimization Part II G.Anuradha

Review of previous lecture- Steepest Descent Choose the next step so that the function decreases: For small changes in x we can approximate F(x): where If we want the function to decrease: We can maximize the decrease by choosing:

Example

Plot

Necessary and sufficient conditions for a function with single variable

Functions with two variables Sufficient conditions Necessary conditions

Stationary Points

Effect of learning rate More the learning rate the trajectory becomes oscillatory. This will make the algorithm unstable The upper limit for learning rates can be set for quadratic functions

Stable Learning Rates (Quadratic) Stability is determined by the eigenvalues of this matrix. Eigenvalues of [I - aA]. (li - eigenvalue of A) Stability Requirement:

Example

Newton’s Method Take the gradient of this second-order approximation and set it equal to zero to find the stationary point:

Example

Plot

This is used for finding line minimization methods and their stopping criteria Initial bracketing Line searches Newton’s method Secant method Sectioning method

Initial Bracketing Helps in finding the range which contains the relative minimum Bracketing some assumed minimum in the starting interval is required Two schemes are used for this purpose

Sectioning methods