Advanced Topics in Optimization

Slides:



Advertisements
Similar presentations
Curved Trajectories towards Local Minimum of a Function Al Jimenez Mathematics Department California Polytechnic State University San Luis Obispo, CA
Advertisements

Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 14.
Optimization.
Engineering Optimization
Optimization Methods TexPoint fonts used in EMF.
Optimization : The min and max of a function
Optimization of thermal processes
Optimization 吳育德.
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
The loss function, the normal equation,
D Nagesh Kumar, IIScOptimization Methods: M1L1 1 Introduction and Basic Concepts (i) Historical Development and Model Building.
Numerical Optimization
Function Optimization Newton’s Method. Conjugate Gradients
Tutorial 12 Unconstrained optimization Conjugate gradients.
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Optimization Methods One-Dimensional Unconstrained Optimization
Revision.
Optimization Mechanics of the Simplex Method
Tutorial 5-6 Function Optimization. Line Search. Taylor Series for Rn
Optimization Methods One-Dimensional Unconstrained Optimization
Unconstrained Optimization Problem
Nonlinear programming
Function Optimization. Newton’s Method Conjugate Gradients Method
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
D Nagesh Kumar, IIScOptimization Methods: M2L3 1 Optimization using Calculus Optimization of Functions of Multiple Variables: Unconstrained Optimization.
Why Function Optimization ?
Optimization Methods One-Dimensional Unconstrained Optimization

9 1 Performance Optimization. 9 2 Basic Optimization Algorithm p k - Search Direction  k - Learning Rate or.
UNCONSTRAINED MULTIVARIABLE
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
D Nagesh Kumar, IIScOptimization Methods: M1L3 1 Introduction and Basic Concepts (iii) Classification of Optimization Problems.
Nonlinear programming Unconstrained optimization techniques.
Chapter 7 Optimization. Content Introduction One dimensional unconstrained Multidimensional unconstrained Example.
1 Unconstrained Optimization Objective: Find minimum of F(X) where X is a vector of design variables We may know lower and upper bounds for optimum No.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Method of Hooke and Jeeves
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
Engineering Optimization Chapter 3 : Functions of Several Variables (Part 1) Presented by: Rajesh Roy Networks Research Lab, University of California,
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Quasi-Newton Methods of Optimization Lecture 2. General Algorithm n A Baseline Scenario Algorithm U (Model algorithm for n- dimensional unconstrained.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
D Nagesh Kumar, IIScOptimization Methods: M6L5 1 Dynamic Programming Applications Capacity Expansion.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
Nonlinear Programming In this handout Gradient Search for Multivariable Unconstrained Optimization KKT Conditions for Optimality of Constrained Optimization.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Searching a Linear Subspace Lecture VI. Deriving Subspaces There are several ways to derive the nullspace matrix (or kernel matrix). ◦ The methodology.
D Nagesh Kumar, IISc Water Resources Systems Planning and Management: M2L2 Introduction to Optimization (ii) Constrained and Unconstrained Optimization.
Optimal Control.
Non-linear Minimization
Chapter 14.
CS5321 Numerical Optimization
Collaborative Filtering Matrix Factorization Approach
Chapter 10. Numerical Solutions of Nonlinear Systems of Equations
Chapter 7 Optimization.
L5 Optimal Design concepts pt A
Optimization and Some Traditional Methods
The loss function, the normal equation,
Mathematical Foundations of BME Reza Shadmehr
Performance Optimization
Section 3: Second Order Methods
Computer Animation Algorithms and Techniques
Presentation transcript:

Advanced Topics in Optimization Direct and Indirect Search Methods D Nagesh Kumar, IISc Optimization Methods: M8L4

Optimization Methods: M8L4 Introduction Real World Problems Possible Solution: Search Algorithm Nonlinear Optimization Complicated Objective Function Analytical Solution D Nagesh Kumar, IISc Optimization Methods: M8L4

Flowchart for Search Algorithm Start with Trial Solution Xi, Set i=1 Compute Objective function f(Xi) Generate new solution Xi+1 Compute Objective function f(Xi+1) Convergence Check Set i=i+1 No Yes Optimal Solution, Xopt=Xi D Nagesh Kumar, IISc Optimization Methods: M8L4

Classification of Search Algorithm Direct Search Algorithm Considers only objective function Partial derivatives of objective function: not considered Called nongradient or zeroth order method Indirect Search Algorithm Partial derivatives of objective function are considered Also called descent method D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm Random Search Method Grid Search Method Pattern Directions Univariate Method Rosen Brock’s Method Simplex Method D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm (Contd..) Random Search Method: Generates trial solution for the decision variables. Classification: Random jump, random walk, and random walk with direction exploitation. Random Jump: generates huge number of data points assuming uniform distribution of decision variables and selects the best one Random Walk: generates trial solution with sequential improvements using scalar step length and unit random vector. Random Walk with Direction Exploitation: Improved version of random walk search, successful direction of generating trial solution is found out and steps are taken along this direction. D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm (Contd..) Grid Search Method Methodology involves setting up grids in the decision space. Objective function values are evaluated at the grid points. The point corresponding to the best objective function value considered as optimum solution. Major Drawback: number of grid points increases exponentially with the number of decision variables. D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm (Contd..) Univariate Method: Generates trial solution for one decision variable keeping all others fixed. Best solution for each of the decision variables keeping others constant are obtained. The whole process is repeated iteratively till convergence. D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm (Contd..) Pattern Directions: In Univariate method, search direction is same as coordinate axes directions. Makes the convergence process slow. In pattern directions, search is performed not along the coordinate directions, but along the direction towards the best solution. Popular Methods: Hooke and Jeeves’ Method, and Powells method. Hooke and Jeeves’ Method: uses exploratory move and pattern move. Powells method: uses conjugate gradient D Nagesh Kumar, IISc Optimization Methods: M8L4

Direct Search Algorithm (Contd..) Rosen Brock’s Method of Rotating Coordinates: Modified version of Hooke and Jeeves’ Method. Coordinate system is rotated in such a way that the first axis always orients to the locally estimated direction of the best solution and all the axes are made mutually orthogonal and normal to the first one. Simplex Method Conventional direct search algorithm. The best solution lies on the vertices of a geometric figure in N-dimensional space made of a set of N+1 points. The method compares the objective function values at the N+1 vertices and moves towards the optimum point iteratively. Movement of the simplex algorithm: reflection, contraction and expansion. D Nagesh Kumar, IISc Optimization Methods: M8L4

Indirect Search Algorithm Steepest Descent Method Conjugate Gradient Method Newton’s Method Quasi-Newton Method Marquardt Method D Nagesh Kumar, IISc Optimization Methods: M8L4

Indirect Search Algorithm (Contd..) Steepest Descent Method: Search starts from an initial trial point X1 Iteratively moves along the steepest descent direction until optimum point is found. Drawback: Solution may get stuck to a local optima. Conjugate Gradient Method Convergence technique uses the concept of conjugate gradient Uses the properties of quadratic convergence D Nagesh Kumar, IISc Optimization Methods: M8L4

Indirect Search Algorithm (Contd..) Newton’s Method: Based on Taylor series expansion. [Ji]=[J]|xi is the Hessian Matrix Setting the partial derivatives of the objective function to zero, minimum f(x) can be obtained From both the Eq. The improved solution is given by Process continues till convergence D Nagesh Kumar, IISc Optimization Methods: M8L4

Indirect Search Algorithm (Contd..) Marquardt Method: Combination of both steepest descent algorithm and Newton’s method By modifying the diagonal elements of Hessian matrix iteratively optimum solution is obtained Quasi-Newton Method Based on Newton’s method They approximate the Hessian matrix, or its inverse, in order to reduce the amount of computation per iteration. The Hessian matrix is updated using the secant equation, a generalization of the secant method for multidimensional problems D Nagesh Kumar, IISc Optimization Methods: M8L4

Constrained Optimization Search algorithms cannot be directly applied to constrained optimization. Constrained optimization can be converted into unconstrained optimization using penalty function when there is a constraint violation. =1( for minimization problem) and -1 ( for maximization problem), M=dummy variable with a very high value Penalty Function D Nagesh Kumar, IISc Optimization Methods: M8L4

Optimization Methods: M8L4 Summary Nonlinear optimization with complicated objective functions or constraints: Can be solved by search algorithm. Care should be taken so that the search should not get stuck at local optima. Search algorithms are not directly applicable to constrained optimization Penalty functions can be used to convert constrained optimization into unconstrained optimization D Nagesh Kumar, IISc Optimization Methods: M8L4

Optimization Methods: M8L4 Thank You D Nagesh Kumar, IISc Optimization Methods: M8L4