Optimization Methods Unconstrained optimization of an objective function F Deterministic, gradient-based methods Running a PDE: will cover later in course.

Slides:



Advertisements
Similar presentations
Optimization with Constraints
Advertisements

Line Search.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 One-Dimensional Unconstrained Optimization Chapter.
Optimization : The min and max of a function
GUM*02 tutorial session UTSA, San Antonio, Texas Parameter searching in neural models Mike Vanier, Caltech.
Optimization methods Review
Optimization Introduction & 1-D Unconstrained Optimization
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Nonlinear Programming
Easy Optimization Problems, Relaxation, Local Processing for a small subset of variables.
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
Jonathan Richard Shewchuk Reading Group Presention By David Cline
MIT and James Orlin © Nonlinear Programming Theory.
Numerical Optimization
1cs542g-term Notes  Extra class this Friday 1-2pm  If you want to receive s about the course (and are auditing) send me .
458 Interlude (Optimization and other Numerical Methods) Fish 458, Lecture 8.
Engineering Optimization
Optimization Methods One-Dimensional Unconstrained Optimization
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
Chapter 10: Iterative Improvement
Optimization Methods One-Dimensional Unconstrained Optimization
Engineering Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Why Function Optimization ?
Optimization Methods One-Dimensional Unconstrained Optimization
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
UNCONSTRAINED MULTIVARIABLE
Principles of Computer-Aided Design and Manufacturing Second Edition 2004 ISBN Author: Prof. Farid. Amirouche University of Illinois-Chicago.
Chapter 17 Boundary Value Problems. Standard Form of Two-Point Boundary Value Problem In total, there are n 1 +n 2 =N boundary conditions.
ENCI 303 Lecture PS-19 Optimization 2
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear Programming.  A nonlinear program (NLP) is similar to a linear program in that it is composed of an objective function, general constraints,
Stochastic Linear Programming by Series of Monte-Carlo Estimators Leonidas SAKALAUSKAS Institute of Mathematics&Informatics Vilnius, Lithuania
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
1 Optimization Multi-Dimensional Unconstrained Optimization Part II: Gradient Methods.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Mathematical Models & Optimization?
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Constrained stress majorization using diagonally scaled gradient projection Tim Dwyer and Kim Marriott Clayton School of Information Technology Monash.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
1 Chapter 6 General Strategy for Gradient methods (1) Calculate a search direction (2) Select a step length in that direction to reduce f(x) Steepest Descent.
Today's Specials ● Detailed look at Lagrange Multipliers ● Forward-Backward and Viterbi algorithms for HMMs ● Intro to EM as a concept [ Motivation, Insights]
Optimization of functions of one variable (Section 2)
Exam 1 Oct 3, closed book Place ITE 119, Time:12:30-1:45pm
Non-Linear Programming © 2011 Daniel Kirschen and University of Washington 1.
Gradient Methods In Optimization
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
METHOD OF STEEPEST DESCENT ELE Adaptive Signal Processing1 Week 5.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
A study of simulated annealing variants Ana Pereira Polytechnic Institute of Braganca, Portugal Edite Fernandes University of Minho,
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
Calculus III Hughes-Hallett Chapter 15 Optimization.
1 Introduction Optimization: Produce best quality of life with the available resources Engineering design optimization: Find the best system that satisfies.
Parameter estimation class 5 Multiple View Geometry CPSC 689 Slides modified from Marc Pollefeys’ Comp
Constrained Optimization by the  Constrained Differential Evolution with an Archive and Gradient-Based Mutation Tetsuyuki TAKAHAMA ( Hiroshima City University.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
Optimal Control.
Excel’s Solver Use Excel’s Solver as a tool to assist the decision maker in identifying the optimal solution for a business decision. Business decisions.
Optimization via Search
Bounded Nonlinear Optimization to Fit a Model of Acoustic Foams
University of Colorado Boulder APPM 4380 October 10th, 2016
Heuristic Optimization Methods
Non-linear Minimization
7.5 – Constrained Optimization: The Method of Lagrange Multipliers
More on Search: A* and Optimization
Boltzmann Machine (BM) (§6.4)
Computer Animation Algorithms and Techniques
Presentation transcript:

Optimization Methods Unconstrained optimization of an objective function F Deterministic, gradient-based methods Running a PDE: will cover later in course Gradient-based (ascent/descent) methods Stochastic methods Simulated annealing Theoretically but not practically interesting Evolutionary (genetic) algorithms Multiscale methods Mean field annealing, graduated nonconvexity, etc. Constrained optimization Lagrange multipliers

Our Assumptions for Optimization Methods With objective function F(p) Dimension(p) >> 1and frequently quite large Evaluating F at any p is very expensive Evaluating D 1 F at any p is very, very expensive Evaluating D 2 F at any p is extremely expensive True in most image analysis and graphics applications

Order of Convergence for Iterative Methods |  i+1 | = k|  i |  in limit  is order of convergence The major factor in speed of convergence N steps of method has order of convergence  N Thus issue is linear convergence (  =1) vs. superlinear convergence (  >1)

Ascent/Descent Methods At maximum, D 1 F (i.e.,  F) =0. Pick direction of ascent/descent Find approximate maximum in that direction: two possibilities –Calculate stepsize that will approximately reach maximum –In search direction, find actual max within some range

Gradient Ascent/Descent Methods Direction of ascent/descent is  D 1 F. If you move to optimum in that direction, next direction will be orthogonal to this one –Guarantees zigzag –Bad behavior for narrow ridges (valleys) of F –Linear convergence

Newton and Secant Ascent/Descent Methods for F(p) We are solving D 1 F=0 –Use Newton or secant equation solution method to solve Newton to solve f(p)=0 is p i+1 = p i – D 1 f (p i ) -1 p i Newton –Move from p to p-(D 2 F) -1 D 1 F Is direction of ascent/descent is gradient direction D 1 F? –Methods that ascend/descend in D 1 f (gradient) directionare inferior Really direction of ascent/descent is direction of (D 2 F) -1 D 1 F Also gives you step size in that direction Secant –Same as Newton except replace D 2 F and D 1 F by discrete approximations to them from this and last n iterates

Conjugate gradient method Preferable to gradient descent/ascent methods Two major aspects –Successive directions for descent/ascent are conjugate: = 0 in limit for convex F If trueat all steps (quadratic F), convergence in n-1 steps, with n=dim(p) Improvements available using more previous directions –In search direction, find actual max/min within some range Quadratic convergence depends on =0, i.e., F a local minimum in the h i direction References –Shewchuk, An Intro. to the CGM w/o the Agonizing Pain ( gradient.pdf) –Numerical Recipes –Polak, Computational Methods in Optimization, Ac. Press

Conjugate gradient method issues Preferable to gradient descent/ascent methods Must find a local minimum in the search direction Will have trouble with –Bumpy objective functions –Extremely elongated minimum/maximum regions

Smooth objective function to put initial estimate on hillside of its global optimum –E.g., by using larger scale measurements Find its optimum Iterate –Decrease scale of objective function –Use prev. optimum as starting point for new optimization Multiscale Gradient-Based Optimization To avoid local optima

General methods –Graduated non-convexity [Blake & Zisserman, 1987] –Mean field annealing [Bilbro, Snyder, et al, 1992] In image analysis –Vary degree of globality of geometric representation Multiscale Gradient-Based Optimization Example Methods

To optimize F(p) over p subject to g i (p)=0, i=1, 2, …, N, with p having n parameters –Create function F(p)+  i i g i (p) –Find critical point for it over p and Solve D 1 p,  F(p)+  i i g i (p)]=0 –n+N equations in n+N unknowns –N of the equations are just g i (p)=0, i=1, 2, …, N The critical point will need to be an optimum w.r.t. p Optimization under Constraints by Lagrange Multiplier(s)

Stochastic Methods Needed when objective function is bumpy or many variables or hard to compute gradient of objective function