Local search algorithms Most local search algorithms are based on derivatives to guide the search. For differentiable function it has been shown that.

Slides:



Advertisements
Similar presentations
Vegetation Science Lecture 4 Non-Linear Inversion Lewis, Disney & Saich UCL.
Advertisements

Local Search Algorithms
1 Modeling and Simulation: Exploring Dynamic System Behaviour Chapter9 Optimization.
ZEIT4700 – S1, 2014 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Partial Differential Equations
Introduction to Optimization Anjela Govan North Carolina State University SAMSI NDHS Undergraduate workshop 2006.
Nelder Mead.
Optimization.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 123 “True” Constrained Minimization.
Unconstrained optimization Gradient based algorithms –Steepest descent –Conjugate gradients –Newton and quasi-Newton Population based algorithms –Nelder.
P. Venkataraman Mechanical Engineering P. Venkataraman Rochester Institute of Technology DETC2013 – 12269: Continuous Solution for Boundary Value Problems.
Lecture 2: Parameter Estimation and Evaluation of Support.
Optimization : The min and max of a function
DREAM PLAN IDEA IMPLEMENTATION Introduction to Image Processing Dr. Kourosh Kiani
Optimization methods Review
CHAPTER 2 D IRECT M ETHODS FOR S TOCHASTIC S EARCH Organization of chapter in ISSO –Introductory material –Random search methods Attributes of random search.
Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.
Chapter 3 UNCONSTRAINED OPTIMIZATION
Engineering Optimization
Design Optimization School of Engineering University of Bradford 1 Numerical optimization techniques Unconstrained multi-parameter optimization techniques.
Chapter 1 Introduction The solutions of engineering problems can be obtained using analytical methods or numerical methods. Analytical differentiation.
Engineering Optimization
Predicting Communication Latency in the Internet Dragan Milic Universität Bern.
Optimization Methods One-Dimensional Unconstrained Optimization
CHE/ME 109 Heat Transfer in Electronics LECTURE 11 – ONE DIMENSIONAL NUMERICAL MODELS.
Algorithmic Problems in Algebraic Structures Undecidability Paul Bell Supervisor: Dr. Igor Potapov Department of Computer Science
Advanced Topics in Optimization
ISM 206 Lecture 3 The Simplex Method. Announcements.
Optimization Methods One-Dimensional Unconstrained Optimization
TRIPLE INTEGRALS IN SPHERICAL COORDINATES
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
Chapter 6 Differential Calculus
What is Optimization? Optimization is the mathematical discipline which is concerned with finding the maxima and minima of functions, possibly subject.
Heuristic Search Heuristic - a “rule of thumb” used to help guide search often, something learned experientially and recalled when needed Heuristic Function.
A New Method For Numerical Constrained Optimization Ronald N. Perry Mitsubishi Electric Research Laboratories.
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Basic Numerical methods and algorithms
Part 2 Chapter 7 Optimization
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Nonlinear programming Unconstrained optimization techniques.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
Scientific Computing Partial Differential Equations Implicit Solution of Heat Equation.
Survey of gradient based constrained optimization algorithms Select algorithms based on their popularity. Additional details and additional algorithms.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Method of Hooke and Jeeves
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
OR Chapter 2. Simplex method (2,0) (2,2/3) (1,2)(0,2)
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Lecture 5 - Single Variable Problems CVEN 302 June 12, 2002.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Forging new generations of engineers. Centroid Also known as the center of gravity or center of mass if you’re dealing with a 3-D object. The centroid.
Discretization Methods Chapter 2. Training Manual May 15, 2001 Inventory # Discretization Methods Topics Equations and The Goal Brief overview.
Particle Swarm Optimization Using the HP Prime Presented by Namir Shammas 1.
Survey of unconstrained optimization gradient based algorithms
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Optimization in Engineering Design 1 Introduction to Non-Linear Optimization.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory 117 Penalty and Barrier Methods General classical constrained.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
Non-linear Minimization
Various Symbols for the Derivative
Local search algorithms
Chapter 7: Optimization
Downhill Simplex Search (Nelder-Mead Method)
First Exam 18/10/2010.
Evolutionary Computational Intelligence
Part 2 Chapter 7 Optimization
Direct Methods.
Presentation transcript:

Local search algorithms Most local search algorithms are based on derivatives to guide the search. For differentiable function it has been shown that even if you need to calculate derivatives by finite differences, derivative-based algorithms are better than ones that do not use derivatives. However, we will start with Nelder-Mead sequential simplex algorithm that can deal with non-differentiable functions, and even with some discontinuities.

Nelder Mead Sequential Simplex Algorithm An old local-search algorithm that contains the ingredients of modern search techniques: 1.No derivatives 2.Population based 3.Simple idea that does not require much mathematics Basis for Matlab’s fminsearch testimonial to its robustness. Simplex is the simplest body in n dimensions Has n+1 vertices (triangle in 2D and tetrahedron in 3D)

Sequential Simplex Method (Mostly from Wikipedia and Matlab) 1.In n dimensional space start with n+1 vertices of a selected simplex, evaluate the function there and order points by function value 2.Calculate x 0, the center of gravity of all the points except x n+1 3.Reflect x n+1 about x 0 4.If new point is better than 2 nd worst, but not best, use to replace worst and go back to step 1. Reflect worst point about c.g. Read about expansion and contraction

Expansion and contraction Expansion (new point is best) Contraction (new point is worst than 2 nd worst)

Reduction (shrinkage) If the contracted point is not better than worst, then contract all points about the best one

Problems Nelder Mead operators 1.For a 2D problem, the current simplex has f(0,1)=3, f(0,0)=1, f(1,0)=2. Where will you evaluate f next? SolutionSolution 2.If the next two points gave us function values of 4 and 5, respectively, where will you evaluate the function next? SolutionSolution 3.If instead, the next point gave us a function value of 0, where will you evaluate the function next? SolutionSolution

Rosenbrock Banana function Various versions This is from Wikeipedia (minimum at (1,1))

Generating first 20 points for Banana function function [y]=banana(x) global z1; global z2; global yg global count y=100*(x(2)-x(1)^2)^2+(1-x(1))^2; z1(count)=x(1); z2(count)=x(2); yg(count)=y; count=count+1; global z2; global yg; global z1 global count count =1; options=optimset('MaxFunEvals',20) [x,fval] = 1],options) mat=[z1;z2;yg] mat = Columns 1 through Columns 9 through

Reflection and expansion

Next iteration

Complete search examples from Wikipedia Mead_methodhttp://en.wikipedia.org/wiki/Nelder- Mead_method Shows progress for the Banana function as well as for Himmelblau’s function.

Problem fminsearch Track and plot the first few iterations of fminsearch on Himmelblau’s function starting from (1,1). Solution