Downhill Simplex Search (Nelder-Mead Method)

Slides:



Advertisements
Similar presentations
Vegetation Science Lecture 4 Non-Linear Inversion Lewis, Disney & Saich UCL.
Advertisements

1 Modeling and Simulation: Exploring Dynamic System Behaviour Chapter9 Optimization.
Introduction to Optimization Anjela Govan North Carolina State University SAMSI NDHS Undergraduate workshop 2006.
Nelder Mead.
MATLAB Optimization Greg Reese, Ph.D Research Computing Support Group Miami University.
Optimization.
Unconstrained optimization Gradient based algorithms –Steepest descent –Conjugate gradients –Newton and quasi-Newton Population based algorithms –Nelder.
Optimization : The min and max of a function
CHAPTER 2 D IRECT M ETHODS FOR S TOCHASTIC S EARCH Organization of chapter in ISSO –Introductory material –Random search methods Attributes of random search.
Random numbers and optimization techniques Jorge Andre Swieca School Campos do Jordão, January,2003 second lecture.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Motion Analysis (contd.) Slides are from RPI Registration Class.
Optimization COS 323. Ingredients Objective functionObjective function VariablesVariables ConstraintsConstraints Find values of the variables that minimize.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
Advanced Topics in Optimization
Dr. Marco A. Arocha Aug,  “Roots” problems occur when some function f can be written in terms of one or more dependent variables x, where the.
MAE 552 – Heuristic Optimization Lecture 6 February 4, 2002.
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
What is Optimization? Optimization is the mathematical discipline which is concerned with finding the maxima and minima of functions, possibly subject.
Accurate 3D Modeling of User Inputted Molecules Computer Systems Lab: Ben Parr Period 6.
A New Method For Numerical Constrained Optimization Ronald N. Perry Mitsubishi Electric Research Laboratories.
84 b Unidimensional Search Methods Most algorithms for unconstrained and constrained optimisation use an efficient unidimensional optimisation technique.
Analytical vs. Numerical Minimization Each experimental data point, l, has an error, ε l, associated with it ‣ Difference between the experimentally measured.
Chem Math 252 Chapter 5 Regression. Linear & Nonlinear Regression Linear regression –Linear in the parameters –Does not have to be linear in the.
CSIE Dept., National Taiwan Univ., Taiwan
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Multivariate Unconstrained Optimisation First we consider algorithms for functions for which derivatives are not available. Could try to extend direct.
Local search algorithms Most local search algorithms are based on derivatives to guide the search. For differentiable function it has been shown that.
Engineering Optimization Chapter 3 : Functions of Several Variables (Part 1) Presented by: Rajesh Roy Networks Research Lab, University of California,
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Molecular Mechanics a. Force fields b. Energy minimization / Geometry optimization c. Molecular mechanics examples.
Local Search and Optimization Presented by Collin Kanaley.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Announcement "A note taker is being recruited for this class. No extra time outside of class is required. If you take clear, well-organized notes, this.
Survey of unconstrained optimization gradient based algorithms
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Root Finding UC Berkeley Fall 2004, E77 Copyright 2005, Andy Packard. This work is licensed under the Creative.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 2 - Chapter 7 Optimization.
ADVERSARIAL GAME SEARCH: Min-Max Search
CSIE Dept., National Taiwan Univ., Taiwan
University of Colorado Boulder APPM 4380 October 10th, 2016
Quadratic Classifiers (QC)
National Taiwan University
Intro to Machine Learning
Non-linear Minimization
Gradient Descent 梯度下降法
National Taiwan University
Local Search Algorithms
Part 2 Chapter 6 Roots: Open Methods
Local search algorithms
Chapter 7: Optimization
Deep Neural Networks (DNN)
Hierarchical Clustering
Optimization and Some Traditional Methods
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
National Taiwan University
Applications of Heaps J.-S. Roger Jang (張智星) MIR Lab, CSIE Dept.
Insertion Sort Jyh-Shing Roger Jang (張智星)
Prediction in Stock Trading
Local Search Algorithms
Gradient Descent 梯度下降法
Naive Bayes Classifiers (NBC)
Evolutionary Computational Intelligence
Game Trees and Minimax Algorithm
Reading: Chapter 4.5 HW#2 out today, due Oct 5th
National Taiwan University
Storing Game Entries in an Array
Local Search Algorithms
Bracketing.
Presentation transcript:

Downhill Simplex Search (Nelder-Mead Method) J.-S. Roger Jang (張智星) jang@mirlab.org http://mirlab.org/jang MIR Lab, CSIE Dept. National Taiwan University 2019/4/11

Downhill Simplex Search (DSS) DSS: A heuristic search method that tries to find the minimizing point in a function of several variables Proposed by John Nelder & Roger Mead in 1965 AKA amoeba method, Nelder-Mead method Concept Use a simplex of n+1 points to explore the objective function with n inputs Simplex: interval in 1D, triangle in 2D, tetrahedron in 3D, etc. Simplex have several operations to move to and pinpoint the minimum: reflection, expansion, contraction, shrink Quiz!

Basic Operation: Reflection Reflection toward downhill direction location probed by reflection step worst point (highest function value)

Basic Operation: Expansion Try an expansion if reflection resulted in lowest value location probed by expansion step

Basic Operation: Contraction Try a contraction if reflection resulted in highest value location probed by contration step

Basic Operation: Shrink Shrink the simplex around the best point if all else fails.

DSS Parameters for DSS R=1 (reflection) K=0.5 (contraction) E=2 (expansion) S=0.5 (shrinkage)

Steps in DSS (1/4) Sort by function value: Order the vertices to satisfy f1 < f2 < … < fn+1 Calculate xm = sum xi (the average of all the points except the worst) Reflection. Compute xr = xm + a(xm-xn+1) and evaluate fr =f(xr). If f1 < fr < fn , accept xr and terminate the iteration.

Steps in DSS (2/4) Expansion. If fr < f1 calculate xe = xm+ g(xr - xm) and evaluate fe =f(xe). If fe < fr , accept xe ; otherwise accept xr. Terminate the iteration.

Steps in DSS (3/4) Contraction. If fr > fn, perform a contraction between xm and the better of xr and xn+1. Outside. If fn < fr < fn+1 calculate xoc= xm+ l (xr - xm) and evaluate f(xoc). If foc< fr, accept xoc and terminate the iteration; otherwise do a shrink. Inside. If fr > fn+1 calculate xic = xm – K (xm- xn+1) and evaluate f(xic). If fic< fn+1 accept xic and terminate the iteration; otherwise do a shrink.

Steps in DSS (4/4) Shrink. Evaluate f at the n points vi = xi + S (xi-x1), i = 2,….,n+1. The vertices of the simplex at the next iteration are x1, v2, …, vn+1.

List of DSS Moves in 2D

Example of DSS (1/2)

Example of DSS (2/2)

MATLAB for Optimization Basic functions for optimization Single variable: fminbnd Multiple variables: fminsearch (which uses DSS) Examples of using the above functions 請見「MATLAB 程式設計進階篇」的第8.4節「函數的極小值」 Slides and recording For more options, try the following toolboxes Optimization Toolbox Global Optimization Toolbox

Summary of DSS About DSS Strengths Weaknesses Hints Intuitive concept Easy to implement No gradient or derivative needed Does not care about smoothness of the objective function Weaknesses Slow Could be trapped in local minima Hints Start with various initial guesses to avoid local minima. Improvise your own version of DSS! Quiz!

References DSS in Wikipedia Some of our slides come from Many variants! Some of our slides come from http://physics.ujep.cz/~mmaly/vyuka/MPVT_II/Heuristiky/NelderMead.ppt http://www.cs.princeton.edu/courses/archive/spr06/cos323/notes/lecture03_optimization/cos323_s06_lecture03_optimization.ppt