Download presentation
Presentation is loading. Please wait.
1
Evolutionary Computational Intelligence
Ferrante Neri 26/05/2019 Evolutionary Computational Intelligence Lecture 1: Basic Concepts Ferrante Neri University of Jyväskylä 26/05/ :59:09 Lecture 1:Basic Concepts
2
Introductory Example: Radio Tuning
Ferrante Neri 26/05/2019 Introductory Example: Radio Tuning Position (e.g. angular) of the Radio Knob : candidate solution We want to have a clear signal that is: Maximize the signal Minimize the background noise 26/05/ :59:09 Lecture 1:Basic Concepts
3
Lecture 1:Basic Concepts
Optimization Problem Candidate solution Decision (or design) variables Variable bounds define the decision space objective function or fitness function (the behavior taken by the fitness over the decision space is called fitness landscape) 26/05/ :59:09 Lecture 1:Basic Concepts
4
Real-World Optimization Problems
Optimization Problems are often rather easily formulated but very hard to be solved when the problem comes from an application. In fact, some features characterizing the problem can make it extremely challenging. These features are summarized in the following: 26/05/ :59:09 Lecture 1:Basic Concepts
5
Highly nonlinear fitness function
Usually optimization problems are characterized by nonlinear function. In real world optimization problems, the physical phenomenon, due to its nature (e.g. in the case of saturation phenomenon or for systems which employ electronic components), cannot be approximated by a linear function neither in some areas of the decision space. 26/05/ :59:09 Lecture 1:Basic Concepts
6
Highly multimodal fitness landscape
It often happens that the fitness landscape contains many local optima and that many of these have an unsatisfactory performance (fitness value) These fitness landscapes are usually rather difficult to be handled since the optimization algorithms which employ gradient based information in detecting the search direction could easily converge to a suboptimal basin of attraction Basin of attraction: set of points of the decision space such that ,initial conditions chosen, dynamically evolve to a particular attractor 26/05/ :59:09 Lecture 1:Basic Concepts
7
Optimization in Noisy Environment
Uncertainties in optimization can be categorized into three classes. Noisy fitness function. Noise in fitness evaluations may come from many different sources such as sensory measurement errors or randomized simulations. Approximated fitness function. When the fitness function is very expensive to evaluate, or an analytical fitness function is not available, approximated fitness functions are often used instead. These approximated models implicitly introduce a noise which is the difference between the approximated value and real fitness value, which is unknown. Robustness. Often, when a solution is implemented, the design variables or the environmental parameters are subject to perturbations or changes (e.g. control problems). 26/05/ :59:09 Lecture 1:Basic Concepts
8
Computationally expensive problems
Optimization problems can be computationally expensive because of two reasons: high cardinality decision space (usually combinatorial) computationally expensive fitness function (e.g. design of on-line electric drives) 26/05/ :59:09 Lecture 1:Basic Concepts
9
Real-World Problems and Classical Methods
When such features are present in an optimization problem, the application of exact methods is usually unfeasible since the hypotheses are not respected. Moreover the application of classical deterministic algorithms is also questionable since their use could easily lead to suboptimal solutions (e.g. a hill climber for highly multimodal functions) or return completely unreliable results (e.g. a deterministic optimizer in noisy environments). 26/05/ :59:09 Lecture 1:Basic Concepts
10
Lecture 1:Basic Concepts
Rosenbrock Algorithm 1960 From chemical application Well defined decision space No analytical expression and no derivatives Modification of a steepest descent method 26/05/ :59:09 Lecture 1:Basic Concepts
11
Lecture 1:Basic Concepts
Rosenbrock Algorithm The search is executed along each direction variable (orthogonal search) The search is continued by enlarging the step size for successful directions and reducing for unsuccessful directions The search is stopped when the trial was successful in all the directions 26/05/ :59:09 Lecture 1:Basic Concepts
12
Lecture 1:Basic Concepts
Rosenbrock Algorithm Under these conditions, a new set of directions is determined by means of Gram-Schmidt procedure and the search is start over 26/05/ :59:09 Lecture 1:Basic Concepts
13
Lecture 1:Basic Concepts
Rosenbrock Algorithm 26/05/ :59:09 Lecture 1:Basic Concepts
14
Hooke Jeeves Algorithm (1961)
exploratory radius h, an initial candidate Initial solution x n × n direction exploratory matrix U (e.g. diag (w(1),w(2), …w(i)...,w(n), where w (i) is the width of the range of variability of the i-th variable) U(i,:) is the i-th row of the matrix 26/05/ :59:09 Lecture 1:Basic Concepts
15
Hooke Jeeves Algorithm
Exploratory Move: samples solutions x (i)+hU(i, :) (”+” move) with i = 1, 2, , n and thesolutions x (i)−hU(i, :) (”-” move) with i = 1, 2, , n only along those directions which turned out unsuccessful during the ”+” move Directions are analyzed Separately! 26/05/ :59:09 Lecture 1:Basic Concepts
16
Hooke Jeeves Algorithm
Pattern Move: The pattern move is an aggressive attempt of the algorithm to exploit promising search directions. Rather than centering the following exploration at the most promising explored candidate solution the HJA tries to move further. The algorithm makes a double step and centers the subsequent exploratory move. If this second exploratory fails: step back and exploratory. 26/05/ :59:09 Lecture 1:Basic Concepts
17
Hooke Jeeves Algorithm
26/05/ :59:09 Lecture 1:Basic Concepts
18
Nelder Mead Algortihm (1965)
works on a setof n + 1 solutions in order to perform the local search it employs an exploratory logic based on a dynamic construction of a polyhedron (simplex) n + 1 solutions x0, x1, , xn sorted in descending order according to their fitness values (i.e. x0 is the best), the NMA attempts to improve xn 26/05/ :59:09 Lecture 1:Basic Concepts
19
Lecture 1:Basic Concepts
Nelder Mead Algorithm The centroid xm is calculated: 1st step: reflection If the reflected point outperforms x0,replacement occurs 26/05/ :59:09 Lecture 1:Basic Concepts
20
Lecture 1:Basic Concepts
Nelder Mead Algorithm 2nd step: expansion. if the reflection was successful, (i.e. reflected point better than x0) it calculates if the expansion is also successful new replacement occurs 26/05/ :59:09 Lecture 1:Basic Concepts
21
Lecture 1:Basic Concepts
Nelder Mead Algorithm If xr did not improve upon x0, If f (xr) < f (xn−1) then xr replaces xn. If this trial is also unsuccessful, if f (xr) < f (xn), xr replaces xn and the 3rd step: outside contraction. 26/05/ :59:09 Lecture 1:Basic Concepts
22
Lecture 1:Basic Concepts
Nelder Mead Algorithm If xr does not outperform neither xn then the 4th step: inside contraction: if the contraction was successful then xc replaces xn 26/05/ :59:09 Lecture 1:Basic Concepts
23
Lecture 1:Basic Concepts
Nelder Mead Algorithm If there is no way to improve x0 the 5th step shrinking: n new points a sampled and the process is started over 26/05/ :59:09 Lecture 1:Basic Concepts
24
Lecture 1:Basic Concepts
Comparative Analysis The three algorithms do not require derivatives and do not require explicit analytical expression Rosenbrock and Hooke Jeeves are fully deterministic while Nelder Mead has some randomness Rosenbrock and Nelder Mead move in the space along all the directions simultaneously (e.g. diagonal in 2D) while Hooke Jeeves moves along one direction at once 26/05/ :59:09 Lecture 1:Basic Concepts
25
Fundamental Points in Comparative Analysis
Rosenbrock and Hooke Jeeves have a mathematically proved convergence while Nelder Mead doesn’t! Rosenbrock and Hooke Jeeves have “local properties” while Nelder Mead has “global properties” 26/05/ :59:09 Lecture 1:Basic Concepts
26
Two-Phase Nozzle Design (Experimental)
Experimental design optimisation: Optimise efficieny. ... evolves... Initial design Final design: 32% improvement in efficieny. 26/05/ :59:09 Lecture 1:Basic Concepts
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.