Evolutionary Computational Intelligence

Slides:



Advertisements
Similar presentations
G5BAIM Artificial Intelligence Methods
Advertisements

1 Modeling and Simulation: Exploring Dynamic System Behaviour Chapter9 Optimization.
Nelder Mead.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
1 Evolutionary Computational Inteliigence Lecture 6b: Towards Parameter Control Ferrante Neri University of Jyväskylä.
Optimization methods Review
CHAPTER 2 D IRECT M ETHODS FOR S TOCHASTIC S EARCH Organization of chapter in ISSO –Introductory material –Random search methods Attributes of random search.
Classification and Prediction: Regression Via Gradient Descent Optimization Bamshad Mobasher DePaul University.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
Maximum likelihood (ML) and likelihood ratio (LR) test
Evolutionary Computational Intelligence Lecture 10a: Surrogate Assisted Ferrante Neri University of Jyväskylä.
Neural Networks Marco Loog.
MAE 552 – Heuristic Optimization Lecture 3 January 28, 2002.
MAE 552 – Heuristic Optimization Lecture 4 January 30, 2002.
1 Hybrid Agent-Based Modeling: Architectures,Analyses and Applications (Stage One) Li, Hailin.
Optimization Methods One-Dimensional Unconstrained Optimization
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
Unconstrained Optimization Problem
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
Advanced Topics in Optimization
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Greg GrudicIntro AI1 Support Vector Machine (SVM) Classification Greg Grudic.
Maximum likelihood (ML)
Optimization of Linear Problems: Linear Programming (LP) © 2011 Daniel Kirschen and University of Washington 1.
Radial Basis Function Networks
Simulated Annealing G.Anuradha. What is it? Simulated Annealing is a stochastic optimization method that derives its name from the annealing process used.
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems Prof. Hao Zhu Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Dr. Hala Moushir Ebied Faculty of Computers & Information Sciences
By Saparila Worokinasih
Computational Stochastic Optimization: Bridging communities October 25, 2012 Warren Powell CASTLE Laboratory Princeton University
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Deterministic vs. Random Maximum A Posteriori Maximum Likelihood Minimum.
Engineering Optimization Chapter 3 : Functions of Several Variables (Part 1) Presented by: Rajesh Roy Networks Research Lab, University of California,
Monte-Carlo method for Two-Stage SLP Lecture 5 Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius, Lithuania EURO Working Group on Continuous.
559 Fish 559; Lecture 5 Non-linear Minimization. 559 Introduction Non-linear minimization (or optimization) is the numerical technique that is used by.
Evolutionary Computation (P. Koumoutsakos) 1 Mathematics vs. “Heuristics”  Heuristics/Experimental Optimisation : When a functional.
Non-Linear Models. Non-Linear Growth models many models cannot be transformed into a linear model The Mechanistic Growth Model Equation: or (ignoring.
Classification Course web page: vision.cis.udel.edu/~cv May 14, 2003  Lecture 34.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
ZEIT4700 – S1, 2015 Mathematical Modeling and Optimization School of Engineering and Information Technology.
Chapter 2-OPTIMIZATION G.Anuradha. Contents Derivative-based Optimization –Descent Methods –The Method of Steepest Descent –Classical Newton’s Method.
September 28, 2000 Improved Simultaneous Data Reconciliation, Bias Detection and Identification Using Mixed Integer Optimization Methods Presented by:
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
INTRO TO OPTIMIZATION MATH-415 Numerical Analysis 1.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Chapter 7. Classification and Prediction
Scientific Research Group in Egypt (SRGE)
12. Principles of Parameter Estimation
Goal We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid.
Non-linear Minimization
Solver & Optimization Problems
Boundary Element Analysis of Systems Using Interval Methods
C.-S. Shieh, EC, KUAS, Taiwan
کاربرد نگاشت با حفظ تنکی در شناسایی چهره
3.7. Other Game Physics Approaches
Autonomous Cyber-Physical Systems: Dynamical Systems
Non-linear Least-Squares
Collaborative Filtering Matrix Factorization Approach
Chapter 27.
Local search algorithms
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
“Hard” Optimization Problems
Optimization and Some Traditional Methods
Boltzmann Machine (BM) (§6.4)
Artificial Bee Colony Algorithm
Local Search Algorithms
12. Principles of Parameter Estimation
Chapter 8. General LP Problems
Presentation transcript:

Evolutionary Computational Intelligence Ferrante Neri 26/05/2019 Evolutionary Computational Intelligence Lecture 1: Basic Concepts Ferrante Neri University of Jyväskylä 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Introductory Example: Radio Tuning Ferrante Neri 26/05/2019 Introductory Example: Radio Tuning Position (e.g. angular) of the Radio Knob : candidate solution We want to have a clear signal that is: Maximize the signal Minimize the background noise 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Optimization Problem Candidate solution Decision (or design) variables Variable bounds define the decision space objective function or fitness function (the behavior taken by the fitness over the decision space is called fitness landscape) 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Real-World Optimization Problems Optimization Problems are often rather easily formulated but very hard to be solved when the problem comes from an application. In fact, some features characterizing the problem can make it extremely challenging. These features are summarized in the following: 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Highly nonlinear fitness function Usually optimization problems are characterized by nonlinear function. In real world optimization problems, the physical phenomenon, due to its nature (e.g. in the case of saturation phenomenon or for systems which employ electronic components), cannot be approximated by a linear function neither in some areas of the decision space. 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Highly multimodal fitness landscape It often happens that the fitness landscape contains many local optima and that many of these have an unsatisfactory performance (fitness value) These fitness landscapes are usually rather difficult to be handled since the optimization algorithms which employ gradient based information in detecting the search direction could easily converge to a suboptimal basin of attraction Basin of attraction: set of points of the decision space such that ,initial conditions chosen, dynamically evolve to a particular attractor 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Optimization in Noisy Environment Uncertainties in optimization can be categorized into three classes. Noisy fitness function. Noise in fitness evaluations may come from many different sources such as sensory measurement errors or randomized simulations. Approximated fitness function. When the fitness function is very expensive to evaluate, or an analytical fitness function is not available, approximated fitness functions are often used instead. These approximated models implicitly introduce a noise which is the difference between the approximated value and real fitness value, which is unknown. Robustness. Often, when a solution is implemented, the design variables or the environmental parameters are subject to perturbations or changes (e.g. control problems). 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Computationally expensive problems Optimization problems can be computationally expensive because of two reasons: high cardinality decision space (usually combinatorial) computationally expensive fitness function (e.g. design of on-line electric drives) 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Real-World Problems and Classical Methods When such features are present in an optimization problem, the application of exact methods is usually unfeasible since the hypotheses are not respected. Moreover the application of classical deterministic algorithms is also questionable since their use could easily lead to suboptimal solutions (e.g. a hill climber for highly multimodal functions) or return completely unreliable results (e.g. a deterministic optimizer in noisy environments). 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Rosenbrock Algorithm 1960 From chemical application Well defined decision space No analytical expression and no derivatives Modification of a steepest descent method 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Rosenbrock Algorithm The search is executed along each direction variable (orthogonal search) The search is continued by enlarging the step size for successful directions and reducing for unsuccessful directions The search is stopped when the trial was successful in all the directions 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Rosenbrock Algorithm Under these conditions, a new set of directions is determined by means of Gram-Schmidt procedure and the search is start over 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Rosenbrock Algorithm 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Hooke Jeeves Algorithm (1961) exploratory radius h, an initial candidate Initial solution x n × n direction exploratory matrix U (e.g. diag (w(1),w(2), …w(i)...,w(n), where w (i) is the width of the range of variability of the i-th variable) U(i,:) is the i-th row of the matrix 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Hooke Jeeves Algorithm Exploratory Move: samples solutions x (i)+hU(i, :) (”+” move) with i = 1, 2, . . . , n and thesolutions x (i)−hU(i, :) (”-” move) with i = 1, 2, . . . , n only along those directions which turned out unsuccessful during the ”+” move Directions are analyzed Separately! 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Hooke Jeeves Algorithm Pattern Move: The pattern move is an aggressive attempt of the algorithm to exploit promising search directions. Rather than centering the following exploration at the most promising explored candidate solution the HJA tries to move further. The algorithm makes a double step and centers the subsequent exploratory move. If this second exploratory fails: step back and exploratory. 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Hooke Jeeves Algorithm 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Nelder Mead Algortihm (1965) works on a setof n + 1 solutions in order to perform the local search it employs an exploratory logic based on a dynamic construction of a polyhedron (simplex) n + 1 solutions x0, x1, . . . , xn sorted in descending order according to their fitness values (i.e. x0 is the best), the NMA attempts to improve xn 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Nelder Mead Algorithm The centroid xm is calculated: 1st step: reflection If the reflected point outperforms x0,replacement occurs 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Nelder Mead Algorithm 2nd step: expansion. if the reflection was successful, (i.e. reflected point better than x0) it calculates if the expansion is also successful new replacement occurs 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Nelder Mead Algorithm If xr did not improve upon x0, If f (xr) < f (xn−1) then xr replaces xn. If this trial is also unsuccessful, if f (xr) < f (xn), xr replaces xn and the 3rd step: outside contraction. 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Nelder Mead Algorithm If xr does not outperform neither xn then the 4th step: inside contraction: if the contraction was successful then xc replaces xn 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Nelder Mead Algorithm If there is no way to improve x0 the 5th step shrinking: n new points a sampled and the process is started over 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Lecture 1:Basic Concepts Comparative Analysis The three algorithms do not require derivatives and do not require explicit analytical expression Rosenbrock and Hooke Jeeves are fully deterministic while Nelder Mead has some randomness Rosenbrock and Nelder Mead move in the space along all the directions simultaneously (e.g. diagonal in 2D) while Hooke Jeeves moves along one direction at once 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Fundamental Points in Comparative Analysis Rosenbrock and Hooke Jeeves have a mathematically proved convergence while Nelder Mead doesn’t! Rosenbrock and Hooke Jeeves have “local properties” while Nelder Mead has “global properties” 26/05/2019 08:59:09 Lecture 1:Basic Concepts

Two-Phase Nozzle Design (Experimental) Experimental design optimisation: Optimise efficieny. ... evolves... Initial design Final design: 32% improvement in efficieny. 26/05/2019 08:59:09 Lecture 1:Basic Concepts