“Hard” Optimization Problems

Slides:



Advertisements
Similar presentations
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Advertisements

G5BAIM Artificial Intelligence Methods
Explicit Modelling in Metaheuristic Optimization Dr Marcus Gallagher School of Information Technology and Electrical Engineering University of Queensland.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Channel Assignment using Chaotic Simulated Annealing Enhanced Neural Network Channel Assignment using Chaotic Simulated Annealing Enhanced Hopfield Neural.
Estimation of Distribution Algorithms Ata Kaban School of Computer Science The University of Birmingham.
Evolutionary Computational Intelligence Lecture 10a: Surrogate Assisted Ferrante Neri University of Jyväskylä.
Estimation of Distribution Algorithms Let’s review what have done in EC so far: We have studied EP and found that each individual searched via Gaussian.
(Intro To) Evolutionary Computation Revision Lecture Ata Kaban The University of Birmingham.
Object Recognition Using Genetic Algorithms CS773C Advanced Machine Intelligence Applications Spring 2008: Object Recognition.
1 A hybrid particle swarm optimization algorithm for optimal task assignment in distributed system Peng-Yeng Yin and Pei-Pei Wang Department of Information.
Natural Computation: computational models inspired by nature Dr. Daniel Tauritz Department of Computer Science University of Missouri-Rolla CS347 Lecture.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
© P. Pongcharoen ISA/1 Applying Designed Experiments to Optimise the Performance of Genetic Algorithms for Scheduling Capital Products P. Pongcharoen,
Evolutionary Computation Application Peter Andras peter.andras/lectures.
Ant Colony Optimization to Resource Allocation Problems Peng-Yeng Yin and Ching-Yu Wang Department of Information Management National Chi Nan University.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Ant Colony Optimization: an introduction
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Carlos Eduardo Maldonado Research Professor Universidad del Rosario INNOVATION AND COMPLEXITY.
Swarm Computing Applications in Software Engineering By Chaitanya.
10/6/2015 1Intelligent Systems and Soft Computing Lecture 0 What is Soft Computing.
Estimation of Distribution Algorithms (EDA)
Search Methods An Annotated Overview Edward Tsang.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Mean Field Variational Bayesian Data Assimilation EGU 2012, Vienna Michail Vrettas 1, Dan Cornford 1, Manfred Opper 2 1 NCRG, Computer Science, Aston University,
How to apply Genetic Algorithms Successfully Prabhas Chongstitvatana Chulalongkorn University 4 February 2013.
G5BAIM Artificial Intelligence Methods
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Asst. Prof. Dr. Ahmet ÜNVEREN, Asst. Prof. Dr. Adnan ACAN.
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
Particle Swarm Optimization (PSO)
A field of study that encompasses computational techniques for performing tasks that require intelligence when performed by humans. Simulation of human.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Optimization Problems
Global Optimization of Complex Healthcare Systems
Genetic Algorithm in TDR System
Soft Computing Basics Ms. Parminder Kaur.
metaheuristic methods and their applications
Evolutionary Algorithms Jim Whitehead
CSCI 4310 Lecture 10: Local Search Algorithms
Particle Swarm Optimization (2)
The 2st Chinese Workshop on Evolutionary Computation and Learning
Scientific Research Group in Egypt (SRGE)
Heuristic Optimization Methods
Digital Optimization Martynas Vaidelys.
School of Computer Science & Engineering
C.-S. Shieh, EC, KUAS, Taiwan
MultiRefactor: Automated Refactoring To Improve Software Quality
Local Search Algorithms
Generalization and adaptivity in stochastic convex optimization
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Advanced Artificial Intelligence Evolutionary Search Algorithm
Optimization Techniques for Natural Resources SEFS 540 / ESRM 490 B
metaheuristic methods and their applications
Optimization Problems
Heuristic Optimization Methods Pareto Multiobjective Optimization
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
Intelligent Systems and
Subset of Slides from Lei Li, HongRui Liu, Roberto Lu
GENETIC ALGORITHMS & MACHINE LEARNING
Boltzmann Machine (BM) (§6.4)
ZEIT4700 – S1, 2016 Mathematical Modeling and Optimization
SASS: Self-Adaptation using Stochastic Search
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Evolutionary Computational Intelligence
Coevolutionary Automated Software Correction
Stochastic Methods.
Presentation transcript:

“Hard” Optimization Problems Goal: Find where S is often multi-dimensional; real-valued or binary Many classes of optimization problems (and algorithms) exist. When might it be worthwhile to consider metaheuristic or machine learning approaches? Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 Finding an “exact” solution is intractable. Limited knowledge of f() No derivative information. May be discontinuous, noisy,… Evaluating f() is expensive in terms of time or cost. f() is known or suspected to contain nasty features Many local minima, plateaus, ravines. The search space is high-dimensional. Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 What is the “practical” goal of (global) optimization? “There exists a goal (e.g. to find as small a value of f() as possible), there exist resources (e.g. some number of trials), and the problem is how to use these resources in an optimal way.” A. Torn and A. Zilinskas, Global Optimisation. Springer-Verlag, 1989. Lecture Notes in Computer Science, Vol. 350. Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 Heuristics Heuristic (or approximate) algorithms aim to find a good solution to a problem in a reasonable amount of computation time – but with no guarantee of “goodness” or “efficiency” (cf. exact or complete algorithms). Broad classes of heuristics: Constructive methods Local search methods Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 Metaheuristics Metaheuristics are (roughly) high-level strategies that combinine lower-level techniques for exploration and exploitation of the search space. An overarching term to refer to algorithms including Evolutionary Algorithms, Simulated Annealing, Tabu Search, Ant Colony, Particle Swarm, Cross-Entropy,… C. Blum and A. Roli. Metaheuristics in Combinatorial Optimization: Overview and Conceptual Comparison. ACM Computing Surveys, 35(3), 2003, pp. 268-308. Marcus Gallagher - MASCOS Symposium, 26/11/04

Learning/Modelling for Optimization Most optimization algorithms make some (explicit or implicit) assumptions about the nature of f(). Many algorithms vary their behaviour during execution (e.g. simulated annealing). In some optimization algorithms the search is adaptive Future search points evaluated depend on previous points searched (and/or their f() values, derivatives of f() etc). Learning/modelling can be implicit (e.g, adapting the step-size in gradient descent, population in an EA). Marcus Gallagher - MASCOS Symposium, 26/11/04

EDAs: Probabilistic Modelling for Optimization Idea is to convert the optimization problem into a search over probability distributions. P. Larranaga and J. A. Lozano (eds.). Estimation of Distribution Algorithms: a new tool for evolutionary computation. Kluwer Academic Publishers, 2002. The probabilistic model is in some sense an explicit model of (currently) promising regions of the search space. Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 Summary The field of metaheuristics (including Evolutionary Computation) has produced A large variety of optimization algorithms Demonstrated good performance on a range of real-world problems. Metaheuristics are considerably more general: can even be applied when there isn’t a “true” objective function (coevolution). Can evolve non-numerical objects. Marcus Gallagher - MASCOS Symposium, 26/11/04

Marcus Gallagher - MASCOS Symposium, 26/11/04 Summary EDAs take an explicit modelling approach to optimization. Existing statistical models and model-fitting algorithms can be employed. Potential for solving challenging problems. Model can be more easily visualized/interpreted than a dynamic population in a conventional EA. Although the field is highly active, it is still relatively immature Improve quality of experimental results. Make sure research goals are well-defined. Lots of preliminary ideas, but lack of comparative/followup research. Difficult to keep up with the literature and see connections with other fields. Marcus Gallagher - MASCOS Symposium, 26/11/04