Opracowanie językowe dr inż. J. Jarnicki

Slides:



Advertisements
Similar presentations
Artificial Bee Colony Algorithm
Advertisements

Particle Swarm Optimization (PSO)
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Particle Swarm Optimization
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Optimization methods Review
Gizem ALAGÖZ. Simulation optimization has received considerable attention from both simulation researchers and practitioners. Both continuous and discrete.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
Genetic Algorithms and Their Applications John Paxton Montana State University August 14, 2003.
Ant Colony Optimization Optimisation Methods. Overview.
Evolutionary Computational Intelligence Lecture 8: Memetic Algorithms Ferrante Neri University of Jyväskylä.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Ant Colony Optimization: an introduction
Differential Evolution Hossein Talebi Hassan Nikoo 1.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Elements of the Heuristic Approach
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Internet Engineering Czesław Smutnicki Discrete Mathematics – Discrete Optimization.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Swarm Intelligence 虞台文.
Internet Engineering Czesław Smutnicki Discrete Mathematics – Location and Placement Problems in Information and Communication Systems.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
1 Short Term Scheduling. 2  Planning horizon is short  Multiple unique jobs (tasks) with varying processing times and due dates  Multiple unique jobs.
Mathematical Models & Optimization?
Exact and heuristics algorithms
G5BAIM Artificial Intelligence Methods
Reactive Tabu Search Contents A brief review of search techniques
Edge Assembly Crossover
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local Search and Optimization Presented by Collin Kanaley.
Biologically inspired algorithms BY: Andy Garrett YE Ziyu.
Optimization Problems
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
A field of study that encompasses computational techniques for performing tasks that require intelligence when performed by humans. Simulation of human.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
CEng 713, Evolutionary Computation, Lecture Notes parallel Evolutionary Computation.
Optimization Problems
CSCI 4310 Lecture 10: Local Search Algorithms
Particle Swarm Optimization (2)
Heuristic Optimization Methods
Digital Optimization Martynas Vaidelys.
Meta-heuristics Introduction - Fabien Tricoire
School of Computer Science & Engineering
Opracowanie językowe dr inż. J. Jarnicki
Local Search Algorithms
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
metaheuristic methods and their applications
Example: Applying EC to the TSP Problem
Optimization Problems
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
Multi-Objective Optimization
School of Computer Science & Engineering
“Hard” Optimization Problems
Boltzmann Machine (BM) (§6.4)
Artificial Bee Colony Algorithm
More on HW 2 (due Jan 26) Again, it must be in Python 2.7.
Md. Tanveer Anwar University of Arkansas
Local Search Algorithms
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Local Search Algorithms
Presentation transcript:

Opracowanie językowe dr inż. J. Jarnicki Internet Engineering Czesław Smutnicki Discrete Mathematics – Discrete Optimization

CONTENTS Numerical troubles Packages Tools Useful methods

OPTIMIZATION TROUBLES. NICE BEGINNINGS OF BAD NEWS FIND EXTREMES OF THE FUNCTION 2D 1D DE JONG TEST FUNCTION

OPTIMIZATION TROUBLES. MULTIPLE EXTREMES GRIEWANGK TEST FUNCTION FIND EXTREMES OF THE FUNCTION 2D

OPTIMIZATION TROUBLES. EXPONENTAL NUMBER OF EXTREMES LANGERMANN TEST FUNCTION FIND EXTREMES OF THE FUNCTION 2D

OPTIMIZATION TROUBLES. DECEPTION POINTS FOX HOLES TEST FUNCTION FIND EXTREMES OF THE FUNCTION

OPTIMIZATION TROUBLES. TIME OF CALCULATIONS/COST OF CALCULATIONS CURSE OF DIMENSIONALITY Please wait. Calculations will last 3 289 years NP-HARDNESS  LAB INSTANCE 5..20 VARIABLES ! ! ? NONLINEAR FUNCTION OF 1980 VARIABLES !!! INSTANCE FROM PRACTICE

OPTIMIZATION TROUBLES. SIZE OF THE SOLUTION SPACE The smallest practical instance FT10 of the job-shop scheduling problem (waited 25 years for the solving), consists of 10 jobs, 10 machines, 100 operations; solution space contains 1048 discrete feasible solutions; each solution has dimension 90; the greatest currently used benchmarks have dimension 1980 SOLUTION SPACE FT 10 corresponds to printed area of 1032 km2 (Jupiter has 1010 km2) if single solution is a dot 0.01 x 0.01 mm dimension and size

OPTIMIZATION TROUBLES. DISTRIBUTION OF THE GOAL FUNCTION VALUES Example: job-shop scheduling problem; relative Hamming distances DIST between a feasible solution and the „best” solution are distributed normally in the solution space Goal function values are distributed normally in the solution space;

OPTIMIZATION TROUBLES. FUR Example: job-shop scheduling problem SIMULATION OF GOAL FUNCTION VALUES TOWARDS CENTER OF THE SPACE

OPTIMIZATION TROUBLES. ZOOM IN ON THE FUR Example: job-shop scheduling problem SIMULATION OF GOAL FUNCTION VALUES TOWARDS CENTER (ZOOM)

OPTIMIZATION TROUBLES. STONE FOREST Transformation of a sample of random solutions from the 90D space into 2D space.

PROPERTIES OF SOLUTION SPACE LANDSCAPE BIG VALLEY – positive correlation between goal function value and the distance to optimal solution (the best found solution); in the big valley the concentration of local extremes is high. The size of the valley is usually relatively small in relation to the size of the whole solution space. RUGGEDNESS – measure of diversity of goal function values of related (neighboring) solutions; rruggedness is greater if diversity of the goal function value in the neighborhood of this point is greater; less differentiation of the goal function value means the flat landscape. THE NUMBER OF LOCAL EXTREMES (peaks) in relation to to the size of the solution space DISTRIBUTION OF LOCAL EXTREMES experimental OTHER MEASURES autocorrelation function, correlation function between random trajectories, landscape statistically isotropic, fractal landscape, correlation between genes (epitasis), correlation of the distance of fitness

CURRENT STATE IN DISCRETE OPTIMIZATION Packages and solvers (LINDO, CPLEX, ILOG, …) Exact methods (B&B, DP, ILP, BLP, MILP, SUB,…) Approximate methods (…): heuristics, metaheuristics, meta2heuristics Quality measures of approximation (absolute, relative, …) Analysis of quality measure (worst-case, probabilistic, experimental) Calculation cost (pessimistic, average, experimentally tested) Approximation schemes (AS, polynomial-time PTAS, fully polynomial-time FPTAS) Inapproximality Useful experimental methods (…) „No free lunch” theorem Public benchmarks Parallel and distributed methods: new class of algorithms

OPTIMIZATION HISTORY/TRENDS Priority rules Theory of NP-completeness Plynomial-time algorithms Exact methods (B&B, DP, ILP, BLP,…) Approximation methods: quality analysis Approximation schemes (AS, PTAS, FPTAS, …) Inapproximality theory Competitive analysis (on-line algorithms) Metaheuristics Theoretical foundations of metaheuristics Parallel metahuristics Theoretical foundations of parallel metaheuristics

APPROXIMATE METHODS METHODS RESISTANT TO LOCAL EXTREMES constructive/improvement priority rules random search greedy randomized adaptive simulated annealing simulated jumping estimation of distribution tabu search adaptive memory search variable neighborhood search evolutionary, genetic search differential evolution biochemistry methods immunological methods ant colony optimization particle swarm optimization neural networks threshold accepting path search beam search scatter search harmony search path relinging adaptive search constraint satisfaction descending, hill climbing multi-agent memetic search bee search intelligent water drops * * * * * METHODS RESISTANT TO LOCAL EXTREMES

EVOLUTION: DARWIN’S VIEW. GENETIC ALGORITHMS GOAL OF THE NATURE? optimization, fitness, continuity preservation, follow up changes SUCCESION: genetic material carries data for body construction EVOLUTION: crossing over, mutation SELECTION: soft/hard individual=solution=genotype≠fenotype individual, gene, chromosome, trait population (structure, size, composition) crossing-over (what is the key of progress?) mutation (insurance?) sex ? democracy/elitarism theoretical properties

EVOLUTION: DARWIN’S VIEW. COMPONENTS GENOTYPE CHROMOSOM MORE … SOLUTION GENE EXPRESSION CONTROL OF POPULATION DYNAMICS FEASIBILITY REPAIRING FENOTYPE SELECTION SCHEME CODING MATTING POOL LETHALITY MUTATION BIG VALLEY PHENOMENON CROSSING OVER INTENSIFICATION OPERATOR MSXF

EVOLUTION: DARWIN’S VIEW. COPYING FROM THE NATURE control of population dynamics/preserving diversity parents matching strategies: (sharing function to prevent too close relative parents; incest preventing by using Hamming distance to evaluate genotype similarity) structures of the population (migration, diffusion models) social behavior patterns (satisfied, glad, disappointed -> clonning, crossing-over, mutation) adaptive mutation gene expression distributed populations …

EVOLUTION: DARWIN’S VIEW. MULTISTEP FUSION MSXF SOURCE SOLUTION (PARENT) NEIGHBORHOOD OF THE SOURCE DISTANCE TO TARGET TRAJECTORY = GOAL ORIENTED PATH TARGET SOLUTION (PARENT) TARGET NEIGHBORHOOD SUCCESSIVE NEIGHBOURHOODS SEARCHED IN THE STOCHASTIC WAY DEPENDING THE DISCTANCE TO TARGET

EVOLUTION: LAMARCK/BALDWIN’S VIEW. MEMETIC ALGORITHMS GOAL OF THE NATURE? optimization, fitness, continuity preservation, follow up changes, transfer knowledge to successors SUCCESION: genetic material carries data for body building plus acquired knowledge EVOLUTION: crossing over, mutation, learning SELECTION: soft/hard individual=solution=memotype≠fenotype individual, meme, chromosome, trait population (structure, size, composition, learning) crossing-over, mutation, learning theoretical properties ?

DIFFERENTIAL EVOLUTION Differential evolution is a subclass of genetic search methods. Democracy in creating successors with using crossover and mutation in GS has been replaced in DE by directed changes to fathom solution space. DE starts from the random population of individuals (solutions). In each iteration something similar to mutation and crossover is performed, however in completely different way than in GS. For each solution x from the space, an offspring y is generated as the trial solution being the extension of a selected random solution a and two directional solutions b and c (analogy to parents) selected at random. Generation is based on linear combination with some random parameters. Separate mechanism prevents generating an offspring by simple copying of the parent. Significant role plays the mutation, which due to specific strategy, is self-adaptive and goal-oriented with respect to the direction, scale and range. If the trial solution is better, it is accepted; otherwise it is released. Iterations are repeated until the fixed a priori number of iterations has been reached, or stagnation has been detected. The method owns some specific tuned parameters: differential weight, crossover probability, … selected experimentally.

ARTIFICIAL IMMUNE SYSTEM LIBRARY OF ANTIBODIES fitness recombination antibody = solution antigen = problem or instance Antigen (invasive protein) represents new problem to solve or new (or temporary) constraints set for the solution of already solved problem. Variety of possible antigens is huge, frequently infinite. Moreover, sequence of presented antigens is not known a priori. Antibody (protein blocking antigent, directed against intruder) corresponds to an algorithm which produces a solution to the problem. Variety of antibodies is usually small, however mechamisms exist of their aggregation and recombination in order to produce new antibodies with various properties. Patterns of antibodies are collected in the library, which constitutes memory of the system. Matching (fitness) is the selection of antibody for the antigen. Matching is ideal, if the antibody allow us to generate solution of the problem which is globally optimal under given constraints. Otherwise, certain defined measure is used to evaluate quality of the maching. Bad maching forces the system to seek for new types of antibodies, usually by using evolution.

ANT SEARCH. COOPERATIVE SWARMS control system pheromone generator Pheromone detectors moving drive ANT seeks for food leaves pheromone on the trail moves at random, but prefers pheromone trails pheromone density decreases in time

ANT SEARCH. SEEKING FOODS. DISCOVERING THE PATH B C H E A D D B C H E A D B C H E A

ANT SEARCH. PHEROMONE DISTRIBUTION

PARTICLE SWARM OPTIMIZATION swarm is a large set of individuals (particles) moving together each individual performs the search trajectory in the solution space trajectories are distributed, correlated and take into account experiences of individuals location of the individual (solution) is described by the location vector x, changes of location is described by velocity vector v velocity equation containts an inertiA term and two directional terms weighted by using some random parameters location of the individual depends on: recent (previous) position, experience (best location up to now), location of the leader of the swarm, the best up to now solution form the most promising direction of the search

BEE SEARCH waggle dance = distribution of knowledge bee trajectory = solution hive bee flowers & nectar nectar amount = goal function visited site = neighborhood Neighborhood search combined with random search and supported by cooperation (learning). bee swarm collects honey in hive each bee performs the random path (solution) to the search region of nectar selected elite bees in hive perform „waggle dance” in order to inform other bees about promising search regions (direction, distance, quality)

TABU SEARCH STARTING SOLUTION human thinking in the process of seeking a solution the method „best in local neighborhood” repeated from the best recently found forbidding the return to solutions already visited to prevent cyclic (wandering around); short term memory NEIGHBOURHOOD SUCCESSIVE NEIGHBOURHOODS EXPLORED EXHAUSTIVELY

ADAPTIVE MEMORY SEARCH gathering data in human brain during the process of seeking a solution the method „best” in the current heighbourhood (a few solution relatively close to the current) repetition from the best recently found; intensification of the search operational (short term) memory: prohibition of coming back to solutions already visited to prevent wandering tactic memory: set direction of the search strategic memory: selection of search regions (basins of attraction); diversification recency based, frequency based memory

INTELLIGENT WATER DROPS Based on the dynamic of the river systems, action and reaction, that happen among water drops in rivers: a drop has some (static) parameters, namely velocity, soil; these parameters may change during the lifetime (e.g. iterative cost) drops flow from a source to destination a drop starts with some initial velocity and zero soil during the flow, drop removes some soil from the environment speed of the drop incereases non-linearly inversely to the amount of soil; path with less soil is faster than path with more soil soil is gathered in the drop and removed from the environment drop statistically prefers path with lower soil

SIMULATED ANNEALING. COOLING SCHEMES annealing = slow cooling of ferromagnetic or antyferromagnetic solid in order to eliminate internal stretches Boltzman (harmonic) Logarithmic (Hajek lemay) Geometric

SIMULATED ANNEALING. AUTOTUNING Random starting solution Sequence of k trial moves in the space K steps in each fixed temperature Starting temperature adjusted automatically Adaptive speed of cooling p  0.9

SIMULATED JUMPING annealing by successive heating and cooling, in order to eliminate internal stretches of the spin-glass solid (mixed ferromagnetic and antyferromagnetic material); the aim is to penetrate high barriers that exist between domains

DISCRETE OPTIMIZATION. SOLUTION SPACE PROPERTIES DISTANCE MEASURES IN THE SOLUTION SPACE Move type A S I DA (, ) DS (, ) DI (, ) measure number of inversion in -1 o  n minus the number of cycles in -1 o  n minus the lenght of the maximal increasing subsequence in -1 o  receipt mean variance complexity

SELECTED INSTANCES. BIG VALLEY There exists strong correlation between quality of the function value (RE) and distance to the best solution (DIST); this correlation is preserved after transformation of the solution to x/y coordinates start best BIG VALLEY

SELECTED METHODS. RANDOM SEARCH Random search offers slow convergence to the good solution because it doesn’t use any information about structure of the solution space start best RANDOM SEARCH TRAJECTORY

SELECTED METHODS. SIMULATED ANNEALING Simulated annealing offers moderate speed of convergence to the good solution; it is much more similar to the random search than to goal-oriented search start best SIMULATED ANNEALING TRAJECTORY

SELECTED METHODS. TABU SEARCH Tabu search offers quick convergence to the good solution; this is the fast descent method supported by adaptive memory start best TABU SEARCH TRAJECTORY

PARALLEL OPTIMIZATION: NEW CLASS OF ALGORITHMS Theoretical models of parallel calculation: SISD, SIMD, MISD, MIMD Theoretical models of memory access: EREW, CREW, CRCW Parallel calculation environments: hardware, software, GPGPU Shared memory programming: Pthreads (C), Java threads, Open MP (FORTRAN, C, C++) Distributed memory programing, message-passing, object-based, Internet computing: PVM, MPI, Sockets, Java RMI, CORBA, Globus, Condor Measures of quality of parallel algorithms: runtime, speedup, effciency, cost Single/multiple searching threads; granularity Independent/cooperative search threads Distributed (reliable) calculations in the net

PARALLEL OPTIMIZATION: FESTIVAL OF APPROACHES SIMULATED ANNEALING: Single thread, conventional SA, parallel calculation of the goal function value; fine grain; theory of convergence Single thread, pSA, parallel moves, subset of random trial solutions selected in the neighborhood, parallel evaluation of trial solutions; theory of convergence Exploration of equilibrium state at fixed temperature in parallel Multiple independent threads; coarse grain Multiple cooperative threads; coarse grain GENETIC SEARCH: Single thread, conventional GA, parallel calculation of the goal function value; small grain; theory of convergence Single thread, parallel evaluation of population; Multiple cooperative threads, distributed subpopulations: migration, diffusion, island models …

Thank you for your attention DISCRETE MATHEMATICS Czesław Smutnicki