Multi-Objective Optimization NP-Hard Conflicting objectives Flow shop with both minimum makespan and tardiness objective TSP problem with minimum distance, time and cost objective Container management – balancing volume, weight and value Has no single solution but a set of solutions called Pareto Optimal Solutions A solution is Pareto optimal if it not possible to improve a single objective without deteriorating another objective The objective is to find the Pareto optimal set and the Pareto front Metaheuristics can be used to approximate the Pareto optimal set Both S and P – metaheuristics are used
Metaheuristics for Multiobjective Optimization Fitness assignment – assign a scalar value to the quality of the solution Diversity preserving – generate a diverse set of solutions Elitism – Select the best set of solutions at every step General strategies Aggregation – use an aggregation method to covert the problem into mono-objective Weighted Metric – preselect a reference value of the objective function and measure the distance of the other solutions from this reference and minimize this distance Parallel approach- treat each objective individually. Then crossover and mutate the solutions from each objective to find a compromise Sequential approach- search in a preference order of objectives Dominance based- search using a dominant criteria set by the final user
Hybrid Metaheuristics Combining S and P or a S and S metaheuristics Combining with other math programming methods Metaheuristics and AI Main classification Relay - sequential Teamwork – cooperative search Example. Embedding metaheuristics search within other optimization solution methods Branch and bound – the upper bound of a node can be obtained using metaheuristic which also yields a partial solution upto the given node Dynamic programming- if the state-action space is large, metaheuristics can reduce the action space by performing a local search among a set of all possible actions for a state
Parallel Metaheuristics Speed up search Improve quality Solve large NP hard problems Parallel designs Algorithmic level – Independent or cooperative self-contained metaheuristics approaches are used in parallel Iterative level – At an iteration, search is done in several neighborhoods by different computers to speed up search, reconcile the solutions periodically Solution level- the generation of the objective function value and the check for any constraint violations is done in parallel for a set of solutions generated by one search
Elements of the Heuristic Approach Representation of the solution space Vector of Binary values – 0/1 Knapsack, 0/1 IP problems Vector of discrete values- Location , and assignment problems Vector of continuous values on a real line – continuous, parameter optimization Permutation – sequencing, scheduling, TSP Defining the neighborhood and the neighbors Flip operator – binary or over a range of numbers (+1 or -1 as in knapsack) Permutation operator pair-wise exchange operator Insertion operator 12345 14235 Exchange operator 12345 14325 Inversion operator 123456 154326
Implementation - Chapter 14 -15 Summary
Elements of the Heuristic Approach Defining the initial solution Random or greedy Choosing the method (algorithm for iterative search) Off-the shelf or tailor made heuristic Single-start or multistart (still single but several independent singles) or population (solutions interact with one another) Strategies for escaping local optima Balance diversification and intensification of search Objective function evaluation Full or partial evaluation At every iteration or after a set of iterations Stopping criteria Number of iterations Time Counting the number of non-improving solutions in consecutive iterations. Remember: there is a lot of flexibility in setting up the above. Optimality cannot be proved. All you are looking for is a good solution given the resource (time, money and computing power) constraints
Single-Metaheuristics Accept nonimproving neighbors Tabu search and simulated annealing Iterating with different initial solutions Multistart local search, greedy randomized adaptive search procedure (GRASP), iterative local search Changing the neighborhood Variable neighborhood search Changing the objective function or the input to the problem in a effort to solve the original problem more effectively. Guided local search
Population-based metaheuristics Nature-inspired Initialize a population A new population of solutions is generated Integrate the new population into the current one using one these methods – by replacement which is a selection process from the new and current solutions Evolutionary Algorithms – genetic algorithm Estimation of distribution algorithm (EDA) Scatter search Evolutionary programming- genetic programming Swarm Intelligence Ant colony Particle swarm optimization (PSO) Bee colony Artificial Immune system AIS Continue until a stopping criteria is reached The generation and replacement process could be memoryless or some search memory is used
Summary of Other Heuristics From Text See webpage and Text
Next Sem – Dynamic Programming Next Fall – Approximate DP Thank YOU!