Particle Swarm Procedure for the Capacitated Open Pit Mining Problem Jacques A. Ferland, University of Montreal Jorge Amaya, University of Chile Melody.

Slides:



Advertisements
Similar presentations
Hadi Goudarzi and Massoud Pedram
Advertisements

Principal Component Analysis Based on L1-Norm Maximization Nojun Kwak IEEE Transactions on Pattern Analysis and Machine Intelligence, 2008.
1 An Adaptive GA for Multi Objective Flexible Manufacturing Systems A. Younes, H. Ghenniwa, S. Areibi uoguelph.ca.
1 Transportation problem The transportation problem seeks the determination of a minimum cost transportation plan for a single commodity from a number.
Multi-Objective Optimization NP-Hard Conflicting objectives – Flow shop with both minimum makespan and tardiness objective – TSP problem with minimum distance,
Contents College 4 §4.1, §4.2, §4.4, §4.6 Extra literature on resource constrained project scheduling (will be handed out)
1 Modeling and Optimization of VLSI Interconnect Lecture 9: Multi-net optimization Avinoam Kolodny Konstantin Moiseev.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.

On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Ant colony algorithm Ant colony algorithm mimics the behavior of insect colonies completing their activities Ant colony looking for food Solving a problem.
Test Case Filtering and Prioritization Based on Coverage of Combinations of Program Elements Wes Masri and Marwa El-Ghali American Univ. of Beirut ECE.
CSM6120 Introduction to Intelligent Systems Evolutionary and Genetic Algorithms.
Linkage Problem, Distribution Estimation, and Bayesian Networks Evolutionary Computation 8(3) Martin Pelikan, David E. Goldberg, and Erick Cantu-Paz.
Infinite Horizon Problems
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
Branch and Bound Searching Strategies
Charge-Sensitive TCP and Rate Control Richard J. La Department of EECS UC Berkeley November 22, 1999.
GREEDY RANDOMIZED ADAPTIVE SEARCH PROCEDURES Reporter : Benson.
Evolutionary Design By: Dianna Fox and Dan Morris.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Unsupervised Learning
Scheduling of Flexible Resources in Professional Service Firms Arun Singh CS 537- Dr. G.S. Young Dept. of Computer Science Cal Poly Pomona.
Metaheuristics The idea: search the solution space directly. No math models, only a set of algorithmic steps, iterative method. Find a feasible solution.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Escaping local optimas Accept nonimproving neighbors – Tabu search and simulated annealing Iterating with different initial solutions – Multistart local.
1. The Simplex Method.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
Efficient and Scalable Computation of the Energy and Makespan Pareto Front for Heterogeneous Computing Systems Kyle M. Tarplee 1, Ryan Friese 1, Anthony.
Graph Algorithms. Definitions and Representation An undirected graph G is a pair (V,E), where V is a finite set of points called vertices and E is a finite.
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Exact methods for ALB ALB problem can be considered as a shortest path problem The complete graph need not be developed since one can stop as soon as in.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Accuracy Based Generation of Thermodynamic Properties for Light Water in RELAP5-3D 2010 IRUG Meeting Cliff Davis.
ECE 8443 – Pattern Recognition LECTURE 10: HETEROSCEDASTIC LINEAR DISCRIMINANT ANALYSIS AND INDEPENDENT COMPONENT ANALYSIS Objectives: Generalization of.
Ant colony algorithm Ant colony algorithm mimics the behavior of insect colonies completing their activities Ant colony looking for food Solving a problem.
1 S ystems Analysis Laboratory Helsinki University of Technology Flight Time Allocation Using Reinforcement Learning Ville Mattila and Kai Virtanen Systems.
Computational Intelligence: Methods and Applications Lecture 23 Logistic discrimination and support vectors Włodzisław Duch Dept. of Informatics, UMK Google:
Mining Binary Constraints in Feature Models: A Classification-based Approach Yi Li.
The Greedy Method. The Greedy Method Technique The greedy method is a general algorithm design paradigm, built on the following elements: configurations:
1 Markov Decision Processes Infinite Horizon Problems Alan Fern * * Based in part on slides by Craig Boutilier and Daniel Weld.
1 Branch and Bound Searching Strategies Updated: 12/27/2010.
1 Network Models Transportation Problem (TP) Distributing any commodity from any group of supply centers, called sources, to any group of receiving.
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Evolutionary Design (2) Boris Burdiliak. Topics Representation Representation Multiple objectives Multiple objectives.
1  The Problem: Consider a two class task with ω 1, ω 2   LINEAR CLASSIFIERS.
Clustering Instructor: Max Welling ICS 178 Machine Learning & Data Mining.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
METAHEURISTIC Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
Preliminary Background Tabu Search Genetic Algorithm.
A Fast Genetic Algorithm Based Static Heuristic For Scheduling Independent Tasks on Heterogeneous Systems Gaurav Menghani Department of Computer Engineering,
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Resource-Constrained Project Scheduling Problem (RCPSP)
1 Chapter 5 Branch-and-bound Framework and Its Applications.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
CWR 6536 Stochastic Subsurface Hydrology Optimal Estimation of Hydrologic Parameters.
Particle Swarm Optimization (2)
Bulgarian Academy of Sciences
Balancing of Parallel Two-Sided Assembly Lines via a GA based Approach
Computer Science cpsc322, Lecture 14
METAHEURISTIC Jacques A. Ferland
Multi-Objective Optimization
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
EE368 Soft Computing Genetic Algorithms.
Genetic Algorithm Soft Computing: use of inexact t solution to compute hard task problems. Soft computing tolerant of imprecision, uncertainty, partial.
Presentation transcript:

Particle Swarm Procedure for the Capacitated Open Pit Mining Problem Jacques A. Ferland, University of Montreal Jorge Amaya, University of Chile Melody Suzy Djuimo, University of Montreal ICARA, December 2006

RIOT Mining Problem web site: Maximal Open Pit problem: to determine the maximal gain expected from the extraction the net value of extracting block i objective function

Maximal pit slope constraints to identify the set B i of predecessor blocks that have to be removed before block i

Maximal pit slope constraints to identify the set B i of predecessor blocks that have to be removed before block i

Scheduling block extraction Account for operational constraints: C t the maximal weight that can be extracted during period t and for the discount factor during the extracting horizon: discount rate per period

the net value of extracting block i p i weight of block i N can be replaced by the maximal open pit N* = (S – {s})

Scheduling block extraction ↔ RCPSP Open pit extraction ↔ project Each block extraction ↔ activity Precedence relationship derived from the maximal pit slope constraints

Scheduling block extraction ↔ RCPSP Open pit extraction ↔ project Each block extraction ↔ activity Precedence relationship derived from the maximal pit slope constraints Reward associated with activity (block) i depends of the extraction period t

Genotype representation of solution Similar to Hartman’s priority value encoding for RCPSP priority of scheduling block i extraction

Decoding of a representation PR into a solution x Serial decoding to schedule blocks sequentially one by one to be extracted To initiate the first extraction period t = 1: remove the block among those having no predecessor (i.e., in the top layer) having the highest priority. During any period t, at any stage of the decoding scheme: the next block to be removed is one of those with the highest priority among those having all their predecessors already extracted such that the capacity C t is not exceeded by its extraction. If no such block exists, then a new extraction period (t + 1) is initiated.

Priority of a block Consider its net value b i and impact on the extraction of other blocks in future periods Block lookahead value (Tolwinski and Underwood) determined by referring to the spanning cone SC i of block i

Genotype priority vector generation Several different genotype priority vectors can be randomly generated with a GRASP procedure biased to give higher priorities to blocks i having larger lookahead values Several feasible solutions of (SBE) can be obtained by decoding different genotype vectors generated with the GRASP procedure.

Particle Swarm Procedure Evolutionary process evolving in the set of genotype vectors to converge to an improved feasible solution of (SBE). Initial population P of M genotype vectors (individuals) generated using GRASP Denote the best achievement of the individual k up to the current iteration the best overall genotype vector achieved up to the current iteration

Particle Swarm Procedure Denote the best achievement of the individual k up to the current iteration the best overall genotype vector achieved up to the current iteration Modification of the individual vector k at each iteration

Particle Swarm Procedure Denote the best achievement of the individual k up to the current iteration the best overall genotype vector achieved up to the current iteration Modification of the individual vector k at each iteration

Numerical Results 20 problems randomly generated over a two dimensions grid having 20 layers and being 60 blocks wide. The 10 problems having smaller optimal pit are used to analyse the impact of the parameters. We compare the results for 12 different set of parameter values l

For each set of parameter values l, each problem ρ is solved 5 times to determine va lρ : the average of the best values v(PRb) achieved vb lρ : the best values v(PRb) achieved % lρ : the average % of improvement it lρ : the last iteration where an improvement of PRb occurs. Then for each set of parameter l, we compute the average values va l, vb l, % l, and it l over the 10 problems.

Results

Impact of β in GRASP Comparing rows 1, 2, and 3, we observe that the values of va l and vb l decrease while the value of % l increases as the value of β increases. The same observations apply for rows 4, 5, and 6. Bias to increase priority of blocks with larger lookahead value as β decreases. ( β = 100 is equivalent to assign priority randomly). Individual genotype vectors in initial population tend to be better as β decreases.

Impact of population size M The value va l in row 1 is larger than in row 4. The same is true if we compare rows 2 and 5, and rows 3 and 6. This indicates that the values of the solutions generated are better when the size of the population is larger. This makes sense since we generate a larger number of different solutions.

Impact of particle swarm parameters Comparing the results in rows 2, 7, 8,and 9, and those in rows 5, 10, 11, and 12, there is no clear impact of modifying the values of the parameters w, c 1 and c 2. Note that the values w = 0.7, c1 = 1.4, and c2 = 1.4 were selected accordingly to the authors in [19] who shown that setting the values of the parameters close to w = and c1 = c2 = gives acceptable results.

Impact of particle swarm process Each of the 10 other larger problems are solved 5 times with parameter values in set 1. Average value Best value Worst value v greedy value of the solution generated by decoding the genotype vector where priority of blocks are proportional to their lookahead values the worst values vw 1ρ is better than v greedy for all problems. the percentage of improvement of va 1ρ over v greedy ranges from 2.32% to 52.24%

Future work Solve larger problems Include other operational constraints found in real world applications Compare with other evolutionary approaches (genetic algorithm)