1 A Novel Binary Particle Swarm Optimization. 2 Binary PSO- One version In this version of PSO, each solution in the population is a binary string. –Each.

Slides:



Advertisements
Similar presentations
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Advertisements

Fast Algorithms For Hierarchical Range Histogram Constructions
An Approximate Truthful Mechanism for Combinatorial Auctions An Internet Mathematics paper by Aaron Archer, Christos Papadimitriou, Kunal Talwar and Éva.
K Means Clustering , Nearest Cluster and Gaussian Mixture
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #33 4/22/02 Fully Stressed Design.
Query Evaluation. An SQL query and its RA equiv. Employees (sin INT, ename VARCHAR(20), rating INT, age REAL) Maintenances (sin INT, planeId INT, day.
Particle Swarm Optimization Particle Swarm Optimization (PSO) applies to concept of social interaction to problem solving. It was developed in 1995 by.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
MAE 552 – Heuristic Optimization Lecture 24 March 20, 2002 Topic: Tabu Search.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
On the Construction of Energy- Efficient Broadcast Tree with Hitch-hiking in Wireless Networks Source: 2004 International Performance Computing and Communications.
Upper Bounds on the Time and Space Complexity of Optimizing Additively Separable Functions Matthew J. Streeter Carnegie Mellon University Pittsburgh, PA.
An Introduction to Black-Box Complexity
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Computability and Complexity 24-1 Computability and Complexity Andrei Bulatov Approximation.
Simulated Annealing Van Laarhoven, Aarts Version 1, October 2000.
16.5 Introduction to Cost- based plan selection Amith KC Student Id: 109.
Probability Grid: A Location Estimation Scheme for Wireless Sensor Networks Presented by cychen Date : 3/7 In Secon (Sensor and Ad Hoc Communications and.
CSCE Monte Carlo Methods When you can’t do the math, simulate the process with random numbers Numerical integration to get areas/volumes Particle.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Neural Networks Lecture 17: Self-Organizing Maps
Efficient Model Selection for Support Vector Machines
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Section 8.1 Estimating  When  is Known In this section, we develop techniques for estimating the population mean μ using sample data. We assume that.
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
HOW TO MAKE A TIMETABLE USING GENETIC ALGORITHMS Introduction with an example.
(Particle Swarm Optimisation)
1 IE 607 Heuristic Optimization Particle Swarm Optimization.
Simulated Annealing.
Particle Swarm Optimization Speaker: Lin, Wei-Kai
Heuristic Optimization Methods Greedy algorithms, Approximation algorithms, and GRASP.
Schreiber, Yevgeny. Value-Ordering Heuristics: Search Performance vs. Solution Diversity. In: D. Cohen (Ed.) CP 2010, LNCS 6308, pp Springer-
METAHEURISTICS Genetic Algorithm Jacques A. Ferland Department of Informatique and Recherche Opérationnelle Université de Montréal
MAE 552 Heuristic Optimization Instructor: John Eddy Lecture #12 2/20/02 Evolutionary Algorithms.
Societies of Hill-Climbers Before presenting SoHCs let’s first talk about Hill-Climbing in general. For this lecture we will confine ourselves to binary-
Power Series Section 9.1a.
Particle Swarm Optimization by Dr. Shubhajit Roy Chowdhury Centre for VLSI and Embedded Systems Technology, IIIT Hyderabad.
Rounding scheme if r * j  1 then r j := 1  When the number of processors assigned in the continuous solution is between 0 and 1 for each task, the speed.
Changing Bases. Base 10: example number ³ 10² 10¹ 10 ⁰ ₁₀ 10³∙2 + 10²∙1 + 10¹∙ ⁰ ∙0 = 2120 ₁₀ Implied base 10 Base 8: 4110 ₈ 8³ 8².
GENETIC ALGORITHM Basic Algorithm begin set time t = 0;
Particle Swarm Optimization (PSO)
COURSE: JUST 3900 INTRODUCTORY STATISTICS FOR CRIMINAL JUSTICE Test Review: Ch. 4-6 Peer Tutor Slides Instructor: Mr. Ethan W. Cooper, Lead Tutor © 2013.
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Example Apply hierarchical clustering with d min to below data where c=3. Nearest neighbor clustering d min d max will form elongated clusters!
Selection Methods Choosing the individuals in the population that will create offspring for the next generation. Richard P. Simpson.
4 4.1 © 2016 Pearson Education, Ltd. Vector Spaces VECTOR SPACES AND SUBSPACES.
School of Information Sciences University of Pittsburgh TELCOM2125: Network Science and Analysis Konstantinos Pelechrinis Spring 2013 Figures are taken.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
A distributed PSO – SVM hybrid system with feature selection and parameter optimization Cheng-Lung Huang & Jian-Fan Dun Soft Computing 2008.
Breeding Swarms: A GA/PSO Hybrid 簡明昌 Author and Source Author: Matthew Settles and Terence Soule Source: GECCO 2005, p How to get: (\\nclab.csie.nctu.edu.tw\Repository\Journals-
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
An Evolutionary Approach
Evolutionary Algorithms Jim Whitehead
Particle Swarm Optimization (2)
Scientific Research Group in Egypt (SRGE)
Discrete ABC Based on Similarity for GCP
Particle Swarm Optimization
PSO -Introduction Proposed by James Kennedy & Russell Eberhart in 1995
Artificial Intelligence (CS 370D)
METAHEURISTIC Jacques A. Ferland
Latent Variables, Mixture Models and EM
Introduction Swarm Intelligence
Population Methods.
Presentation transcript:

1 A Novel Binary Particle Swarm Optimization

2 Binary PSO- One version In this version of PSO, each solution in the population is a binary string. –Each binary string is of dimension n which is evaluated to give parameter values. In the binary PSO, each binary string represents a particle Strings are updated bit-by-bit based on its current value, the value of that bit in the best (fitness) of that particle to date, and the best value of that bit to date of its neighbors

3 Binary PSO- What is a neighbor? For binary strings, neighbors can be selected in one of several ways. Some examples are: (for a neighborhood of size k) –Neighbors are the k binary strings whose Hamming distance is minimum For equal Hamming distances, the choices are arbitrary –In the beginning, arbitrarily assign groups of k strings to neighborhoods –Let the neighborhood size be the population size

4 BPSO In regular (real valued) PSO, everything is in terms of a velocity. In BPSO, how does one define a velocity for a single bit? –Generally the velocity is defined in terms of a probability of the bit changing You will see in a minute how this works

5 BPSO As just noted, in BPSO, bit-by-bit updates are done probabilistically –In other words, for a chosen bit (d) in a chosen string (i) it is changed to a 1 with a probability (P) that is a function of its predisposition to be a 1, the best value of itself to date, and the best value of its neighbors. –1-P is the probability of changing to a 0 –Once P is determined, we generate a random number R, and if R<P, then the bit becomes a 1; otherwise it becomes a 0

6 BPSO The formula for an individual bit’s update is: The function P is a probability, and thus once this value is computed for a given particle bit, we must generate a uniform random number to see whether it should be a 1 or a 0

7 BPSO

8 The challenge is to come up with the f() from the previous slide The value of v id (t) determines a string’s propensity to choose 1 or 0. –Higher values of v id (t) mean it is more likely to choose a 1, similarly for lower values choosing a 0

9 BPSO For the function We are saying that this probability is a function of the bit’s current value, its “velocity” and the values of the best to date for the bit and best to date for the neighborhood. –Remember, best to date for a bit is simply a 0 or a 1

10 BPSO Since f will be a probability value, we know it must range between 0 and 1. There are several measures or expressions used for f, one that is commonly used is the sigmoid function

11 BPSO In the preceding Sometimes these parameters are chosen from a uniform distribution 0 - 2, such that the sum of their two limits is 4.0

12 BPSO Example As an example, let’s say that we are dealing with a population of 5 bit binary particles and a population of 4 particles We are updating particle 2 (01011), bit 3 (0)

13 BPSO Example Furthermore, we will assume that the current propensity (velocity) of this bit to be a 1 is Furthermore, assume that the best value of this particle (to date) was And the best value of the whole population (to date) was 01111

14 BPSO Example Thus we have:

15 BPSO Example Now, with the value for f, we generate a random number, and if it is < f then bit x becomes a 1 otherwise, it becomes a 0.

16 BPSO - Parameters Sometimes the v value is limited so that f does not approach too closely to 0.0 or 1.0. –In this case, constant parameters [V min,V max ] is used. –When v id is > V max, v id is set to V max –When v id is < V min, v id is set to V min

17 BPSO - Initializing There are a few things that need to be initialized. –Initial population (particle) values – just randomly generate binary strings –Initial velocities can be generated as

Main problems with binary PSO Parameters of the binary PSO –the effects of these parameters are the opposite of those for the real valued PSO –values ofw < 1 prevents convergence. For values of −1<w <1,V ij becomes 0 over time. for w < 1 we have Memory of BPSO 18

Proposed Binary Particle Swarm Optimization Two velocities for PSO 19

Results 20

21

22