1 Logic Aided Lamarckian Evolution Evelina Lamma (1), Fabrizio Riguzzi (2), Luís Moniz Pereira (3) (1) DEIS, University of Bologna, Italy (2) DI, University.

Slides:



Advertisements
Similar presentations
Genetic Algorithms Chapter 3. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Genetic Algorithms GA Quick Overview Developed: USA in.
Advertisements

Department of Electronic Engineering NUIG Direct Evolution of Patterns using Genetic Algorithms By: John Brennan Supervisor: John Maher.
Genetic Algorithms.
Representing Hypothesis Operators Fitness Function Genetic Programming
Updates plus Preferences Luís Moniz Pereira José Júlio Alferes Centro de Inteligência Artificial Universidade Nova de Lisboa Portugal JELIA’00, Málaga,
CS6800 Advanced Theory of Computation
Exact and heuristics algorithms
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
1 Logic Aided Lamarckian Evolution Evelina Lamma (1), Fabrizio Riguzzi (2), Luís Moniz Pereira (3) (1) DEIS, University of Bologna, Italy (2) DI, University.
Non-Linear Problems General approach. Non-linear Optimization Many objective functions, tend to be non-linear. Design problems for which the objective.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Luís Moniz Pereira CENTRIA, Departamento de Informática Universidade Nova de Lisboa Pierangelo Dell’Acqua Dept. of Science and Technology.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
LP and Non-Monotonicity LP includes a non-monotonic form of default negation not L is true if L cannot (now) be proven This feature is used for representing.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Luís Moniz Pereira Centro de Inteligência Artificial - CENTRIA Universidade Nova de Lisboa, Portugal Pierangelo Dell’Acqua Dept. of Science and Technology.
Adaptive Reasoning for Cooperative Agents Luís Moniz Pereira Alexandre Pinto Centre for Artificial Intelligence – CENTRIA Universidade Nova de Lisboa INAP’09,
Chapter 6: Transform and Conquer Genetic Algorithms The Design and Analysis of Algorithms.
Prepared by Barış GÖKÇE 1.  Search Methods  Evolutionary Algorithms (EA)  Characteristics of EAs  Genetic Programming (GP)  Evolutionary Programming.
Genetic Algorithm.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Solving the Concave Cost Supply Scheduling Problem Xia Wang, Univ. of Maryland Bruce Golden, Univ. of Maryland Edward Wasil, American Univ. Presented at.
Intro. ANN & Fuzzy Systems Lecture 36 GENETIC ALGORITHM (1)
CS Machine Learning Genetic Algorithms (II).
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic Algorithms Michael J. Watts
Arindam K. Das CIA Lab University of Washington Seattle, WA MINIMUM POWER BROADCAST IN WIRELESS NETWORKS.
Applying Genetic Algorithm to the Knapsack Problem Qi Su ECE 539 Spring 2001 Course Project.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
Kansas State University Department of Computing and Information Sciences CIS 732: Machine Learning and Pattern Recognition Friday, 16 February 2007 William.
Preference Revision via Declarative Debugging Pierangelo Dell’Acqua Dept. of Science and Technology - ITN Linköping University, Sweden EPIA’05, Covilhã,
1 Genetic Algorithms K.Ganesh Introduction GAs and Simulated Annealing The Biology of Genetics The Logic of Genetic Programmes Demo Summary.
Genetic Algorithms Przemyslaw Pawluk CSE 6111 Advanced Algorithm Design and Analysis
Chapter 9 Genetic Algorithms.  Based upon biological evolution  Generate successor hypothesis based upon repeated mutations  Acts as a randomized parallel.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms Genetic algorithms provide an approach to learning that is based loosely on simulated evolution. Hypotheses are often described by bit.
Genetic Algorithms What is a GA Terms and definitions Basic algorithm.
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
NMR98 - Logic Programming1 Learning with Extended Logic Programs Evelina Lamma (1), Fabrizio Riguzzi (1), Luís Moniz Pereira (2) (1)DEIS, University of.
1 Consistent-based Diagnosis Yuhong YAN NRC-IIT. 2 Main concepts in this paper  (Minimal) Diagnosis  Conflict Set  Proposition 3.3  Corollary 4.5.
D Nagesh Kumar, IIScOptimization Methods: M8L5 1 Advanced Topics in Optimization Evolutionary Algorithms for Optimization and Search.
Neural Networks And Its Applications By Dr. Surya Chitra.
Learning Three-Valued Logical Programs Evelina Lamma 1, Fabrizio Riguzzi 1, Luis Moniz Pereira 2 1 DEIS, Università di Bologna 2 Centro de Inteligencia.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Machine Learning Chapter 7. Computational Learning Theory Tom M. Mitchell.
Agenda  INTRODUCTION  GENETIC ALGORITHMS  GENETIC ALGORITHMS FOR EXPLORING QUERY SPACE  SYSTEM ARCHITECTURE  THE EFFECT OF DIFFERENT MUTATION RATES.
Resource-Constrained Project Scheduling Problem (RCPSP)
Genetic algorithms for task scheduling problem J. Parallel Distrib. Comput. (2010) Fatma A. Omara, Mona M. Arafa 2016/3/111 Shang-Chi Wu.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Genetic Programming. What is Genetic Programming? GP for Symbolic Regression Other Representations for GP Example of GP for Knowledge Discovery Outline.
EVOLUTIONARY SYSTEMS AND GENETIC ALGORITHMS NAME: AKSHITKUMAR PATEL STUDENT ID: GRAD POSITION PAPER.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Genetic Algorithm (Knapsack Problem)
Introduction to Genetic Algorithms
Genetic Algorithms.
An evolutionary approach to solving complex problems
EE368 Soft Computing Genetic Algorithms.
Machine Learning: UNIT-4 CHAPTER-2
Presentation transcript:

1 Logic Aided Lamarckian Evolution Evelina Lamma (1), Fabrizio Riguzzi (2), Luís Moniz Pereira (3) (1) DEIS, University of Bologna, Italy (2) DI, University of Ferrara, Italy (3) CENTRIA, Departamento de Informática Universidade Nova de Lisboa, Portugal

2 Summary zGenetic algorithms zLamarckian operator zMulti-agent genetic algorithms zGenes and Memes zMulti-agent Crossover zBelief revision zEvolutionary approach to belief revision zExample zExperiments zConclusions

3 Genetic Algorithms zDarwinian operators: yselection ymutation ycrossover

4 Lamarckian operator zGiven a chromosome: yexpress it as a phenotype ymodify the phenotype in order to improve its fitness ytranslate back the phenotype into a genotype zModel of cultural evolution zConcept of “meme”

5 GA in Multi Agent Systems zMAS: communication of knowledge by means of explicit messages zadd: communication of knowledge by exchange of genes and memes zIf the number of agents is fixed, each has a pool of chromosomes of its own; or each agent is a single chromosome and there is a single pool of agents

6 Genetic Operators zCrossover: used in order to exchange genes and memes among agents ya chromosome in an agent is crossed with chromosomes from other agents zLamarckian operator: used to locally improve the fitness by experience directed self-mutation

7 Genes and Memes zGenes are modified only by Darwinian operators yindividual “physical” features are fixed yinherited irrespective of parental learning zMemes are modified by Darwinian and Lamarckian operators yindividual “cultural” features are changeable yinherited via parental learning

8 Asymmetrical flow of memes zMemes only go from teacher to learner zIn crossover: xgenes are copied from both parents xmemes are copied from another agent only if that agent has “accessed” and “tagged” them: accessed: confirmed or modified after an application of the Lamarckian operator tagged: an extra bit is associated to each meme in order to code whether the meme has been accessed.

9 Multi-agent crossover zA new agent offspring is produced from two parent chromosomes yone parent comes from the pool of another agent ybits from each parent are copied according to a mask zThe mask is such that: ygenes are selected randomly, half from each parent ymemes are selected randomly, half from memes in the other agent, but only if they have been accessed

10 Multiagent crossover  Mask Ag1 Ag2 child in Ag1 pool genesmemes

11 Genetic algorithm GA(max_gen, p, r,m, l, Fitness) max_gen: maximum number of generations before termination p: number of individuals in the population r: fraction of population to be replaced by Crossover at each step m: fraction of population to be mutated l: fraction of population that evolves Lamarckianly Fitness: fitness function F(h i )

12 Genetic algorithm GA(max_gen, p, r,m, l, Fitness) Initialize population P := set of p hypotheses randomly generated gen :=0 while gen <= max_gen Generate P S by applying the following operators to P : selection crossover mutation Lamarck update: P := P S return the hypothesis from P S with the highest fitness

13 Genetic operators select: select (1- r) p hypotheses from P with a probability Pb proportional to their fitness and add them to P S crossover: for i:=1 to r p select h 1 from P with probability Pb select h 2 from another agent chosen at random crossover h 1 with h 2 obtaining h’ 1, add h’ 1 to P S mutate: choose m percent of the members of P S with uniform probability and, for each, invert randomly one bit Lamarck: choose l p hypotheses from P S with uniform probability and apply to them the Lamarckian operator

14 Belief Revision zImportant functionality of agents. zProblem definition. Given yan extended logic program containing integrity constraints, i.e.:   B 1,…,B n, not C 1,…,not C m ya set of revisable literals, i.e., literals for which the revision is allowed. xThey must not have any definition

15 Belief Revision zFind: ya truth value for the revisable literals so that the program is not contradictory, i.e.,  does not belong to the model of the program

16 GAs for Belief Revision zGenetic Algorithms can be used for Belief Revision: yeach revisable is encoded with a meme ythe meme has value 1 if the revisable is true and 0 if it is false yeach set of assumptions about the values of revisables is coded as a chromosome

17 Fitness function zn i number of integrity constraints satisfied by hypothesis h i zn total number of integrity constraints

18 Example zDigital circuit diagnosis zRevisable literals indicate the assumed behaviour mode of each gate: ynot ab(gate) : gate behaves normally yab(gate): gate behaves abnormally

19 Example: circuit c17 g10 g11 g22 g16 g19 g23 g6 g1 g3 g2 g obs

20 Example: circuit c17 val( in(Type,Name,Nr), V ) :- conn( in(Type,Name,Nr), out(Type2,Name2) ), val( out(Type2,Name2), V ). val( out(nand,Name), V ) :- not ab(Name), val( in(nand,Name,1), W1), val( in(nand,Name,2), W2), nand_table(W1,W2,V). nand_table(0,0,1). …... val( out(nand,Name), V ) :- ab(Name), val( in(nand,Name,1), W1), val( in(nand,Name,2), W2), and_table(W1,W2,V). val( out(inpt0, Name), V ) :- obs( out(inpt0, Name), V ).

21 Topology conn(in(nand, g10, 1), out(inpt0, g1)). conn(in(nand, g10, 2), out(inpt0, g3)). conn(in(nand, g11, 1), out(inpt0, g3)). conn(in(nand, g11, 2), out(inpt0, g6)). conn(in(nand, g16, 1), out(inpt0, g2)). conn(in(nand, g16, 2), out(nand, g11)). conn(in(nand, g19, 1), out(nand, g11)). conn(in(nand, g19, 2), out(inpt0, g7)). conn(in(nand, g22, 1), out(nand, g10)). conn(in(nand, g22, 2), out(nand, g16)). conn(in(nand, g23, 1), out(nand, g16)). conn(in(nand, g23, 2), out(nand, g19)).

22 Observations and constraints  :- obs(out(nand, g22), 0), val(out(nand, g22), 1).  :- obs(out(nand, g22), 1), val(out(nand, g22), 0).  :- obs(out(nand, g23), 0), val(out(nand, g23), 1).  :- obs(out(nand, g23), 1), val(out(nand, g23), 0). obs(out(inpt0, g1), 0). obs(out(inpt0, g2), 1). obs(out(inpt0, g3), 0). obs(out(inpt0, g6), 0). obs(out(inpt0, g7), 0). obs(out(nand, g22), 0). obs(out(nand, g23), 1).

23 Diagnosis One of the integrity constraints is violated: ythe observed output for g22 is different from the computed output. Contradiction is removed by assuming yab(g22) which is a diagnosis for the circuit.

24 Belief Revision zSupport Set: a support set of a literal L of a program P, denoted by SS(L), is a set of revisables sufficient to support a derivation of L in P

25 Belief Revision zHitting set: a hitting set of for a collection of SS(L) is the union of one non-empty subset from each SS(L). It is minimal iff no proper subset is a hitting set. zA contradiction removal set is a hitting set for the SS(  ).

26 Lamarckian operator zThe Lamarckian operator uses techniques similar to BR ones. zIt differs from BR because it starts from an arbitrary chromosome C zThe Lamarckian support sets are all the support sets that are subsets of the current chromosome C

27 Lamarckian operator zfind all the Lamarckian support sets for  with respect to C zfind a hitting set HS(  ) for them zchange in C all its literals which are in HS(  ).

28 Example: circuit c17 zSuppose, initially: yC={ab(g10), not ab(g11), ab(g16), not ab(g19), not ab(g22), not ab(g23)} zIn this case, two constraints are violated because out(g22)=1 and out(g23)=0

29 Example: circuit c17 zA BR operator would return as changes to C: y{not ab(g10), not ab(g11), not ab(g16), not ab(g19), ab(g22), not ab(g23)} ythese are consistent with both ICs

30 Example: circuit c17 zLamarckian support sets of  : y[not ab(g11),not ab(g19),not ab(g11),ab(g16),not ab(g23)] y[not ab(g11),ab(g16),ab(g10),not ab(g22)] zLamarck returns these changes to C, one for each hitting set: yC={ab(g10), ab(g11), ab(g16), not ab(g19), not ab(g22), not ab(g23)} yC={ab(g10), not ab(g11), not ab(g16), not ab(g19), not ab(g22), not ab(g23)} yone constraint in either case is still violated

31 Experiments yISCAS85 collection of benchmark digital circuits yFour algorithms considered: S-L: single agent GA without the Lamarckian operator M-L: as S-L but multi agent M+L-A: as M-L plus Lamarck, without asymmetry M+L+A: as M+L-A plus asymmetry

32 Results  alu4_flat circuit y100 gates (100 revisables) y8 outputs (16 constraints) y4 agents, with same observations and constraints y10 chromosomes each, l=0.6 y5 experiments zAverage fitness:

33 Conclusions zFramework for solving problems represented with logic: ybelief revision ydynamic world, control of observable outputs zPerformance improvement by ydistributed agents yLamarckian operator yasymmetric crossover on memes

34 Future work zSituations where: yagents do not have the same observations, constraints or revisables yobservations change over time zThree-valued memes for expressing irrelevancy zIntegrating Lamarckism with other agent features

[ab(c11gat),ab(c19gat),ab(c11gat),ab(c16gat),ab(c23gat)], [not ab(c11gat),ab(c19gat),ab(c11gat),ab(c16gat),ab(c23gat)], [ab(c11gat),not ab(c19gat),ab(c11gat),ab(c16gat),ab(c23gat)], [not ab(c11gat),not ab(c19gat),ab(c11gat),ab(c16gat),ab(c23gat)], [ab(c11gat),ab(c19gat),not ab(c11gat),ab(c16gat),ab(c23gat)], [not ab(c11gat),ab(c19gat),not ab(c11gat),ab(c16gat),ab(c23gat)], [ab(c11gat),ab(c19gat),ab(c11gat),not ab(c16gat),ab(c23gat)], [not ab(c11gat),ab(c19gat),ab(c11gat),not ab(c16gat),ab(c23gat)], [ab(c11gat),ab(c19gat),not ab(c11gat),not ab(c16gat),ab(c23gat)], [not ab(c11gat),ab(c19gat),not ab(c11gat),not ab(c16gat),ab(c23gat)], [ab(c11gat),not ab(c19gat),not ab(c11gat),not ab(c16gat),ab(c23gat)], [not ab(c11gat),not ab(c19gat),not ab(c11gat),not ab(c16gat),ab(c23gat)], [ab(c11gat),not ab(c19gat),not ab(c11gat),ab(c16gat),not ab(c23gat)], [not ab(c11gat),not ab(c19gat),not ab(c11gat),ab(c16gat),not ab(c23gat)], [ab(c11gat),not ab(c19gat),ab(c11gat),not ab(c16gat),not ab(c23gat)], [not ab(c11gat),not ab(c19gat),ab(c11gat),not ab(c16gat),not ab(c23gat)], [not ab(c11gat),ab(c16gat),not ab(c10gat),ab(c22gat)], [ab(c11gat),not ab(c16gat),not ab(c10gat),ab(c22gat)], [ab(c11gat),ab(c16gat),ab(c10gat),not ab(c22gat)], [not ab(c11gat),ab(c16gat),ab(c10gat),not ab(c22gat)], [ab(c11gat),not ab(c16gat),ab(c10gat),not ab(c22gat)], [not ab(c11gat),not ab(c16gat),ab(c10gat),not ab(c22gat)], [ab(c11gat),ab(c16gat),not ab(c10gat),not ab(c22gat)], [not ab(c11gat),not ab(c16gat),not ab(c10gat),not ab(c22gat)]

36 Belief Revision zSupport Set: a support set of a literal L of a program P, denoted by SS(L), is obtained as follows:  if L is not a revisable literal, then, for each rule L  B in P, there is one SS(L) given by the union of SS(B i ) for each B i  B. If B is empty then SS(L)={} yif L is a revisable literal then SS(L)={L}

37 Lamarckian operator Lamarckian support set: given an hypothesis C, a Lamarckian support set of a literal L of a program P, denoted by SS(L), is obtained as follows: if L is not a revisable literal, then, for each rule L  B in P there is one SS(L) given by the union of SS(Bi) for each Bi  B. If B is empty then SS(L)={} if L is a revisable literal then if L belongs to C, then SS(L)={L} if L is not in C or the default complement belongs to C then the SS(L) under construction is not a support set

38 Results, single agent zSingle agent, with and without the Lamarckian operator zFitness function: xf i number of revisables of h i that are false