Download presentation
Presentation is loading. Please wait.
Published byLaura Webb Modified over 9 years ago
1
24-02-2006Siddhartha Shakya1 Estimation Of Distribution Algorithm based on Markov Random Fields Siddhartha Shakya School Of Computing The Robert Gordon University
2
24-02-2006Siddhartha Shakya2 Outline From GAs to EDAs Probabilistic Graphical Models in EDAs –Bayesian networks –Markov Random Fields Fitness modelling approach to estimating and sampling MRF in EDA –Gibbs distribution, energy function and modelling the fitness –Estimating parameters (Fitness modelling approach) –Sampling MRF (several different approaches) Conclusion
3
24-02-2006Siddhartha Shakya3 Genetic Algorithms (GAs) Population based optimisation technique Based on Darwin's theory of Evolution A solution is encoded as a set of symbols known as chromosome A population of solution is generated Genetic operators are then applied to the population to get next generation that replaces the parent population
4
24-02-2006Siddhartha Shakya4 Simple GA simulation
5
24-02-2006Siddhartha Shakya5 GA to EDA
6
24-02-2006Siddhartha Shakya6 Simple EDA simulation 01111 10101 00101 01000 01111 10101 00101 01000 0.5 1.0 0.5 00111 11101 10101 01111
7
24-02-2006Siddhartha Shakya7 Joint Probability Distribution (JPD) Solution as a set of random variables Joint probability Distribution (JPD) Exponential to the number of variables, therefore not feasible to calculate in most cases Needs Simplification!!
8
24-02-2006Siddhartha Shakya8 Factorisation of JPD Univariate model: No interaction: Simplest model Bivariate model: Pair- wise interaction Multivariate Model: interaction of more than two variables
9
24-02-2006Siddhartha Shakya9 Typical estimation and sampling of JPD in EDAs Learn the interaction between variables in the solution Learn the probabilities associated with interacting variables This specifies the JPD: p(x) Sample the JPD (i.e. learned probabilities)
10
24-02-2006Siddhartha Shakya10 Probabilistic Graphical Models Efficient tool to represent the factorisation of JPD Marriage between probability theory and Graph theory Consist of Two components –Structure –Parameters Two types of PGM –Directed PGM (Bayesian Networks) –Undirected PGM (Markov Random Field)
11
24-02-2006Siddhartha Shakya11 Directed PGM (Bayesian networks) Structure: Directed Acyclic Graph (DAG) Independence relationship: A variable is conditionally independent of rest of the variables given its parents Parameters: Conditional probabilities X1X1 X2X2 X3X3 X4X4 X5X5
12
24-02-2006Siddhartha Shakya12 Bayesian networks The factorisation of JPD encoded in terms of conditional probabilities is JPD for BN X1X1 X2X2 X3X3 X4X4 X5X5
13
24-02-2006Siddhartha Shakya13 Estimating a Bayesian network Estimate structure Estimate parameters This completely specifies the JPD JPD can then be Sampled X1X1 X2X2 X3X3 X4X4 X5X5
14
24-02-2006Siddhartha Shakya14 BN based EDAs 1.Initialise parent solutions 2.Select a set from parent solutions 3.Estimate a BN from selected set a.Estimate structure b.Estimate parameters 4.Sample BN to generate new population 5.Replace parents with new set and go to 2 until termination criteria satisfies
15
24-02-2006Siddhartha Shakya15 How to estimate and sample BN in EDAs Estimating structure –Score + Search techniques –Conditional independence test Estimating parameters –Trivial in EDAs: Dataset is complete –Estimate probabilities of parents before child Sampling –Probabilistic Logical Sampling (Sample parents before child) X1X1 X2X2 X3X3 X4X4 X5X5
16
24-02-2006Siddhartha Shakya16 BN based EDAs Well established approach in EDAs BOA, EBNA, LFDA, MIMIC, COMIT, BMDA References –Larrañiaga and Lozano 2002 –Pelikan 2002
17
24-02-2006Siddhartha Shakya17 Markov Random Fields (MRF) Structure: Undirected Graph Local independence: A variable is conditionally independent of rest of the variables given its neighbours Global Independence: Two sets of variables are conditionally independent to each other if there is a third set that separates them. Parameters: potential functions defined on the cliques X1X1 X3X3 X2X2 X4X4 X6X6 X5X5
18
24-02-2006Siddhartha Shakya18 Markov Random Field The factorisation of JPD encoded in terms of potential function over maximal cliques is JPD for MRF X1X1 X3X3 X2X2 X4X4 X6X6 X5X5
19
24-02-2006Siddhartha Shakya19 Estimating a Markov Random field Estimate structure from data Estimate parameters –Requires potential functions to be numerically defined This completely specifies the JPD JPD can then be Sampled –No specific order (not a DAG) so a bit problematic X1X1 X3X3 X2X2 X4X4 X6X6 X5X5
20
24-02-2006Siddhartha Shakya20 MRF in EDA Has recently been proposed as a estimation of distribution technique in EDA Shakya et al 2004, 2005 Santana et el 2003, 2005
21
24-02-2006Siddhartha Shakya21 MRF based EDA 1.Initialise parent solutions 2.Select a set from parent solutions 3.Estimate a MRF from selected set a.Estimate structure b.Estimate parameters 4.Sample MRF to generate new population 5.Replace parent with new solutions and go to 2 until termination criteria satisfies
22
24-02-2006Siddhartha Shakya22 How to estimate and sample MRF in EDA Learning Structure –Conditional Independence test (MN-EDA, MN-FDA) –Linkage detection algorithm (LDFA) Learning Parameter –Junction tree approach (FDA) –Junction graph approach (MN-FDA) –Kikuchi approximation approach (MN-EDA) –Fitness modelling approach (DEUM) Sampling –Probabilistic Logic Sampling (FDA, MN-FDA) –Probability vector approach (DEUMpv) –Direct sampling of Gibbs distribution (DEUMd) –Metropolis sampler (Is-DEUMm) –Gibbs Sampler (Is-DEUMg, MN-EDA)
23
24-02-2006Siddhartha Shakya23 Fitness modelling approach Hamersley Clifford theorem: JPD for any MRF follows Gibbs distribution Energy of Gibbs distribution in terms of potential functions over the cliques Assuming probability of solution is proportional to its fitness: From (a) and (b) a Model of fitness function - MRF fitness model (MFM) – is derived
24
24-02-2006Siddhartha Shakya24 MRF fitness Model (MFM) Properties: –Completely specifies the JPD for MRF –Negative relationship between fitness and Energy i.e. Minimising energy = maximise fitness Task: –Need to find the structure for MRF –Need to numerically define clique potential function
25
24-02-2006Siddhartha Shakya25 MRF Fitness Model (MFM) Let us start with simplest model: univariate model – this eliminates structure learning :) For univariate model there will be n singleton clique For each singleton clique assign a potential function Corresponding MFM In terms of Gibbs distribution X1X1 X3X3 X2X2 X4X4 X6X6 X5X5
26
24-02-2006Siddhartha Shakya26 Estimating MRF parameters using MFM Each chromosome gives us a linear equation Applying it to a set of selected solution gives us a system of linear equations Solving it will give us the approximation to the MRF parameters Knowing MRF parameters completely specifies JPD Next step is to sample the JPD
27
24-02-2006Siddhartha Shakya27 General DEUM framework Distribution Estimation Using MRF algorithm (DEUM) 1.Initialise parent population P 2.Select set D from P (can use D=P !!) 3.Build a MFM and fit to D to estimate MRF parameters 4.Sample MRF to generate new population 5.Replace P with new population and go to 2 until termination criterion satisfies
28
24-02-2006Siddhartha Shakya28 How to sample MRF Probability vector approach Direct Sampling of Gibbs Distribution Metropolis sampling Gibbs sampling
29
24-02-2006Siddhartha Shakya29 Probability vector approach to sample MRF Minimise U(x) to maximise f(x) To minimise U(x) Each α i x i should be minimum This suggests: if α i is negative then corresponding x i should be positive We could get an optimum chromosome for the current population just by looking on α However not always the current population contains enough information to generate optimum We look on sign of each α i to update a vector of probability
30
24-02-2006Siddhartha Shakya30 DEUM with probability vector (DEUMpv)
31
24-02-2006Siddhartha Shakya31 Updating Rule Uses sign of a MRF parameter to direct search towards favouring value of respective variable that minimises energy U(x) Learning rate controls convergence
32
24-02-2006Siddhartha Shakya32 Simulation of DEUMpv 01111 10101 00101 01000 0.40.6 0.5 4 3 2 1 01111 10101 4 3 00101 01000 2 1 0.05-0.05-0.625 -0.05
33
24-02-2006Siddhartha Shakya33 Results OneMax Problem
34
24-02-2006Siddhartha Shakya34 Results F6 function optimisation
35
24-02-2006Siddhartha Shakya35 Results Trap 5 function Deceptive problem No solution found
36
24-02-2006Siddhartha Shakya36 Sampling MRF Probability vector approach Direct sampling of Gibbs distribution Metropolis sampling Gibbs sampling
37
24-02-2006Siddhartha Shakya37 Direct Sampling of Gibbs distribution In the probability vector approach, only the sign of MRF parameters has been used However, one could directly sample from the Gibbs distribution and make use of the values of MRF parameters Also could use the temperature coefficient to manipulate the probabilities
38
24-02-2006Siddhartha Shakya38 Direct Sampling of Gibbs distribution
39
24-02-2006Siddhartha Shakya39 Direct Sampling of Gibbs distribution The temperature coefficient has an important role Decreasing T will cool probability to either 1 or 0 depending upon sign and value of alpha This forms the basis for the DEUM based on direct sampling of Gibbs distribution (DEUMd)
40
24-02-2006Siddhartha Shakya40 DEUM with direct sampling (DEUMd) 1. Generate initial population, P, of size M 2. Select the N fittest solutions, N ≤ M 3. Calculate MRF parameters 4. Generate M new solutions by sampling univariate distribution 5. Replace P by new population and go to 2 until complete
41
24-02-2006Siddhartha Shakya41 DEUMd simulation 01111 10101 00101 01000 4 3 2 1 01111 10101 4 3 00101 01000 2 1 0.05-0.05-0.625 -0.05 0.40.6 01111 10111 01101 01010 4 4 3 2
42
24-02-2006Siddhartha Shakya42 Experimental results OneMax Problem
43
24-02-2006Siddhartha Shakya43 F6 function optimization
44
24-02-2006Siddhartha Shakya44 Plateau Problem (n=180)
45
24-02-2006Siddhartha Shakya45 Checker Board Problem (n=100)
46
24-02-2006Siddhartha Shakya46 Trap function of order 5 (n=60)
47
24-02-2006Siddhartha Shakya47 Experimental results GAUMDAPBILDEUMd Checker Board Fitness 254.68 ± (4.39) 233.79 ± (9.2) 243.5 ± (8.7) 254.1 ± (5.17) Evaluation 427702.2 ± (1098959.3) 50228.2 ± (9127) 191476.8 ± (37866.65) 33994 ± (13966.75) Equal- Products Fitness 211.59 ± (1058.47) 5.03 ± (18.29) 9.35 ± (43.36) 2.14 ± (6.56) Evaluation 1000000 ± (0) 1000000 ± (0) 1000000 ± (0) 1000000 ± (0) Colville Fitness 0.61 ± (1.02) 40.62 ± (102.26) 2.69 ± (2.54) 0.61 ± (0.77) Evaluation 1000000 ± (0) 62914.56 ± (6394.58) 1000000 ± (0) 1000000 ± (0) Six Peaks Fitness 99.1 ± (9) 98.58 ± (3.37) 99.81 ± (1.06) 100 ± (0) Evaluation 49506 ± (4940) 121333.76 ± (14313.44) 58210 ± (3659.15) 26539 ± (1096.45)
48
24-02-2006Siddhartha Shakya48 Analysis of Results For Univariate problems (OneMax), given population size of 1.5n, P=D and T->0, solution was found in single generation For problems with low order dependency between variables (Plateau and CheckerBoard), performance was significantly better than that of other Univariate EDAs. For the deceptive problems with higher order dependency (Trap function and Six peaks) DEUMd was deceived but by slowing the cooling rate, it was able to find solution for Trap of order 5. For the problems where optimum was not known the performance was comparable to that of GA and other EDAs and was better in some cases.
49
24-02-2006Siddhartha Shakya49 Cost- Benefit Analysis (the cost) Polynomial cost of estimating the distribution compared to linear cost of other univariate EDAs Cost to compute univariate marginal frequency: Cost to compute SVD
50
24-02-2006Siddhartha Shakya50 Cost- Benefit Analysis (the benefit) DEUMd can significantly reduce the number of fitness evaluations Quality of solution was better for DEUMd than other compared EDAs DEUMd should be tried on problems where the increased solution quality outweigh computational cost.
51
24-02-2006Siddhartha Shakya51 Sampling MRF Probability vector approach Direct Sampling of Gibbs Distribution Metropolis sampling Gibbs sampling
52
24-02-2006Siddhartha Shakya52 Example problem: 2D Ising Spin Glass Given coupling constant J find the value of each spins that minimises H MRF fitness model
53
24-02-2006Siddhartha Shakya53 Metropolis Sampler
54
24-02-2006Siddhartha Shakya54 Difference in Energy
55
24-02-2006Siddhartha Shakya55 DEUM with Metropolis sampler
56
24-02-2006Siddhartha Shakya56 Results
57
24-02-2006Siddhartha Shakya57 Sampling MRF Probability vector approach Direct Sampling of Gibbs Distribution Metropolis sampling Gibbs sampling
58
24-02-2006Siddhartha Shakya58 Conditionals from Gibbs distribution For 2D Ising spin glass problem:
59
24-02-2006Siddhartha Shakya59 Gibbs Sampler
60
24-02-2006Siddhartha Shakya60 DEUM with Gibbs sampler
61
24-02-2006Siddhartha Shakya61 Results
62
24-02-2006Siddhartha Shakya62 Summary From GA to EDA PGM approach to modelling and sampling distribution in EDA DEUM: MRF approach to modelling and sampling Learn Structure: No structure learning so far (Fixed models are used) Learn Parameter: Fitness modelling approach Sample MRF: –Probability vector approach to sample –Direct sampling of Gibbs distribution –Metropolis sampler –Gibbs Sampler Results are encouraging and lot more to explore
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.