Approaches to Selection and their Effect on Fitness Modelling in an Estimation of Distribution Algorithm A. E. I. Brownlee, J. A. McCall, Q. Zhang and.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

CS188: Computational Models of Human Behavior
Genetic Algorithm Example based on Koza, J Genetic Programming. Cambridge MA: Basic Books.
1 Chapter 5 Belief Updating in Bayesian Networks Bayesian Networks and Decision Graphs Finn V. Jensen Qunyuan Zhang Division. of Statistical Genomics,
Teaching an Agent by Playing a Multimodal Memory Game: Challenges for Machine Learners and Human Teachers AAAI 2009 Spring Symposium: Agents that Learn.
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
Estimation  Samples are collected to estimate characteristics of the population of particular interest. Parameter – numerical characteristic of the population.
Outline input analysis input analyzer of ARENA parameter estimation
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u
Chapter 8-3 Markov Random Fields 1. Topics 1. Introduction 1. Undirected Graphical Models 2. Terminology 2. Conditional Independence 3. Factorization.
Review of Basic Probability and Statistics
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Estimation of Distribution Algorithms Ata Kaban School of Computer Science The University of Birmingham.
Evolution of descent directions Alejandro Sierra Escuela Politécnica Superior Universidad Autónoma de Madrid Iván Santibáñez Koref Bionik und Evolutionstechnik.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
The moment generating function of random variable X is given by Moment generating function.
On the Task Assignment Problem : Two New Efficient Heuristic Algorithms.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Image Analysis and Markov Random Fields (MRFs) Quanren Xiong.
Moment Generating Functions 1/33. Contents Review of Continuous Distribution Functions 2/33.
Factor Graphs Young Ki Baik Computer Vision Lab. Seoul National University.
Summarized by Soo-Jin Kim
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Summarized by B.-H. Kim Biointelligence Laboratory, Seoul National.
Moment Generating Functions
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
International Workshop on Complex Networks, Seoul (23-24 June 2005) Vertex Correlations, Self-Avoiding Walks and Critical Phenomena on the Static Model.
Testing Models on Simulated Data Presented at the Casualty Loss Reserve Seminar September 19, 2008 Glenn Meyers, FCAS, PhD ISO Innovative Analytics.
1 On the Placement of Web Server Replicas Lili Qiu, Microsoft Research Venkata N. Padmanabhan, Microsoft Research Geoffrey M. Voelker, UCSD IEEE INFOCOM’2001,
Markov Random Fields Probabilistic Models for Images
Siddhartha Shakya1 Estimation Of Distribution Algorithm based on Markov Random Fields Siddhartha Shakya School Of Computing The Robert Gordon.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
Prototype-Driven Learning for Sequence Models Aria Haghighi and Dan Klein University of California Berkeley Slides prepared by Andrew Carlson for the Semi-
Project 2: Classification Using Genetic Programming Kim, MinHyeok Biointelligence laboratory Artificial.
Lecture 2: Statistical learning primer for biologists
CS246 Latent Dirichlet Analysis. LSI  LSI uses SVD to find the best rank-K approximation  The result is difficult to interpret especially with negative.
Learning and Acting with Bayes Nets Chapter 20.. Page 2 === A Network and a Training Data.
Solving Function Optimization Problems with Genetic Algorithms September 26, 2001 Cho, Dong-Yeon , Tel:
Maximum Entropy Model, Bayesian Networks, HMM, Markov Random Fields, (Hidden/Segmental) Conditional Random Fields.
Pattern Recognition and Machine Learning
6.4 Random Fields on Graphs 6.5 Random Fields Models In “Adaptive Cooperative Systems” Summarized by Ho-Sik Seok.
INTELLIGENT SIMULATION OF COMPLEX SYSTEMS USING IMMUCOMPUTING Svetlana P. Sokolova Ludmilla A. Sokolova St. Petersburg Institute for Informatics and Automation.
Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.
5.3 Algorithmic Stability Bounds Summarized by: Sang Kyun Lee.
A Cooperative Coevolutionary Genetic Algorithm for Learning Bayesian Network Structures Arthur Carvalho
Hidden Markov Models. A Hidden Markov Model consists of 1.A sequence of states {X t |t  T } = {X 1, X 2,..., X T }, and 2.A sequence of observations.
Markov Random Fields in Vision
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
ICTP School and Workshop on Structure and Function of complex Networks (16-28 May 2005) Structural correlations and critical phenomena of random scale-free.
On the Ability of Graph Coloring Heuristics to Find Substructures in Social Networks David Chalupa By, Tejaswini Nallagatla.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Ch 1. Introduction Pattern Recognition and Machine Learning, C. M. Bishop, Updated by J.-H. Eom (2 nd round revision) Summarized by K.-I.
Genetic Algorithm(GA)
Ch 5. Language Change: A Preliminary Model 5.1 ~ 5.2 The Computational Nature of Language Learning and Evolution P. Niyogi 2006 Summarized by Kwonill,
Biointelligence Laboratory, Seoul National University
Traffic Simulator Calibration
The set  of all independence statements defined by (3
A Markov Random Field Model for Term Dependencies
The Normal Probability Distribution Summary
Artificial Intelligence Chapter 20 Learning and Acting with Bayes Nets
Markov Random Fields Presented by: Vladan Radosavljevic.
(c) SNU CSE Biointelligence Lab,
Adaptive Cooperative Systems Chapter 6 Markov Random Fields
Chapter 20. Learning and Acting with Bayes Nets
RM-MEDA: A Regularity Model-Based Multiobjective Estimation of Distribution Algorithm BISCuit EDA Seminar
Biointelligence Laboratory, Seoul National University
Restructuring Sparse High Dimensional Data for Effective Retrieval
Presentation transcript:

Approaches to Selection and their Effect on Fitness Modelling in an Estimation of Distribution Algorithm A. E. I. Brownlee, J. A. McCall, Q. Zhang and D. F. Brown Presented by Kim, Kwonill

© 2008, SNU CSE BioIntelligence Lab, At a Glance Selection Ability of EDA Effect ? Yes!! Top Bottom Top&Bottom Fitness function modelling by DEUM (Distribution Estimation Using Markov network) Correlation Under Various Conditions Selection EDA Measure Between Predicted & True Fitnesses 2

Contents Selection in EDA DEUM Correlation coefficient Experiment method Result & Conclusion Summary & QnA © 2008, SNU CSE BioIntelligence Lab,

Selection in EDA Loss of information from population  Truncation selection Related works Topic of this paper  Let’s analyze the effects of selection on the information about fitness function contained in a population. © 2008, SNU CSE BioIntelligence Lab,

Distribution Estimation Using Markov network (DEUM) (1/3) Using Markov network to model fitness as an energy distribution over the solution space Markov Network  Random variable → Node  Interaction → Undirected edge  Only neighborhood interactions  A set of cliques © 2008, SNU CSE BioIntelligence Lab,

Joint probability distribution (by Gibbs distribution)  U(x): sum of clique potential function  x : each individual  c, α : parameters, which define the Markov network Markov Fitness Model (MFM)  Maximizing Fitness = Minimizing U(x) © 2008, SNU CSE BioIntelligence Lab, Distribution Estimation Using Markov network (DEUM) (2/3)

Distribution Estimation Using Markov network (DEUM) (3/3) Example: MAXSAT problem  An individual x={0011}, with fitness f(x)=2 Parameter determination  By singular value decomposition (SVD) Fitness prediction © 2008, SNU CSE BioIntelligence Lab,

Measure: Correlation Coefficient Product moment correlation coefficient  x: a set of true fitnesses calculated by fitness function  y: a set of predicted fitnesses calculated by model © 2008, SNU CSE BioIntelligence Lab,

EXPERIMENTS & RESULTS © 2008, SNU CSE BioIntelligence Lab,

Correlation: C m & C r © 2008, SNU CSE BioIntelligence Lab, 1.Generate random initial population 2.Select a subset σ of the population 3.Use σ to build MFM 4.For each member of s, repeat: 4.1 Mutate one bit in the individual 4.2 Use MFM to predict fitness of individual 4.3 Use fitness function to determine true fitness 5.Generate a complete new random population equal in size to the first, 6.For each member of this population: 6.1 Use MFM to predict fitness 6.2 Use fitness function to determine true fitness C m : C r : Ability of the model to predict the fitness of randomly generated individuals. → How closely MFM is modeled to fitness function Ability of MFM to predict fitness of solutions “near to” the current population

Selection Strategies © 2008, SNU CSE BioIntelligence Lab, Population n Selected p*n Population n Selected p*n Population n Selected p*n/2 Top SelectionBottom Selection Top & Bottom Selection

Underspecified & Overspecified N > n: Underspecified system N ≤ n: Overspecified system  n: population size  N: # of terms in MFM Ex. Onemax problem © 2008, SNU CSE BioIntelligence Lab,

Perfect & Imperfect Perfect model structure  MFM includes all interactions defined by the problem Imperfect model structure  MFM misses some interactions  Realistic situation © 2008, SNU CSE BioIntelligence Lab,

Problems Onemax Ising Spin Glass MAXSAT © 2008, SNU CSE BioIntelligence Lab,

Perfect + Overspecified © 2008, SNU CSE BioIntelligence Lab,

Perfect + Underspecified © 2008, SNU CSE BioIntelligence Lab,

Imperfect + Overspecified © 2008, SNU CSE BioIntelligence Lab,

Conclusion Selection is particularly important when the fitness model is unlikely to perfectly match the fitness function. © 2008, SNU CSE BioIntelligence Lab,

Summary & QnA Selection, information loss & ability of EDA Fitness modelling of DEUM  Markov network Measurement  Correlation Coefficient: C m & C r Experiments  Overspecified & underspecified system  Perfect & imperfect model structure  Top, Bottom, Top&Bottom selection strategies © 2008, SNU CSE BioIntelligence Lab,

THANK YOU!! © 2008, SNU CSE BioIntelligence Lab,