Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan.

Slides:



Advertisements
Similar presentations
Pattern Recognition and Machine Learning: Kernel Methods.
Advertisements

Face Recognition & Biometric Systems Support Vector Machines (part 2)
Optimal Design Laboratory | University of Michigan, Ann Arbor 2011 Design Preference Elicitation Using Efficient Global Optimization Yi Ren Panos Y. Papalambros.
Ch. 4: Radial Basis Functions Stephen Marsland, Machine Learning: An Algorithmic Perspective. CRC 2009 based on slides from many Internet sources Longin.
Including Uncertainty Models for Surrogate-based Global Optimization
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
Content Based Image Clustering and Image Retrieval Using Multiple Instance Learning Using Multiple Instance Learning Xin Chen Advisor: Chengcui Zhang Department.
Date:2011/06/08 吳昕澧 BOA: The Bayesian Optimization Algorithm.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
Radial Basis Function Networks 표현아 Computer Science, KAIST.
Learning to Advertise. Introduction Advertising on the Internet = $$$ –Especially search advertising and web page advertising Problem: –Selecting ads.
Evolutionary Computational Intelligence Lecture 9: Noisy Fitness Ferrante Neri University of Jyväskylä.
The Pareto fitness genetic algorithm: Test function study Wei-Ming Chen
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks I PROF. DR. YUSUF OYSAL.
Genetic Algorithm What is a genetic algorithm? “Genetic Algorithms are defined as global optimization procedures that use an analogy of genetic evolution.
Radial-Basis Function Networks
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Radial Basis Function Networks
8/10/ RBF NetworksM.W. Mak Radial Basis Function Networks 1. Introduction 2. Finding RBF Parameters 3. Decision Surface of RBF Networks 4. Comparison.
Radial Basis Function Networks
 C. C. Hung, H. Ijaz, E. Jung, and B.-C. Kuo # School of Computing and Software Engineering Southern Polytechnic State University, Marietta, Georgia USA.
Transformation of Input Space using Statistical Moments: EA-Based Approach Ahmed Kattan: Um Al Qura University, Saudi Arabia Michael Kampouridis: University.
Genetic Programming on Program Traces as an Inference Engine for Probabilistic Languages Vita Batishcheva, Alexey Potapov
1 Hybrid methods for solving large-scale parameter estimation problems Carlos A. Quintero 1 Miguel Argáez 1 Hector Klie 2 Leticia Velázquez 1 Mary Wheeler.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
Improved Gene Expression Programming to Solve the Inverse Problem for Ordinary Differential Equations Kangshun Li Professor, Ph.D Professor, Ph.D College.
Study on Genetic Network Programming (GNP) with Learning and Evolution Hirasawa laboratory, Artificial Intelligence section Information architecture field.
Combining Regression Trees and Radial Basis Function Networks paper by: M. Orr, J. Hallam, K. Takezawa, A. Murray, S. Ninomiya, M. Oide, T. Leonard presentation.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Predictive Design Space Exploration Using Genetically Programmed Response Surfaces Henry Cook Department of Electrical Engineering and Computer Science.
Well Log Data Inversion Using Radial Basis Function Network Kou-Yuan Huang, Li-Sheng Weng Department of Computer Science National Chiao Tung University.
Genetic Algorithms Genetic Algorithms – What are they? And how they are inspired from evolution. Operators and Definitions in Genetic Algorithms paradigm.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
How to apply Genetic Algorithms Successfully Prabhas Chongstitvatana Chulalongkorn University 4 February 2013.
Statistics and the Verification Validation & Testing of Adaptive Systems Roman D. Fresnedo M&CT, Phantom Works The Boeing Company.
Mohamed Hefeeda 1 School of Computing Science Simon Fraser University, Canada Efficient k-Coverage Algorithms for Wireless Sensor Networks Mohamed Hefeeda.
ELeaRNT: Evolutionary Learning of Rich Neural Network Topologies Authors: Slobodan Miletic 3078/2010 Nikola Jovanovic 3077/2010
1 Genetic Algorithms and Ant Colony Optimisation.
1 Effect of Spatial Locality on An Evolutionary Algorithm for Multimodal Optimization EvoNum 2010 Ka-Chun Wong, Kwong-Sak Leung, and Man-Hon Wong Department.
Engineering an Optimal Wind Farm Stjepan Mahulja EWEM Rotor Design : Aerodynamics s132545, th September 2015 SUPERVISORS: Gunner Chr. Larsen.
Chapter 12 FUSION OF FUZZY SYSTEM AND GENETIC ALGORITHMS Chi-Yuan Yeh.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
1 Approximate XML Query Answers Presenter: Hongyu Guo Authors: N. polyzotis, M. Garofalakis, Y. Ioannidis.
INCLUDING UNCERTAINTY MODELS FOR SURROGATE BASED GLOBAL DESIGN OPTIMIZATION The EGO algorithm STRUCTURAL AND MULTIDISCIPLINARY OPTIMIZATION GROUP Thanks.
Algorithms For Solving History Sensitive Cascade in Diffusion Networks Research Proposal Georgi Smilyanov, Maksim Tsikhanovich Advisor Dr Yu Zhang Trinity.
Intelligent Numerical Computation1 Center:Width:.
Agenda  INTRODUCTION  GENETIC ALGORITHMS  GENETIC ALGORITHMS FOR EXPLORING QUERY SPACE  SYSTEM ARCHITECTURE  THE EFFECT OF DIFFERENT MUTATION RATES.
Metaheuristics for the New Millennium Bruce L. Golden RH Smith School of Business University of Maryland by Presented at the University of Iowa, March.
Tree and Forest Classification and Regression Tree Bagging of trees Boosting trees Random Forest.
Advanced AI – Session 6 Genetic Algorithm By: H.Nematzadeh.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
1 A genetic algorithm with embedded constraints – An example on the design of robust D-stable IIR filters 潘欣泰 國立高雄大學 資工系.
Evolutionary Computation Evolving Neural Network Topologies.
T. P. Runarsson, J. J. Merelo, U. Iceland & U. Granada (Spain) NICSO 2010 Adapting Heuristic Mastermind Strategies to Evolutionary Algorithms.
Evolutionary Algorithms Jim Whitehead
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Goal We present a hybrid optimization approach for solving global optimization problems, in particular automated parameter estimation models. The hybrid.
Bulgarian Academy of Sciences
School of Computer Science & Engineering
Who cares about implementation and precision?
Informed Search and Exploration
Noémi Gaskó, Rodica Ioana Lung, Mihai Alexandru Suciu
“Hard” Optimization Problems
Aiman H. El-Maleh Sadiq M. Sait Syed Z. Shazli
Evolutionary Computational Intelligence
Presentation transcript:

Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models Ahmed Kattan Edgar Galvan

The Motivation Evolutionary Computation is not fit to solve expensive optimisation problems. Surrogate Models (SMs) mimic the behaviour of the simulation model as closely as possible while being computationally cheaper to evaluate. Due to their nature, SMs can be seen as heuristics that can help to estimate the fitness of a candidate solution without having to evaluate it. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Contributions of this work 1. In this paper, we propose a new SM based on Genetic Programming (GP) and Radial Basis Function Networks (RBFN), called GP-RBFN. 2. We use GP to evolve both: the structure of a RBF and its parameters. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Our proposed Solution Standard RBFN Mean of training fitness Point to be predicted Center, Point used to train the model Weight We replaced Euclidean distance with Hamming Width parameter Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Setting the Weights Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP-RBFN There are two main problems associated with training RBFN. Determining the centers Determining the optimal parameters. In this work, we let the GP to evolve the RBFN parameters. In particular, we search for the best β value i.e., RBF radius parameter. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP-RBFN 1. We train RBFN using standard procedures (without calculating the β value). 2. Run GP to optimise the trained RBFN. RBF node Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP-RBFN GP minimises the training error. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP-RBFN Best Evolved individual Query point Center Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP Fitness function It is worth mentioning that programs which produce low prediction errors are not necessary able to approximate the real objective function and may not be able to guide the search. Preliminary experiments showed that when the fitness function is guided by the average absolute error of predictions, GP tends to evolve programs that produce numbers close to the training set average. This bias the system to evolve programs that produce the average value of the training set Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

GP-RBFN In Action Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Experiments As a reference of performance, we compared our results against Standard RBFN Surrogate: to validate our approach against the standard surrogate model. Standard GA search: to verify whether our approach can find better solutions using small number of evaluation than standard (expensive) GA. (1+1) Evolutionary Strategies: to compare our search with a standard search algorithm. Random search: to validate our main hypothesis that surrogate is better than random sampling as it makes rational use of the available information and point out the next promising points in the search space. For a fair comparison, all algorithms search the given problem within exactly the same number of expensive evaluations. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Experiments Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Experiments Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Experiments Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Conclusions In this paper, we proposed GP-RBFN Surrogate. The proposed model uses GP to construct better RBF expressions for the RBFN surrogate model. GP receives standard RBF associated with its weights as a primitive and uses it to construct surrogate programs. We let the GP system to find the best radius parameter value for the RBF. Our proposed approach was tested in one of the most studied NP-complete problems (MAX-SAT) to validate its effectiveness. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Future Work An attractive idea is to let GP decides the weights beside the radius parameters. Another direction is exploring the possibility of letting GP finding the best number of RBFs’ centres. Evolving RBF Networks via GP for Estimating Fitness Values using Surrogate Models

Thank you for paying attention!