Evolutionary Computation Evolving Neural Network Topologies.

Slides:



Advertisements
Similar presentations
Slides from: Doug Gray, David Poole
Advertisements

NEURAL NETWORKS Backpropagation Algorithm
Yuri R. Tsoy, Vladimir G. Spitsyn, Department of Computer Engineering
1 Neural networks. Neural networks are made up of many artificial neurons. Each input into the neuron has its own weight associated with it illustrated.
For Wednesday Read chapter 19, sections 1-3 No homework.
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
EA, neural networks & fuzzy systems Michael J. Watts
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Neural Networks. R & G Chapter Feed-Forward Neural Networks otherwise known as The Multi-layer Perceptron or The Back-Propagation Neural Network.
Analysis of Classification-based Error Functions Mike Rimer Dr. Tony Martinez BYU Computer Science Dept. 18 March 2006.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
LOGO Classification III Lecturer: Dr. Bo Yuan
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
國立陽明大學生資學程 陳虹瑋. Genetic Algorithm Background Fitness function ……. population selection Cross over mutation Fitness values Random cross over.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University Course website:
Revision Michael J. Watts
Ranga Rodrigo April 6, 2014 Most of the sides are from the Matlab tutorial. 1.
Artificial Neural Networks
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Integrating Neural Network and Genetic Algorithm to Solve Function Approximation Combined with Optimization Problem Term presentation for CSC7333 Machine.
C. Benatti, 3/15/2012, Slide 1 GA/ICA Workshop Carla Benatti 3/15/2012.
Cristian Urs and Ben Riveira. Introduction The article we chose focuses on improving the performance of Genetic Algorithms by: Use of predictive models.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
1 Artificial Neural Networks Sanun Srisuk EECP0720 Expert Systems – Artificial Neural Networks.
1 6. Feed-forward mapping networks Lecture Notes on Brain and Computation Byoung-Tak Zhang Biointelligence Laboratory School of Computer Science and Engineering.
Appendix B: An Example of Back-propagation algorithm
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 RECENT DEVELOPMENTS IN MULTILAYER PERCEPTRON NEURAL NETWORKS Walter H. Delashmit Lockheed Martin Missiles and Fire Control Dallas, TX 75265
CS 478 – Tools for Machine Learning and Data Mining Backpropagation.
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 31: Feedforward N/W; sigmoid.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
ELeaRNT: Evolutionary Learning of Rich Neural Network Topologies Authors: Slobodan Miletic 3078/2010 Nikola Jovanovic 3077/2010
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
An Artificial Neural Network Approach to Surface Waviness Prediction in Surface Finishing Process by Chi Ngo ECE/ME 539 Class Project.
CS621 : Artificial Intelligence
CS344: Introduction to Artificial Intelligence (associated lab: CS386) Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 32: sigmoid neuron; Feedforward.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Introduction to Neural Networks Introduction to Neural Networks Applied to OCR and Speech Recognition An actual neuron A crude model of a neuron Computational.
Alice E. Smith and Mehmet Gulsen Department of Industrial Engineering
CAP6938 Neuroevolution and Artificial Embryogeny Neural Network Weight Optimization Dr. Kenneth Stanley January 18, 2006.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
Evolutionary Design of the Closed Loop Control on the Basis of NN-ANARX Model Using Genetic Algoritm.
Learning: Neural Networks Artificial Intelligence CMSC February 3, 2005.
Pattern Recognition Lecture 20: Neural Networks 3 Dr. Richard Spillman Pacific Lutheran University.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
CSE343/543 Machine Learning Mayank Vatsa Lecture slides are prepared using several teaching resources and no authorship is claimed for any slides.
Machine Learning Supervised Learning Classification and Regression
Neural networks.
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Announcements HW4 due today (11:59pm) HW5 out today (due 11/17 11:59pm)
Dr. Kenneth Stanley September 6, 2006
Classification with Perceptrons Reading:
EA, neural networks & fuzzy systems
CSSE463: Image Recognition Day 17
CS621: Artificial Intelligence
Genetic Algorithm Optimization for Selecting the Best Architecture of a Multi-Layer Perceptron Neural Network. A Credit Scoring Case Alejandro Correa,
Neural Networks Geoff Hulten.
CSSE463: Image Recognition Day 17
EE368 Soft Computing Genetic Algorithms.
CSSE463: Image Recognition Day 17
David Kauchak CS51A Spring 2019
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Learning Combinational Logic
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
CS621: Artificial Intelligence Lecture 17: Feedforward network (lecture 16 was on Adaptive Hypermedia: Debraj, Kekin and Raunak) Pushpak Bhattacharyya.
Presentation transcript:

Evolutionary Computation Evolving Neural Network Topologies

Project Problem There is a class of problems that are not linearly separable The XOR function is a member of this class The BACKPROPAGATION algorithm can express a variety of these non-linear decision surfaces

Project Problem Non-linear decision surfaces can’t be “learned” with a single perceptron Backprop uses a multi-layer approach where there is a number of input units, a number of “hidden” units, and the corresponding output units

Parametric Optimization Parametric Optimization was the goal of this project The parameter to be optimized was the number of hidden units in the hidden layer of a backprop network used to learn the output for the XOR benchmark function

Tool Boxes A neural network tool box developed by Herve Abdi (available from Matlab Central) was used for the backprop application The Genetic Algorithm for Function Optimization (GAOT) was used for the GA application

Graphical Illustrations The next plot shows the randomness associated with successive runs of the backprop algorithm xx = mytestbpg(1,5000,0.25) xx = mytestbpg(1,5000,0.25) Where 1 is the number of hidden units, 5000 is the number of training iterations, and 0.25 is the learning rate

Graphical Illustrations

The error from the above plot (and the following) is calculated as follows:

Graphical Illustrations E(n) is the error at time epoch n T i (n) = [0 11 0] is the output of the target function with input sets: (0,0), (0,1), (1,0), and (1,1) at time epoch n O i (n) = [o 1 o 2 o 3 o 4 ] is the output of the backprop network at time epoch n Notice that the training examples will cover the entire state space of this function

Graphical Illustrations The next few plots will show the effects of the rate of error convergence for different numbers of hidden units

Graphical Illustrations-1 hidden

Graphical Illustrations-1,2 hidden

Graphical Illustrations-1,2,3,5 hidden

Graphical Illustrations-1,2,3,5,50 hidden

Parametric Optimization The GA supplied by the GAOT tool box was used to optimize the number of hidden units needed for the XOR backprop benchmark A real-valued (floating point representation instead of binary value representation) was used in conjunction with the selection, mutation, and cross-over operators

Parametric Optimization The fitness (evaluation) function is the driving factor for the GA in the GAOT toolbox The fitness function is specific to the problem at hand

Fitness Function

Parametric Optimization Authors of the “NEAT” paper give the following results for their NeuroEvolutionary implementation Optimum number of hidden nodes (average value): 2.35 Average number of generations: 32

Parametric Optimization Results of my experimentation:’ Optimum number of hidden units (average value): 2.9 Convergence of this value after approximately 17 generations

Parametric Optimization GA parameters: Population size of 20 Maximum number of generations: 50 Backprop fixed parameters: 5000 training iterations; learning rate of 0.25 Note: Approximately 20 minutes per run of the GA with these parameter values. Running on Matlab with 768MB of RAM at 2.2 GHz

Initial Population Values

Final Population Values

Paths of Best and Average Solution for a Run of the GA

Target and Backprop Output

Conclusion Interesting experimental results. Also agrees with other researchers results GA is a good tool to use for parameter optimization. However, the results depend on a good fitness function for the task at hand