CAP6938 Neuroevolution and Artificial Embryogeny Evolving Adaptive Neural Networks Dr. Kenneth Stanley March 1, 2006.

Slides:



Advertisements
Similar presentations
Introduction to Artificial Neural Networks
Advertisements

Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Computer Vision Lecture 18: Object Recognition II
Neural Representation, Embodied and Evolved Pete Mandik Chairman, Department of Philosophy Coordinator, Cognitive Science Laboratory William Paterson University,
Evolving Cutting Horse and Sheepdog Behavior with a Simulated Flock Chris Beacham Computer Systems Research Lab 2009.
Tuomas Sandholm Carnegie Mellon University Computer Science Department
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Learning crossmodal spatial transformations through STDP Gerhard Neumann Seminar B, SS 06.
CAP6938 Neuroevolution and Developmental Encoding Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley October 25, 2006.
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Neuro-Evolution of Augmenting Topologies Ben Trewhella.
Producing Artificial Neural Networks using a Simple Embryogeny Chris Bowers School of Computer Science, University of Birmingham White.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
Evolving Neural Network Agents in the NERO Video Game Author : Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen Presented by Yi Cheng Lin.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
September 14, 2010Neural Networks Lecture 3: Models of Neurons and Neural Networks 1 Visual Illusions demonstrate how we perceive an “interpreted version”
Artificial neural networks.
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
Evolutionary Robotics NEAT / HyperNEAT Stanley, K.O., Miikkulainen (2001) Evolving Neural Networks through Augmenting Topologies. Competing Conventions:
1 Evolutionary Growth of Genomes for the Development and Replication of Multi-Cellular Organisms with Indirect Encodings Stefano Nichele and Gunnar Tufte.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
An Introduction to Artificial Intelligence and Knowledge Engineering N. Kasabov, Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering,
CAP6938 Neuroevolution and Developmental Encoding Working with NEAT Dr. Kenneth Stanley September 27, 2006.
Hybrid AI & Machine Learning Systems Using Ne ural Networks and Subsumption Architecture By Logan Kearsley.
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
Chih-Ming Chen, Student Member, IEEE, Ying-ping Chen, Member, IEEE, Tzu-Ching Shen, and John K. Zao, Senior Member, IEEE Evolutionary Computation (CEC),
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
1 Machine Learning: Lecture 12 Genetic Algorithms (Based on Chapter 9 of Mitchell, T., Machine Learning, 1997)
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
CAP6938 Neuroevolution and Developmental Encoding Real-time NEAT Dr. Kenneth Stanley October 18, 2006.
Artificial Life/Agents Creatures: Artificial Life Autonomous Software Agents for Home Entertainment Stephen Grand, 1997 Learning Human-like Opponent Behaviour.
CAP6938 Neuroevolution and Developmental Encoding Basic Concepts Dr. Kenneth Stanley August 23, 2006.
Mike Taks Bram van de Klundert. About Published 2005 Cited 286 times Kenneth O. Stanley Associate Professor at University of Central Florida Risto Miikkulainen.
Autonomous Virtual Humans Tyler Streeter. Contents Introduction Introduction Implementation Implementation –3D Graphics –Simulated Physics –Neural Networks.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Pac-Man AI using GA. Why Machine Learning in Video Games? Better player experience Agents can adapt to player Increased variety of agent behaviors Ever-changing.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Synaptic Dynamics: Unsupervised Learning
Artificial Intelligence Research in Video Games By Jacob Schrum
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
CAP6938 Neuroevolution and Artificial Embryogeny Competitive Coevolution Dr. Kenneth Stanley February 20, 2006.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
CAP6938 Neuroevolution and Developmental Encoding Evolving Adaptive Neural Networks Dr. Kenneth Stanley October 23, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley March 6, 2006.
Evolutionary Robotics The Genotype-to-Phenotype Map The genotype to phenotype map: the algorithm that transforms the genotype into the phenotype. Direct.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
CAP6938 Neuroevolution and Artificial Embryogeny Neural Network Weight Optimization Dr. Kenneth Stanley January 18, 2006.
Minds and Computers Discovering the nature of intelligence by studying intelligence in all its forms: human and machine Artificial intelligence (A.I.)
Evolutionary Robotics The French Approach Jean-Arcady Meyer Commentator on the growth of the field. Animats: artificial animals anima-materials Coined.
CAP6938 Neuroevolution and Artificial Embryogeny Approaches to Neuroevolution Dr. Kenneth Stanley February 1, 2006.
Where are we? What’s left? HW 7 due on Wednesday Finish learning this week. Exam #4 next Monday Final Exam is a take-home handed out next Friday in class.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Artificial Embryogeny
Dr. Kenneth Stanley January 30, 2006
Dr. Kenneth Stanley September 25, 2006
Dr. Kenneth Stanley September 6, 2006
Added After Talk Looking for a review paper on evolving plastic networks? Here is a recent one from Andrea Soltoggio, Sebastian Risi, and Kenneth Stanley:
Dr. Kenneth Stanley February 6, 2006
The Naïve Bayes (NB) Classifier
Machine Learning: UNIT-4 CHAPTER-2
Presentation transcript:

CAP6938 Neuroevolution and Artificial Embryogeny Evolving Adaptive Neural Networks Dr. Kenneth Stanley March 1, 2006

Remember This Thing? What’s missing from current neural models?

An ANN Link is a Synapse (from Dr. George Johnson at )

What Happens at Synapses? Weighted signal transmission But also: –Strengthening –Weakening –Sensitization –Habituation –Hebbian learning –None of these weight changes during a lifetime are happening in static models!

Why Should Weights Change? The world changes Evolution cannot predict all future possibilities Evolution can succeed with less accuracy The Baldwin Effect –Learning smooths the fitness landscape –Traits that initially require learning eventually become instinct if the environment is consistent If the mind is static, you can’t learn!

How Should Weights Change? Remember Hebbian Learning? (lecture 3) –Weight update based on correlation: –Incremental version: How can this be made to evolve? –Which weights should be adaptive? Which rule should they follow if there is more than one? –Which weights should be fixed? –To what degree should they adapt (evolve alpha) Evolve alpha parameter on each link

Floreano’s Weight Update Equations Plain Hebb Rule: Postsynaptic rule: –Weakens synapse if postsynaptic node fires alone Presynaptic rule: Covariance rule: Strengthens when correlated, weakens when not

Floreano’s Genetic Encoding

Experiment: Light-switching Task: Go to black area to turn on light, then go to area under light Requires a policy change in mid-task: Reconfigure weights for new policy Fully Recurrent Network Blynel, J.Blynel, J. and Floreano, D. (2002) Levels of Dynamics and Adaptive Behavior in Evolutionary Neural Controllers. In B. Hallam, D. Floreano, J. Hallam, G. Hayes, and J.-A. Meyer, editors. From Animals to Animats 7: Proceedings of the Seventh International Conference on Simulation on Adaptive Behavior, MIT Press.Floreano, D.

Results Adaptive synapse networks evolved straighter and faster trajectories Rapid and appropriate weight modifications occur at the moment if change

However, It’s Not That Simple A recurrent network with fixed synapses can change its policy too The activation levels cycling through the network are a kind of memory that can affect its functioning Do we need synaptic adaptation at all? Experiment in paper: Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen (2003). Evolving Adaptive Neural Networks with and without Adaptive Synapses, Proceedings of the 2003 IEEE Congress on Evolutionary Computation (CEC-2003).

Experimental Domain: Dangerous Food Foraging Food may be poisonous or may not No way to tell at birth Only way to tell is to try one Then policy should depend on “pain” or not

Condensed Floreano Rules Two adaptation rules: One for excitatory connections, the other for inhibitory: First term is Hebbian, second term is a decay term

NEAT Trick: Use “Traits” to Prevent Dimensionality Multiplication One set of rules/traits Each connection gene points to one of the rules Rules evolve in parallel with network Weights evolve as usual

Robot NNs

Surprising Result Fixed-weight recurrent networks could evolve a solution more efficiently! Adaptive networks found solutions, but more slowly and less reliably

Explanation Fixed networks evolved a “trick”: Strong inhibitory recurrent connection on left turn output cause it to stay on until it experiences pain. Then it turns off and robot spins (from right turn output) until it doesn’t see food anymore, and it runs to the wall In adaptive network, 22% of connections diverge after pain, causing network to spin in place: a holistic change

Discussion Adaptive neurons are not for everything, not even all adaptive tasks In non-adaptive tasks, they only add unnecessary dimensions to the search space In adaptive tasks, they may be best for tasks requiring holistic solutions What are those? Don’t underestimate the power of recurrence

Homework due 3/8/06 (see next slide) Next Topic: Leaky Integrator Neurons, CTRNNs, and Pattern Generators Real neurons encode information as spikes and spike trains with differing rates Dendrite may integrate spike train at different rates Rate differences can create central pattern generators without a clock! Levels of dynamics and adaptive behavior in evolutionary neural controllersLevels of dynamics and adaptive behavior in evolutionary neural controllers by Blynel, J., and Floreano, D. (2002) Evolution of Central Pattern Generators for Bipedal Walking in a Real-Time Physics Environment by Torsten Reil and Phil Husbands (2002) Optional: Evolution and analysis of model CPGs for walking I. Dynamical modules by Chiel, H.J., Beer, R.D. and Gallagher, J.C. (1999) Evolution of Central Pattern Generators for Bipedal Walking in a Real-Time Physics EnvironmentEvolution and analysis of model CPGs for walking I. Dynamical modules

Homework Due 3/8/06 Genetic operators all working: Mating two genomes: mate_multipoint, mate_multipoint_avg, others Compatibility measuring: return distance of two genomes from each other based on coefficients in compatibility equation and historical markings Structural mutations: mutate_add_link, mutate_add_node, others Weight/parameter mutations: mutate_link_weights, mutating other parameters Special mutations: mutate_link_enable_toggle (toggle enable flag), etc. Special restrictions: control probability of certain types of mutations such as adding a recurrent connection vs. a feedforward connection Turn in summary, code, and examples demonstrating that all functions work. Must include checks that phenotypes from genotypes that are new or altered are created properly and work.

Project Milestones (25% of grade) 2/6: Initial proposal and project description 2/15: Domain and phenotype code and examples 2/27: Genes and Genotype to Phenotype mapping 3/8: Genetic operators all working 3/27: Population level and main loop working 4/10: Final project and presentation due (75% of grade)