CAP6938 Neuroevolution and Developmental Encoding Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley October 25, 2006.

Slides:



Advertisements
Similar presentations
Introduction to Neural Networks
Advertisements

Formal Computational Skills
Perceptron Lecture 4.
Chrisantha Fernando & Sampsa Sojakka
NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Neural Networks  A neural network is a network of simulated neurons that can be used to recognize instances of patterns. NNs learn by searching through.
Introduction to Training and Learning in Neural Networks n CS/PY 399 Lab Presentation # 4 n February 1, 2001 n Mount Union College.
Artificial Spiking Neural Networks
Functional Link Network. Support Vector Machines.
Neuro-Evolution of Augmenting Topologies Ben Trewhella.
Rules for Information Maximization in Spiking Neurons Using Intrinsic Plasticity Prashant Joshi & Jochen Triesch { joshi,triesch
Artificial Intelligence (CS 461D)
Producing Artificial Neural Networks using a Simple Embryogeny Chris Bowers School of Computer Science, University of Birmingham White.
Evolving Neural Network Agents in the NERO Video Game Author : Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen Presented by Yi Cheng Lin.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Modeling The quadratic integrate and fire follows from a reduced form (1) where F(V) is a voltage dependant function which aims to capture the voltage.
November 5, 2009Introduction to Cognitive Science Lecture 16: Symbolic vs. Connectionist AI 1 Symbolism vs. Connectionism There is another major division.
SME Review - September 20, 2006 Neural Network Modeling Jean Carlson, Ted Brookings.
1. 2 overview Background on video games Background on video games Neural networks Neural networks NE NE NEAT NEAT rtNEAT rtNEAT NERO NERO.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Artificial Intelligence
Nawaf M Albadia Introduction. Components. Behavior & Characteristics. Classes & Rules. Grid Dimensions. Evolving Cellular Automata using Genetic.
Neural networks - Lecture 111 Recurrent neural networks (II) Time series processing –Networks with delayed input layer –Elman network Cellular networks.
Discovery of Cellular Automata Rules Using Cases Ken-ichi Maeda Chiaki Sakama Wakayama University Discovery Science 2003, Oct.17.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
CAP6938 Neuroevolution and Developmental Encoding Working with NEAT Dr. Kenneth Stanley September 27, 2006.
Conceptual Foundations © 2008 Pearson Education Australia Lecture slides for this course are based on teaching materials provided/referred by: (1) Statistics.
CAP6938 Neuroevolution and Developmental Encoding Developmental Encoding 2 Dr. Kenneth Stanley October 9, 2006.
CAP6938 Neuroevolution and Developmental Encoding Developmental Encoding Dr. Kenneth Stanley October 2, 2006.
CAP6938 Neuroevolution and Developmental Encoding Real-time NEAT Dr. Kenneth Stanley October 18, 2006.
CAP6938 Neuroevolution and Developmental Encoding Basic Concepts Dr. Kenneth Stanley August 23, 2006.
ECE 576 – Power System Dynamics and Stability Prof. Tom Overbye Dept. of Electrical and Computer Engineering University of Illinois at Urbana-Champaign.
Autonomous Virtual Humans Tyler Streeter. Contents Introduction Introduction Implementation Implementation –3D Graphics –Simulated Physics –Neural Networks.
Pac-Man AI using GA. Why Machine Learning in Video Games? Better player experience Agents can adapt to player Increased variety of agent behaviors Ever-changing.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
CAP6938 Neuroevolution and Artificial Embryogeny Competitive Coevolution Dr. Kenneth Stanley February 20, 2006.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
CAP6938 Neuroevolution and Developmental Encoding Evolving Adaptive Neural Networks Dr. Kenneth Stanley October 23, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley March 6, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Evolving Adaptive Neural Networks Dr. Kenneth Stanley March 1, 2006.
It’s raining outside; want to go to the pub? It’s dry outside; want to go to the pub? Sure; I’ll grab the umbrella. What, are you insane? I’ll grab the.
Distributed Pattern Recognition System, Web-based by Nadeem Ahmed.
Dialog Processing with Unsupervised Artificial Neural Networks Andrew Richardson Thomas Jefferson High School for Science and Technology Computer Systems.
Why use landscape models?  Models allow us to generate and test hypotheses on systems Collect data, construct model based on assumptions, observe behavior.
CAP6938 Neuroevolution and Artificial Embryogeny Neural Network Weight Optimization Dr. Kenneth Stanley January 18, 2006.
CAP6938 Neuroevolution and Artificial Embryogeny Approaches to Neuroevolution Dr. Kenneth Stanley February 1, 2006.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory Dr. Kenneth Stanley January 25, 2006.
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Introduction to the TLearn Simulator n CS/PY 231 Lab Presentation # 5 n February 16, 2005 n Mount Union College.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
Ghent University Accelerating Event Based Simulation for Multi-Synapse Spiking Neural Networks September 13, ICANN 2006 Michiel D'Haene, Benjamin.
On Routine Evolution of Complex Cellular Automata
Dr. Kenneth Stanley September 13, 2006
Dialog Processing with Unsupervised Artificial Neural Networks
Dr. Kenneth Stanley September 6, 2006
Introduction to CAP6938 Neuroevolution and Developmental Encoding
Added After Talk Looking for a review paper on evolving plastic networks? Here is a recent one from Andrea Soltoggio, Sebastian Risi, and Kenneth Stanley:
The use of Neural Networks to schedule flow-shop with dynamic job arrival ‘A Multi-Neural Network Learning for lot Sizing and Sequencing on a Flow-Shop’
Dialog Processing with Unsupervised Artificial Neural Networks
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Continuous attractor neural networks (CANNs)
Presentation transcript:

CAP6938 Neuroevolution and Developmental Encoding Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley October 25, 2006

Artificial Neurons are a Model Standard activation model But a real neuron doesn’t have an activation level –Real neurons fire in spike trains –Spikes/second is a rate –Therefore, standard activation can be thought of as outputting a firing rate at discrete timesteps (i.e. rate encoding) Wolfgang Maass,

What is Lost in Rate Encoding? Timing information Synchronization Activity between discrete timesteps <> 30 Neurons Firing in a monkey’s striate cortex From Krüger and Aiple [Krüger and Aiple, 1988].Krüger and Aiple, 1988 Reprinted from maass/123/node2.htmlwww.igi.tugraz.at/ maass/123/node2.html

Spikes Can Be Encoded Explicitly Leaky integrate and fire neurons Encode each individual spike Time is represented exactly Each spike has an associated time The timing of recent incoming spikes determines whether a neuron will fire Computationally expensive Can we do almost as well without encoding every single spike?

Yes: Leaky Integrator Neurons (CTRNNS; Continuous Time Recurrent Neural Networks) Idea: Calculate activation at discrete steps but describe rate of change on a continuous scale Instead of activating only based on input, include a temporal component of activation that controls the rate at which activation goes up or down Then the neuron can react to changes in a temporal manner, like spikes

Activation Rate Builds and Decays Incoming activation causes the output level to climb over time We can sample the rate at any discrete granularity desired A view is created of temporal dynamics without full spike-event simulation Activation Level (i.e. spike rate) time Input to neuron Output over time

What is Leaking In a Leaky Integrator? The neuron loses potential at a defined rate Each neuron leaks at its own constant rate Each neuron integrates at the same constant rate as well Activation Level (i.e. spike rate) time Leaking activation level (membrane potential)

Leaky Integrator Equations Expressing rate of change of activation level: Apply Euler Integration to derive discrete- time equivalent Expressing current activation in terms of activation on previous discrete timestep: Leak Real time Between steps Equations from: Blynel, J., and Floreano, D. (2002). Levels of dynamics neural controllers. In Proceedings of the Seventh International Behavior on From Animals to Animats, 272–281.

What Can a CTRNN Do? With the right time constants for each neuron, complex temporal patterns can be generated That is, the time constants are a new parameter (inside nodes) that can evolve More powerful than a regular RNN Capable of generating complex temporal patterns with no input and no clock

Pattern Generation for What? Walking gaits with no input! Evolution of central pattern generators for bipedal walking in a real-time physics environment Evolution of central pattern generators for bipedal walking in a real-time physics environment T Reil, P Husbands - Evolutionary Computation, IEEE Transactions on, 2002

Reil and Husbands Went on to Found the Company NaturalMotion

Pattern Generation for What? Salamander walking gait Wing flapping Ijspeert A.J.: A connectionist central pattern generator for the aquatic and terrestrial gaits of a simulated salamander, Biological Cybernetics, Vol. 84:5, 2001, pp Biological Cybernetics Evolution of neuro-controllers for flapping-wing animatsEvolution of neuro-controllers for flapping-wing animats - group of 2 » JB Mouret, S Doncieux, L Muratet, T Druot, JA … - Proceedings of the Journees MicroDrones, Toulouse, 2004group of 2 »

Maybe Good for Other Things with Temporal Patterning Music? Tasks that we typically do not conceive in terms of patterns? Learning tasks (better than a simple RNN?; Blynel and Floreano 2002 paper) Largely unexplored How far away from the benefits of a true spiking model?

Leaky NEAT There is a rough, largely untested leakyNEAT at the NEAT Users Group files section: – –Introduces a new activation function and new time constant parameter in the nodes A new leaky-rtNEAT will soon be available too The topology of most CTRNNs in the past was determined completely by the researcher

Next Topic: Non-neural NEAT, Closing Remarks on Survey Portion of Class Complexification and protection of innovation in non-neural structures Example: Cellular Automata neighborhood functions What have we learned, what is its significance, and where does the field stand? Reading: Mitchell Textbook pp (Evolving Cellular Automata) think about: How would NEAT apply to this task?