CAP6938 Neuroevolution and Artificial Embryogeny Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley March 6, 2006.

Slides:



Advertisements
Similar presentations
Approaches, Tools, and Applications Islam A. El-Shaarawy Shoubra Faculty of Eng.
Advertisements

Department of Electronic Engineering NUIG Direct Evolution of Patterns using Genetic Algorithms By: John Brennan Supervisor: John Maher.
Chrisantha Fernando & Sampsa Sojakka
NEU Neural Computing MSc Natural Computation Department of Computer Science University of York.
Evis Trandafili Polytechnic University of Tirana Albania Functional Programming Languages 1.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial Spiking Neural Networks
CAP6938 Neuroevolution and Developmental Encoding Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley October 25, 2006.
Structure learning with deep neuronal networks 6 th Network Modeling Workshop, 6/6/2013 Patrick Michl.
Neuro-Evolution of Augmenting Topologies Ben Trewhella.
Producing Artificial Neural Networks using a Simple Embryogeny Chris Bowers School of Computer Science, University of Birmingham White.
1. Elements of the Genetic Algorithm  Genome: A finite dynamical system model as a set of d polynomials over  2 (finite field of 2 elements)  Fitness.
Evolving Neural Network Agents in the NERO Video Game Author : Kenneth O. Stanley, Bobby D. Bryant, and Risto Miikkulainen Presented by Yi Cheng Lin.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
November 5, 2009Introduction to Cognitive Science Lecture 16: Symbolic vs. Connectionist AI 1 Symbolism vs. Connectionism There is another major division.
1. 2 overview Background on video games Background on video games Neural networks Neural networks NE NE NEAT NEAT rtNEAT rtNEAT NERO NERO.
Evolutionary Computation Application Peter Andras peter.andras/lectures.
October 7, 2010Neural Networks Lecture 10: Setting Backpropagation Parameters 1 Creating Data Representations On the other hand, sets of orthogonal vectors.
Neural Network Tools. Neural Net Concepts The package provides a “standard” multi-layer perceptron –Composed of layers of neurons –All neurons in a layer.
Nawaf M Albadia Introduction. Components. Behavior & Characteristics. Classes & Rules. Grid Dimensions. Evolving Cellular Automata using Genetic.
Evolutionary Robotics NEAT / HyperNEAT Stanley, K.O., Miikkulainen (2001) Evolving Neural Networks through Augmenting Topologies. Competing Conventions:
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
1 Evolutionary Growth of Genomes for the Development and Replication of Multi-Cellular Organisms with Indirect Encodings Stefano Nichele and Gunnar Tufte.
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
CAP6938 Neuroevolution and Developmental Encoding Working with NEAT Dr. Kenneth Stanley September 27, 2006.
CS 484 – Artificial Intelligence1 Announcements Lab 4 due today, November 8 Homework 8 due Tuesday, November 13 ½ to 1 page description of final project.
Investigation of the Effect of Neutrality on the Evolution of Digital Circuits. Eoin O’Grady Final year Electronic and Computer Engineering Project.
CAP6938 Neuroevolution and Developmental Encoding Developmental Encoding 2 Dr. Kenneth Stanley October 9, 2006.
Evolving Virtual Creatures & Evolving 3D Morphology and Behavior by Competition Papers by Karl Sims Presented by Sarah Waziruddin.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Evolutionary Art with Multiple Expression Programming By Quentin Freeman.
CAP6938 Neuroevolution and Developmental Encoding Developmental Encoding Dr. Kenneth Stanley October 2, 2006.
1 Departament of Bioengineering, University of California 2 Harvard Medical School Department of Genetics Metabolic Flux Balance Analysis and the in Silico.
CAP6938 Neuroevolution and Developmental Encoding Real-time NEAT Dr. Kenneth Stanley October 18, 2006.
CAP6938 Neuroevolution and Developmental Encoding Basic Concepts Dr. Kenneth Stanley August 23, 2006.
Mike Taks Bram van de Klundert. About Published 2005 Cited 286 times Kenneth O. Stanley Associate Professor at University of Central Florida Risto Miikkulainen.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Autonomous Virtual Humans Tyler Streeter. Contents Introduction Introduction Implementation Implementation –3D Graphics –Simulated Physics –Neural Networks.
Pac-Man AI using GA. Why Machine Learning in Video Games? Better player experience Agents can adapt to player Increased variety of agent behaviors Ever-changing.
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
CAP6938 Neuroevolution and Artificial Embryogeny Competitive Coevolution Dr. Kenneth Stanley February 20, 2006.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
CAP6938 Neuroevolution and Developmental Encoding Evolving Adaptive Neural Networks Dr. Kenneth Stanley October 23, 2006.
ECE 576 – Power System Dynamics and Stability
CAP6938 Neuroevolution and Artificial Embryogeny Evolving Adaptive Neural Networks Dr. Kenneth Stanley March 1, 2006.
Evolutionary Robotics The Genotype-to-Phenotype Map The genotype to phenotype map: the algorithm that transforms the genotype into the phenotype. Direct.
CAP6938 Neuroevolution and Artificial Embryogeny Neural Network Weight Optimization Dr. Kenneth Stanley January 18, 2006.
Organic Evolution and Problem Solving Je-Gun Joung.
CAP6938 Neuroevolution and Artificial Embryogeny Approaches to Neuroevolution Dr. Kenneth Stanley February 1, 2006.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Computation Theory Dr. Kenneth Stanley January 25, 2006.
Evolvable Hardware (EHW) Topic Review S08*ENGG*6530 Antony Savich.
CAP6938 Neuroevolution and Artificial Embryogeny Evolutionary Comptation Dr. Kenneth Stanley January 23, 2006.
Robot Intelligence Technology Lab. 10. Complex Hardware Morphologies: Walking Machines Presented by In-Won Park
The Problem of Pattern and Scale in Ecology - Summary What did this paper do that made it a citation classic? 1.It summarized a large body of work on spatial.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
An Evolutionary Algorithm for Neural Network Learning using Direct Encoding Paul Batchis Department of Computer Science Rutgers University.
CAP6938 Neuroevolution and Artificial Embryogeny Real-time NEAT Dr. Kenneth Stanley February 22, 2006.
Dr. Kenneth Stanley September 11, 2006
On Routine Evolution of Complex Cellular Automata
CAP6938 Neuroevolution and Artificial Embryogeny Artificial Embryogeny
Dr. Kenneth Stanley January 30, 2006
Dr. Kenneth Stanley September 13, 2006
Dr. Kenneth Stanley September 25, 2006
Added After Talk Looking for a review paper on evolving plastic networks? Here is a recent one from Andrea Soltoggio, Sebastian Risi, and Kenneth Stanley:
Dr. Kenneth Stanley February 6, 2006
A Gentle introduction Richard P. Simpson
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
Presentation transcript:

CAP6938 Neuroevolution and Artificial Embryogeny Leaky Integrator Neurons and CTRNNs Dr. Kenneth Stanley March 6, 2006

Artificial Neurons are a Model Standard activation model But a real neuron doesn’t have an activation level –Real neurons fire in spike trains –Spikes/second is a rate –Therefore, standard activation can be thought of as outputting a firing rate at discrete timesteps (i.e. rate encoding) Wolfgang Maass,

What is Lost in Rate Encoding? Timing information Synchronization Activity between discrete timesteps <> 30 Neurons Firing in a monkey’s striate cortex From Krüger and Aiple [Krüger and Aiple, 1988].Krüger and Aiple, 1988 Reprinted from maass/123/node2.htmlwww.igi.tugraz.at/ maass/123/node2.html

Spikes Can Be Encoded Explicitly Leaky integrate and fire neurons Encode each individual spike Time is represented exactly Each spike has an associated time The timing of recent incoming spikes determines whether a neuron will fire Computationally expensive Can we do almost as well without encoding every single spike?

Yes: Leaky Integrator Neurons (CTRNNS; Continuous Time Recurrent Neural Networks) Idea: Calculate activation at discrete steps but describe rate of change on a continuous scale Instead of activating only based on input, include a temporal component of activation that controls the rate at which activation goes up or down Then the neuron can react to changes in a temporal manner, like spikes

Activation Rate Builds and Decays Incoming activation causes the output level to climb over time We can sample the rate at any discrete granularity desired A view is created of temporal dynamics without full spike-event simulation Activation Level (i.e. spike rate) time Input to neuron Output over time

What is Leaking In a Leaky Integrator? The neuron loses potential at a defined rate Each neuron leaks at its own constant rate Each neuron integrates at the same constant rate as well Activation Level (i.e. spike rate) time Leaking activation level (membrane potential)

Leaky Integrator Equations Expressing rate of change of activation level: Apply Euler Integration to derive discrete- time equivalent Expressing current activation in terms of activation on previous discrete timestep: Leak Real time Between steps Equations from: Blynel, J., and Floreano, D. (2002). Levels of dynamics neural controllers. In Proceedings of the Seventh International Behavior on From Animals to Animats, 272–281.

What Can a CTRNN Do? With the right time constants for each neuron, complex temporal patterns can be generated That is, the time constants are a new parameter (inside nodes) that can evolve More powerful than a regular RNN Capable of generating complex tenporal patterns with no input and no clock

Pattern Generation for What? Walking gaits with no input! Evolution of central pattern generators for bipedal walking in a real-time physics environment Evolution of central pattern generators for bipedal walking in a real-time physics environment T Reil, P Husbands - Evolutionary Computation, IEEE Transactions on, 2002

Reil and Husbands Went on to Found the Company NaturalMotion

Pattern Generation for What? Salamander walking gait Wing flapping Ijspeert A.J.: A connectionist central pattern generator for the aquatic and terrestrial gaits of a simulated salamander, Biological Cybernetics, Vol. 84:5, 2001, pp Biological Cybernetics Evolution of neuro-controllers for flapping-wing animatsEvolution of neuro-controllers for flapping-wing animats - group of 2 » JB Mouret, S Doncieux, L Muratet, T Druot, JA … - Proceedings of the Journees MicroDrones, Toulouse, 2004group of 2 »

Maybe Good for Other Things with Temporal Patterning Music? Picture Drawing? (certain types of patterns) Tasks that we typically do not conceive in terms of patterns? Learning tasks (better than a simple RNN?; Blynel and Floreano 2002 paper) Largely unexplored How far away from the benefits of a true spiking model?

Leaky NEAT There is a rough, largely untested leakyNEAT at the NEAT Users Group files section: – –Introduces a new activation function and new time constant parameter in the nodes The topology of most CTRNNs in the past was determined completely by the researcher

Homework due 3/8/06 (see next slide) Next Topic: Non-neural NEAT, Closing Remarks on Survey Portion of Class Complexification and protection of innovation in non-neural structures Example: Cellular Automata neighborhood functions What have we learned, what is its significance, and where does the field stand? Read for 3/8/06: Mitchell Textbook pp (Evolving Cellular Automata) think about: How would NEAT apply to this task?

Homework Due 3/8/06 Genetic operators all working: Mating two genomes: mate_multipoint, mate_multipoint_avg, others Compatibility measuring: return distance of two genomes from each other based on coefficients in compatibility equation and historical markings Structural mutations: mutate_add_link, mutate_add_node, others Weight/parameter mutations: mutate_link_weights, mutating other parameters Special mutations: mutate_link_enable_toggle (toggle enable flag), etc. Special restrictions: control probability of certain types of mutations such as adding a recurrent connection vs. a feedforward connection Turn in summary, code, and examples demonstrating that all functions work. Must include checks that phenotypes from genotypes that are new or altered are created properly and work.

Project Milestones (25% of grade) 2/6: Initial proposal and project description 2/15: Domain and phenotype code and examples 2/27: Genes and Genotype to Phenotype mapping 3/8: Genetic operators all working 3/27: Population level and main loop working 4/10: Final project and presentation due (75% of grade)