Presentation is loading. Please wait.

Presentation is loading. Please wait.

Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July 18 2006 - WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.

Similar presentations


Presentation on theme: "Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July 18 2006 - WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout."— Presentation transcript:

1 Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July 18 2006 - WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout Electronics and Information Systems Department Ghent University www.elis.UGent.be/SNN/

2 2/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Problem Current learning rules for Spiking Neurons have convergence problems and only support a single coding scheme. Outline Introduction Population-Temporal Coding Analog Spiking Neuron Approximation ASNA-Prop learning rule Results Conclusions

3 3/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Introduction Spiking Neural Networks: Uses spikes to communicate Neuron model is approx. leaky integrator with threshold Computationally more powerfull as analog neurons [Maass] Intrinsically able to process temporal information Biologically more plausible

4 4/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Introduction But learning is still a problem. Already published: Unsupervised Spike Timing Dependent Plasticity Genetic Algorithm, Evolutionary Strategy and Simulated Annealing Correlation based ALOPEX algorithm Supervised, gradient-based SpikeProp learning rule: Only supports time-to-first-spike coding Troubled by convergence problems Sensitive to exact weight initialisation Only able to move spikes, has no notion of creation and removal of spikes due to parameter changes t value = 1/∆t

5 5/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Introduction SpikeProp Very discontinous error landscape, resulting in convergence problems Desired spike times

6 6/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Population-Temporal Coding Many spike coding hypothesis exist: rate, population, time-to- first-spike, rank, filter,... This work uses Population-Temporal Coding: A combination of both temporal and population representation Embodies a large range of different coding possibilities z1z1 z2z2 z3z3 o1o1 o2o2 h 11 h 12 h 21 h 22 h 31 h 32 φ 11 φ 12 φ 21 φ 22 φ 31 φ 32

7 7/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Analog Spiking Neuron Approximation To solve the SpikeProp problems due the hard thresholding: Approximate SNNs by analog approximation with a 'soft' threshold Allows the calculation of gradients at every time step Use this model to contstruct learning rule, but apply this to the original spiking neuron model Analog approximation does not need to be simulated!

8 8/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Analog Spiking Neuron Approximation

9 9/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 ASNA-Prop learning rule Analog spiking neuron approximation is similar to a Output Feedback, Locally Recurrent, Globally Feed-forward neural network Learning rules for OF-LRNN presented by Campolucci and Piazza Used these ideas, but constructed new rules for ASNA ASNA-Prop is derived from analog approximation but applied to spiking model ! See publication for actual math...

10 10/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 ASNA-Prop learning rule ASNA-Prop has notion of spike removal and creation Supports multiple coding schemes Smooth convergence Improved performance greatly by using Resilient PROPagation Heuristic first order backpropagation speedup Was also tried for SpikeProp but didn't work due to discontinous error landscape Not sensitive to weight initialisation

11 11/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Results SpikeProp's time-to-first spike coding only allows timeless functions: y=f(x). Implicit temporal processing not used. PTR coding allows temporal processing: input and output are function of time First test: non-temporal XOR Rate based input coding (length 0.1s, 1 is coded as 500 Hz, 0 by no signal) Output PTR: one output, with Gaussian temporal filtering 3-4-1 architecture Converges in 80 epochs, enhanced SpikeProp in 120 epochs (but has 16 times more connections !)

12 12/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Results 20 runs

13 13/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Results Temporal XOR 40 XORs per example, length 50 ms

14 14/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Results Transcoding: ISI encoding as input, PTR as output Spiking neurons can speak multiple languages

15 15/15 Backpropagation for Population-Temporal Coded Spiking Neural Networks WCCI/IJCCN 2006 – July 18 2006 Conclusions New learning rules that eliminates all the problems of SpikeProp Support broad range of coding schemes Performance is much better than SpikeProp Future work: Apply to much larger benchmark problems Implement rules for other coding schemes (ttfs, ISI,...) Train more parameters to improve performance Influence of alpha need to be further investigated www.elis.UGent.be/SNN/


Download ppt "Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July 18 2006 - WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout."

Similar presentations


Ads by Google