Download presentation
Presentation is loading. Please wait.
Published byEarl Thornton Modified over 6 years ago
1
Linking non-binned spike train kernels to several existing spike train metrics
27 April ESANN 2006 Benjamin Schrauwen and Jan Van Campenhout Electronics and Information Systems Department Ghent University
2
Overview Spike train processing Non-binned spike train kernel
Linking to existing spike train metrics Toy application Conclusion
3
Spike train processing
Spike train = a set of events Types of spike train processing: Physiological data: Classification/clustering used for better understanding of biological neural systems Information coding and representation Neural man-machine interface ‘Engineering’ spike trains SNN more powerful than ANN [Maass] But representing information and training networks is still difficult stimulus biological neural system detect stimulus Need for state of the art classification techniques that can operate on spike trains
4
Non-binned spike train kernel
Linear discriminant solution can be written in inner-product form: h(x)=iiyi(xitx)+b Can be easily made non-linear by mapping to feature space: : X → F with x = (x1, x2, ... xn) → (1(x), 2(x), ... N(x)) h(x) = iiyi((xi)t(x))+b = iiyiK(xi,x)+b Feature space does not need to be explicit: Mercer kernels: K(xi,x) implicitly defines Many classification/clustering techniques can be written in inner-product form: can be easily ‘kernelised’: SVM (kernel maximum-margin linear classifier), kernel PCA, kernel k-means, ...
5
Non-binned spike train kernel
Classic techniques first convert set of events to vector by binning: based on rate coding hypothesis Recently much physiological evidence of temporal coding: exact spike times do matter So spike trains stay sets of events and not vectors Need for (kernel) techniques that are able to cope with sets
6
Non-binned spike train kernel
Kernels for spike trains: based on sets, not vectors Compare all to all set elements using an exponential measure: x=(t1,t2,..,ti,..,tN), z=(t1,t2,..,tj,..,tM) λ sets the kernel 'width' Can easily be proved to be a Mercer kernel Fast implementation possible for λ » 1
7
Linking to existing spike train metrics
Spike train algebra L2-norm for Gaussian filtered spike trains Spike train metric spaces
8
Linking to existing spike train metrics
Spikernel Alignment score kernel
9
Toy application detect original template 1 2
Fact 1: Acquiring physiological data is expensive So: Neurobiology departments don’t give away their datasets Fact 2: Our department is not neurobiology So: Our only option: use artificial data detect original template 1 2 ‘Toy’ application: Jittered spike train template matching Generate a small set of random spike trains and call them templates Generate large set of training/test data by randomly jittering spikes of templates Try to detect the original templates
10
Toy application SNN learning rule [Booij2005]
1 s templates, 10 Hz Poisson firing rate, 500 training, 200 testing Soft margin SVM (C=10) Laplacian and Gaussian very similar because Biologically realistic value: equals a membrane potential of 10 ms SNN learning rule [Booij2005] LSM based template classification [Maass2002]
11
Conclusions Two kernels that operate on spike trains have been presented These kernels are very related to several existing spike train metrics (which all embody the natural spike train variability) Due to this, the presented kernels are also biologically relevant The applicability of the kernels is demonstrated on an artificial spike train classification problem Future work: Test this technique on physiological data Spike based post-processing for the Liquid State Machine Make technique based on ISIs to solve preprocessing problems
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.