27 April ESANN 2006 Benjamin Schrauwen and Jan Van Campenhout

Slides:



Advertisements
Similar presentations
AIME03, Oct 21, 2003 Classification of Ovarian Tumors Using Bayesian Least Squares Support Vector Machines C. Lu 1, T. Van Gestel 1, J. A. K. Suykens.
Advertisements

CSC321: Introduction to Neural Networks and Machine Learning Lecture 24: Non-linear Support Vector Machines Geoffrey Hinton.
Lecture 9 Support Vector Machines
ECG Signal processing (2)
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
INTRODUCTION TO Machine Learning 2nd Edition
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
ONLINE ARABIC HANDWRITING RECOGNITION By George Kour Supervised by Dr. Raid Saabne.
An Introduction of Support Vector Machine
An Introduction of Support Vector Machine
Support Vector Machines
SVMs Reprised. Administrivia I’m out of town Mar 1-3 May have guest lecturer May cancel class Will let you know more when I do...
An Overview of Machine Learning
Face Recognition & Biometric Systems Support Vector Machines (part 2)
Functional Link Network. Support Vector Machines.
Discriminative and generative methods for bags of features
Support Vector Machines
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Fuzzy Support Vector Machines (FSVMs) Weijia Wang, Huanren Zhang, Vijendra Purohit, Aditi Gupta.
Speeding up multi-task learning Phong T Pham. Multi-task learning  Combine data from various data sources  Potentially exploit the inter-relation between.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Artificial Intelligence Statistical learning methods Chapter 20, AIMA (only ANNs & SVMs)
Support Vector Classification (Linearly Separable Case, Primal) The hyperplanethat solves the minimization problem: realizes the maximal margin hyperplane.
Classification Problem 2-Category Linearly Separable Case A- A+ Malignant Benign.
Support Vector Machines
The Implicit Mapping into Feature Space. In order to learn non-linear relations with a linear machine, we need to select a set of non- linear features.
SVMs Finalized. Where we are Last time Support vector machines in grungy detail The SVM objective function and QP Today Last details on SVMs Putting it.
Laurent Itti: CS599 – Computational Architectures in Biological Vision, USC Lecture 7: Coding and Representation 1 Computational Architectures in.
SVMs, cont’d Intro to Bayesian learning. Quadratic programming Problems of the form Minimize: Subject to: are called “quadratic programming” problems.
An Introduction to Support Vector Machines Martin Law.
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
This week: overview on pattern recognition (related to machine learning)
An Example of Course Project Face Identification.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
计算机学院 计算感知 Support Vector Machines. 2 University of Texas at Austin Machine Learning Group 计算感知 计算机学院 Perceptron Revisited: Linear Separators Binary classification.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
An Introduction to Support Vector Machines (M. Law)
Handwritten digit recognition
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Support Vector Machines.
Support Vector Machines. Notation Assume a binary classification problem. –Instances are represented by vector x   n. –Training examples: x = (x 1,
Fast Query-Optimized Kernel Machine Classification Via Incremental Approximate Nearest Support Vectors by Dennis DeCoste and Dominic Mazzoni International.
Support Vector Machines Reading: Ben-Hur and Weston, “A User’s Guide to Support Vector Machines” (linked from class web page)
July 23, BSA, a Fast and Accurate Spike Train Encoding Scheme Benjamin Schrauwen.
IJCNN, July 27, 2004 Extending SpikeProp Benjamin Schrauwen Jan Van Campenhout Ghent University Belgium.
Ghent University Compact hardware for real-time speech recognition using a Liquid State Machine Benjamin Schrauwen – Michiel D’Haene David Verstraeten.
Ghent University An overview of Reservoir Computing: theory, applications and implementations Benjamin Schrauwen David Verstraeten and Jan Van Campenhout.
Support Vector Machines Optimization objective Machine Learning.
Ghent University Backpropagation for Population-Temporal Coded Spiking Neural Networks July WCCI/IJCNN 2006 Benjamin Schrauwen and Jan Van Campenhout.
Knowledge-Based Nonlinear Support Vector Machine Classifiers Glenn Fung, Olvi Mangasarian & Jude Shavlik COLT 2003, Washington, DC. August 24-27, 2003.
Non-separable SVM's, and non-linear classification using kernels Jakob Verbeek December 16, 2011 Course website:
Support Vector Machine Slides from Andrew Moore and Mingyue Tan.
Ghent University Accelerating Event Based Simulation for Multi-Synapse Spiking Neural Networks September 13, ICANN 2006 Michiel D'Haene, Benjamin.
Support Vector Machines
PREDICT 422: Practical Machine Learning
Support Feature Machine for DNA microarray data
Face Detection EE368 Final Project Group 14 Ping Hsin Lee
Support Vector Machines
Mixture of SVMs for Face Class Modeling
An Introduction to Support Vector Machines
An Introduction to Support Vector Machines
Support Vector Machines
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Tomasz Maszczyk and Włodzisław Duch Department of Informatics,
Statistical Learning Dong Liu Dept. EEIS, USTC.
Support Vector Machines
Recitation 6: Kernel SVM
Automatic Handwriting Generation
SVMs for Document Ranking
Presentation transcript:

Linking non-binned spike train kernels to several existing spike train metrics 27 April 2006 - ESANN 2006 Benjamin Schrauwen and Jan Van Campenhout Electronics and Information Systems Department Ghent University www.elis.UGent.be/SNN/

Overview Spike train processing Non-binned spike train kernel Linking to existing spike train metrics Toy application Conclusion

Spike train processing Spike train = a set of events Types of spike train processing: Physiological data: Classification/clustering used for better understanding of biological neural systems Information coding and representation Neural man-machine interface ‘Engineering’ spike trains SNN more powerful than ANN [Maass] But representing information and training networks is still difficult stimulus biological neural system detect stimulus Need for state of the art classification techniques that can operate on spike trains

Non-binned spike train kernel Linear discriminant solution can be written in inner-product form: h(x)=iiyi(xitx)+b Can be easily made non-linear by mapping to feature space: : X → F with x = (x1, x2, ... xn) → (1(x), 2(x), ... N(x)) h(x) = iiyi((xi)t(x))+b = iiyiK(xi,x)+b Feature space  does not need to be explicit: Mercer kernels: K(xi,x) implicitly defines  Many classification/clustering techniques can be written in inner-product form: can be easily ‘kernelised’: SVM (kernel maximum-margin linear classifier), kernel PCA, kernel k-means, ...

Non-binned spike train kernel Classic techniques first convert set of events to vector by binning: based on rate coding hypothesis Recently much physiological evidence of temporal coding: exact spike times do matter So spike trains stay sets of events and not vectors Need for (kernel) techniques that are able to cope with sets

Non-binned spike train kernel Kernels for spike trains: based on sets, not vectors Compare all to all set elements using an exponential measure: x=(t1,t2,..,ti,..,tN), z=(t1,t2,..,tj,..,tM) λ sets the kernel 'width' Can easily be proved to be a Mercer kernel Fast implementation possible for λ » 1

Linking to existing spike train metrics Spike train algebra L2-norm for Gaussian filtered spike trains Spike train metric spaces

Linking to existing spike train metrics Spikernel Alignment score kernel

Toy application detect original template 1 2 Fact 1: Acquiring physiological data is expensive So: Neurobiology departments don’t give away their datasets Fact 2: Our department is not neurobiology So: Our only option: use artificial data detect original template 1 2 ‘Toy’ application: Jittered spike train template matching Generate a small set of random spike trains and call them templates Generate large set of training/test data by randomly jittering spikes of templates Try to detect the original templates

Toy application SNN learning rule [Booij2005] 1 s templates, 10 Hz Poisson firing rate, 500 training, 200 testing Soft margin SVM (C=10) Laplacian and Gaussian very similar because Biologically realistic value: equals a membrane potential of 10 ms SNN learning rule [Booij2005] LSM based template classification [Maass2002]

Conclusions Two kernels that operate on spike trains have been presented These kernels are very related to several existing spike train metrics (which all embody the natural spike train variability) Due to this, the presented kernels are also biologically relevant The applicability of the kernels is demonstrated on an artificial spike train classification problem Future work: Test this technique on physiological data Spike based post-processing for the Liquid State Machine Make technique based on ISIs to solve preprocessing problems www.elis.UGent.be/SNN/