Evolutionary Path to Biological Kernel Machines Magnus Jändel Swedish Defence Research Agency.

Slides:



Advertisements
Similar presentations
Support Vector Machine
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Navneet Goyal, BITS-Pilani Perceptrons. Labeled data is called Linearly Separable Data (LSD) if there is a linear decision boundary separating the classes.
Sparse Coding in Sparse Winner networks Janusz A. Starzyk 1, Yinyin Liu 1, David Vogel 2 1 School of Electrical Engineering & Computer Science Ohio University,
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Functional Link Network. Support Vector Machines.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
1 Neural networks 3. 2 Hopfield network (HN) model A Hopfield network is a form of recurrent artificial neural network invented by John Hopfield in 1982.
Discriminative and generative methods for bags of features
Machine Learning Neural Networks
Soft computing Lecture 6 Introduction to neural networks.
Lecture 14 – Neural Networks
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
The Nature of Statistical Learning Theory by V. Vapnik
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.
Prénom Nom Document Analysis: Linear Discrimination Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Correlation Matrix Memory CS/CMPE 333 – Neural Networks.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
Artificial Neural Networks
November 30, 2010Neural Networks Lecture 20: Interpolative Associative Memory 1 Associative Networks Associative networks are able to store a set of patterns.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Data Mining with Neural Networks (HK: Chapter 7.5)
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Face Processing System Presented by: Harvest Jang Group meeting Fall 2002.
Theory Simulations Applications Theory Simulations Applications.
Oral Defense by Sunny Tang 15 Aug 2003
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Walter Hop Web-shop Order Prediction Using Machine Learning Master’s Thesis Computational Economics.
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
CSSE463: Image Recognition Day 21 Upcoming schedule: Upcoming schedule: Exam covers material through SVMs Exam covers material through SVMs.
Efficient Model Selection for Support Vector Machines
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Machine Learning Dr. Shazzad Hosain Department of EECS North South Universtiy
Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University EE459 Neural Networks The Structure.
NEURAL NETWORKS FOR DATA MINING
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
1 Pattern Classification X. 2 Content General Method K Nearest Neighbors Decision Trees Nerual Networks.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Neural Text Categorizer for Exclusive Text Categorization Journal of Information Processing Systems, Vol.4, No.2, June 2008 Taeho Jo* 報告者 : 林昱志.
CSSE463: Image Recognition Day 14 Lab due Weds, 3:25. Lab due Weds, 3:25. My solutions assume that you don't threshold the shapes.ppt image. My solutions.
Support Vector Machines. Notation Assume a binary classification problem. –Instances are represented by vector x   n. –Training examples: x = (x 1,
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Why Can't A Computer Be More Like A Brain?. Outline Introduction Turning Test HTM ◦ A. Theory ◦ B. Applications & Limits Conclusion.
Neural Networks 2nd Edition Simon Haykin
Machine Learning: A Brief Introduction Fu Chang Institute of Information Science Academia Sinica ext. 1819
ECE 471/571 - Lecture 16 Hopfield Network 11/03/15.
IEEE AI - BASED POWER SYSTEM TRANSIENT SECURITY ASSESSMENT Dr. Hossam Talaat Dept. of Electrical Power & Machines Faculty of Engineering - Ain Shams.
1 An introduction to support vector machine (SVM) Advisor : Dr.Hsu Graduate : Ching –Wen Hong.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Assocative Neural Networks (Hopfield) Sule Yildirim 01/11/2004.
INTRODUCTION TO NEURAL NETWORKS 2 A new sort of computer What are (everyday) computer systems good at... and not so good at? Good at..Not so good at..
Neural networks and support vector machines
Some Slides from 2007 NIPS tutorial by Prof. Geoffrey Hinton
Artificial Intelligence (CS 370D)
Classification with Perceptrons Reading:
The Naïve Bayes (NB) Classifier
Presentation transcript:

Evolutionary Path to Biological Kernel Machines Magnus Jändel Swedish Defence Research Agency

Summary It is comparatively easy for organisms to implement support vector machines. Biological support vector machines provide efficient and cost- effective pattern recognition with one-shot learning [1]. The support vector machine hypothesis is consistent with the architecture of the olfactory system [1]. Bursts in the thalamocortical system may be related to support vector machine pattern recognition [2]. An efficient implementation reuses machinery for learning action sequences [3]. 1) Jändel, M.: A neural support vector machine. Neural Networks 23, (2010). 2) Jändel, M.: Thalamic bursts mediate pattern recognition. Proceedings of the 4th International IEEE EMBS Conference on Neural Engineering 562–565 (2009). 3) Jändel, M.: Pattern recognition as an internalized motor programme. To appear in proc. of ICNN Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Outline Support vector machine definition Evolutionary path to a neural SVM Conclusions and olfactory model Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Support vector machine definition

Maximum margin linear classification Consider binary classification with m training examples: Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Transform to high-dimensional feature space Zero-bias SVM: Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Zero-bias -SVM Maximize: Subject to: and where Solve by iterative gradient ascent in the  -space hyperplane where The margin of the i:th example in feature space! Classification function: Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Evolutionary Path

Stage 1 SSPR Sensor system Simple hard-wired pattern recognizer Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 2 Sensor system Simple hard-wired pattern recognizer SSPR x SM Sensory Memory Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 3 Sensor system Simple hard-wired pattern recognizer SSPR x SM Sensory Memory AM Associative memory Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 4 SSPR x SM AM x y´y´ - Significant patterns and the associated valence are stored in the AM. - Sufficiently similar inputs make the AM recall the valence of a stored pattern. Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010 Zero-bias -SVM

Stage 5 SSPR x SM AM x x´, y´ - Significant patterns and the associated valence are stored in the AM. - Sufficiently similar inputs make the AM recall the valence of a stored pattern - The PR modulates the recalled valence y´ with a similarity measure comparing input x with the stored pattern x´ according to, Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 6 SSPR x SM OM x x i, y i - The OM oscillates between memory states - The PR computes a weighted average over the valences of all stored examples, Stage 6 implements the classification function of a zero-bias SVM. Zero-bias -SVM Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Oscillating Associative Memory Hopfield associative memory N neurons with binary output z i Update rule Imprint m memory patterns x (k) One-shot learning! OM Model m memory patterns The probability of finding the OM in state i is, Each oscillation selects the next state with uniform probability. The average endurance time of state i is T i. Oscillating memory - Firing cell nuclei are exhausted - Active synapses are depleted Modes with perpetual oscillation between attractors. Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 7 SSPR xjxj SM OM xjxj x i, y i - Learning feedback Bij tunes memory weights - Real-world experiments are required B ij x i is the present example presented by the OM x j is the sensory input y j is the valence of x j as learnt from hard-earned experience feedback For each OM oscillation apply the learning rules, and Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Stage 8 SSPR xjxj SM OM xjxj x i, y i - OM patterns are set up in sensory memory while sleeping - OM weights tuned in virtual experiments - No need for external feedback - Implements a zero-bias -SVM B ij xixi Zero-bias -SVM Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Learning SVM weights For each OM oscillation apply the learning rules, and where Averaging over “trapped examples” with probability distribution SSPR xjxj SM OM xjxj x i, y i B ij xixi gives where Zero-bias -SVM Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Conclusions and olfactory model

Summary of support vector machine implementation Classification process SSPR xjxj SM OM xjxj x i, y i B ij xixi x Learning new training examples Learning weights of training examples Zero-bias -SVM Research program Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Trap CL OM OB AOC APC HOBS PPC D1 M2 D3 D5 M1 D2 M3 D4 Olfactory model APC – Anterior piriform cortex PPC – Posterior piriform cortex AOC – Anterior olfactory cortex OB – Olfactory bulb HOBS – Higher-order brain systems Magnus Jändel, Brain Inspired Cognitive Systems, 15 July 2010

Questions?