1 Combining Multiple Modes of Information using Unsupervised Neural Classifiers Neural Computing Group, Department.

Slides:



Advertisements
Similar presentations
© Negnevitsky, Pearson Education, Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Advertisements

Introduction to Neural Networks Computing
Perceptron Learning Rule
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Neural Computing Group Department of Computing University of Surrey In-situ Learning in Multi-net Systems 25 th August 2004
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
Neural Networks Chapter 9 Joost N. Kok Universiteit Leiden.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Outcomes  Look at the theory of self-organisation.  Other self-organising networks  Look at examples of neural network applications.
Kohonen Self Organising Maps Michael J. Watts
Artificial neural networks:
Competitive learning College voor cursus Connectionistische modellen M Meeter.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
X0 xn w0 wn o Threshold units SOM.
Modular Neural Networks CPSC 533 Franco Lee Ian Ko.
You wanted to know what the Matrix is? Matthew Casey.
RBF Neural Networks x x1 Examples inside circles 1 and 2 are of class +, examples outside both circles are of class – What NN does.
Slides are based on Negnevitsky, Pearson Education, Lecture 8 Artificial neural networks: Unsupervised learning n Introduction n Hebbian learning.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Artificial neural networks.
Neural Computing Group Department of Computing University of Surrey Matthew Casey 21 st April 2004
JYC: CSM17 Bioinformatics Week 9: Simulations #3: Neural Networks biological neurons natural neural networks artificial neural networks applications.
Lecture 09 Clustering-based Learning
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.
Radial-Basis Function Networks
Machine Learning. Learning agent Any other agent.
Lecture 12 Self-organizing maps of Kohonen RBF-networks
KOHONEN SELF ORGANISING MAP SEMINAR BY M.V.MAHENDRAN., Reg no: III SEM, M.E., Control And Instrumentation Engg.
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Neural Network Hopfield model Kim, Il Joong. Contents  Neural network: Introduction  Definition & Application  Network architectures  Learning processes.
Artificial Neural Network Unsupervised Learning
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
NEURAL NETWORKS FOR DATA MINING
Microarrays.
George F Luger ARTIFICIAL INTELLIGENCE 6th edition Structures and Strategies for Complex Problem Solving Machine Learning: Connectionist Luger: Artificial.
Machine Learning Neural Networks (3). Understanding Supervised and Unsupervised Learning.
Applying Neural Networks Michael J. Watts
Institute for Advanced Studies in Basic Sciences – Zanjan Kohonen Artificial Neural Networks in Analytical Chemistry Mahdi Vasighi.
Self Organizing Feature Map CS570 인공지능 이대성 Computer Science KAIST.
EE459 Neural Networks Examples of using Neural Networks Kasin Prakobwaitayakit Department of Electrical Engineering Chiangmai University.
IE 585 Competitive Network – Learning Vector Quantization & Counterpropagation.
Unsupervised Learning
1 Financial Informatics –XVII: Unsupervised Learning 1 Khurshid Ahmad, Professor of Computer Science, Department of Computer Science Trinity College, Dublin-2,
Semiconductors, BP&A Planning, DREAM PLAN IDEA IMPLEMENTATION.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Ten MC Questions taken from the Text, slides and described in class presentation. COSC 4426 AJ Boulay Julia Johnson.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 A self-organizing map for adaptive processing of structured.
CHAPTER 14 Competitive Networks Ming-Feng Yeh.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Chapter 6 Neural Network.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Supervised Learning – Network is presented with the input and the desired output. – Uses a set of inputs for which the desired outputs results / classes.
J. Kubalík, Gerstner Laboratory for Intelligent Decision Making and Control Artificial Neural Networks II - Outline Cascade Nets and Cascade-Correlation.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Self-Organizing Network Model (SOM) Session 11
Applying Neural Networks
Data Mining, Neural Network and Genetic Programming
Real Neurons Cell structures Cell body Dendrites Axon
Object Recognition in the Dynamic Link Architecture
CSE P573 Applications of Artificial Intelligence Neural Networks
CSE 573 Introduction to Artificial Intelligence Neural Networks
Deep Learning Authors: Yann LeCun, Yoshua Bengio, Geoffrey Hinton
The Network Approach: Mind as a Web
Artificial Neural Networks
Co-operative neural networks and ‘integrated’ classification
Presentation transcript:

1 Combining Multiple Modes of Information using Unsupervised Neural Classifiers Neural Computing Group, Department of Computing, School of Electronics and Physical Sciences, University of Surrey Khurshid Ahmad, Bogdan Vrusias, Matthew Casey, Panagiotis Saragiotis

2 Content Report on preliminary experiments to: –Attempt to improve classification through combining modalities of information –Use a modular co-operative neural network system combining unsupervised learning techniques Tested using: –Scene-of-crime images and collateral text –Number magnitude and articulation

3 Background Consider how we may improve classification through combination: –Combining like classifiers (e.g. ensemble systems) –Combining expert classifiers (e.g. modular systems) Concentrate on a modular approach to combining modalities of information –For example, Kittler et al (1998): Personal identity verification using frontal face, face profile and voice inputs

4 Multi-net Systems Concept of combining neural network systems has been discussed for a number of years –Both ensemble and modular systems –Ensemble more prevalent Term multi-net systems has been promoted by Sharkey (1999, 2002) who recently advocated the use of modular systems –For example, mixture-of-experts by Jacobs et al 1991

5 Multi-net Systems Neural network techniques for classification tend to subscribe to the supervised learning paradigm –Ensemble methods –Mixture-of-experts Exceptions include Lawrence et al (1997) and Ahmad et al (2002) Unsupervised techniques give rise to problems of interpretation

6 Self-organised Combinations Our approach is based upon the combination of different Hebbian-like learning systems Hebb’s neurophysiological postulate (1949) –Strength of connection is increased when both sides of the connection are active

7 Self-organised Combinations Willshaw & von der Malsburg (1976) –Used Hebbian learning to associate patterns of activity in a 2-d pre-synaptic (input) layer and a 2-d post- synaptic (output) layer –Pre-synaptic neurons become associated with post- synaptic neurons Kohonen (1997) extended this in his Self- organising Map (SOM) –Statistical approximation of the input space –Topological map showing relatedness of input patterns –Clusters used to show classes

8 Self-organised Combinations Our architecture builds further on this using the multi-net paradigm Can be compared to Hebb’s superordinate combination of cell assemblies Two SOMs linked by Hebbian connections –One SOM learns to classify a primary modality of information –One SOM learns to classify a collateral modality of information –Hebbian connections associate patterns of activity in each SOM

9 Self-organised Combinations SOMs and Hebbian connections trained synchronously Primary SOM Bi-directional Hebbian Network Primary Vector Collateral Vector Collateral SOM

10 Self-organised Combinations Hebbian connections associate neighbourhoods of activity –Not just a one-to-one linear association –Each SOM’s output is formed by a pattern of activity centred on the winning neuron for the primary and collateral input Training complete when both SOM classifiers have learned to classify their respective inputs

11 Classifying Images and Text BodyFull length shot of body Single objects (close-up) Nine millimetre browning high power self-loading pistol ClassPrimary ImageCollateral Text

12 Classifying Images and Text Classify images based upon images and texts Primary modality of information: –66 images from the scene-of-crime domain –112-d vector based upon colour, edges and texture Collateral modality of information: –66 texts describing image content –50-d binary vector term frequency analysis 8 expert defined classes 58 vector pairs used for training, 8 for testing

13 Training Image SOM: 15 by 15 neurons Text SOM: 15 by 15 neurons Initial random weights Gaussian neighbourhood function with initial radius 8 neurons, reducing to 1 neuron Exponentially decreasing learning rate, initially 0.9, reducing to 0.1 Hebbian connection weights normalised Trained for 1000 epochs

14 Testing Tested with 8 image and text vectors –Successful classification if test vector’s winner corresponds with identified cluster for class Image SOM: –Correctly classified 4 images Text SOM: –Correctly classified 5 texts

15 Testing For misclassified images –Text classification was determined –Translated into image classification via Hebbian activation Similarly for misclassified texts Image SOM: –Further 3 images classified out of 4 (total 7 out of 8) Text SOM: –Further 2 texts classified out of 3 (total 7 out of 8)

16 Comparison Contrast with single modality of classification in image or text SOM Compared with a single SOM classifier –15 by 15 neurons –Trained on combined image and text vectors (162-d vectors) –3 out of 8 test vectors correctly classified

17 Classifying Number Classify numbers based upon (normalised) image or articulation? Primary modality of information: –Magnitude representation of the numbers 1 to 22 –66-d binary vector with 3 bits per magnitude Collateral modality of information: –Articulation representation of the numbers 1 to 22 –16-d vector representing phonemes 22 different numbers to classify 16 vector pairs used for training, 6 testing

18 Training Magnitude SOM: 66 by 1 neurons Articulation SOM: 16 by 16 neurons Initial random weights Gaussian neighbourhood function with initial radius 33 (primary) and 8 (collateral) neurons, reducing to 1 neuron Exponentially decreasing learning rate, initially 0.5 Hebbian connection weights normalised Trained for 1000 epochs

19 Testing Tested with 6 magnitude and articulation vectors –Successful classification if test vector’s winner corresponds with identified cluster for class Magnitude SOM: –Correctly classified 6 magnitudes –Magnitudes arranged in a ‘number line’ Articulation SOM: –Similar phonetic responses, but essentially misclassified all 6 articulations

20 Testing For misclassified articulation vectors –Magnitude classification was determined –Translated into articulation classification via Hebbian activation Articulation SOM: –3 articulation vectors classified out of 6 –Remaining 3 demonstrate that Hebbian association not sufficient to give rise to better classification

21 Comparison Contrast with single modality of classification in magnitude or articulation SOM Compared with a single SOM classifier –16 by 16 neurons –Trained on combined magnitude and articulation vectors (82-d vectors) –Misclassified all 6 articulation vectors –SOM shows test numbers are similar in ‘sound’ to numbers in the training set –Combined SOM does not demonstrate ‘number line’ and cannot capitalise upon it

22 Summary Preliminary results show that: –Modular co-operative multi-net system using unsupervised learning techniques can improve classification with multiple modalities –Hebb’s superordinate combination of cell assemblies? Future work: –Evaluate against larger sets of data –Further understanding of clustering and classification in SOMs –Further explore linkage of neighbourhoods, more than just a one-to-one mapping, and theory underlying model

23 Acknowledgements Supported by the EPSRC Scene of Crime Information System project (Grant No.GR/M89041) –University of Sheffield –University of Surrey –Five UK police forces Images supplied by the UK Police Training College at Hendon, with text transcribed by Chris Handy

24 References Ahmad, K., Casey, M.C. & Bale, T. (2002). Connectionist Simulation of Quantification Skills. Connection Science, vol. 14(3), pp Jacobs, R.A., Jordan, M.I. & Barto, A.G. (1991). Task Decomposition through Competition in a Modular Connectionist Architecture: The What and Where Vision Tasks. Cognitive Science, vol. 15, pp Hebb, D.O. (1949). The Organization of Behavior: A Neuropsychological Theory. New York: John Wiley & Sons. Kittler, J., Hatef, M., Duin, R.P.W. & Matas, J. (1998). On Combining Classifiers. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20(3), pp Kohonen, T. (1997). Self-Organizing Maps, 2nd Ed. Berlin, Heidelberg, New York: Springer- Verlag. Lawrence, S., Giles, C.L., Ah Chung Tsoi & Back, A.D. (1997). Face Recognition: A Convolutional Neural Network Approach. IEEE Transactions on Neural Networks, vol. 8(1), pp Sharkey, A.J.C. (1999). Combining Artificial Neural Nets: Ensemble and Modular Multi-Net Systems. Berlin, Heidelberg, New York: Springer-Verlag. Sharkey, A.J.C. (2002). Types of Multinet System. In Roli, F. & Kittler, J. (Ed), Proceedings of the Third International Workshop on Multiple Classifier Systems (MCS 2002), pp Berlin, Heidelberg, New York: Springer-Verlag. Willshaw, D.J. & von der Malsburg, C. (1976). How Patterned Neural Connections can be set up by Self-Organization. Proceedings of the Royal Society, Series B, vol. 194, pp

25 Combining Multiple Modes of Information using Unsupervised Neural Classifiers Neural Computing Group, Department of Computing, School of Electronics and Physical Sciences, University of Surrey Khurshid Ahmad, Bogdan Vrusias, Matthew Casey, Panagiotis Saragiotis

26 Multi-net Systems Sharkey (2002) – Types of Multi-net System