SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING.

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Slides from: Doug Gray, David Poole
1 Image Classification MSc Image Processing Assignment March 2003.
2806 Neural Computation Self-Organizing Maps Lecture Ari Visa.
Unsupervised learning. Summary from last week We explained what local minima are, and described ways of escaping them. We investigated how the backpropagation.
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Decision Support Systems
Neural Networks Basic concepts ArchitectureOperation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
November 19, 2009Introduction to Cognitive Science Lecture 20: Artificial Neural Networks I 1 Artificial Neural Network (ANN) Paradigms Overview: The Backpropagation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2006 Shreekanth Mandayam ECE Department Rowan University.
Power Systems Application of Artificial Neural Networks. (ANN)  Introduction  Brief history.  Structure  How they work  Sample Simulations. (EasyNN)
Artificial Neurons, Neural Networks and Architectures
Information Fusion Yu Cai. Research Article “Comparative Analysis of Some Neural Network Architectures for Data Fusion”, Authors: Juan Cires, PA Romo,
S. Mandayam/ ANN/ECE Dept./Rowan University Artificial Neural Networks ECE /ECE Fall 2010 Shreekanth Mandayam ECE Department Rowan University.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Soft Computing Colloquium 2 Selection of neural network, Hybrid neural networks.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Presentation on Neural Networks.. Basics Of Neural Networks Neural networks refers to a connectionist model that simulates the biophysical information.
Multiple-Layer Networks and Backpropagation Algorithms
Intrusion Detection Using Hybrid Neural Networks Vishal Sevani ( )
Artificial Neural Network Theory and Application Ashish Venugopal Sriram Gollapalli Ulas Bardak.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
IE 585 Introduction to Neural Networks. 2 Modeling Continuum Unarticulated Wisdom Articulated Qualitative Models Theoretic (First Principles) Models Empirical.
Artificial Neural Networks Dr. Abdul Basit Siddiqui Assistant Professor FURC.
Artificial Neural Networks An Overview and Analysis.
Introduction to Neural Networks Debrup Chakraborty Pattern Recognition and Machine Learning 2006.
Self organizing maps 1 iCSC2014, Juan López González, University of Oviedo Self organizing maps A visualization technique with data dimension reduction.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
What is a neural network? Collection of interconnected neurons that compute and generate impulses. Components of a neural network include neurons, synapses,
Appendix B: An Example of Back-propagation algorithm
NEURAL NETWORKS FOR DATA MINING
Artificial Neural Networks An Introduction. What is a Neural Network? A human Brain A porpoise brain The brain in a living creature A computer program.
Dünaamiliste süsteemide modelleerimine Identification for control in a non- linear system world Eduard Petlenkov, Automaatikainstituut, 2013.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 8: Neural Networks.
Multi-Layer Perceptron
Neural Networks Steven Le. Overview Introduction Architectures Learning Techniques Advantages Applications.
Synaptic Dynamics: Unsupervised Learning
Over-Trained Network Node Removal and Neurotransmitter-Inspired Artificial Neural Networks By: Kyle Wray.
Lecture 5 Neural Control
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Artificial Neural Networks (Cont.) Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
COSC 4426 AJ Boulay Julia Johnson Artificial Neural Networks: Introduction to Soft Computing (Textbook)
Neural Networks 2nd Edition Simon Haykin
CHEE825 Fall 2005J. McLellan1 Nonlinear Empirical Models.
“Principles of Soft Computing, 2 nd Edition” by S.N. Sivanandam & SN Deepa Copyright  2011 Wiley India Pvt. Ltd. All rights reserved. CHAPTER 2 ARTIFICIAL.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
Lecture 12. Outline of Rule-Based Classification 1. Overview of ANN 2. Basic Feedforward ANN 3. Linear Perceptron Algorithm 4. Nonlinear and Multilayer.
1 Neural Networks MUMT 611 Philippe Zaborowski April 2005.
Multiple-Layer Networks and Backpropagation Algorithms
Self-Organizing Network Model (SOM) Session 11
Neural Networks.
Temporal Sequence Processing using Recurrent SOM
FUNDAMENTAL CONCEPT OF ARTIFICIAL NETWORKS
CSE P573 Applications of Artificial Intelligence Neural Networks
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Artificial Intelligence Methods
Artificial Neural Network & Backpropagation Algorithm
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
CSE 573 Introduction to Artificial Intelligence Neural Networks
network of simple neuron-like computing elements
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
An Introduction To The Backpropagation Algorithm
Department of Electrical Engineering
Introduction to Radial Basis Function Networks
August 8, 2006 Danny Budik, Itamar Elhanany Machine Intelligence Lab
The Network Approach: Mind as a Web
Presentation transcript:

SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING

SOMTIME Spatio-Temporal Pattern recognition Architecture 1: Nielsen Threads. Architecture 2: Recursive approximation. Conclusions.

Spatio-Temporal Pattern Recognition Why Spatio-Temporal Recognition?? Several Applications!! –Speech Processing. –Video Processing. –Sonar Processing. –Radar Processing. All of them vary in both Space and Time.

Spatio-Temporal Pattern Recognition Why Neural Networks?? Properties: –Adaptivity: it adapts to the changes of the surrounding world. –Nonlinearity: speech signal is inherently nonlinear. –Likelihood: % of accuracy detecting a signal. –Contextual Information: Every neuron affected by the others.

Spatio-Temporal Pattern Recognition Two Architectures Proposed: Common Topological analysis, Different Temporal Analysis. Topological Analysis: SOM. Temporal Analysis: –Feedforward accumulation of activity: NIELSEN THREADS. –Recurrent Neural Network.

Nielsen vs. Recurrent Nielsen. Architecture. –SOM + Nielsen Thread. Recognition. –Transmission of the impulse. Learning. –Back-Propagation (One level). Recurrent. Architecture. –SOM + Recurrent Thread Recognition. –Convolution of neuron outputs with the desired output. Learning. –RTRL (Real-Time recurrent Learning).

Preprocessor for Topology Extraction (SOM)

Type: feedforward / feedback. Neuron Layers: 1 input layer. 1 map layer. Input value types: binary, real. Activation Function: Sigmoid. Learning method: unsupervised Learning Algorithm: Selforganization Mainly used in: Pattern classification. Optimization problems. Simulation. Kohonen Feature Map Characteristics

Nielsen Threads Architecture One direction Thread. There are as many neurons as samples.

Nielsen Threads Input’s enter into SOM. SOM yields a Winning Neuron. Neurons orderly excited transfer an impulse left to right till the output is reached.

Recurrent Architecture Bi-directional thread. Interaction is differential.

Recurrent Neural Network Input’s enter into SOM. SOM yields a winning neuron. Recurrent Network orderly excited yields a sequence of high values of each neuron.

Training the Net... UnsupervisedSupervised

Training the Nielsen Thread The training is performed in a neuron/sample by neuron/sample basis. Each Neuron is trained for its corresponding sample in an individual way.

Training the Recurrent Thread RTRL: Real Time Recurrent Learning We have to transform this architecture into its canonic form, so that we could apply the algorithm.

Training the Recurrent Thread RTRL: Real Time Recurrent Learning CANONIC FORM

Training the Recurrent Thread RTRL: Real Time Recurrent Learning 1.- Set the synaptic weights of the algorithm to small values selected from a uniform distribution. 2.- Set the initial value of the state vector X(0)=0 3.- Set  j (0) = 0 for j= 1, 2, … dim(state space), where Computation: compute for n = 0, 1, 2, …,  j (n+1)=  (n)[W a (n)  j (n)- W’ a (n)  j (n-1)+U j (n)] e(n)=d(n)-y(n)=d(n)-CX(n)

Why Nielsen or Recurrent?? NIELSEN Easy training. Intuitive dynamics. Too many neurons for each thread. RECURRENT Small thread provide the functionality. Complicated Training. Complicated dynamics. Which way?? We could think of the recurrent architecture as a generalization of the Nielsen Thread.

Next Steps... Learning Interaction with HMM. Suitable recognition figure for continuous speech recognition. Control of convergence. Improving the training set.