Download presentation
Presentation is loading. Please wait.
Published byMelvin Hines Modified over 9 years ago
1
SOMTIME: AN ARTIFICIAL NEURAL NETWORK FOR TOPOLOGICAL AND TEMPORAL CORRELATION FOR SPATIOTEMPORAL PATTERN LEARNING
2
SOMTIME Spatio-Temporal Pattern recognition Architecture 1: Nielsen Threads. Architecture 2: Recursive approximation. Conclusions.
3
Spatio-Temporal Pattern Recognition Why Spatio-Temporal Recognition?? Several Applications!! –Speech Processing. –Video Processing. –Sonar Processing. –Radar Processing. All of them vary in both Space and Time.
4
Spatio-Temporal Pattern Recognition Why Neural Networks?? Properties: –Adaptivity: it adapts to the changes of the surrounding world. –Nonlinearity: speech signal is inherently nonlinear. –Likelihood: % of accuracy detecting a signal. –Contextual Information: Every neuron affected by the others.
5
Spatio-Temporal Pattern Recognition Two Architectures Proposed: Common Topological analysis, Different Temporal Analysis. Topological Analysis: SOM. Temporal Analysis: –Feedforward accumulation of activity: NIELSEN THREADS. –Recurrent Neural Network.
6
Nielsen vs. Recurrent Nielsen. Architecture. –SOM + Nielsen Thread. Recognition. –Transmission of the impulse. Learning. –Back-Propagation (One level). Recurrent. Architecture. –SOM + Recurrent Thread Recognition. –Convolution of neuron outputs with the desired output. Learning. –RTRL (Real-Time recurrent Learning).
7
Preprocessor for Topology Extraction (SOM)
8
Type: feedforward / feedback. Neuron Layers: 1 input layer. 1 map layer. Input value types: binary, real. Activation Function: Sigmoid. Learning method: unsupervised Learning Algorithm: Selforganization Mainly used in: Pattern classification. Optimization problems. Simulation. Kohonen Feature Map Characteristics
9
Nielsen Threads Architecture One direction Thread. There are as many neurons as samples.
10
Nielsen Threads Input’s enter into SOM. SOM yields a Winning Neuron. Neurons orderly excited transfer an impulse left to right till the output is reached.
11
Recurrent Architecture Bi-directional thread. Interaction is differential.
12
Recurrent Neural Network Input’s enter into SOM. SOM yields a winning neuron. Recurrent Network orderly excited yields a sequence of high values of each neuron.
13
Training the Net... UnsupervisedSupervised
14
Training the Nielsen Thread The training is performed in a neuron/sample by neuron/sample basis. Each Neuron is trained for its corresponding sample in an individual way.
15
Training the Recurrent Thread RTRL: Real Time Recurrent Learning We have to transform this architecture into its canonic form, so that we could apply the algorithm.
16
Training the Recurrent Thread RTRL: Real Time Recurrent Learning CANONIC FORM
17
Training the Recurrent Thread RTRL: Real Time Recurrent Learning 1.- Set the synaptic weights of the algorithm to small values selected from a uniform distribution. 2.- Set the initial value of the state vector X(0)=0 3.- Set j (0) = 0 for j= 1, 2, … dim(state space), where Computation: compute for n = 0, 1, 2, …, j (n+1)= (n)[W a (n) j (n)- W’ a (n) j (n-1)+U j (n)] e(n)=d(n)-y(n)=d(n)-CX(n)
18
Why Nielsen or Recurrent?? NIELSEN Easy training. Intuitive dynamics. Too many neurons for each thread. RECURRENT Small thread provide the functionality. Complicated Training. Complicated dynamics. Which way?? We could think of the recurrent architecture as a generalization of the Nielsen Thread.
19
Next Steps... Learning Interaction with HMM. Suitable recognition figure for continuous speech recognition. Control of convergence. Improving the training set.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.