Download presentation
1
Associative Learning Memories -SOLAR_A
Matlab code presentation
2
Introduction Associating SOLAR (SOLAR_A)
SOLAR_A structures are hierarchically organized and have ability to classify patterns in a network of sparsely connected neurons.
3
Association Training Neurons learn associations between pattern and its code. Once the training is completed, a network is capable to make necessary associations. Testing When the network is presented with the pattern only, it drives the associated input signals to these code values that represent the observed pattern.
4
Signal definition The inner signals in the network range from 0 to 1. A signal is a determinate low or determinate high if its value is 0 or 1. weak low weak high 0.5 “inactive”, or “high impedance”
5
Neurons’ definitions If a neuron is able to observe any type of statistical correlations of its input connections, it will function as an associative neuron. Otherwise it will be a transmitting neuron.
6
Associative neuron A neuron is called an associative neuron when its inputs I1 and I2 are associated Inputs I1 and I2 are associated if and only if I2 can be implied from I1 and I1 can be implied from I2 simultaneously.
7
associative neuron Low I1 is associated with low I2, and high I1 is associated with high I2. I1 and I2 are inputs an associative neuron has received in training. It is quite clear that I1 and I2 are most likely to be simultaneously low or high although there is some noise. This can be verified using P(I2 | I1) and P(I1 | I2), and implying values I2 from I1 and I1 from I2.
8
Network Structure Hierarchical structure
In horizontal direction, the neurons on one layer can only connect to the neurons on the previous layer.
9
Network Structure The connection in vertical direction obeys
80% Gaussian distribution with standard deviation 2 + 20% uniform distribution
10
Network Structure The network uses feedback signals to pass information backwards to the associated inputs.
11
Testing During testing, the missing parts of the data need to be recovered from the existing data through association. For example, in a pattern recognition problem, the associated code inputs are unknown and therefore set to 0.5.
12
Neuron Feedback Scheme
13
Iris Plants Database The Iris database has:
3 classes (Iris Setosa, Iris Versicolour and Iris Virginica) 4 numeric attributes (petal length, petal width , sepal length , sepal width ) 150 instances of 50 instances for each class, where each class refers to a type of iris plant. The classification objective Identify the class ID based on the input feature (attribute) values
14
Coding of the database The 4 features were scaled linearly and coded using a sliding bar code . Input bits from (V-Min)+1 to (V-Min)+L will be set high and remaining bits will be low N-L=Max-Min
15
Coding of the database We scaled the 4 features of Iris database between 0-30, and Set the length of L equal to 12 The total length of each feature is 42 The feature input requires 168 bits
16
Coding of the database In order to increase the probability that each feature is associated with sample class code, we merged the 4 features.
17
Coding of the database
18
Coding of the database There are 3 classes total
We use 3M bits to code the class ID maximizing their code Hamming distance The white part is filled by 2M-bit 0 string, while the grey part is filled by M-bit 1 string.
19
Iris database simulation
Rows class ID Rows Features
20
Iris database simulation
21
Glass identification database
22
Simulation of mixed features and class ID code
23
Simulation of mixed features and class ID code
Iris database
24
Image recovery Examples of training patterns
Testing results and recovered images of letter B and J
25
Coding example Samples from Iris database
5.1,3.5,1.4,0.2,Iris-setosa(class 1) 7.0,3.2,4.7,1.4,Iris-versicolor (class 2) 6.3,3.3,6.0,2.5,Iris-virginica (class 3)
26
Coding example Pre-preparing: 51,35,14,2,1
Coding:5.1,3.5,1.4,0.2,Iris-setosa (class 1) Pre-preparing: 51,35,14,2,1 Scaling the features (51,35,14,2) from 0 to 30 After scaling: 7,19,2,2,1
27
Coding example Features 7 000000011111111111100000000000000000000000
7 bits 12 bits Features Class ID … … …000 56 bits 112 bits
28
Coded data Matrix- Input
Features Class ID code Input matrix M Training data N Testing data
29
Matlab user interface main.m – main function
training2.m – training function testing2.m – testing function catchassociating.m– actively associative neurons generate_input– coding the database
30
parameters columns- depth of layers rows- length of an input pattern
stdr- standard deviation in vertical stdc- standard deviation in horizontal n_tests- test numbers
31
training.m r_distribution(meanr,stdr,rows,columns,width)
--defines distribution in vertical direction normrnd(meanr,stdc,rows,columns) --defines distribution in horizontal direction
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.