Download presentation
Presentation is loading. Please wait.
Published byPenelope Norton Modified over 9 years ago
1
B.Macukow 1 Neural Networks Lecture 4
2
B.Macukow 2 McCulloch symbolism The symbolism introduced by McCulloch at the basis of simplified Venn diagrams is very useful in the analysis of logical networks Two areas X 1 i X 2 correspond to two argument logic function. Symbol X 1 means the input signal x 1 = 1, its complement - signal x 1 = 0. The same for X 2.
3
B.Macukow 3 We have four fragments denoted: X 1 X 2, X 1 ~X 2, ~ X 1 X 2 and ~ X 1 ~ X 2 McCulloch symbolism
4
B.Macukow 4 Instead of circles we can used crosses McCulloch symbolism
5
B.Macukow 5 Symbolic notation – cross with dots X 1 X 2 X 1 X 2 = (X 1 ~X 2 ) ( ~ X 1 X 2 ) ( X 1 X 2 ) conjunction – AND operation disjunction - OR operation McCulloch symbolism
6
B.Macukow 6 Symbolic notation – cross with dots ~X 2 X 1 X 2 negation – NOT operation implication McCulloch symbolism
7
B.Macukow 7 The function depending of parameters x1x1 x1x1 x2x2 x2x2 x1 x2x1 x2 ~x1 x2~x1 x2 McCulloch symbolism
8
B.Macukow 8 Operations performed x1x1 x1x1 x2x2 ( ( x1x1 x1x1 x2x2 x2x2 ~ ~ (() ) ) x3x3 x2x2 x2x2 ~~~ ) x3x3 x2x2 x1x1 (( ([ x3x3 x3x3 ] )) x2x2 ~~ [ x1x1 ~ ~ x2x2 x3x3 ()] x1x1 ~ ) x3x3 x2x2 ~ ~ McCulloch symbolism
9
B.Macukow 9 The whole algebra Proof: x1x1 x2x2 x1x1 x1x1 x2x2 ~~ ( ) x1x1 ~ () x2x2 ~x2x2 x1x1 x2x2 ~ () x1x1 ~~ () x2x2 ~ x1x1 x2x2 x2x2 x1x1 ~ ~ x2x2 x1x1 ~ ~ x1x1 x2x2 McCulloch symbolism
10
B.Macukow 10 Analysis of the simple nets composed from logical neurons x1x1 x4x4 x3x3 x2x2 x out McCulloch symbolism
11
B.Macukow 11 Simplified notation x1x1 x4x4 x3x3 x2x2 x out McCulloch symbolism
12
B.Macukow 12 The middle cross denotes an operation performed on the two symbols on either side. For example, the operation below means the operation which is not entered in either symbol on the left or symbol on the right should be written down as the result. Proof ) x1x1 x2x2 ( x3x3 x1x1 x2x2 () x4x4 McCulloch symbolism
13
B.Macukow 13 x3x3 x2x2 ~ x4x4 ~x1x1 x2x2 x3x3 x4x4 ~~x3x3 x4x4 ~ (( x2x2 ~ )) ~ ~x1x1 x2x2 x2x2 x1x1 x2x2 () x1x1 x2x2 ~ ( ()) x2x2 x1x1 x2x2 x2x2 ~ ) x1x1 x2x2 ( x3x3 x1x1 x2x2 () x4x4 McCulloch symbolism
14
B.Macukow 14 The net with the loops McCulloch symbolism
15
B.Macukow 15 Threshold influence for the neuron reaction McCulloch symbolism
16
B.Macukow 16 Applications of McCulloch symbols, operation on the symbols McCulloch symbolism
17
B.Macukow 17 Use of McCulloch symbols to denote the function of a neuron McCulloch symbolism
18
B.Macukow 18 Network for modeling the conditioned reflex network realized by a single unconditioned reflex (UR) and a conditioned reflex (CR) McCulloch symbolism
19
B.Macukow 19 Venn diagram and McCulloch symbols for three outputs. Unknown are marked by A, B and C. McCulloch symbolism for three outputs
20
B.Macukow 20 McCulloch symbolism four outputs Venn diagram and McCulloch symbols for four outputs. Unknown are marked by A, B, C and D.
21
B.Macukow 21 Logical functions of two unknown
22
FunctionFormulaDescription Diagram00011011 Const 1 1 (A B) (A ~B) (~A B) (~A ~B) 1111 NAND~ (A B) (A ~B) (~A B) (~A ~B) 1110 Implication ABAB (A B) (~A B) (~A ~B) 1 1 0 1 Negation A ~A~A (~A B) (~A ~B) 1100
23
FunctionFormulaDescription Diagram00011011 Implication BABA (A B) (A ~B) (~A ~B) 1011 Negation B ~B~B (A ~B) (~A ~B) 1010 equivalence ABAB (A B) (~A ~B) 1 0 0 1 NOR ~(A B) (~A ~B) 1000 Logical functions of two unknown
24
FunctionFormulaDescription Diagram00011011 disjunction ABAB (A B) (A ~B) (~A B) 0111 non- equivalence ~(AB)~(AB) (A ~B) (~A B) 0110 B B (A B) (~A B) 0 1 0 1 negation of implication ~AB~AB (~A B) 0100 Logical functions of two unknown
25
FunctionFormulaDescription Diagram00011011 A A (A B) (A ~B) 0011 negation of implication A ~ B (A ~B) 0010 conjunction A B (A B) 0 0 0 1 constant 0 0 0000 Logical functions of two unknown
26
B.Macukow 26 Neural Networks
27
B.Macukow 27 Neural Networks At the beginning was the idea that it is enough to build the net of many randomly connected elements to get the model of the brain operation. Question: how many element is necessary for the process of self organization ??
28
B.Macukow 28 Research of McCulloch, Lettvin, Maturana, Hartlin and Ratliff. Research on the frog’s eye and specially on the compound eye of the horseshoe cram - Limulus. Hubel and Wiesel research on the mammals visual system. Some parts are constructed in the very special, regular way. Neural Networks
29
B.Macukow 29 Two – layers chain structure Neural Networks
30
B.Macukow 30 The input layer of photoreceptors and the layer of processing elements which will locate the possible changes in the excitation distribution. Connection rule: Each receptor cell is to excite one element (exactly below). In addition to the excitatory connections there are also inhibitory connections (for the simplicity - to the adjacent cells only) which reduce the signal to the neighbors. Neural Networks
31
B.Macukow 31 The inhibition range can differs. This is known as the of lateral inhibition Neural Networks
32
B.Macukow 32 As can be easily seen the uniform excitation of the first layer will not excite the second layer. The excitatory and inhibitory signals will be balanced. A step signal is a step change in the spatial distribution. The distribution of output signal is not a copy of the input signal distribution but is the convolution of the input signal and the weighting function. Neural Networks
33
B.Macukow 33 The point in which the step change occurs is exaggerated at each side by increasing and decreasing the signal resulting in the single signal at the point of the this step. Neural Networks
34
B.Macukow 34 Input Signals Output Signals 1 Elements’ transfer function Neural Networks
35
B.Macukow 35 As you can see such a network gives the possibility to locate the point where the changes in the excitation were enough high (terminations, inflections, bends etc.). From the neurophysiology we know on the existence of the opposite operation lateral excitation. These nets allows to detect the points of crossing or branchings etc. Neural Networks
36
B.Macukow 36 The lateral inhibition rule can be realized be the one dimensional net with negative feedback Attention: elements are nonlinear and the feedback loops make analysis difficult; such the networks can be non stable and the distribution of the input signals does not depends univocally from the input signals. Neural Networks
37
Another simple neural nets
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.