Download presentation
Presentation is loading. Please wait.
1
Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm
2
The cerebral cortex is a huge associative memory, or rather a large network of associatively connected topographical areas. Associations between patterns are formed by Hebbian learning. Even simple tasks require the interaction of many cortical areas.
3
Modelling Cortical Areas with Associative Memories Andreas Knoblauch & Günther Palm Department of Neural Information Processing University of Ulm, Germany
4
Introduction Neural associative memory Willshaw model Spiking associative memory (SAM) Modeling Cortical Areas for the MirrorBot project Cortical areas for the minimal scenario „Bot show plum!“ Implementation of the language areas using SAM Summary and Discussion Overview
5
Associative Memory (AM) P 1, P 2,..., P M AM Addressing with one or more noisy patterns P X = P i1 + P i2 +... + P im + noise AM (P i1, P i2,..., P im ) (1) Learning patterns: (2) Retrieving patterns
6
Neural Associative Memory (NAM) Binary Willshaw model -sparse coding: pattern P = 1 1 1 {0,1} n k = O(log n) n neurons O(n 2 /log 2 n) patterns can be stored memory capacity ln 2 0.7 bit/synapse - extensions: - iterative retrieval (Schwenker/Sommer/Palm 1996, 1999) - spiking associative memory (Wennekers/Palm 1997) (mainly for biological modelling) (Willshaw 1969, Palm 1980, Hopfield 1982) k n
7
Binary Willshaw -NAM: Learning Patterns P (k) {0,1} n, k=1,...,M P (1) = 1 1 1 1 0 0 0 0 Memory matrix A ij = min ( 1, P (k) · P (k) ) k i j P (1) 1111000 1 1111000 0 0000000 i j
8
Binary Willshaw -NAM: Learning Patterns P (k) {0,1} n, k=1,...,M P (1) = 1 1 1 1 0 0 0 0 P (2) = 0 0 1 1 1 1 0 0 Memory matrix A ij = min ( 1, P (k) · P (k) ) k i j P (1) 1111000 P (2) 0011110 1 0 1111000 1 11111110 0 10011110 0 00000000 i j
9
Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = 1 1 1 1 0 0 0 0 P (2) = 0 0 1 1 1 1 0 0 Address pattern: P X = 0 1 1 0 0 0 0 0 P X A 0 1111000 1 1111 000 11111110 01111110 00011110 00000000
10
Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = 1 1 1 1 0 0 0 0 P (2) = 0 0 1 1 1 1 0 0 Address pattern: P X = 0 1 1 0 0 0 0 0 neuron potentials: x = A P X P X A 0 1111000 1 1111 000 11111110 01111110 00011110 00000000 x2222110
11
Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = 1 1 1 1 0 0 0 0 P (2) = 0 0 1 1 1 1 0 0 Address pattern: P X = 0 1 1 0 0 0 0 0 Neuron potentials: x = A P X Retrieval result: P R = x P X A 0 1111000 1 1111 000 11111110 01111110 00011110 00000000 x2222110 P R ( =2)1111000
12
NAM and Problems with Superpositions: Learning
13
NAM and Problems with Superpositions: Retrieving (1) Classical: Addressing with 1/2 pattern (k/2) + noise (f)
14
NAM and Problems with Superpositions: Retrieving (2) Classical: Addressing with 1/2 pattern (k/2) + noise (f) Superposition: Addressing with 2 x 1/2 pattern + noise
15
NAM and Problems with Superpositions: Solutions? Classical: Addressing with 1/2 pattern (k/2) + noise (f) Superposition: Addressing with 2 x 1/2 pattern + noise Possible Solutions: " Spiking neuron models " Iterative retrieval? " combination?
16
Working Principle of Spiking Associative Memory Interpretation of classical potentials x as dx/dt temporal dynamics - the most excited neurons fire first - breaking of symmetry by feedback - pop-out of one pattern - suppression of others Problem: How to achieve threshold control? (e.g. Wennekers/Palm '97)
17
Counter Model of Spiking Associative Memory States („counters“) of neuron i : C H i (t) : # spikes received heteroassociatively until time t C A i (t) : # spikes “ auto- “ “ “ C (t) : # all spikes Instantaneous Willshaw-Retrieval-Strategy at time t: neuron i probably is part of the pattern to be retrieved, if C A i (t) C (t) Simple linear example: dx i / dt = a C H i + b ( C A i - C ), b >> a > 0, 1 Knoblauch/Palm 2001
18
Overview " A minimal cortical model for a very simple scenario, “Bot show plum!” " Information flow and binding in the model: Hearing and understanding “Bot show plum!” Reacting: Seeking the plum, and pointing to the plum
19
Minimal model “Bot show plum!” - Overview
20
- 3 auditory sensory areas
21
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas
22
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas
23
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area
24
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area
25
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas
26
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas
27
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1)
28
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1) + evaluation fields
29
Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1) + evaluation fields + activation fields
30
Minimal model “Bot show plum!” - Integration Input from robot sensors - robot software - simulation environment -
31
Minimal model “Bot show plum!” - Integration Input from robot sensors - robot software - simulation environment - Output to - robot actors - robot software - simulation environment
32
Minimal model “Bot show plum!” - Connectivity
33
Bot listens to “Bot show plum!”
34
“Bot show plum!”: Processing of ‘bot’ (1)
35
“Bot show plum!”: Processing of ‘bot’ (2)
36
“Bot show plum!”: Processing of ‘bot’ (3)
37
“Bot show plum!”: Processing of ‘bot’ (4)
38
“Bot show plum!”: Processing of ‘bot’ (5)
39
“Bot show plum!”: Processing of ‘bot’ (6)
40
“Bot show plum!”: Processing of ‘bot’ (7)
41
“Bot show plum!”: Processing of ‘bot’ (8)
42
“Bot show plum!”: Processing of ‘bot’ (9)
43
“Bot show plum!”: Processing of ‘bot’ (10)
44
“Bot show plum!”: Processing of ‘show’ (1)
45
“Bot show plum!”: Processing of ‘show’ (2)
46
“Bot show plum!”: Processing of ‘show’ (3)
47
“Bot show plum!”: Processing of ‘show’ (4)
48
“Bot show plum!”: Processing of ‘show’ (5)
49
“Bot show plum!”: Processing of ‘show’ (6)
50
“Bot show plum!”: Processing of ‘show’ (7)
51
“Bot show plum!”: Processing of ‘show’ (8/9)
52
“Bot show plum!”: Processing of ‘show’ (10)
53
“Bot show plum!”: Processing of ‘show’ (11)
54
“Bot show plum!”: Processing of ‘show’ (12)
55
“Bot show plum!”: Processing of ‘plum’ (1)
56
“Bot show plum!”: Processing of ‘plum’ (2)
57
“Bot show plum!”: Processing of ‘plum’ (3)
58
“Bot show plum!”: Processing of ‘plum’ (4)
59
“Bot show plum!”: Processing of ‘plum’ (5)
60
“Bot show plum!”: Processing of ‘plum’ (6)
61
“Bot show plum!”: Processing of ‘plum’ (7)
62
“Bot show plum!”: Processing of ‘plum’ (8)
63
“Bot show plum!”: Processing of ‘plum’ (9)
64
“Bot show plum!”: Processing of ‘plum’ (10)
65
“Bot show plum!”: Processing of ‘plum’ (11)
66
After listening to “Bot show plum!”: Bot knows finally what to do (G1/G3)
67
Bots reaction to “Bot show plum!”
68
“Bot show plum!”: Seek plum - Activate motor areas (1)
69
“Bot show plum!”: Seek plum - Activate motor areas (2)
70
“Bot show plum!”: Seek plum - Activate motor areas (3)
71
“Bot show plum!”: Seek plum - Activate motor areas (4)
72
“Bot show plum!”: Seek plum - Activate motor areas (5)
73
“Bot show plum!”: Seek plum - Activate motor areas (6)
74
“Bot show plum!”: Seek plum - Activate motor areas (6/7)
75
“Bot show plum!”: Seek plum - Activate motor areas (7)
76
“Bot show plum!”: Seek plum - Activate motor areas (8)
77
“Bot show plum!”: Seek plum - Activate motor areas (9)
78
“Bot show plum!”: Seek plum - Motor areas are activated!
79
“Bot show plum!”: Seek plum - Activate visual attention (1)
80
“Bot show plum!”: Seek plum - Activate visual attention (2)
81
“Bot show plum!”: Seek plum - Activate vis. attention (3)
82
“Bot show plum!”: Seek plum - Activate vis. attention (4)
83
“Bot show plum!”: Seek plum - Attention is active!
84
“Bot show plum!”: Seek plum - Check if plum is visible (1)
85
“Bot show plum!”: Seek plum - Check if plum is visible (2)
86
“Bot show plum!”: Seek plum - Check if plum is visible (3)
87
“Bot show plum!”: Seek plum - Wait until plum is visible!
88
“Bot show plum!”: Seek plum - plum is visible (1)
89
“Bot show plum!”: Seek plum - plum is visible (2)
90
“Bot show plum!”: Seek plum - plum is visible (3)
91
“Bot show plum!”: Seek plum - plum is visible (4)
92
“Bot show plum!”: Seek plum - plum is visible (5)
93
“Bot show plum!”: Seek plum - plum is visible (6)
94
“Bot show plum!”: Plum is found, now point to plum
95
“Bot show plum!”: point to plum - activate motor areas (1)
96
“Bot show plum!”: point to plum - activate motor areas (2)
97
“Bot show plum!”: point to plum - activate motor areas (3)
98
“Bot show plum!”: point to plum - activate motor areas (4)
99
“Bot show plum!”: point to plum - activate motor areas (5)
100
“Bot show plum!”: point to plum - activate motor areas (5/6)
101
“Bot show plum!”: point to plum - activate motor areas (6)
102
“Bot show plum!”: point to plum - activate motor areas (7)
103
“Bot show plum!”: point to plum - activate motor areas (8)
104
“Bot show plum!”: point to plum - activate motor areas (9)
105
“Bot show plum!”: point to plum - motor areas are activated!
106
“Bot show plum!”: point to plum - activate hand position control (1)
107
“Bot show plum!”: point to plum - activate hand position control (2)
108
“Bot show plum!”: point to plum - activate hand position control (3)
109
“Bot show plum!”: point to plum - activate hand position control (4)
110
“Bot show plum!”: point to plum - activate hand position control (5)
111
“Bot show plum!”: point to plum - hand position control is active
112
“Bot show plum!”: point to plum - hand moves to correct position (1)
113
“Bot show plum!”: point to plum - hand moves to correct position (2)
114
“Bot show plum!”: point to plum - hand moves to correct position
115
“Bot show plum!”: point to plum - hand is in correct position (1)
116
“Bot show plum!”: point to plum - hand is in correct position (2)
117
“Bot show plum!”: Bot has completed the task!
118
Summary: - We have proposed a minimal model for „Bot show plum!“ - in principle implementable by using biological neurons and associative memories Discussion: - biologically realistic? - Modell extensions? - complexer scenarios? - learning? - mirror system?
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.