Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm.

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

A Neural Model for Detecting and Labeling Motion Patterns in Image Sequences Marc Pomplun 1 Julio Martinez-Trujillo 2 Yueju Liu 2 Evgueni Simine 2 John.
A new approach to Artificial Intelligence.  There are HUGE differences between brain architecture and computer architecture  The difficulty to emulate.
Template design only ©copyright 2008 Ohio UniversityMedia Production Spring Quarter  A hierarchical neural network structure for text learning.
Un Supervised Learning & Self Organizing Maps. Un Supervised Competitive Learning In Hebbian networks, all neurons can fire at the same time Competitive.
5/16/2015Intelligent Systems and Soft Computing1 Introduction Introduction Hebbian learning Hebbian learning Generalised Hebbian learning algorithm Generalised.
Artificial neural networks:
Artificial Spiking Neural Networks
Introduction: Neurons and the Problem of Neural Coding Laboratory of Computational Neuroscience, LCN, CH 1015 Lausanne Swiss Federal Institute of Technology.
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
1Neural Networks B 2009 Neural Networks B Lecture 1 Wolfgang Maass
Soft computing Lecture 6 Introduction to neural networks.
From Perception to Action And what’s in between?.
A Batch-Language, Vector-Based Neural Network Simulator Motivation: - general computer languages (e.g. C) lead to complex code - neural network simulators.
[1].Edward G. Jones, Microcolumns in the Cerebral Cortex, Proc. of National Academy of Science of United States of America, vol. 97(10), 2000, pp
Pattern Association A pattern association learns associations between input patterns and output patterns. One of the most appealing characteristics of.
Hybrid Pipeline Structure for Self-Organizing Learning Array Yinyin Liu 1, Ding Mingwei 2, Janusz A. Starzyk 1, 1 School of Electrical Engineering & Computer.
Un Supervised Learning & Self Organizing Maps Learning From Examples
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
EE141 1 Broca’s area Pars opercularis Motor cortexSomatosensory cortex Sensory associative cortex Primary Auditory cortex Wernicke’s area Visual associative.
Basic Models in Neuroscience Oren Shriki 2010 Associative Memory 1.
Associative Learning in Hierarchical Self Organizing Learning Arrays Janusz A. Starzyk, Zhen Zhu, and Yue Li School of Electrical Engineering and Computer.
Overview 1.The Structure of the Visual Cortex 2.Using Selective Tuning to Model Visual Attention 3.The Motion Hierarchy Model 4.Simulation Results 5.Conclusions.
Neural Networks Chapter 2 Joost N. Kok Universiteit Leiden.
Biologically Inspired Robotics Group,EPFL Associative memory using coupled non-linear oscillators Semester project Final Presentation Vlad TRIFA.
Artificial Intelligence
CS623: Introduction to Computing with Neural Nets (lecture-10) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
© Negnevitsky, Pearson Education, Lecture 7 Artificial neural networks: Supervised learning Introduction, or how the brain works Introduction, or.
Self-Organized Recurrent Neural Learning for Language Processing April 1, March 31, 2012 State from June 2009.
Neural Networks Architecture Baktash Babadi IPM, SCS Fall 2004.
Outline What Neural Networks are and why they are desirable Historical background Applications Strengths neural networks and advantages Status N.N and.
Advances in Modeling Neocortex and its impact on machine intelligence Jeff Hawkins Numenta Inc. VS265 Neural Computation December 2, 2010 Documentation.
Neural Network with Memory and Cognitive Functions Janusz A. Starzyk, and Yue Li School of Electrical Engineering and Computer Science Ohio University,
Cognition, Brain and Consciousness: An Introduction to Cognitive Neuroscience Edited by Bernard J. Baars and Nicole M. Gage 2007 Academic Press Chapter.
Neural coding (1) LECTURE 8. I.Introduction − Topographic Maps in Cortex − Synesthesia − Firing rates and tuning curves.
Background The physiology of the cerebral cortex is organized in hierarchical manner. The prefrontal cortex (PFC) constitutes the highest level of the.
Neural Networks and Fuzzy Systems Hopfield Network A feedback neural network has feedback loops from its outputs to its inputs. The presence of such loops.
What to make of: distributed representations summation of inputs Hebbian plasticity ? Competitive nets Pattern associators Autoassociators.
CSC321: Introduction to Neural Networks and machine Learning Lecture 16: Hopfield nets and simulated annealing Geoffrey Hinton.
Chapter 16. Basal Ganglia Models for Autonomous Behavior Learning in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning from Humans.
Pencil-and-Paper Neural Networks Prof. Kevin Crisp St. Olaf College.
The Function of Synchrony Marieke Rohde Reading Group DyStURB (Dynamical Structures to Understand Real Brains)
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Neural Network Basics Anns are analytical systems that address problems whose solutions have not been explicitly formulated Structure in which multiple.
Evolutionary Path to Biological Kernel Machines Magnus Jändel Swedish Defence Research Agency.
From brain activities to mathematical models The TempUnit model, a study case for GPU computing in scientific computation.
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Model-based learning: Theory and an application to sequence learning P.O. Box 49, 1525, Budapest, Hungary Zoltán Somogyvári.
Chapter 3. Stochastic Dynamics in the Brain and Probabilistic Decision-Making in Creating Brain-Like Intelligence, Sendhoff et al. Course: Robots Learning.
Why Can't A Computer Be More Like A Brain?. Outline Introduction Turning Test HTM ◦ A. Theory ◦ B. Applications & Limits Conclusion.
Biological Modeling of Neural Networks: Week 15 – Fast Transients and Rate models Wulfram Gerstner EPFL, Lausanne, Switzerland 15.1 Review Populations.
NETWORK SONGS !! created by Carina Curto & Katherine Morrison January 2016 Input: a simple directed graph G satisfying two rules: 1. G is an oriented.
Lecture 9 Model of Hopfield
Biological Modeling of Neural Networks: Week 10 – Neuronal Populations Wulfram Gerstner EPFL, Lausanne, Switzerland 10.1 Cortical Populations - columns.
CSC321: Neural Networks Lecture 1: What are neural networks? Geoffrey Hinton
1 Neural networks 2. 2 Introduction: Neural networks The nervous system contains 10^12 interconnected neurons.
CSC2535: Computation in Neural Networks Lecture 8: Hopfield nets Geoffrey Hinton.
1 Azhari, Dr Computer Science UGM. Human brain is a densely interconnected network of approximately neurons, each connected to, on average, 10 4.
Building a scalable neural processing subsystem Joy Bose Supervisors Steve Furber (Amulet Group) Jonathan Shapiro (AI Group) Amulet Group Meeting 30 January.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
Bayesian Brain - Chapter 11 Neural Models of Bayesian Belief Propagation Rajesh P.N. Rao Summary by B.-H. Kim Biointelligence Lab School of.
Outline Of Today’s Discussion
CSC321 Lecture 18: Hopfield nets and simulated annealing
Covariation Learning and Auto-Associative Memory
Implementation of neural gas on Cell Broadband Engine
OCNC Statistical Approach to Neural Learning and Population Coding ---- Introduction to Mathematical.
Ch6: AM and BAM 6.1 Introduction AM: Associative Memory
CSC 578 Neural Networks and Deep Learning
Associational cortex introduction
Presentation transcript:

Brain-like design of sensory-motor programs for robots G. Palm, Uni-Ulm

The cerebral cortex is a huge associative memory, or rather a large network of associatively connected topographical areas. Associations between patterns are formed by Hebbian learning. Even simple tasks require the interaction of many cortical areas.

Modelling Cortical Areas with Associative Memories Andreas Knoblauch & Günther Palm Department of Neural Information Processing University of Ulm, Germany

Introduction Neural associative memory Willshaw model Spiking associative memory (SAM) Modeling Cortical Areas for the MirrorBot project Cortical areas for the minimal scenario „Bot show plum!“ Implementation of the language areas using SAM Summary and Discussion Overview

Associative Memory (AM) P 1, P 2,..., P M AM Addressing with one or more noisy patterns P X = P i1 + P i P im + noise AM (P i1, P i2,..., P im ) (1) Learning patterns: (2) Retrieving patterns

Neural Associative Memory (NAM) Binary Willshaw model -sparse coding: pattern P =  {0,1} n k = O(log n) n neurons  O(n 2 /log 2 n) patterns can be stored memory capacity ln 2  0.7 bit/synapse - extensions: - iterative retrieval (Schwenker/Sommer/Palm 1996, 1999) - spiking associative memory (Wennekers/Palm 1997) (mainly for biological modelling) (Willshaw 1969, Palm 1980, Hopfield 1982) k n

Binary Willshaw -NAM: Learning Patterns P (k)  {0,1} n, k=1,...,M P (1) = Memory matrix A ij = min ( 1,  P (k) · P (k) ) k i j P (1) i j

Binary Willshaw -NAM: Learning Patterns P (k)  {0,1} n, k=1,...,M P (1) = P (2) = Memory matrix A ij = min ( 1,  P (k) · P (k) ) k i j P (1) P (2) i j

Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = P (2) = Address pattern: P X = P X A

Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = P (2) = Address pattern: P X = neuron potentials: x = A  P X P X A x

Binary Willshaw -NAM: Retrieving Learned patterns: P (1) = P (2) = Address pattern: P X = Neuron potentials: x = A  P X Retrieval result: P R = x  P X A x P R (  =2)

NAM and Problems with Superpositions: Learning

NAM and Problems with Superpositions: Retrieving (1) Classical: Addressing with 1/2 pattern (k/2) + noise (f)

NAM and Problems with Superpositions: Retrieving (2) Classical: Addressing with 1/2 pattern (k/2) + noise (f) Superposition: Addressing with 2 x 1/2 pattern + noise

NAM and Problems with Superpositions: Solutions? Classical: Addressing with 1/2 pattern (k/2) + noise (f) Superposition: Addressing with 2 x 1/2 pattern + noise Possible Solutions: " Spiking neuron models " Iterative retrieval? " combination?

Working Principle of Spiking Associative Memory Interpretation of classical potentials x as dx/dt  temporal dynamics - the most excited neurons fire first - breaking of symmetry by feedback - pop-out of one pattern - suppression of others Problem: How to achieve threshold control? (e.g. Wennekers/Palm '97)

Counter Model of Spiking Associative Memory States („counters“) of neuron i : C H i (t) : # spikes received heteroassociatively until time t C A i (t) : # spikes “ auto- “ “ “ C  (t) : # all spikes Instantaneous Willshaw-Retrieval-Strategy at time t: neuron i probably is part of the pattern to be retrieved, if C A i (t)  C  (t) Simple linear example: dx i / dt = a C H i + b ( C A i -  C  ), b >> a > 0,  1 Knoblauch/Palm 2001

Overview " A minimal cortical model for a very simple scenario, “Bot show plum!” " Information flow and binding in the model: – Hearing and understanding “Bot show plum!” – Reacting: Seeking the plum, and pointing to the plum

Minimal model “Bot show plum!” - Overview

- 3 auditory sensory areas

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1)

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1) + evaluation fields

Minimal model “Bot show plum!” - Overview - 3 auditory sensory areas - 2 grammar areas - 3 visual sensory areas - 1 somatic sensory area - 1 visual attention area - 4 goal areas - 3 motor areas Together 17 cortical areas (incl. 2 sequence areas A4/G1) + evaluation fields + activation fields

Minimal model “Bot show plum!” - Integration Input from robot sensors - robot software - simulation environment -

Minimal model “Bot show plum!” - Integration Input from robot sensors - robot software - simulation environment - Output to - robot actors - robot software - simulation environment

Minimal model “Bot show plum!” - Connectivity

Bot listens to “Bot show plum!”

“Bot show plum!”: Processing of ‘bot’ (1)

“Bot show plum!”: Processing of ‘bot’ (2)

“Bot show plum!”: Processing of ‘bot’ (3)

“Bot show plum!”: Processing of ‘bot’ (4)

“Bot show plum!”: Processing of ‘bot’ (5)

“Bot show plum!”: Processing of ‘bot’ (6)

“Bot show plum!”: Processing of ‘bot’ (7)

“Bot show plum!”: Processing of ‘bot’ (8)

“Bot show plum!”: Processing of ‘bot’ (9)

“Bot show plum!”: Processing of ‘bot’ (10)

“Bot show plum!”: Processing of ‘show’ (1)

“Bot show plum!”: Processing of ‘show’ (2)

“Bot show plum!”: Processing of ‘show’ (3)

“Bot show plum!”: Processing of ‘show’ (4)

“Bot show plum!”: Processing of ‘show’ (5)

“Bot show plum!”: Processing of ‘show’ (6)

“Bot show plum!”: Processing of ‘show’ (7)

“Bot show plum!”: Processing of ‘show’ (8/9)

“Bot show plum!”: Processing of ‘show’ (10)

“Bot show plum!”: Processing of ‘show’ (11)

“Bot show plum!”: Processing of ‘show’ (12)

“Bot show plum!”: Processing of ‘plum’ (1)

“Bot show plum!”: Processing of ‘plum’ (2)

“Bot show plum!”: Processing of ‘plum’ (3)

“Bot show plum!”: Processing of ‘plum’ (4)

“Bot show plum!”: Processing of ‘plum’ (5)

“Bot show plum!”: Processing of ‘plum’ (6)

“Bot show plum!”: Processing of ‘plum’ (7)

“Bot show plum!”: Processing of ‘plum’ (8)

“Bot show plum!”: Processing of ‘plum’ (9)

“Bot show plum!”: Processing of ‘plum’ (10)

“Bot show plum!”: Processing of ‘plum’ (11)

After listening to “Bot show plum!”: Bot knows finally what to do (G1/G3)

Bots reaction to “Bot show plum!”

“Bot show plum!”: Seek plum - Activate motor areas (1)

“Bot show plum!”: Seek plum - Activate motor areas (2)

“Bot show plum!”: Seek plum - Activate motor areas (3)

“Bot show plum!”: Seek plum - Activate motor areas (4)

“Bot show plum!”: Seek plum - Activate motor areas (5)

“Bot show plum!”: Seek plum - Activate motor areas (6)

“Bot show plum!”: Seek plum - Activate motor areas (6/7)

“Bot show plum!”: Seek plum - Activate motor areas (7)

“Bot show plum!”: Seek plum - Activate motor areas (8)

“Bot show plum!”: Seek plum - Activate motor areas (9)

“Bot show plum!”: Seek plum - Motor areas are activated!

“Bot show plum!”: Seek plum - Activate visual attention (1)

“Bot show plum!”: Seek plum - Activate visual attention (2)

“Bot show plum!”: Seek plum - Activate vis. attention (3)

“Bot show plum!”: Seek plum - Activate vis. attention (4)

“Bot show plum!”: Seek plum - Attention is active!

“Bot show plum!”: Seek plum - Check if plum is visible (1)

“Bot show plum!”: Seek plum - Check if plum is visible (2)

“Bot show plum!”: Seek plum - Check if plum is visible (3)

“Bot show plum!”: Seek plum - Wait until plum is visible!

“Bot show plum!”: Seek plum - plum is visible (1)

“Bot show plum!”: Seek plum - plum is visible (2)

“Bot show plum!”: Seek plum - plum is visible (3)

“Bot show plum!”: Seek plum - plum is visible (4)

“Bot show plum!”: Seek plum - plum is visible (5)

“Bot show plum!”: Seek plum - plum is visible (6)

“Bot show plum!”: Plum is found, now point to plum

“Bot show plum!”: point to plum - activate motor areas (1)

“Bot show plum!”: point to plum - activate motor areas (2)

“Bot show plum!”: point to plum - activate motor areas (3)

“Bot show plum!”: point to plum - activate motor areas (4)

“Bot show plum!”: point to plum - activate motor areas (5)

“Bot show plum!”: point to plum - activate motor areas (5/6)

“Bot show plum!”: point to plum - activate motor areas (6)

“Bot show plum!”: point to plum - activate motor areas (7)

“Bot show plum!”: point to plum - activate motor areas (8)

“Bot show plum!”: point to plum - activate motor areas (9)

“Bot show plum!”: point to plum - motor areas are activated!

“Bot show plum!”: point to plum - activate hand position control (1)

“Bot show plum!”: point to plum - activate hand position control (2)

“Bot show plum!”: point to plum - activate hand position control (3)

“Bot show plum!”: point to plum - activate hand position control (4)

“Bot show plum!”: point to plum - activate hand position control (5)

“Bot show plum!”: point to plum - hand position control is active

“Bot show plum!”: point to plum - hand moves to correct position (1)

“Bot show plum!”: point to plum - hand moves to correct position (2)

“Bot show plum!”: point to plum - hand moves to correct position

“Bot show plum!”: point to plum - hand is in correct position (1)

“Bot show plum!”: point to plum - hand is in correct position (2)

“Bot show plum!”: Bot has completed the task!

Summary: - We have proposed a minimal model for „Bot show plum!“ - in principle implementable by using biological neurons and associative memories Discussion: - biologically realistic? - Modell extensions? - complexer scenarios? - learning? - mirror system?