Download presentation
Presentation is loading. Please wait.
Published byRalph Lane Modified over 9 years ago
1
Contact info seguti@ics.uci.edu Controlling Mobile Robots with Distributed Neuro-Biological Systems Sebastian Gutierrez-Nolasco (UCI) Nalini Venkatasubramanian (UCI) Alfredo Weitzenfeld (ITAM)
2
Biologically Inspired Robotic Systems Nature has always been a source of inspiration in the development of autonomous robotic systems Ethology Animal behavior-based simulation Interaction with the environment is usually oversimplified Lack of strong biological basis for their working assumptions Lack of any formal underpinnings for the simulation results Neuroethology behavior related to neurobiological structure Replicate brain models to provide credible and general animal behavior Provide inspiration for further robotics architectures More complex and accurate than ethology systems Enable experimentation Experimentation requires real-time performance
3
Neuroethological robotic systems Super Robots Incorporate extensive processing capabilities Bulky Expensive Inexpensive Robots Smaller and inexpensive robots connected to a network of processing nodes Concerns Real-time performance Unpredictable communication environment affects robot performance
4
A neural model may take hours of processing time Simulation of multiple neural networks require a distributed processing environment A typical retina model may consist of more than 100,000 neurons and 500,000 interconnections Biologically inspired robotics demand sophisticated image processing techniques Communication intensive tasks are required Autonomous robotic agents have real-time and processing restrictions, as well as power awareness requirements Battery usage is a major concern in mobile robots Challenges of Biologically Inspired Autonomous Robots
5
Developing software for autonomous mobile robots is complex Highly heterogeneous methods for capturing and processing sensor information Multiple sensory input devices Sensory input multi-granularity Communication is error-prone due to unpredictable interference and failures partial and complete failures Unreliability and disconnection Varying available bandwidth Developing Biologically Inspired Robot Architectures
6
Our Approach Develop an embedded architecture capable of conducting neuroethological robotic experimentation Inexpensive small robots communicate (wireless) with distributed computational resources Neural models are distributed in multiple processing nodes Adaptive robotic middleware optimizes robot communication in response to varying network conditions
7
Structure of the Talk 1. Neuroethological Modeling Study animal behavior and corresponding neural structure as inspiration to robotic architectures 2. Embedded Mobile Robots Develop distributed wireless robot architectures capable of efficient neural processing 3. Adaptive Middleware Achieve real-time computation and adapt embedded architecture to varying network conditions 4. Internet Based Robotics Enable remote robot task development and experimentation
8
1. Neuroethological Modeling Study animal behavior and corresponding neural structure as inspiration to robotic architectures 2. Embedded Mobile Robots Develop distributed wireless robot architectures capable of efficient neural processing 3. Adaptive Middleware Achieve real-time computation and adapt embedded architecture to varying network conditions 4. Internet Based Robotics Enable remote robot task development and experimentation
9
Behavior : Praying Mantis - Chantlitaxia [Arkin, Ali, Weitzenfeld and Cervantes, 2000]
10
PS Mate Mate-Pair S PS Prey Prey-Acq S PS Pred Pred-Av S Moving-Object S PS NMO Find-Loc S Non-Moving-Object S PS MO ++ + - - - Perceptual Schema (PS) Main Schema (S) Behavior: Frog and Toad - Rana Computatrix [Arbib 1987, Cervantes 1990]
11
Mobile visual stimulus in lateral visual field (monocular perception) Mobile visual stimulus in binocular visual field (short distance) Mechanic stimulus in mouth and pharynx receptors Orientation Binocular fixation Attack Snap Clean Stimulus Response Behavior: Toad Prey Acquisition [Cervantes 1985]
12
Behavior: Toad Prey Acquisition with Detour Behavior Before and After Learning [Corbacho and Arbib 1995] 20cm Barrier Before learning 20cm Barrier After learning 10cm Barrier
13
Schema Level 1 data in data out Other Processes Schema Level 2 Schema Level Neural Level... din 1 n dout 1 m Schema Neural Schema Computational Model
14
R4 Visual R1-R2 R3 Retina Tectum PreTectum/Thalamus Motor Heading Map Static Object Recognizer Prey Recognizer TH10 T5_2 Static Object Avoidance Prey Approach Forward Orient Sidestep Backward Tactile Schema Level Neural Level Moving Stimulus Selector R1-R2 R3 R4 MaxSelector Predator Recognizer Predator Avoid R1-R2 R3 R4 Depth Stereo Neural based Behavior : Toad Prey Acquisitions and Predator Avoidance
15
1 2 3 4 5 6 7 11 8 9 10 12 13 14 15 16 10cm barrier 20cm barrier Before learning 20cm barrier After learning Toad Prey Acquisition with Detour: Simulation Results
16
1. Neuroethological Modeling Study animal behavior and corresponding neural structure as inspiration to robotic architectures 2. Embedded Mobile Robots Develop distributed wireless robot architectures capable of efficient neural processing 3. Adaptive Middleware Achieve real-time computation and adapt embedded architecture to varying network conditions 4. Internet Based Robotics Enable remote robot task development and experimentation
17
LEGO OOPIC Embedded Mobile Robots: Robot Hardware
18
Autonomous Robot 1 Autonomous Robot N Internet Wireless Internet Server... Remote Computaional System Instance 1 Instance N Remote Computational System Embedded Mobile Robots: Distributed Embedded Architecture
19
Time consuming processes are carried out in the (neural) computational system Neural processing Image processing Limited task are carried out in the robot hardware Sensory input Motor output Default behavior Communication and data transformation is managed by the adaptive middleware Embedded Mobile Robots: Distributed Embedded Architecture
20
motor servo camera CPU (OOPic) Transceiver Power stage Frame Grabber Sensors (tact) Transceiver PC Remote Computaional System Robot Wireless Embedded Mobile Robots: Distributed Embedded Architecture
21
NSL ASL Video Server Video/ Image Processing camera Robot Wireless Remote Computational System Tactile Server Motor Server tactile motor transceiver NSL – Neural Simulation Language ASL – Abstract Schema Language Embedded Mobile Robots: Distributed Embedded Architecture
22
Video capture Video processing Model simulation Model output Navigation control (d, r, c ) Embedded Mobile Robots: Processing cycle
23
1. Neuroethological Modeling Study animal behavior and corresponding neural structure as inspiration to robotic architectures 2. Embedded Mobile Robots Develop distributed wireless robot architectures capable of efficient neural processing 3. Adaptive Middleware Achieve real-time computation and adapt embedded architecture to varying network conditions 4. Internet Based Robotics Enable remote robot task development and experimentation
24
Distributed Systems Middleware Enables the modular interconnection of distributed software Abstract over low level mechanisms used to implement resource management services Concurrent Object Oriented Model Separation of concerns and reuse of services Customizable, Composable Middleware Frameworks Provide for dynamic network and system customizations, dynamic invocation/revocation/installation of services Concurrent execution of multiple resource management policies
25
Core Resource Management Services Core Services - basic services where interactions between the application and system can occur. Building blocks for other services Reduce interactions among many services to interactions between a few simple services Choosing core services - commonly observed patterns Recreation of data/services at a remote site Capturing approximation of distributed state at multiple sites Interactions with a global repository
26
TLAM: The Two Level Meta-architecture Distributed Snapshot Remote Creation Directory Services Replication Migration DGC Check- pointing Access Control System (Meta) Level Application (Base) Level
27
Adaptive Robotic Middleware (ARM) Extends the TLAM to Optimize information flow between robots and the computational system Determine how, when and what information should be modified in order to match fluctuations in the communication environment Compose communication protocols to obtain the combined benefits - conflicting requirements Explicit knowledge of how communication protocols compose and interact is required Adapt protocols and mechanisms to changing communication and power constraints
28
ARM NSL ASL Video Server Video/ Image processing camera Wireless Tactile Server Motor Server tactile motor transceiver Robot Remote Computational System NSL – Neural Simulation Language ASL – Abstract Schema Language ARM: Distributed Embedded Architecture
29
Communication manager Provide and enforce application level requirements Components Oracle determine most suitable protocol implementation in terms of coverage and efficiency Set of communication protocols Protocol installer/uninstaller Resident ARM module running in the robot (resident evil) Adaptation manager Provide adaptation and monitor mechanisms operating at different levels of abstraction Reactive Triggered when failure to achieve intended communication goal is detected Proactive Triggered when a more efficient communication can be achieved under the current environment conditions Adaptation Repository Determine most suitable adaptation strategy to be applied ARM: Components
30
ARM: Example
31
1. Neuroethological Modeling Study animal behavior and corresponding neural structure as inspiration to robotic architectures 2. Embedded Mobile Robots Develop distributed wireless robot architectures capable of efficient neural processing 3. Adaptive Middleware Achieve real-time computation and adapt embedded architecture to varying network conditions 4. Internet Based Robotics Enable remote robot task development and experimentation
32
Interned based Robotics: Web Access
33
Experimental Results: 2 Preys
34
Experimental Results: 2 Preys and Predator
35
(A) (B) (C) (D) Embedded Mobile Robots: Experimental Results: Prey Acquisition with 10 cm Barrier
36
(A) (B) (C) (D) (E) (F) (G) (H) Embedded Mobile Robots: Experimental Results: Prey Acquisition with 20 cm Barrier
37
Neural based Behavior: Prey Acquisition (10cm barrier) Barrier (PreTectum) Prey (Tectum) Integrated (MHM) Heading (MHM) Tactile Visual Fields
38
Neural based Behavior: Prey Acquisition (20cm barrier before bumping) Barrier (PreTectum) Prey (Tectum) Integrated (MHM) Heading (MHM) Tactile Visual Fields
39
Neural based Behavior: Prey Acquisition (20cm barrier after bumping) Barrier (PreTectum) Prey (Tectum) Integrated (MHM) Heading (MHM) Tactile Visual Fields
40
Neural based Behavior: Prey Acquisition (20cm barrier after learning) Barrier (PreTectum) Prey (Tectum) Integrated (MHM) Heading (MHM) Tactile Visual Fields
41
Future Work Complete Internet based System Develop middleware adaptation capabilities Build smaller robotic systems Extend to multiple robot tasks Extend vision system to “true” moving forms Extend biological models
42
Video
43
Bonus Section
44
Neuroscience (Experiments) Robotics Brain Theory (Modeling) New Hypothesis Gaps in Knowledge Data, Hypotheses Formal Models New Ideas (Results from Experiments with Physical Devices) New Hypotheses Research Cycle
45
T - Temporal, D - Dorsal, N - Nasal, V - Ventral O - Optic Tectum, B - Nucleus of Belonci C - Lateral Geniculate Nucleus, P - Thalamic Pretectal Neuropil X - Basal Optic Root [Scalia and Fite 1974] Neural Maps
46
mp neuron s mf input output mp - membrane potential : dmp(t)/dt = f(s,mp,t) mf - firing rate : mf(t) = (mp(t)) Leaky Integrator : dm(t)/dt = -m(t) + s Neuron Model
47
TP – ThalamusPreTectum GL – Glomerelus SN – Stellate Neurons SP – Small Pear LP – Large Pear PY - Pyramidal TP LP SN SP GL PY R4 R3 R2 + + + + + + - + + + + + - + + - + - + Output Retina + - + - Excitation Inhibition Synapsis Input Retina-Thalamus-Tectum
48
Max Selector [Didday 1976]
49
uf Ulayer vf Vlayer u_ in v_ s_ out MaxSelector s_out MaxSelector Stimulus MaxSelectorModel MaxSelector Output nslModel MaxSelectorModel () extends NslModel() { private MaxSelector maxselector(10); private MaxSelectorStimulus stimulus(10); private MaxSelectorOutput output(); public void initSys() { system.setRunTime(10.0); system.setRunDelta(0.1); } public void makeConn() { nslConnect(stimulus.s_out,maxselector.s_in); nslConnect(stimulus.s_out,output.s_in); nslConnect(maxselector.out, output.uf); } uf s_in Max Selector Model
50
uf Ulayer vf Vlayer u_in v s in out MaxSelector nslModule MaxSelector (int size) extends NslModule() { public Ulayer u1(size); public Vlayer v1(size); public NslDinDouble1 in(size); public NslDoutDouble1 out(size); public void makeConn(){ nslRelabel(in,u1.s_in); nslConnect(v1.vf,u1.v_in); nslConnect(u1.uf,v1.u_in); nslRelabel(u1.uf,out); } Max Selector Module
51
nslModule Ulayer(int size) extends NslModule () { public NslDinDouble1 s_in(size); public NslDinDouble0 v_in(); public NslDoutDouble1 uf(size); private NslDouble1 up(size); private NslDouble0 hu(); private double tau; public void simRun() { up =0; uf = 0; hu = 0.1; tau =1.0; } public void simRun() { up = nslDiff(up,tau, -up + uf - v_in – hu + s_in); uf = nslStep(up,0.1,0.1.0); } nslModule Vlayer(int size) extends NslModule () { public NslDinDouble1 u_in(size); public NslDoutDouble0 vf(); private NslDouble0 vp(); private NslDouble0 hv(); private double tau; public void initRun() { vp =0; vf = 0; hv=0.5; tau=1.0; } public void simRun() { vp = nslDiff(vp,tau,-vp+nslSum(u_in) – hv); vf = nslRamp(vp); } uf Ulayer v_in s vf Vlayer u_in Ulayer and Vlayer Modules
52
Axon Neuron Dendrite Soma Synapse Spine Synaptic Cleft Axon Terminal Receptor Interacellular Element Vescicle Calcium Mechanism NMDA Receptor AMPA Receptor Pump Diffusion Channel Buffer Dendrites Axon Synapses Soma Neuron (detailed)
53
Visual Fields Predator (PreTectum) Prey (Tectum) Integrated (MHM) Heading (MHM) Neural based Behavior: Prey Acquisition (without barrier)
54
Neuroscience: Autonomous Biological Agents Sensors Actuators Vision Sound Smell Touch Legs Wings Fins
55
Robotics: Autonomous Robotic Agents Sensors Actuators Vision Sound Smell Touch Legs Wings Fins Wheels
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.