MICHAEL MILFORD, DAVID PRASSER, AND GORDON WYETH FOLAMI ALAMUDUN GRADUATE STUDENT COMPUTER SCIENCE & ENGINEERING TEXAS A&M UNIVERSITY RatSLAM on the Edge:

Slides:



Advertisements
Similar presentations
Bioinspired Computing Lecture 16
Advertisements

Patch to the Future: Unsupervised Visual Prediction
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
Mapping: Scaling Rotation Translation Warp
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
1 Panoramic University of Amsterdam Informatics Institute.
Neural NetworksNN 11 Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Matching brain and body dynamics Daniel Wolpert: – "Why don't plants have brains?" – "Plants don't have to move!" Early phases of embodied artificial intelligence:
COGNITIVE NEUROSCIENCE
Biologically-inspired robot spatial cognition based on rat neurophysiological studies Alejandra Barrera and Alfredo Weitzenfeld Auton Robot Rakesh.
Pattern Recognition using Hebbian Learning and Floating-Gates Certain pattern recognition problems have been shown to be easily solved by Artificial neural.
SME Review - September 20, 2006 Neural Network Modeling Jean Carlson, Ted Brookings.
Chapter Seven The Network Approach: Mind as a Web.
Christian Siagian Laurent Itti Univ. Southern California, CA, USA
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Behavior- Based Approaches Behavior- Based Approaches.
Intelligent Systems Lectures 17 Control systems of robots based on Neural Networks.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
Biologically Inspired Turn Control for Autonomous Mobile Robots Xavier Perez-Sala, Cecilio Angulo, Sergio Escalera.
Neural mechanisms of Spatial Learning. Spatial Learning Materials covered in previous lectures Historical development –Tolman and cognitive maps the classic.
DARPA Mobile Autonomous Robot SoftwareLeslie Pack Kaelbling; March Adaptive Intelligent Mobile Robotics Leslie Pack Kaelbling Artificial Intelligence.
On-line Novelty Detection With Application to Mobile Robotics Stephen Marsland Imaging Science and Biomedical Engineering University of Manchester.
An Architecture for Empathic Agents. Abstract Architecture Planning + Coping Deliberated Actions Agent in the World Body Speech Facial expressions Effectors.
/09/dji-phantom-crashes-into- canadian-lake/
NEURAL NETWORKS FOR DATA MINING
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
University of Amsterdam Search, Navigate, and Actuate - Qualitative Navigation Arnoud Visser 1 Search, Navigate, and Actuate Qualitative Navigation.
Computer Science Department Pacific University Artificial Intelligence -- Computer Vision.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Notes: 1. Exam corrections and assignment 3 due today. 2. Last exam – last day of class 3. Chapter 24 reading assignment - pgs. 704 – New website:
Computational Modeling of Place Cells in the Rat Hippocampus Nov. 15, 2001 Charles C. Kemp.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
I Robot.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Major Disciplines in Computer Science Ken Nguyen Department of Information Technology Clayton State University.
Autonomous Navigation Based on 2-Point Correspondence 2-Point Correspondence using ROS Submitted By: Li-tal Kupperman, Ran Breuer Advisor: Majd Srour,
Topological Path Planning JBNU, Division of Computer Science and Engineering Parallel Computing Lab Jonghwi Kim Introduction to AI Robots Chapter 9.
Intelligent Database Systems Lab N.Y.U.S.T. I. M. Externally growing self-organizing maps and its application to database visualization and exploration.
Turning Autonomous Navigation and Mapping Using Monocular Low-Resolution Grayscale Vision VIDYA MURALI AND STAN BIRCHFIELD CLEMSON UNIVERSITY ABSTRACT.
Robotics Club: 5:30 this evening
Chapter 10. The Explorer System in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans On, Kyoung-Woon Biointelligence Laboratory.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Chapter 8. Learning of Gestures by Imitation in a Humanoid Robot in Imitation and Social Learning in Robots, Calinon and Billard. Course: Robots Learning.
M. A. Wilson and B. L. McNaughton Presented by: Katie Herdman, Monika Walerjan, Scott Good, Snir Seitelbach and David Dudar.
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Tracking with dynamics
Self Organizing Maps: Clustering With unsupervised learning there is no instruction and the network is left to cluster patterns. All of the patterns within.
Artificial Intelligence: Research and Collaborative Possibilities a presentation by: Dr. Ernest L. McDuffie, Assistant Professor Department of Computer.
Ghislain Fouodji Tasse Supervisor: Dr. Karen Bradshaw Computer Science Department Rhodes University 04 August 2009.
Cogs1 mapping space in the brain Douglas Nitz – Feb. 19, 2009 any point in space is defined relative to other points in space.
Robot Intelligence Technology Lab. Evolutionary Robotics Chapter 3. How to Evolve Robots Chi-Ho Lee.
Abstract Neurobiological Based Navigation Map Created During the SLAM Process of a Mobile Robot Peter Zeno Advisors: Prof. Sarosh Patel, Prof. Tarek Sobh.
Artificial Neural Networks By: Steve Kidos. Outline Artificial Neural Networks: An Introduction Frank Rosenblatt’s Perceptron Multi-layer Perceptron Dot.
National Taiwan Normal A System to Detect Complex Motion of Nearby Vehicles on Freeways C. Y. Fang Department of Information.
Probabilistic Robotics Graph SLAM
Fundamentals of Information Systems, Sixth Edition
Neural Networks.
Paper – Stephen Se, David Lowe, Jim Little
Emulating the Functionality of Rodents’ Neurobiological Navigation and Spatial Cognition Cells in a Mobile Robot Peter J. Zeno Department of Computer Science.
Using an FPGA to Emulate Grid Cell Spatial Cognition in a Mobile Robot
Simultaneous Localization and Mapping
Ch 14. Active Vision for Goal-Oriented Humanoid Robot Walking (1/2) Creating Brain-Like Intelligence, Sendhoff et al. (eds), Robots Learning from.
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Spatial Patterns of Persistent Neural Activity Vary with the Behavioral Context of Short- Term Memory  Kayvon Daie, Mark S. Goldman, Emre R.F. Aksay  Neuron 
Information Processing by Neuronal Populations Chapter 5 Measuring distributed properties of neural representations beyond the decoding of local variables:
Simultaneous Localization and Mapping
The Network Approach: Mind as a Web
Presentation transcript:

MICHAEL MILFORD, DAVID PRASSER, AND GORDON WYETH FOLAMI ALAMUDUN GRADUATE STUDENT COMPUTER SCIENCE & ENGINEERING TEXAS A&M UNIVERSITY RatSLAM on the Edge: Revealing a Coherent Representation from an Overloaded Rat Brain

OUTLINE Overview RatSLAM Experience Mapping Goal Recall Using Experience Maps Experiment Results Discussion

OVERVIEW In order for a robot to navigate intelligently : It must possess a means of acquiring and storing information about past experiences; and It must possess the ability to use make decisions based on this information.

OVERVIEW What is SLAM? Simultaneous Localization and Mapping Determine the state of the world: What does the world look like? Determine location in the observed world: Where in the world am I? Where in the world…?

OUTLINE Overview RatSLAM Experience Mapping Goal Recall Using Experience Maps Experiment Results Discussion

ratSLAM Why are we SLAM-ing? Maps are used to depict the environment for an overview and to determine location within the perceived environment. Locating and mapping under conditions of errors and of noise is very complex. Simultaneous localization and mapping (SLAM).

Inspired by computational models of hippocampus in rodents. Hippocampus is a part of the brain that plays an important role in long-term memory and spatial navigation Neurons in the rat and mouse hippocampus respond as place cells. Place cells exhibit a high rate of firing whenever an animal is in a location in an environment corresponding to the cell's "place field" Place Field are patterns of neural activity that correspond to locations in space ratSLAM

RatSLAM is an implementation of a hippocampal model of robot control: To provide a new and effective method for the mobile robot problem of (SLAM); and To reproduce a high-level brain function in a robot in order to increase the understanding of memory and learning in mammals.

ratSLAM Architecture for RatSLAM. Local View and Pose Cell arrangement for artificial landmarks

ratSLAM – Local View Local View (LV): A form of representation processed from vision information from camera images Calibrates the robot’s state information Stored and associated with the currently active pose cells. If familiar, the current visual scene also causes activity to be injected into the pose cells associated with the currently active LV cells.

ratSLAM – Pose Cell 3-D pose cell model. Each dimension corresponds to one of the three state variables of a ground-based robot Θ`Θ` y` x`

ratSLAM – Pose Cell Pose Cell: A three-dimensional structured competitive attractor neural network. Combines the characteristics of place and head- direction cells Each axis of the structure corresponds to a different state variable, x′, y′ and θ′

ratSLAM How it works: Wheel encoder information is used to perform path integration by shifting the current pose cell activity. Vision information is converted into a local view. Local view cell is associated with the currently active pose cells. If familiar, activity is injected into the particular pose cells associated with the currently active local view cells.

ratSLAM – Pose Cell The first test environment was a two by two metre arena

ratSLAM – Pose Cell Floor plan and robot trajectory for initial goal navigation experiments.

ratSLAM – Pose Cell The temporal map cells after recall of the first goal.

ratSLAM – Pose Cell The path the robot followed to reach the first goal.

ratSLAM Hashing collisions in the Pose Cells: Vision information starts to cause more frequent loop closures. Leads to discontinuities in the pose cell matrix. Multiple representations of the same physical areas in the environment. Clusters of pose cells become associated with more than one pose. Hashing collisions within pose cells are unavoidable.

ratSLAM Floor plan of large indoor environment

ratSLAM Dominant packet path for a 40 × 20 × 36 pose cell matrix.

ratSLAM Temporal map for the large indoor environment

OUTLINE Overview RatSLAM Experience Mapping Goal Recall Using Experience Maps Experiment Results Discussion

EXPERIENCE MAPPING Experience mapping algorithm is the creation and maintenance of a collection of experiences and inter- experience links. This produces a spatially continuous map without collisions from the messy representations found in the pose cells. It does this by combining information from the pose cells with the Local View cells and the robot's current behavior

EXPERIENCE MAPPING Experience map co-ordinate space

EXPERIENCE MAPPING Experience Mapping: The algorithm uses output from pose cells and local view cells to create an experience map. A graph-like map containing nodes (experiences) and links between experiences. Each node represents a snapshot of the activity within pose cells and local view cells. New experience nodes is created as needed.

EXPERIENCE MAPPING Experience Generation Each experience has its own (x′, y′, θ′, V ).  where x’, y’, and ′ are the three state variables.  V describes the visual scene associated with the experience. Output from the pose cells and local view cells is used to create a map made up of robot experiences. Inter-experience links store temporal, behavioral, and odometric information about the robot's movement between experiences.

EXPERIENCE MAPPING Experience zone of influence: Activity is dependent on how close the activity peaks in the pose cells and local view cells are to the cells associated with the experience.

EXPERIENCE MAPPING x' PC,,Y PC and θ' - coordinates of the dominant activity packet, x' i, y I, and θ‘ - coordinates of the associated experience i, r a is the zone constant for the (x',y') plane, and 0 a is the zone constant for the 0' dimension.

EXPERIENCE MAPPING Experience zone Visual scene V is the current visual scene. V i is the visual scene associate with experience i. E x’y’θ’ is the visual scene energy component.

EXPERIENCE MAPPING Total Energy Level: Total Energy level of Experience E i : E i = E V × (E xy + E θ )

EXPERIENCE MAPPING Experience Mapping: As the robot moves around a novel environment, it needs to generate experiences to form a representation of the world. Learning of new experiences is triggered not only by exploring in new areas of an environment, but also by visual changes in areas the robot has already explored.

OUTLINE Overview RatSLAM Experience Mapping Goal Recall Using Experience Maps Experiment Results Discussion

GOAL RECOLLECTION Experience Transitions: Transitions represent the physical movement of the robot in the world as it moves from one experience to another.

GOAL RECOLLECTION Experience Transitions: dp ij is a vector describing the position and orientation of experience j relative to experience i.

GOAL RECOLLECTION Map Correction: Discrepancies between a transition's odometric information and the linked experiences' coordinates are minimized through a process of map correction:

OUTLINE Overview RatSLAM Experience Mapping Goal Recall Using Experience Maps Experiment Results Discussion

EXPERIMENTAL RESULTS

DISCUSSION Experience maps are localized:  Cartesian properties are not guaranteed beyond local area For instance straight corridors may be slightly curved in the experience map.