Robot Compagnion Localization at home and in the office Arnoud Visser, Jürgen Sturm, Frans Groen University of Amsterdam Informatics Institute.

Slides:



Advertisements
Similar presentations
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Advertisements

Interactive Rendering using the Render Cache Bruce Walter, George Drettakis iMAGIS*-GRAVIR/IMAG-INRIA Steven Parker University of Utah *iMAGIS is a joint.
Discussion topics SLAM overview Range and Odometry data Landmarks
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
Learning from Demonstrations Jur van den Berg. Kalman Filtering and Smoothing Dynamics and Observation model Kalman Filter: – Compute – Real-time, given.
Introduction To Tracking
Robust 3D Head Pose Classification using Wavelets by Mukesh C. Motwani Dr. Frederick C. Harris, Jr., Thesis Advisor December 5 th, 2002 A thesis submitted.
(Includes references to Brian Clipp
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Christian Mandel Bernd Krieg-Brückner Bernd Gersdorf Christoph Budelmann Marcus-Sebastian Schröder Navigation Aid for Mobility Assistants Joint CEWIT-TZI-acatech.
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
1 Panoramic University of Amsterdam Informatics Institute.
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
SA-1 Body Scheme Learning Through Self-Perception Jürgen Sturm, Christian Plagemann, Wolfram Burgard.
Localization David Johnson cs6370. Basic Problem Go from thisto this.
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Adam Rachmielowski 615 Project: Real-time monocular vision-based SLAM.
Probabilistic Methods in Mobile Robotics. Stereo cameras Infra-red Sonar Laser range-finder Sonar Tactiles.
Special Interest Group on NETworking SIGNET Discovery, localization, and recognition of smart objects by a mobile robot UNIVERSITY OF PADUA Dept. of Information.
Visual Odometry Michael Adams CS 223B Problem: Measure trajectory of a mobile platform using visual data Mobile Platform (Car) Calibrated Camera.
Introduction to Kalman Filter and SLAM Ting-Wei Hsu 08/10/30.
Learning in Artificial Sensorimotor Systems Daniel D. Lee.
Active Simultaneous Localization and Mapping Stephen Tully, : Robotic Motion Planning This project is to actively control the.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
A Probabilistic Approach to Collaborative Multi-robot Localization Dieter Fox, Wolfram Burgard, Hannes Kruppa, Sebastin Thrun Presented by Rajkumar Parthasarathy.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Boğaziçi University Artificial Intelligence Lab. Artificial Intelligence Laboratory Department of Computer Engineering Boğaziçi University Techniques for.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
Overview and Mathematics Bjoern Griesbach
Real-Time Vision on a Mobile Robot Platform Mohan Sridharan Joint work with Peter Stone The University of Texas at Austin
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Real-time Dense Visual Odometry for Quadrocopters Christian Kerl
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
Navi Rutgers University 2012 Design Presentation
A General Framework for Tracking Multiple People from a Moving Camera
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
Simultaneous Localization and Mapping Presented by Lihan He Apr. 21, 2006.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Visibility Graph. Voronoi Diagram Control is easy: stay equidistant away from closest obstacles.
Young Ki Baik, Computer Vision Lab.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
Submitted by: Giorgio Tabarani, Christian Galinski Supervised by: Amir Geva CIS and ISL Laboratory, Technion.
Asian Institute of Technology
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
The Hardware Design of the Humanoid Robot RO-PE and the Self-localization Algorithm in RoboCup Tian Bo Control and Mechatronics Lab Mechanical Engineering.
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Robotics Club: 5:30 this evening
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Multimedia Systems and Communication Research Multimedia Systems and Communication Research Department of Electrical and Computer Engineering Multimedia.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
10-1 Probabilistic Robotics: FastSLAM Slide credits: Wolfram Burgard, Dieter Fox, Cyrill Stachniss, Giorgio Grisetti, Maren Bennewitz, Christian Plagemann,
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Statistical environment representation to support navigation of mobile robots in unstructured environments Sumare workshop Stefan Rolfes Maria.
CSE-473 Mobile Robot Mapping.
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Map for Easy Paths GIANLUCA BARDARO
Vehicle Segmentation and Tracking in the Presence of Occlusions
Introduction to Robot Mapping
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Statistical environment representation to support navigation of mobile robots in unstructured environments Stefan Rolfes Maria Joao Rendas
Principle of Bayesian Robot Localization.
Presentation transcript:

Robot Compagnion Localization at home and in the office Arnoud Visser, Jürgen Sturm, Frans Groen University of Amsterdam Informatics Institute

Overview Mobile robotics Mobile robotics Robot localization Robot localization Presentation of the panorama approach Presentation of the panorama approach Results Results Demonstration videos Demonstration videos

Mobile robotics SICO at Kosair Children's Hospital Dometic, Louisville, Kentucky Sony Aibos playing soccer Cinekids, De Balie, Amsterdam Robot cranes and trucks unloading ships Port of Rotterdam RC3000, the robocleaner Kärcher

The localization problem Robot localization Robot localization.. is the problem of estimating the robot’s pose relative to a map of the environment. Position tracking Position tracking Global localization Global localization Kidnapping problem Kidnapping problem

Localization Sensors Sensors Odometry, GPS, Laserscanner, Camera.. Odometry, GPS, Laserscanner, Camera.. Feature space Feature space World representation World representation Topological graphs, grid-based maps Topological graphs, grid-based maps Filters Filters Kalman filters, particle filters Kalman filters, particle filters

Classical approaches Special environments Special environments (Visual) landmarks (Visual) landmarks (Electro-magnetic) guiding lines (Electro-magnetic) guiding lines Special sensors Special sensors GPS GPS Laser-scanners Laser-scanners Omni-directional cameras Omni-directional cameras Special requirements Special requirements Computationally heavy (offline computation) Computationally heavy (offline computation)

New approach Natural environments Natural environments Human environments Human environments Unstructured and/or unknown for the robot Unstructured and/or unknown for the robot Normal sensors Normal sensors Camera Camera Reasonable requirements Reasonable requirements Real-time Real-time Moderate hardware requirements Moderate hardware requirements

Platform: Sony Aibo Internal camera 30fps 208x160 pixels Computer 64bit RISC processor 567 MHz 64 MB RAM 16 MB memorystick WLAN Actuators Legs: 4 x 3 joints Head: 3 joints

Demo: Compass Library, University of Amsterdam

Synopsis

Color segmentation Sidetrack: Color Calibration Robot collects colors from environment Robot collects colors from environment Colors are clustered using an EM algorithm Colors are clustered using an EM algorithm Color-to-Colorclass lookup table is created for faster access Color-to-Colorclass lookup table is created for faster access Raw image Color class image

Mathematics rotation translation feature vector ideal world model learned world model

Feature space conversion

Feature vectors and world model World model distribution Feature vector consists of color transition counts between the n color classes

Feature space conversion (2) Raw image Color class image Sector-based feature vectors

Learning Update distribution of single color class transition by updating the constituting counters

Matching Likelihood of Single sector Rotation estimate Confidence estimate Adjacent sectors

Post-processing: Compass Idea: smooth rotational estimate over multiple frames + removes outliers + stabilizes estimate + integrates (rotational) odometry

Results: Compass Brightly illuminated living room

Results: Compass Daylight office environment

Results: Compass Outdoor soccer field

Results: Compass Robocup 4-Legged soccer field

Signal degradation (w.r.t. distance) Robocup 4-Legged soccer field

Post-processing: Grid localization Idea: learn multiple spots, then use confidence value to estimate the robot‘s position in between – fixed grid (better: self-learned graph based on confidence) – difficult to integrate odometry + proof of concept

Demo: Grid localization Robocup 4-Legged soccer field

Results: Grid localization Robocup 4-Legged soccer field x [cm] y [cm] Positioning accuracy Robot walks back to center after kidnap

Conclusions Novel approach to localization: Novel approach to localization: Works in unstructured environments Works in unstructured environments Tested on various locations Tested on various locations Interesting approach for mobile robots at home and in the office Interesting approach for mobile robots at home and in the office

Questions?