Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.

Slides:



Advertisements
Similar presentations
A Novel Approach of Assisting the Visually Impaired to Navigate Path and Avoiding Obstacle-Collisions.
Advertisements

Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Odometry Error Modeling Three noble methods to model random odometry error.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
A vision-based system for grasping novel objects in cluttered environments Ashutosh Saxena, Lawson Wong, Morgan Quigley, Andrew Y. Ng 2007 Learning to.
Silvina Rybnikov Supervisors: Prof. Ilan Shimshoni and Prof. Ehud Rivlin HomePage:
Using Perception for mobile robot. 2D ranging for mobile robot.
InteractIVe Summer School, July 6 th, 2012 Grid based SLAM & DATMO Olivier Aycard University of Grenoble 1 (UJF), FRANCE
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Final Demonstration: Dead Reckoning System for Mobile Robots Lee FithianSteven Parkinson Ajay JosephSaba Rizvi.
An experiment on squad navigation of human and robots IARP/EURON Workshop on Robotics for Risky Interventions and Environmental Surveillance January 7th-8th,
1 AE - Control and Simulation – Micro Air Vehicle laboratory Flying Robots : the MAV-lab - Delfly: 3g, 10 cm, camera - Guinness book of records - Autonomous.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
TAXISAT PROJECT Low Cost GNSS and Computer Vision based data fusion solution for driverless vehicles Marc POLLINA
Robots at Work Dr Gerard McKee Active Robotics Laboratory School of Systems Engineering The University of Reading, UK
What is it? A mobile robotics system controls a manned or partially manned vehicle-car, submarine, space vehicle | Website for Students.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Autonomous Robot Localisation, Path Planning, Navigation, and Obstacle Avoidance The Problem: –Robot needs to know where it is? where it’s going? is there.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Activity 1: Multi-sensor based Navigation of Intelligent Wheelchairs Theo Theodoridis and Huosheng Hu University of Essex 27 January 2012 Ecole Centrale.
Challenging Environment
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Constraints-based Motion Planning for an Automatic, Flexible Laser Scanning Robotized Platform Th. Borangiu, A. Dogar, A. Dumitrache University Politehnica.
Team Spot On! With Zebulon Clark Desiree And Earl.
GCAPS Team Design Review CPE 450 Section 1 January 21, 2008 Nick Hebner Kooper Frahm Ryan Weiss.
Cooperating AmigoBots Framework and Algorithms
Low-Cost Localization for Educational Robotic Platforms via an External Fixed-Position Camera Drew Housten Dr. William Regli
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
3D SLAM for Omni-directional Camera
MESA LAB Multi-view image stitching Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of.
Submitted by: Giorgio Tabarani, Christian Galinski Supervised by: Amir Geva CIS and ISL Laboratory, Technion.
Qualitative Vision-Based Mobile Robot Navigation Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering Clemson University.
1 Distributed and Optimal Motion Planning for Multiple Mobile Robots Yi Guo and Lynne Parker Center for Engineering Science Advanced Research Computer.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
0 Test Slide Text works. Text works. Graphics work. Graphics work.
Abstract A Structured Approach for Modular Design: A Plug and Play Middleware for Sensory Modules, Actuation Platforms, Task Descriptions and Implementations.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Prime Mobility Group Group Members: Fredrick Baggett William Crick Sean Maxon Advisor: Dr. Elliot Moore.
SurveyBOT Final Report Chris Johnson Miguel Lopez Jeremy Coffeen July 24, 2003 Georgia Institute of Technology School of Electrical and Computer Engineering.
Towards the autonomous navigation of intelligent robots for risky interventions Janusz Bedkowski, Grzegorz Kowalski, Zbigniew Borkowicz, Andrzej Masłowski.
Typical DOE environmental management robotics require a highly experienced human operator to remotely guide and control every joint movement. The human.
Fast SLAM Simultaneous Localization And Mapping using Particle Filter A geometric approach (as opposed to discretization approach)‏ Subhrajit Bhattacharya.
Rover and Instrument Capabilities Life in the Atacama 2004 Science & Technology Workshop Michael Wagner, James Teza, Stuart Heys Robotics Institute, Carnegie.
Navigation Strategies for Exploring Indoor Environments Hector H Gonzalez-Banos and Jean-Claude Latombe The International Journal of Robotics Research.
Auto-Park for Social Robots By Team Daedalus. Requirements for FVE Functional Receive commands from user via smartphone app Share data with other cars.
©Roke Manor Research Ltd 2011 Part of the Chemring Group 1 Startiger SEEKER Workshop Estelle Tidey – Roke Manor Research 26 th February 2011.
G. Casalino, E. Zereik, E. Simetti, A. Turetta, S. Torelli and A. Sperindè EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Heterogeneous Teams of Modular Robots for Mapping and Exploration by Grabowski et. al.
Basilio Bona DAUIN – Politecnico di Torino
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Life in the Atacama, Design Review, December 19, 2003 Carnegie Mellon Software Architecture Life in the Atacama Design Review December 19, 2003 David Wettergreen.
STEREO VISION BASED MOBILE ROBOT NAVIGATION IN ROUGH TERRAIN
Vision-Guided Humanoid Footstep Planning for Dynamic Environments
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
Paper – Stephen Se, David Lowe, Jim Little
Contents Team introduction Project Introduction Applicability
Ch 14. Active Vision for Goal-Oriented Humanoid Robot Walking (1/2) Creating Brain-Like Intelligence, Sendhoff et al. (eds), Robots Learning from.
Vehicle Segmentation and Tracking in the Presence of Occlusions
Mixed Reality Server under Robot Operating System
Presentation transcript:

Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Introduction TBRA Overview Objectives Architecture / Technical Description Perception Capability DEM and Data Fusion Test Results Conclusion EUCASS 2011 – 4-8 July, St. Petersburg, Russia Agenda

Introduction Rover GNC: Navigation Localisation Trajectory Control Reach goal Safe and best path Environment perception EUCASS 2011 – 4-8 July, St. Petersburg, Russia

TBRA Overview TBRA innovative aspects Flexibility Flexibility modular software architecture each module provides a key functionality of the GNC Modularity Modularity Resulting from the standardization of the interfaces and data structures exchanged between the software modules No “ad-hoc development” typical of space applications

Project Objectives experience Acquisition of experience in robotics for space exploration Team Growth Validation Testing Validation and Testing of GNC algorithms Complete mobile robotic platform development New New solutions for the space market No “start from scratch” Support to programs, project and study development Anticipate Anticipate future exploration mission needs Innovative Sensors (Omni-camera, Time-of-flight Camera, …) Robot Swarms Manipulation Samples Fetching and Handling Opportunistic Science Reactive Navigation Simultaneous Localization and mapping (SLAM) … EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Functional Overview EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Architecture

Robotic Setup EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Localisation – Visual Odometry Navigation – Path Planning Navigation – Traversability Navigation – Perception Single Functionality Testing

EUCASS 2011 – 4-8 July, St. Petersburg, Russia Perception Capability

Single Perception EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Single Reconstruction EUCASS 2011 – 4-8 July, St. Petersburg, Russia

frames,, and located in the same point A function of pan/tilt angles contribution also to translation as the camera does not rotate with respect to its reference frame origin EUCASS 2011 – 4-8 July, St. Petersburg, Russia PTU Model

Multiple Perception EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Multiple Reconstruction EUCASS 2011 – 4-8 July, St. Petersburg, Russia

navigable blue means navigable, red means obstacleunknown obstacle, black means unknown 400 x 400 (pxl) planar image rover located at the image centre each pixel represents a world area of 5 cm x 5 cm black pixels just behind the box due to the occlusion of the box itself EUCASS 2011 – 4-8 July, St. Petersburg, Russia Camera Digital Elevation Model

3DLS Digital Elevation Model EUCASS 2011 – 4-8 July, St. Petersburg, Russia sparse after 2-3 m 0.25 deg 3DLS angular resolution 10 deg/s speed resolution limited to reduce perception time

Data Fusion Data fusion policies: Visual DEM or Laser DEM only no fusion at all higher priority Both DEMs but Vision has the higher priority higher priority Both DEMs but Laser has the higher priority Merging Merging of both DEMs Filtering Filtering strategy for Noise Mismatched correspondences Bad illumination EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Single Sensor DEM Filtering (Median,…) Apply the filtering to each point from the sensor Assign each computed value to the final DEM EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Multiple Sensor DEM Apply the filtering to each point from the higher priority sensor If the pixel is invalid, filter the corresponding value from the other sensor Assign the computed value to the final DEM EUCASS 2011 – 4-8 July, St. Petersburg, Russia

Weighted Merging confidence Each pixel of each sensor DEM assigned to a confidence according to some criteria: less considered points far away from the camera located at the very side of the camera field of view weighted Perform a weighted fusion Data filtering improve DEM accuracy avoid errors such as false obstacle detectionBUT be careful not to delete obstacles!!!

Conclusions EUCASS 2011 – 4-8 July, St. Petersburg, Russia Rover: Autonomous navigation both indoor and outdoor Many GNC sensors integrated New GNC sensors and (functionalities) to be added Omni-camera Time-of-Flight Camera Manipulator Focus on algorithms only Portability on different platform No “ad hoc” development No “from scratch” development Testing and Validation Tool

Questions? EUCASS 2011 – 4-8 July, St. Petersburg, Russia