A UTONOMOUS GUIDANCE FOR A UAS ALONG A STAIRCASE 14/12/2015 Olivier De Meyst, Thijs Goethals, Haris Balta, Geert De Cubber, Rob Haelterman Belgian Royal.

Slides:



Advertisements
Similar presentations
RGB-D object recognition and localization with clutter and occlusions Federico Tombari, Samuele Salti, Luigi Di Stefano Computer Vision Lab – University.
Advertisements

For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Luis Mejias, Srikanth Saripalli, Pascual Campoy and Gaurav Sukhatme.
Hybrid Position-Based Visual Servoing
Computer Vision Detecting the existence, pose and position of known objects within an image Michael Horne, Philip Sterne (Supervisor)
TCSP – Patent Liability Analysis. Project Overview Overall Objectives Create an Unmanned Aerial Vehicle (UAV) which is capable of the following: Create.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Longin Jan Latecki Zygmunt Pizlo Yunfeng Li Temple UniversityPurdue University Project Members at Temple University: Yinfei Yang, Matt Munin, Kaif Brown,
A Versatile Depalletizer of Boxes Based on Range Imagery Dimitrios Katsoulas*, Lothar Bergen*, Lambis Tassakos** *University of Freiburg **Inos Automation-software.
Three-Dimensional Concepts
DVMM Lab, Columbia UniversityVideo Event Recognition Video Event Recognition: Multilevel Pyramid Matching Dong Xu and Shih-Fu Chang Digital Video and Multimedia.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
Segmentation Analysis
Human tracking and counting using the KINECT range sensor based on Adaboost and Kalman Filter ISVC 2013.
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
Activity 1: Multi-sensor based Navigation of Intelligent Wheelchairs Theo Theodoridis and Huosheng Hu University of Essex 27 January 2012 Ecole Centrale.
LEGO Mindstorms NXT Introduction. Component NXT Brick Touch Sensor Light Sensor Ultrasonic Sensor Interactive Servo Motors MMN Lab.
Geodetic Metrology and Engineering Geodesy Institute of Geodesy and Photogrammetry C URRENT I NVESTIGATIONS AT THE ETH Z URICH IN O.
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Indoor Localization using Wireless LAN infrastructure Location Based Services Supervised by Prof. Dr. Amal Elnahas Presented by Ahmed Ali Sabbour.
CS 376b Introduction to Computer Vision 04 / 29 / 2008 Instructor: Michael Eckmann.
EADS DS / SDC LTIS Page 1 7 th CNES/DLR Workshop on Information Extraction and Scene Understanding for Meter Resolution Image – 29/03/07 - Oberpfaffenhofen.
Xueyu(Sherry) Du Mechanical and Aerospace Engineering University of Florida 4/10/2014 ZENITH: A ROBOT THAT CAN AUTOMATICALLY CLIMB UP THE STAIRS.
Solving the Indoor SLAM Problem for a Low-Cost Robot Using Sensor Data Fusion and Autonomous Feature-Based Exploration PhD Student: Prof. MSc. Luciano.
International Conference on Computer Vision and Graphics, ICCVG ‘2002 Algorithm for Fusion of 3D Scene by Subgraph Isomorphism with Procrustes Analysis.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil)
TEAM AR.DRONE Final presentation Ingredients AR drone Drone moving API -Compass -front Camera -bottom Camera.
1 Webcam Mouse Using Face and Eye Tracking in Various Illumination Environments Yuan-Pin Lin et al. Proceedings of the 2005 IEEE Y.S. Lee.
COBXXXX EXPERIMENTAL FRAMEWORK FOR EVALUATION OF GUIDANCE AND CONTROL ALGORITHMS FOR UAVS Sérgio Ronaldo Barros dos Santos,
Course 9 Texture. Definition: Texture is repeating patterns of local variations in image intensity, which is too fine to be distinguished. Texture evokes.
Jorge Almeida Laser based tracking of mutually occluding dynamic objects University of Aveiro 2010 Department of Mechanical Engineering 10 September 2010.
Camera Basics. Three things effect the exposure: 2. The size of the aperture or hole that allows light in. 3. The length of time light is admitted into.
ESR 2 / ER 2 Testing Campaign Review A. CrivellaroY. Verdie.
Single View Geometry Course web page: vision.cis.udel.edu/cv April 9, 2003  Lecture 20.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Goal and Motivation To study our (in)ability to detect inconsistencies in the illumination of objects in images Invited Talk! – Hany Farid: Photo Forensincs:
Motion Analysis using Optical flow CIS750 Presentation Student: Wan Wang Prof: Longin Jan Latecki Spring 2003 CIS Dept of Temple.
S PIDER B AT : A UGMENTING W IRELESS S ENSOR N ETWORKS WITH D ISTANCE AND A NGLE I NFORMATION Georg Oberholzer, Philipp Sommer, and Roger Wattenhofer IPSN.
The Wayfarer Modular Navigation Payload for Intelligent Robot Infrastructure Brian Yamauchi
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
3d Pose Detection Used by Kinect
A Robust Method for Lane Tracking Using RANSAC James Ian Vaughn Daniel Gicklhorn CS664 Computer Vision Cornell University Spring 2008.
FUFO project Final report.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Presentation: Shashank Gundu.  Introduction  Related work  Hardware Platform  Experiments  Conclusion.
Spatiotemporal Saliency Map of a Video Sequence in FPGA hardware David Boland Acknowledgements: Professor Peter Cheung Mr Yang Liu.
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
Optic Flow QuadCopter Control
Model Refinement from Planar Parallax Anthony DickRoberto Cipolla Department of Engineering University of Cambridge.
Automatic 3D modelling of Architecture Anthony Dick 1 Phil Torr 2 Roberto Cipolla 1 1 Department of Engineering 2 Microsoft Research, University of Cambridge.
1 Munther Abualkibash University of Bridgeport, CT.
Integrating LiDAR Intensity and Elevation Data for Terrain Characterization in a Forested Area Cheng Wang and Nancy F. Glenn IEEE GEOSCIENCE AND REMOTE.
3D acoustic image processing for underwater visual inspection and navigation Umberto Castellani, Andrea Fusiello, Vittorio Murino Dipartimento di Informatica,
FINGERTEC FACE ID FACE RECOGNITION Technology Overview.
A Plane-Based Approach to Mondrian Stereo Matching
José Manuel Iñesta José Martínez Sotoca Mateo Buendía
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
Segmentation Based Environment Modeling Using a Single Image
Yun-FuLiu Jing-MingGuo Che-HaoChang
Real-time Wall Outline Extraction for Redirected Walking
Three-Dimensional Concepts. Three Dimensional Graphics  It is the field of computer graphics that deals with generating and displaying three dimensional.
Multiple Robot navigation and Mapping for Combat environment
Coordinate Geometry – Outcomes
Multi-Sensor Soft-Computing System for Driver Drowsiness Detection
Viability of vanishing point navigation for mobile devices
Course 6 Stereo.
Shooting in manual mode
Detecting and analysing motion
Presentation transcript:

A UTONOMOUS GUIDANCE FOR A UAS ALONG A STAIRCASE 14/12/2015 Olivier De Meyst, Thijs Goethals, Haris Balta, Geert De Cubber, Rob Haelterman Belgian Royal Military Academy (RMA)

P ROJECT C ONTEXT : ICARUS

P ROBLEM S TATEMENT Indoor flight is becoming possible However, staircases remain a problem ?

P ROBLEM S TATEMENT & C ONSTRAINTS Design a UAS control architecture able to guide a UAS over a staircase Use a low-cost UAV: Parrot AR-Drone Use as much as possible existing (ROS) components  Focus on system integration

S YSTEM A RCHITECTURE Components: Perception 2D Detection 3D Detection 2D/3D Data Fusion Navigation

2D D ETECTION P IPELINE : 1) L INE D ETECTION 1.Line Segment Detector (LSD) 2.Filter out parallel lines (RANSAC) 3.Detect vanishing points

2D D ETECTION P IPELINE : 2) O BJECT D ETECTION 1.ADABOOST- based object detector, trained for staircase models

2D D ETECTION P IPELINE : 3) C OMBINING LINES AND OBJECTS Retain only detections where both detector agree  Verify if detected lines fall in object detection rectangles  Cohen-Sutherland clipping algorithm

3D D ETECTION P IPELINE Sparse 3D Data collected via LSD-SLAM Projected to depth image for further processing

2D-3D C OMBINATION 1.Back-project 2D lines in 3D image (PCL SACMODEL_LINE ) 2.Fit 3D Planes 3.Planes are agglomerated to create a model of a staircase 4.Apply hierarchical clustering to reduce the number of hits and improve the accuracy of detection

S TAIRCASE M ODELING Projection Model:

The staircase can be characterized by expressing the ratio between the observed distance d between the projected lines P i of the staircase in the 2D image model: f(c x, c y, s d, s h, i,  )  f(c x,i) S TAIRCASE M ODELING

A UTONOMOUS N AVIGATION Based on ROS tum_ardrone: 1. When the developed staircase algorithm has enough detected points, it calculates the central point using an agglomerative clustering algorithm 2. The top step of the staircase is identified 3. A point 1.5 meters above the top step is calculated and chosen as a goal position 4. A goto x y z w command is issued to tum ardrone (w, the yaw angle, in degrees and the coordinates (x, y, z ) are in m) 5. The tum ardrone navigation controller ensures the autonomous navigation of the UAS from the start to the goal position by using four built-in PID controllers.

R ESULTS Limitation of the system: off-board processing Framerate: 5-10 FPS (i7) Initialization of the tum_ardrone package has proven to be the most diffcult part of this methodology. Planar scenes with a lot of different texture worked fine with a success rate of over 80 percent. A scene with a lot of depth or similar texture (e.g. concrete wall) did not work at all.

R ESULTS Outdoor staircases had a higher success rate. Mainly because of better light conditions which result in a superior performance of lsd_slam. tum_ardrone is very sensitive to yaw movements. Sudden movements can result in losing the tracking and this can only be solved by a re-initialization. Once tum ardrone and lsd slam were successful initialized, a 100 percent success rate of detection was obtained and 75 percent of the staircases were successfully climbed.

R ESULTS : I NDOOR N AVIGATION

R ESULTS : O UTDOOR N AVIGATION Outdoor Navigation

R ESULTS : O UTDOOR N AVIGATION

C ONCLUSIONS & F UTURE W ORK The methodology integrating multiple SoA approaches was presented, enabling a UAS to climb stairs This is a first step; multiple improvements are required: Better UAS platform and sensors (e.g dedicated depth sensors) On-line processing Higher robustness Dealing with narrower staircases Faster initialisation Future evolution: hybrid system (UAV-UGV)

14/12/2015 A UTONOMOUS GUIDANCE FOR A UAS ALONG A STAIRCASE Olivier De Meyst, Thijs Goethals, Haris Balta, Geert De Cubber, Rob Haelterman Belgian Royal Military Academy (RMA)