Proprioceptive Visual Tracking of a Humanoid Robot Head Motion

Slides:



Advertisements
Similar presentations
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
Advertisements

Face Alignment with Part-Based Modeling
Introduction To Tracking
Monte Carlo Localization for Mobile Robots Karan M. Gupta 03/10/2004
Tracking Multiple Occluding People by Localizing on Multiple Scene Planes Professor :王聖智 教授 Student :周節.
Object Inter-Camera Tracking with non- overlapping views: A new dynamic approach Trevor Montcalm Bubaker Boufama.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
An appearance-based visual compass for mobile robots Jürgen Sturm University of Amsterdam Informatics Institute.
Sam Pfister, Stergios Roumeliotis, Joel Burdick
Vision for Robotics ir. Roel Pieters & dr. Dragan Kostić Section Dynamics and Control Dept. Mechanical Engineering {r.s.pieters, January.
Project Progress Presentation Coffee delivery mission Dec, 10, 2007 NSH 3211 Hyun Soo Park, Iacopo Gentilini 1.
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Motion from image and inertial measurements Dennis Strelow Honeywell Advanced Technology Lab.
Motion from image and inertial measurements (additional slides) Dennis Strelow Carnegie Mellon University.
Tracking using the Kalman Filter. Point Tracking Estimate the location of a given point along a sequence of images. (x 0,y 0 ) (x n,y n )
Discriminative Training of Kalman Filters P. Abbeel, A. Coates, M
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Motion from image and inertial measurements Dennis Strelow Carnegie Mellon University.
UNIVERSITY OF MURCIA (SPAIN) ARTIFICIAL PERCEPTION AND PATTERN RECOGNITION GROUP REFINING FACE TRACKING WITH INTEGRAL PROJECTIONS Ginés García Mateos Dept.
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
Feature extraction Feature extraction involves finding features of the segmented image. Usually performed on a binary image produced from.
Computer Vision. Computer vision is concerned with the theory and technology for building artificial Computer vision is concerned with the theory and.
Vision Guided Robotics
An INS/GPS Navigation System with MEMS Inertial Sensors for Small Unmanned Aerial Vehicles Masaru Naruoka The University of Tokyo 1.Introduction.
Kalman filter and SLAM problem
Behavior analysis based on coordinates of body tags Mitja Luštrek, Boštjan Kaluža, Erik Dovgan, Bogdan Pogorelc, Matjaž Gams Jožef Stefan Institute, Department.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Parameter.
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
1. Introduction Motion Segmentation The Affine Motion Model Contour Extraction & Shape Estimation Recursive Shape Estimation & Motion Estimation Occlusion.
MARS: A Muscle Activity Recognition System Enabling Self-configuring Musculoskeletal Sensor Networks IPSN 2013 NSLab study group 2013/06/17 Presented by:
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Sensors.
Extracting Barcodes from a Camera-Shaken Image on Camera Phones Graduate Institute of Communication Engineering National Taiwan University Chung-Hua Chu,
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Simulation.
Visual Tracking on an Autonomous Self-contained Humanoid Robot Mauro Rodrigues, Filipe Silva, Vítor Santos University of Aveiro CLAWAR 2008 Eleventh International.
A Novel Distributed Sensor Positioning System Using the Dual of Target Tracking Liqiang Zhang, Member, IEEE, Qiang Cheng, Member, IEEE, Yingge Wang, and.
A New Fingertip Detection and Tracking Algorithm and Its Application on Writing-in-the-air System The th International Congress on Image and Signal.
Access Control Via Face Recognition. Group Members  Thilanka Priyankara  Vimalaharan Paskarasundaram  Manosha Silva  Dinusha Perera.
AUTOMATIC TARGET RECOGNITION OF CIVILIAN TARGETS September 28 th, 2004 Bala Lakshminarayanan.
Head Tracking in Meeting Scenarios Sascha Schreiber.
Centre for Mechanical Technology and Automation Institute of Electronics Engineering and Telematics  TEMA  IEETA  Control.
Cooperative Air and Ground Surveillance Wenzhe Li.
Robust Object Tracking by Hierarchical Association of Detection Responses Present by fakewen.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Principles of Radar Target Tracking The Kalman Filter: Mathematical Radar Analysis.
State Estimation for Autonomous Vehicles
Using Kalman Filter to Track Particles Saša Fratina advisor: Samo Korpar
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Learning Roomba Module 5 - Localization. Outline What is Localization? Why is Localization important? Why is Localization hard? Some Approaches Using.
Robot Vision SS 2009 Matthias Rüther ROBOT VISION 2VO 1KU Matthias Rüther.
ENTERFACE 08 Project 9 “ Tracking-dependent and interactive video projection ” Mid-term presentation August 19th, 2008.
Computer Vision UCT2 – Information Technologies MAP-I Doctoral Programme Miguel Tavares Coimbra (Principal Instructor), FC, UP Adérito Fernandes Marcos,
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
IMAGE PROCESSING is the use of computer algorithms to perform image process on digital images   It is used for filtering the image and editing the digital.
Signal and Image Processing Lab
A. M. R. R. Bandara & L. Ranathunga
Motion from image and inertial measurements
Paper – Stephen Se, David Lowe, Jim Little
Contents Team introduction Project Introduction Applicability
Florian Shkurti, Ioannis Rekleitis, Milena Scaccia and Gregory Dudek
Zhigang Zhu, K. Deepak Rajasekar Allen R. Hanson, Edward M. Riseman
Vehicle Segmentation and Tracking in the Presence of Occlusions
Inertial Measurement Units
Institute of Neural Information Processing (Prof. Heiko Neumann •
Visual Tracking on an Autonomous Self-contained Humanoid Robot
Sensor Fusion Localization and Navigation for Visually Impaired People
Principle of Bayesian Robot Localization.
-Koichi Nishiwaki, Joel Chestnutt and Satoshi Kagami
Detecting Digital Forgeries using Blind Noise Estimation
Presentation transcript:

Proprioceptive Visual Tracking of a Humanoid Robot Head Motion João Peixoto[1], Vitor Santos[1;2], and Filipe Silva[1;2] 1 Universidade de Aveiro, 2 Institute for Electronics Engineering and Informatics of Aveiro - IEETA {joao.peixoto,vitor,fmsilva}@ua.pt

1 The Problem 2 3 4 5 Develop a balance algorithm based on the motion of the head; Measure the motion of the head; Improve measurements accuracy by merging different sources of data. 6 7

Experimental Setup 1 2 3 4 5 6 Inertial sensors; 7 Visual sensors; Processing Unit. 6 7

Experimental Setup 1 2 3 4 5 6 7 RAZOR 9DOF - SEN 10736 Sensor A Sensor B Processing Unit Fire-wire Camera 4 5 6 7 RAZOR 9DOF - SEN 10736

Experimental Setup 1 2 3 4 5 6 7 POLOLU - MinIMU9DOF v2 Sensor A Sensor B Processing Unit Fire-wire Camera 4 5 6 7 POLOLU - MinIMU9DOF v2

Experimental Setup 1 2 3 4 5 6 7 Arduino UNO R3 Sensor A Sensor B Processing Unit Fire-wire Camera 4 5 6 7 Arduino UNO R3

Firefly MV-03MTC - Pointgrey 1 Experimental Setup 2 3 Sensor A Sensor B Processing Unit Fire-wire Camera 4 5 6 7 Firefly MV-03MTC - Pointgrey

Experimental Setup 1 2 3 4 5 6 Experiment design problem: 7 Lack of accurate ground truth. 6 7

Experimental Setup 1 2 3 4 5 6 FANUC LR Mate 200iB 7 High repeatability; High end-effector position accuracy; Reliable ground truth; Easy experiment design.

1 2 3 4 5 6 7 Fanuc 200iB POLOLU - MinIMU9DOF v2 RAZOR 9DOF - SEN 10736 Arduino UNO R3 Fire-wire Camera

Obtaining Inertial Data 1 Obtaining Inertial Data 2 3 4 5 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 6 7

Obtaining Inertial Data 1 Obtaining Inertial Data 2 3 4 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 5 6 7 𝜃 𝑧 𝑘 𝜃 𝑥 𝑘 𝜃 𝑦 𝑘

Obtaining Visual Data 1 2 3 4 5 6 Blob Detection Method; 7 Feature Extraction Method. 6 7

1 Blob Detection 2 3 4 5 6 7

1 Blob Detection 2 3 4 𝜽 𝒚 =𝟎 ° 5 6 7

1 Blob Detection 2 3 4 𝜽 𝒚 =𝟓𝟓 ° 5 6 7

Blob Detection Advantages Disadvantages 1 2 3 4 5 6 7 Direct measure of angular position; Not dependent of previous measures. Disadvantages Lack of robustness

1 Feature Extraction 2 3 4 5 6 7

1 Feature Extraction 2 3 4 5 6 7

1 Feature Extraction 2 3 4 5 6 7 𝑇

Feature Extraction 1 2 3 4 5 𝜃 𝑇 𝑘 − 𝜃 𝑇 𝑘−1 =Δ 𝜃 𝑇 𝑘 𝜃 𝑇 𝑘 − 𝜃 𝑇 𝑘−1 =Δ 𝜃 𝑇 𝑘 𝑇 = 𝑐𝑜𝑠 𝜃 𝑇 −𝑠𝑖𝑛 𝜃 𝑇 𝑑 𝑧 𝑠𝑒𝑛 𝜃 𝑇 𝑐𝑜𝑠 𝜃 𝑇 𝑑 𝑥 0 0 1 6 7 𝜃 𝑦 𝑘 = Δ𝜃 𝑇 𝑘 + 𝜃 𝑦 𝑘−1 𝜃 𝑦 𝑘 = Δ 𝜃 𝑇 𝑘 Δ 𝑡 𝑘 + 𝜃 𝑦 𝑘−1

Feature Extraction Advantages Disadvantages 1 2 3 4 5 6 7 More robust method, which can operate in several environments; Can be used for various tasks, like mapping or scene recognition. Disadvantages Relies on previous measurements

1 Visual Tracking 2 3 4 5 6 7

1 Visual Tracking 2 3 4 5 6 7

1 Results 2 3 4 5 6 7

Kalman Filter 1 2 3 4 5 6 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝑦 𝑘 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 6 𝑺𝒕𝒂𝒕𝒆 𝑽𝒂𝒓𝒊𝒂𝒃𝒍𝒆𝒔: 𝑥 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 7 𝒙 𝒌 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝑦 𝑘 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 𝑴𝒐𝒅𝒆𝒍 𝑫𝒆𝒇𝒊𝒏𝒊𝒕𝒊𝒐𝒏: 𝜃 𝑘 = 𝜃 𝑘−1 + 𝜃 𝑘−1 . ∆ 𝑡 +0,5. 𝜃 k−1 . ∆t 2 𝜃 𝑘 = 𝜃 𝑘−1 + 𝜃 k−1 .∆𝑡 𝜃 𝑘−1 = 𝑢 𝑘 = 𝟎 ( 𝜃 𝑘−1 − 𝜃 𝑘−2 ) ∆ 𝑡 𝑘 6 7 𝑥 𝑘 = 𝑨.𝑥 𝑘−1 +𝑩. 𝑢 𝑘−1 + 𝑤 𝑘 𝑦 𝑘 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 𝑴𝒐𝒅𝒆𝒍 𝑫𝒆𝒇𝒊𝒏𝒊𝒕𝒊𝒐𝒏: 𝜃 𝑘 = 𝜃 𝑘−1 + 𝜃 𝑘−1 . ∆ 𝑡 +0,5. 𝜃 k−1 . ∆t 2 𝜃 𝑘 = 𝜃 𝑘−1 + 𝜃 k−1 .∆𝑡 6 7 𝑥 𝑘 = 𝑨.𝑥 𝑘−1 +𝑩. 𝑢 𝑘−1 + 𝑤 𝑘 𝑦 𝑘 =𝐶. 𝑥 𝑘 + 𝑣 𝑘 𝐴= 1 0 0 Δ 𝑡 0 0 0 1 0 0 Δ 𝑡 0 0 0 1 0 0 Δ 𝑡 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 𝐵= 0,5 ∆t 2 0,5 ∆t 2 0,5 ∆t 2

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 6 𝜃 𝑥 𝑖 𝜃 𝑦 𝑖 𝜃 𝑧 𝑖 𝜃 𝑥 𝑖 𝜃 𝑦 𝑖 𝜃 𝑧 𝑖 = 1 0 0 Δ 𝑡 0 0 0 1 0 0 Δ 𝑡 0 0 0 1 0 0 Δ 𝑡 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 . 𝜃 𝑥 𝑖−1 𝜃 𝑦 𝑖−1 𝜃 𝑧 𝑖−1 𝜃 𝑥 𝑖−1 𝜃 𝑦 𝑖−1 𝜃 𝑧 𝑖−1 7 + 0,5 ∆t 2 0,5 ∆t 2 0,5 ∆t 2 𝜃 𝑥 𝑖−1 𝜃 𝑦 𝑖−1 𝜃 𝑧 𝑖−1 𝒙 𝒌 = 𝑨.𝒙 𝒌−𝟏 +𝑩. 𝒖 𝒌−𝟏 + 𝒘 𝒌 𝑦 𝑘 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 𝜃 𝑧 𝑘 𝜃 𝑥 𝑘 𝜃 𝑦 𝑘 5 6 𝑦 𝑘 = … 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝒚 𝒌 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 𝜃 𝑧 𝑘 𝜃 𝑥 𝑘 𝜃 𝑦 𝑘 5 𝑦 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 6 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝒚 𝒌 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 𝑦 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 … 6 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝒚 𝒌 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 𝑦 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝐶 𝑦 𝜃 𝐶 𝑦 … 6 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝒚 𝒌 =𝐶. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 𝑦 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝐶 𝑦 𝜃 𝐶 𝑦 … 5 𝐶= 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 0 0 0 0 0 1 0 1 0 0 0 0 0 0 0 0 1 0 … 6 𝑥 𝑘 = 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 𝜃 𝑥 𝜃 𝑦 𝜃 𝑧 7 𝑥 𝑘 = 𝐴.𝑥 𝑘−1 +𝐵. 𝑢 𝑘−1 + 𝑤 𝑘 𝑦 𝑘 =𝑪. 𝑥 𝑘 + 𝑣 𝑘

Data Merging Using Kalman Filter 1 Data Merging Using Kalman Filter 2 3 4 5 I V MD Kalman Filter 6 *1 7 *1 𝐺 𝑘 = ……………… ……………… ……………… ……………… ……………… ……………… 0 0 0 0 0 0 0 0 0 0 0 0

1 Results 2 3 4 5 6 7

1 Results 2 3 4 5 6 7

Results 𝑢 𝑘 = ( 𝜃 𝑘−1 − 𝜃 𝑘−2 ) ∆ 𝑡 𝑘 1 2 3 4 5 6 Kalman Filter Inertial Data Visual Data Merged Data Ground Truth No Treatment Kalman Filter 𝑢 𝑘 =0 𝑢 𝑘 = ( 𝜃 𝑘−1 − 𝜃 𝑘−2 ) ∆ 𝑡 𝑘 6 7

Experiment 1

Experiment 1 (Error)

Experiment 2

Experiment 2 (Error)

Conclusions FANUC 200iB provides accurate and reliable ground truth; Merging inertial and visual data will wield better results than the original data by itself; Kalman Filter is robust to noise; Worst cases will have better improvement; Extensible tool/approach.