Performance Evaluation of Feature Detection for Local Optical Flow Tracking Tobias Senst, Brigitte Unger, Ivo Keller and Thomas Sikora.

Slides:



Advertisements
Similar presentations
Registration for Robotics Kurt Konolige Willow Garage Stanford University Patrick Mihelich JD Chen James Bowman Helen Oleynikova Freiburg TORO group: Giorgio.
Advertisements

Outline Feature Extraction and Matching (for Larger Motion)
Supervised by Dr. Hau-San WONG Prepared by Kam-fung YU ( )
Patch to the Future: Unsupervised Visual Prediction
Activity Recognition Aneeq Zia. Agenda What is activity recognition Typical methods used for action recognition “Evaluation of local spatio-temporal features.
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Stanford CS223B Computer Vision, Winter 2005 Lecture 11: Structure From Motion 2 Sebastian Thrun, Stanford Rick Szeliski, Microsoft Hendrik Dahlkamp and.
LUCAS KANADE FEATURE TRACKER a pyramidal implementation
Feature tracking. Identify features and track them over video –Small difference between frames –potential large difference overall Standard approach:
RECOGNIZING FACIAL EXPRESSIONS THROUGH TRACKING Salih Burak Gokturk.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Computing motion between images
Domenico Bloisi, Luca Iocchi, Dorothy Monekosso, Paolo Remagnino
KLT tracker & triangulation Class 6 Read Shi and Tomasi’s paper on good features to track
Viewpoint Tracking for 3D Display Systems A look at the system proposed by Yusuf Bediz, Gözde Bozdağı Akar.
Dept. of ECE 1 Feature-based Object Tracking Dr. Dapeng Oliver Wu Joint Work with Bing Han, William Roberts, and Jian Li Department of Electrical and Computer.
Overview Introduction to local features
Person Detection and Tracking using Binocular Lucas-Kanade Feature Tracking and K-means Clustering Chris Dunkel Committee: Dr. Stanley Birchfield, Committee.
Action recognition with improved trajectories
Exploiting video information for Meeting Structuring ….
Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different.
3D SLAM for Omni-directional Camera
Motion estimation Digital Visual Effects Yung-Yu Chuang with slides by Michael Black and P. Anandan.
Dynamic 3D Scene Analysis from a Moving Vehicle Young Ki Baik (CV Lab.) (Wed)
BING: Binarized Normed Gradients for Objectness Estimation at 300fps
Visual motion Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
A Fast and Accurate Tracking Algorithm of the Left Ventricle in 3D Echocardiography A Fast and Accurate Tracking Algorithm of the Left Ventricle in 3D.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Effective Optical Flow Estimation
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
21 June 2009Robust Feature Matching in 2.3μs1 Simon Taylor Edward Rosten Tom Drummond University of Cambridge.
Stable Multi-Target Tracking in Real-Time Surveillance Video
Communication Systems Group Technische Universität Berlin S. Knorr A Geometric Segmentation Approach for the 3D Reconstruction of Dynamic Scenes in 2D.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Vision and SLAM Ingeniería de Sistemas Integrados Departamento de Tecnología Electrónica Universidad de Málaga (Spain) Acción Integrada –’Visual-based.
Joint Tracking of Features and Edges STAN BIRCHFIELD AND SHRINIVAS PUNDLIK CLEMSON UNIVERSITY ABSTRACT LUCAS-KANADE AND HORN-SCHUNCK JOINT TRACKING OF.
Overview Introduction to local features Harris interest points + SSD, ZNCC, SIFT Scale & affine invariant interest point detectors Evaluation and comparison.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Department of Computer Science,
Visual Odometry David Nister, CVPR 2004
IEEE International Conference on Multimedia and Expo.
Motion Features for Action Recognition YeHao 3/11/2014.
Motion Estimation Today’s Readings Trucco & Verri, 8.3 – 8.4 (skip 8.3.3, read only top half of p. 199) Newton's method Wikpedia page
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Motion estimation Digital Visual Effects, Spring 2006 Yung-Yu Chuang 2005/4/12 with slides by Michael Black and P. Anandan.
Video Google: Text Retrieval Approach to Object Matching in Videos Authors: Josef Sivic and Andrew Zisserman University of Oxford ICCV 2003.
Optical flow and keypoint tracking Many slides adapted from S. Seitz, R. Szeliski, M. Pollefeys.
1 Long-term image-based motion estimation Dennis Strelow and Sanjiv Singh.
Motion estimation Parametric motion (image alignment) Tracking Optical flow.
Motion estimation Digital Visual Effects, Spring 2005 Yung-Yu Chuang 2005/3/23 with slides by Michael Black and P. Anandan.
Affine Motion-compensated Prediction
Motion and Optical Flow
CSE 577 Image and Video Analysis
Video Google: Text Retrieval Approach to Object Matching in Videos
Digital Visual Effects Yung-Yu Chuang
Feature Tracking and Optical Flow
Vehicle Segmentation and Tracking in the Presence of Occlusions
Combining Geometric- and View-Based Approaches for Articulated Pose Estimation David Demirdjian MIT Computer Science and Artificial Intelligence Laboratory.
VMorph: Motion and Feature-Based Video Metamorphosis
Announcements more panorama slots available now
KFC: Keypoints, Features and Correspondences
Liyuan Li, Jerry Kah Eng Hoe, Xinguo Yu, Li Dong, and Xinqi Chu
Coupled Horn-Schunck and Lukas-Kanade for image processing
Announcements more panorama slots available now
Optical flow and keypoint tracking
TENNIS STROKE DETECTION
Tracking Many slides adapted from Kristen Grauman, Deva Ramanan.
Report 2 Brandon Silva.
Counting in High-Density Crowd Videos
Presentation transcript:

Performance Evaluation of Feature Detection for Local Optical Flow Tracking Tobias Senst, Brigitte Unger, Ivo Keller and Thomas Sikora

Overview Performance Evaluation of Feature Detection for Local Optical Flow Tracking Motivation Evaluation Methodology Evaluation Results Conclusion

Motivation The Kanade-Lucas-Thomas (KLT) feature tracker still remains as a widely accepted and utilized method for sparse motion estimation due to its high computational efficiency. [Ali & Shah] [Fradet et al.] KLT – by 1991 Saad Ali and Mubarak Shah, A Lagrangian Particle Dynamics Approach for Crowd Flow Segmentation and Stability Analysis, CVPR 2007 D. C. Herath, Sarath Kodagoda and Gamini Dissanayake, Simultaneous Localisation and Mapping: A Stereo Vision Based Approach, IEEE Intelligent Robots Systems 2006 Hanna Kallén, Hakan Ardö and Olof Enqvist Tracking and Reconstruction of Vehicles for Accurate Position Estimation, WACV 2011 Fradet, Robert and Pérez Clustering point trajectories with various life-spans. CVMP 2009 (Visual Media Production) [Herath et al.] [Källén et al.]

Motivation KLT feature tracker

Motivation KLT feature tracker Good Features To Track Feature Detector [Shi & Tomasi,1994]

Local Optical Flow Tracking Motivation KLT feature tracker Local Optical Flow Tracking Good Features To Track Pyramidal Lucas-Kanade Method [Shi & Tomasi,1994] [Bouguet, 2000]

Pyramidal Lucas-Kanade Method Motivation KLT feature tracker Rejection Good Features To Track Pyramidal Lucas-Kanade Method SSD [Shi & Tomasi,1994] [Bouguet, 2000]

Local Optical Flow Tracking Motivation Feature Detector Local Optical Flow Tracking Rejection KLT feature tracker Good Features To Track Pyramidal Lucas-Kanade Method SSD [Shi & Tomasi,1994] [Bouguet, 2000] LMedS Local Optical Flow [Kim et al., 2004] Bi Direct : Bidirectional flow Gain Adaptive PLK [Zach et al., 2008] Robust Local Optical Flow (mod L2 norm) Bi-Direct. [Senst et al., 2011]

Local Optical Flow Tracking Motivation Feature Detector Good Features To Track [Shi & Tomasi,1994] Local Optical Flow Tracking Good Feature To Track deals with: Aperture Problem ? Bi Direct : Bidirectional flow

Local Optical Flow Tracking Motivation Feature Detector Good Features To Track [Shi & Tomasi,1994] Local Optical Flow Tracking Good Feature To Track deals with: Aperture Problem ?

Local Optical Flow Tracking Motivation Feature Detector Good Features To Track [Shi & Tomasi,1994] Local Optical Flow Tracking Good Feature To Track deals with: Aperture Problem does not deal with: Generalized Aperture Problem Local Optical Flow/ KLT – Tracker are based on the Motion Constancy Assumption W

Local Optical Flow Tracking Motivation Feature Detector Good Features To Track [Shi & Tomasi,1994] Local Optical Flow Tracking Good Feature To Track deals with: Aperture Problem does not deal with: Generalized Aperture Problem Pyramidal Implementation

Local Optical Flow Tracking Motivation What about other Feature Detectors? Feature Detector Local Optical Flow Tracking Rejection Good Features To Track Robust Local Optical Flow (mod L2 norm) Bi-Direct. [Shi & Tomasi,1994] [Senst et al., 2011] FAST [Rosten & Drummond,2006] Find a new Feature detector - In general we find the best combination by perform a Benchmark SIFT [Lowe,1999] … MSER [Matas, 2002]

Evaluation Methodology How could we measure the performance of a Local Optical Flow based Feature Tracker?

Evaluation Methodology How could we measure the performance of a Local Optical Flow based Feature Tracker? Middlebury Optical Flow dataset is an established benchmark for dense motion data. Local Optical Flow are not build for dense motion estimation (efficiency and accuracy) but for sparse But we (Feature Tracking) want to pick some points and track them These points should contain as much as motion information of the sequence [Baker et al. 2007] http://vision.middlebury.edu/flow/

Evaluation Methodology How could we measure the performance of a Local Optical Flow based Feature Tracker? What are the favored behaviors of a Local Optical Flow based Feature Tracker? Local Optical Flow Feature Tracker A local optical flow tracker should have Accuracy Efficiency Descriptive

Evaluation Methodology How could we measure the performance of a Local Optical Flow based Feature Tracker? Covering Measure: Mean density of trajectories in a motion segment (r) Covering the feature should be distributed over the image regarding the moving objects In jedem segment eine hoh dichte an feature vektoren erwarten

Evaluation Methodology How could we measure the performance of a Local Optical Flow based Feature Tracker? Covering Measure: Mean density of trajectories in a motion segment (r) Accuracy Measure: Median of endpoint error (MME) Efficiency Measure: Feature detection runtime (tD) Overall runtime (t) Percentage of rejected features trajectories (h) Dataset: Middlebury Flow Test Sequences, 2 Frames

Feature Detection Runtime Evaluation Results Feature Detection Runtime Grove3 640x480 on a AMD Phenom II X4 960 2.00 GHZ FAST 5ms, GFT 35, PGFT 74, SIFT 272, SURF 158, STAR 18, MSER 54

Good Features To Track (GFT) Evaluation Results Good Features To Track (GFT) ~200 Feature Points ~2500 Feature Points Method td(ms) t(ms) h(%) MEE r(%) GFT 26 77 2.2 0.055 0.06 PGFT 55 106 1.8 0.059 0.07 FAST 1 50 0.5 0.051 0.08 SIFT 183 233 1.5 0.188 SURF 113 162 4.3 0.180 0.11 STAR 13 62 3.3 0.127 0.10 MSER 51 101 3.0 0.075 0.24 Method td(ms) t(ms) h(%) MEE r(%) GFT 27 89 3.2 0.057 0.99 PGFT 56 119 1.12 FAST 4 63 2.2 0.050 1.25 SIFT 414 476 1.9 0.78 SURF 237 299 3.3 0.066 1.06 STAR 18 74 2.0 0.94 MSER 93 148 2.3 0.051 0.54

Feature from Accelerated Segment Test (FAST) Evaluation Results Feature from Accelerated Segment Test (FAST) ~200 Feature Points ~2500 Feature Points Method td(ms) t(ms) h(%) MEE r(%) GFT 26 77 2.2 0.055 0.06 PGFT 55 106 1.8 0.059 0.07 FAST 1 50 0.5 0.051 0.08 SIFT 183 233 1.5 0.188 SURF 113 162 4.3 0.180 0.11 STAR 13 62 3.3 0.127 0.10 MSER 51 101 3.0 0.075 0.24 Method td(ms) t(ms) h(%) MEE r(%) GFT 27 89 3.2 0.057 0.99 PGFT 56 119 1.12 FAST 4 63 2.2 0.050 1.25 SIFT 414 476 1.9 0.78 SURF 237 299 3.3 0.066 1.06 STAR 18 74 2.0 0.94 MSER 93 148 2.3 0.051 0.54

Speed Up Robust Features (SURF) Evaluation Results Speed Up Robust Features (SURF) ~200 Feature Points ~2500 Feature Points Method td(ms) t(ms) h(%) MEE r(%) GFT 26 77 2.2 0.055 0.06 PGFT 55 106 1.8 0.059 0.07 FAST 1 50 0.5 0.051 0.08 SIFT 183 233 1.5 0.188 SURF 113 162 4.3 0.180 0.11 STAR 13 62 3.3 0.127 0.10 MSER 51 101 3.0 0.075 0.24 Method td(ms) t(ms) h(%) MEE r(%) GFT 27 89 3.2 0.057 0.99 PGFT 56 119 1.12 FAST 4 63 2.2 0.050 1.25 SIFT 414 476 1.9 0.78 SURF 237 299 3.3 0.066 1.06 STAR 18 74 2.0 0.94 MSER 93 148 2.3 0.051 0.54

Conclusion Our goal was to provide a set of baseline results for tracking features with alternative detectors. We provide an evaluation methodology based on accuracy, efficiency and descriptive measures. We observed that the FAST is an efficient alternative to the GFT detector. The SIFT shows limited improvements, but an improved GTF method could benefit from its scale space analysis. The feature tracking evaluation would benefit from a dataset with more than two frames e.g. by measuring life-spans of trajectories.

Conclusion Thank you!