Presentation is loading. Please wait.

Presentation is loading. Please wait.

Real-Time Tracking Axel Pinz Image Based Measurement Group EMT – Institute of Electrical Measurement and Measurement Signal Processing TU Graz – Graz University.

Similar presentations


Presentation on theme: "Real-Time Tracking Axel Pinz Image Based Measurement Group EMT – Institute of Electrical Measurement and Measurement Signal Processing TU Graz – Graz University."— Presentation transcript:

1 Real-Time Tracking Axel Pinz Image Based Measurement Group EMT – Institute of Electrical Measurement and Measurement Signal Processing TU Graz – Graz University of Technology http://www.emt.tugraz.at/~tracking http://www.emt.tugraz.at/~pinz axel.pinz@tugraz.at

2 Defining the Terms Real-Time –Task dependent, “in-the-loop” –Navigation: “on-time” –Video rate: 30Hz –High-speed tracking: several kHz Tracking –DoF: Degrees of Freedom –2D: images, videos  2 / 3 DoF –3D: scenes, object pose  6 DoF

3 Example: High-speed, 2D

4 Applications Surveillance Augmented reality Surgical navigation Motion capture (MoCap) Autonomous navigation Telecommunication Many industrial applications

5 Example: Augmented Reality [ARToolkit, Billinghurst, Kato, Demo at ISAR2000, Munich] http://www.hitl.washington.edu/research/shared_space/download/

6 Agenda Structure of the SSIP Lecture Intro, terminology, applications 2D motion analysis Geometry 3D motion analysis Practical considerations Existing systems Summary, conclusions

7 2D Motion Analysis Change detection –Can be anything (not necessarily motion) Optical flow computation –What is moving in which direction ? –Hard in real time Data reduction required ! –Interest operators –Points, lines, regions, contours Modeling required –Motion models, object models –Probabilistic modeling, prediction

8 Change Detection [Pinz, Bildverstehen, 1994]

9 Optical Flow (1) [Brox, Bruhn, Papenberg, Weickert] ECCV04 best paper award Estimating the displacement field Assumptions: Gray value constancy Gradient constancy Smoothness... Error Minimization

10 Optical Flow (2) [Brox, Bruhn, Papenberg, Weickert] ECCV04 best paper award !! Not in real-time !!

11 Interest Operators Reduce the amount of data Track only salient features Support region – ROI (region of interest) Feature in ROI: Edge / LineBlobCornerContour

12 2D Point Tracking [Univ. Erlangen, VAMPIRE, EU-IST-2001-34401] Corner detection  Initialization –Calculate “cornerness” c –Threshold  sensitivity, # of corners –E.g.: “Harris” / “Plessey” corners in ROI Cross-correlation in ROI

13 2D Point Tracking [Univ. Erlangen, VAMPIRE, EU-IST-2001-34401]

14 Edge Tracking [Rapid 95, Harris, RoRapid 95, Armstrong, Zisserman]

15 Blob Tracking [Mean Shift 03, Comaniciu, Meer]

16 Contour Tracking [CONDENSATION 98-02, Isard, Toyama, Blake]

17 CONDENSATION (2) CONditional DENSity propagATION Requires a good initialization Works with active contours Maintains / adapts a contour model Can keep more than one hypothesis

18 Agenda Structure of the SSIP Lecture Intro, terminology, applications 2D motion analysis Geometry 3D motion analysis Practical considerations Existing systems Summary, conclusions

19 Geometry Having motion in images: –What does it mean? –What can be measured? Projective camera Algebraic projective geometry Camera calibration Computer Vision –Reconstruction from uncalibrated views There are excellent textbooks [Faugeras 1994, Hartley+Zisserman 2001, Ma et al. 2003]

20 Projective Camera (1) Pinhole camera model: –p = (x,y) T is the image of P = (X,Y,Z ) T –(x,y)... image-, (X,Y,Z)... scene-coordinates –o... center of projection –(x,y,z)... camera coordinate system –(x,y,-f)... image plane x x y z y f p(x,y) P(X,Y,Z) o X Y Z

21 Projective Camera (2) Pinhole camera model: –If scene- = camera-coordinate system Z o f x X

22 Projective Camera (3) Frontal pinhole camera model: –(x,y,+f)... image plane –Normalized camera: f=+1 x x y z y f p P(X,Y,Z) o X Y Z

23 Projective Camera (4) “real” camera: –5 Intrinsic parameters (K) –Lens distortion –6 Extrinsic parameters (M: R, t) – … arbitrary scale

24 Algebraic Projective Geometry [Semple&Kneebone 52] Homogeneous coordinates Duality points  lines Homography H describes any transformation –E.g.: image  image transform: x’ = Hx –All transforms can be described by 3x3 matrices –Combination of transformations: Matrix product Translation Rotation

25 Camera Calibration (1) Recover the 11 camera parameters: –5 Intrinsic parameters (K: fs x, fs y, fs , u 0, v 0 ) –6 Extrinsic parameters (M: R, t) Calibration target: –At least 6 point correspondences  –System of linear equations Direct (initial) solution for K and M

26 Camera Calibration (2) Iterative optimization –K, M, lens distortion –E.g. Levenberg-Marquart Practical solutions require more points –Many algorithms [Tsai 87, Zhang 98, Heikkilä 00] Overdetermined systems Robustness against outliers –E.g. RANSAC Refer to [Hartley, Zisserman, 2001]

27 What can be measured... … with a calibrated camera –Viewing directions –Angles between viewing directions –3D reconstruction: more than 1 view required … with uncalibrated camera(s) –Computer Vision research of the past decade –Hierarchy of geometries: Projective – oriented projective – affine – similarity – Euclidean

28 Agenda Structure of the SSIP Lecture Intro, terminology, applications 2D motion analysis Geometry 3D motion analysis Practical considerations Existing systems Summary, conclusions

29 3D Motion Analysis: Location and Orientation R t head coord. system scene coord. system 6 DoF pose in real-time  Extrinsic parameters in real-time

30 3D Motion Analysis: Tracking technologies, terminology Camera pose (PnP) Stereo, E, F, epipolar geometry Model-based tracking –Confluence of 2D and 3D Fusion Kalman Filter

31 Tracking Technologies (1) Mechanical tracking “Magnetic tracking” Acoustic – time of flight “Optical”  vision-based Compass GPS, … External effort required ! No „self-contained“ system [Allen, Bishop, Welch. Tracking: Beyond 15 minutes of thought. SIGGRAPH’01]

32 Tracking Technologies (2) Examples [Allen, Bishop, Welch. Tracking: Beyond 15 minutes of thought. SIGGRAPH’01]

33 Research at EMT: Hybrid Tracking – HT Combine 2 technologies: Vision-based +Good results for slow motion –Motion blur, occlusion, wrong matches Inertial +Good results for fast motion –Drift, noise, long term stability Fusion of complementary sensors ! Mimicks human cognition !

34 Vision-Based Tracking More Terminology Measure position and orientation in real-time Obtain trajectories of object(s) Moving observer, egomotion – “inside-out” Stationary observer – “outside-in Tracking” Combinations of the above Degrees of Freedom – DoF –3 DoF (mobile robot) –6 DoF (head tracking in AR)

35 Inside-out Tracking monocular exterior parameters 6 DoF from  4 points wearable, fully mobile cornersblobsnatural landmarks

36 Outside-in Tracking stereo-rig IR-illumination no cables 1 marker/device: 3 DoF 2 markers: 5 DoF 3 markers: 6 DoF devices

37 Camera Pose Estimation Pose estimation: Estimate extrinsic parameters from known / unknown scene  find R, t Linear algorithms [Quan, Zan, 1999] Iterative algorithms [Lu et al., 2000] Point-based methods –No geometry, just 3D points Model-based methods –Object-model, e.g. CAD

38 PnP(1) Perspective n-Point Problem Calibrated camera K, C = (KK T ) -1 n point correspondences scene  image Known scene coordinates of p i, and known distances d ij = || p i – p j || Each pair (p i,p j ) defines an angle   can be measured (2 lines of sight, calibrated camera)  constraint for the distance ||c – p i || c xixi xjxj pipi pjpj  d ij

39 PnP (2) c xixi xjxj pipi pjpj  uiui ujuj d ij

40 PnP (3) P3P, 3 points: underdetermined, 4 solutions P4P, 4 points: overdetermined, 6 equations, 4 unknowns 4 x P3P, then find a common solution General problem: PnP, n points

41 PnP (4) Once the x i have been solved: 1)project image points  scene p’ i = x i K -1 u i 2)find a common R, t for p’ i  p i (point-correspondences  solve a simple system of linear equations)

42 Stereo Reconstruction Elementary stereo geometry in “canonical configuration” 2 h … “baseline” b P r - P l … “disparity” d There is just column disparity Depth computation: x z y hh x=0x r =0x l =0 f PlPl PrPr ClCl CrCr P(x,y,z) PzPz

43 Stereo (2) 2 cameras, general configuration: Epipolar geometry ClCl CrCr X ulul urur l lrlr elel erer Y v

44 Uncalibrated cameras: Fundamental matrix F Calibrated cameras: Essential matrix E 3x3, Rank 2 Many Algorithms –Normalized 8-point [Hartley 97] –5-point (structure and motion) [Nister 03] Stereo (3)

45 Model-Based Tracking Confluence of 2D and 3D [Deutscher, Davison, Reid 01]

46 3D Motion Analysis: Tracking technologies, terminology Camera pose (PnP) Stereo, E, F, epipolar geometry Model-based tracking –Confluence of 2D and 3D Fusion Kalman Filter 1.General considerations 2.Kalman Filter 3.  EMT HT project

47 General Considerations We have: –Several sensors (vision, inertial,...) –Algorithms to deliver pose streams for each sensor (at discrete times; rates may vary depending on sensor, processor, load,...) Thus, we need: –Algorithms for sensor fusion (weighting the confidence in a sensor, sensor accuracy,...) –Pose estimation including a temporal model [Allen, Bishop, Welch. Tracking: Beyond 15 minutes of thought. SIGGRAPH’01]

48 Sensor Fusion Dealing with ignorance: –Imprecision, ambiguity, contradiction,... Mathematical models –Fuzzy sets –Dempster-Shafer evidence theory –Probability theory Probabilistic modeling in Computer Vision –The topic of this decade ! –Examples: CONDENSATION, mean shift

49 Kalman Filter (1) http://www.cs.unc.edu/~welch/kalman [Welch, Bishop. An Introduction to the Kalman Filter. SIGGRAPH’01]

50 Kalman Filter (2) Estimate a process with a measurement x  n... State of the process z  m... Measurement p, v... Process and measurement noise (zero mean) A... n x n Matrix relates the previous with the current time step B... n x l Matrix relates optional control input u to x H... n x m Matrix relates state x to measurement z

51 Kalman Filter (3) Definitions Then:

52 Kalman Filter (4) Compute with

53 Kalman Filter (5) http://www.cs.unc.edu/~welch/kalman

54 Kalman Filter (6) http://www.cs.unc.edu/~welch/kalman

55 EMT Hybrid Tracker HT Project Ingredients of hybrid tracking: Camera(s) Inertial sensors Feature extraction Pose estimation Structure estimation Real-time Synchronisation Kalman filter Sensor Fusion

56 Hybrid Tracking 6 DoF vision-based tracker 6 DoF inertial tracker Fusion by a Kalman filter

57 “Structure and Motion” “Tracking” + “Structure from Motion”

58 Research Prototype Tracking subsystem Visualization subsystem Sensors + HMD

59 HT Application Example

60 Agenda Structure of the SSIP Lecture Intro, terminology, applications 2D motion analysis Geometry 3D motion analysis Practical considerations Existing systems Summary, conclusions

61 Practical Considerations There are critical configurations ! Projective geometry vs. discrete pixels –Rays do not intersect ! –Error minimization algorithms required Robustness (many points) vs. real-time –Outlier detection can become difficult ! Precision (iterative) vs. real-time (linear) Combination of diverse features –Points, lines, curves Jitter, lag Debugging of a real-time system !

62 Existing Systems (1) VR/AR –Intersense, Polhemus, A.R.T. MoCap –Vicon, A.R.T. Medical tracking –MedTronic, A.R.T. Fiducial tracker (Intersense) Research systems –KLT (Kanade, Lucas, Tomasi) –ARToolkit (Billinghurst) –XVision (Hager)

63 Existing Systems (2)

64 Open Issues Tracking of natural landmarks First success in online structure and motion –[Nister CVPR03, ICCV03, ECCV04] (Re-)Initialisation in highly complex scenes Usability !

65 Future Applications Can pose (position and orientation) be exploited ? –What is the user looking at? –Architecture, city guide, museum, emergency, … From bulky gear and HMD  PDA –Wireless communication –Camera(s) –Inertial sensors (+ compass, + GPS, …) Automotive ! –Driver assistance –Autonomous vehicles, mobile robot navigation, … Medicine ! –Surgical navigation –Online fusion (temporal genesis, sensory modes, …)

66 Summary, Conclusions Real-time pose (6 DoF) 2D and 3D motion analysis Geometry Probabilistic modeling High potential for future developments

67 Acknowledgements EU-IST-2001-34401 Vampire - Visual Active Memory Processes and Interactive Retrieval FWF P15748 Smart Tracking FWF P14470 Mobile Collaborative AR Christian Doppler Laboratory for Automotive Measurement Research Markus Brandner Harald Ganster Bettina Halder Jochen Lackner Peter Lang Ulrich Mühlmann Miguel Ribo Hannes Siegl Christoph Stock Georg Teichtmeister Jürgen Wolf

68 Further Reading R. Hartley, A. Zisserman. Multiple View Geometry in Computer Vision. Cambridge University Press, 2nd ed., 2003. Y. Ma, S. Soatto, J. Kosecka, S. Shankar Sastry. An Invitation to 3D Vision. Springer, 2004. B.D. Allen, G. Bishop, G. Welch. Tracking: Beyond 15 Minutes of Thought. SIGGRAPH 2001, Course 11. See http://www.cs.unc.edu/~welch G. Welch, G. Bishop. An Introduction to the Kalman Filter. SIGGRAPH 2001, Course 8. See http://www.cs.unc.edu/~welch


Download ppt "Real-Time Tracking Axel Pinz Image Based Measurement Group EMT – Institute of Electrical Measurement and Measurement Signal Processing TU Graz – Graz University."

Similar presentations


Ads by Google