Presentation is loading. Please wait.

Presentation is loading. Please wait.

Tracking and Data Fusion for 4D Visualization Ulrich Neumann Suya You Computer Science Department Integrated Media Systems Center University of Southern.

Similar presentations


Presentation on theme: "Tracking and Data Fusion for 4D Visualization Ulrich Neumann Suya You Computer Science Department Integrated Media Systems Center University of Southern."— Presentation transcript:

1 Tracking and Data Fusion for 4D Visualization Ulrich Neumann Suya You Computer Science Department Integrated Media Systems Center University of Southern California MURI Review June 2002

2 Combine all manner of 3D models, images, video, and data in a coherent dynamic visualization to support spatio-temporal data understanding, information extraction, and dynamic scene change detection Text Hyper-links Communications … LiDAR Laser Stereo … Satellite Aerial Ground … 3D modelVideo/image Dat a Information Fusion & Visualization Data Fusion Tracking Model refinement Needs: Research Goals

3 Robust tracking for outdoor unprepared environment –Portable hybrid tracking and data acquisition/fusion system –Natural feature tracking using vision sensors Fusion of 2D video/images and 3D model –LiDAR data tessellation and model reconstruction –Real-time video texture projection and visualization 6DOF Auto-calibration technology –Detect and calibrate scene features (points and lines) to refine models and aid in tracking We pursue basic algorithm research and testbed implementations that are feasible with current or near term technology Research Highlights

4 People with sensors (or unmanned sensors) moving in environment provide textures and data for visualizations… Where are they? Where are they looking? Need 6DOF pose tracking over wide area outdoors –Varying sensor data availability and data rates vision, GPS, inertial sensors –Varying certainty of measurements spatial and temporal noise and precision –Fusion models and algorithms underdetermined system needs constraints real-time acquisition and execution on portable systems Developed two tracking systems –Portable hybrid tracking and real-time data acquisition system –Natural feature tracking for computing motion of video sensor Tracking in Unprepared Environments

5 System configuration: RTK differential GPS (Ashtech Z-Sensor base/mobile) 3D inertial tracker (Intersense IS300) Real-time stereo head (SRI International) PIII 866Hz laptop DGPS receiver 3DOF Gyro sensor Stereo camera head Data fusion and storage Com1 Com2 Firewire Portable Acquisition System

6 Hybrid DGPS and Inertial sensor provide real-time 6DOF pose tracking High-resolution digital camera pairs are used to capture video streams for texture projection and façade reconstruction Complete self-contained system in a backpack Acquisition in real-time (all data streams are time- stamped and synchronized) Includes data capture and playback tools Real-Time Data Acquisition

7 Problem Most vision tracking methods require a priori knowledge about the environment Pre-calibrated landmarks Active sensors Scene models Active control or modification of an outdoor environment is unrealistic Our approach Detect and use naturally occurred features Robust tracking 1D (point) and 2D feature (region) SFM (structure from motion) from tracked features Neither camera ego-motion nor structure information is known Natural Feature Tracking Using Vision Sensor

8 Approach provides camera pose and structure estimates Related pose tracking can be directly used for augmented reality overlays Structure estimates allows continually tracking also can be used to improve/refine existing models Framework allows further sensor fusion (GPS, gyroscopes) for absolute pose reconstruction Video streams Pose & Structure Estimate Feature verification 2D Feature detection (new feature) 2D Feature tracking Structure Pose Feature list Natural Feature Tracking Using Vision Sensor

9 Natural Feature Tracking

10 Vision Tracking Used for AR Overlays

11 Imagine dozens of video streams from people, UAVs, and sensors distributed and moving through scene…. Use sensor models and 3D models of the scene to integrate video/image data from different sources Visualize the imagery in unique, innovative ways that maximize information extraction and comprehension Produce dynamic visualizations from arbitrary viewpoints - Static textures pre-compute mapping of images to models - dynamic projections onto models – like “slide projectors” Fusion of 2D video/image and 3D model

12 LiDAR provides 3D accurate (sub-meter) position samples and cm height accuracy - Use as base model/context for video visualization Raw LiDAR comes as a 3D point-cloud - Need tools for data resampling, tessellation, and model reconstruction 3D Model: LiDAR from flyover

13 Data tessellation –Data re-sampling (irregular sample cloud to regular grid) –Surface interpolation (hole filling) 3D models represented as triangle meshes - Easily converted to many other geometric representations - Supports many level-of-detail techniques - Easily add photometric information (texture projections) - Hardware acceleration for fast image rendering Use VRML as the standard model representation - supports web applications and open tool-base Data Tessellation and Model Reconstruction

14 LIDAR data acquired for USC campus Resampled range imageReconstructed 3D model Data Resampling and Model Reconstruction

15  Texture Projection vs. Texture Map –dynamic vs. static –texture image and position both change each video frame Image/Video Texture Projection 3D modelimage texture

16 Dynamic view and placement control during visualization Update pose and “paint” the scene each frame HW texture transformation during rendering Real-time visualization with HW accelerator (Video) Virtual Texture Projector

17 Putting it all together… −Accurate 3D models −Accurate 3D sensor models −calibration and 6DOF tracking −Projection transformation computes texture mapping during rendering −Visibility and occlusion processing −multi-pass algorithm ensure that only visible surfaces are textured Dynamic Texture Projection

18 video texture projected on USC model (GPS/inertial tracking)

19 video texture projected on USC model (vision/GPS/inertial tracking)

20 video texture projected on USC LiDAR/model (vision/GPS/inertial)

21 Visualizations from arbitrary viewpoints User control of viewpoint as well as image inclusion (blending, and projection parameters) Multi-texture projectors simultaneously visualize many image sources projected on the same model Hardware acceleration with G-Force graphics card with use of pixel-shader features Visualization wall (8x10-foot tracked stereo) LiDAR/Projection Rendering System

22 Autocalibration computes 3D scene information during tracking –allows tracking in regions beyond models or landmarks –Provides the necessary scale factor data to create the 6th DOF that is lacking from vision-only tracking –Provides absolute pose data for stabilizing the multi- sensors data fusion –Provided estimated 3D information of structure features (point, line and edge) to improve/refine models 6 DOF Autocalibration

23 Camera pose and features are estimated simultaneously along the motion path –Point features (developed in past 2-3 years) –Line features (this past year) Pose estimate from 3D calibrated lines or points Autocalibration of 3D lines –Unique line representation (minimal) Four parameters N1, N2  (x, y, 1) for a given (R 1 T 1, R 2 T 2 ) to uniquely determine a 3D line L –EKF based estimator Update per feature or measurement Adapt to different sample rate 6 DOF Autocalibration

24 Simulation Results Simulation with 100 inch volume and 50 lines

25 Tracked line features are marked as green Red lines are the projections of auto- calibrated lines A virtual lamp and chair are inserted into the real scene based on the estimated camera pose Autocalibration Results

26

27 Automate steps in data capture and fusion for vision/gps/inertial tracking and visualization –do 8-12 captures around 3-4 buildings on campus –visualize streams simultaneously –correspondences from video to LiDAR/models edges, windows, … Model refinement (w/Cal, GT) –constrained autocalibration for estimating building features simplification and definition of edges, planar faces, windows, … Temporal data management (w/Cal, GT, UCSC, USyr) –accumulation of persistent textures –user management and temporal blending (update) Future Plans

28

29  Benefits and capabilities  Real-time multi-source data fusion –Models, imagery, video, maps...  Enhanced data understanding –Dynamic controls of viewpoint as well as image inclusion –Rapid update to reflect most recent information –Highly interactive real-time perspective view capability  3D data editing using photogrammetric methods –Enables reuse of models and incremental refinement Dynamic Texture Projection

30 Aerial view of projected image texture (campus of Purdue University) Sensor Image plane View frustum Projection on LiDAR Data


Download ppt "Tracking and Data Fusion for 4D Visualization Ulrich Neumann Suya You Computer Science Department Integrated Media Systems Center University of Southern."

Similar presentations


Ads by Google