Presentation is loading. Please wait.

Presentation is loading. Please wait.

Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision.

Similar presentations


Presentation on theme: "Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision."— Presentation transcript:

1 Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision based telematic systems, which provide non-intrusive probing of the state of the driver and driving conditions. In this paper we present a system which allows simultaneous capture of the driver's head pose, driving view, and surroundings of the vehicle. The integrated machine vision system utilizes a video stream of full 360 degree panoramic field of view. The processing modules include perspective transformation, feature extraction, head detection, head pose estimation, driving view synthesis, and motion segmentation. The paper presents a multi-state statistical decision models with Kalman filtering based tracking for head pose detection and face orientation estimation. The basic feasibility and robustness of the approach is demonstrated with a series of systematic experimental studies. Keywords: Driver head tracking, face orientation estimation, driver’s view generation, surround vehicle detection. Kohsia S. Huang, Mohan M. Trivedi, and Tarak Gandhi khuang@ucsd.edu, mtrivedi@ucsd.edu, tgandhi@ece.ucsd.edu Computer Vision & Robotics Research (CVRR) Laboratory University of California at San Diego La Jolla, CA 92093-0434 Research Objective Accurate and real-time estimation of driver’s face orientation, driver’s view, as well as vehicle surround for a driver assistance system. Perspective Transformation on Driver’s Seat Ellipse Search Window To Face Orientation Estimation Sub-sample and Grayscale Edge Detection Constrained Ellipse Detection (RHT) Face/Non-face Classification (DFFS) Equalization Head Candidate Extraction Predict Head Location in Next Frame Update Kalman Filter for Head Tracking Computation of Driver’s Viewing Direction Direction of driver Direction of car 0 degree360 degree180 degree 0 degree of camera Driver Head Detection and Tracking Head Detection & Tracking Head Tilting Compensation Projection into Feature Subspace 12N 1M Gaussian Likelihood Functions State Sequence Head Detection & Tracking Head Tilting Compensation View-Based Face Orientation Likelihood Fns. Pan/tilt angles to camera Omnicamera Viewing Direction Kalman Filter Orientation ML Estimation of Head Pose and Face Orientation Scheme 1 Scheme 2 (Future) Face Orientation Estimation Driver’s View Generation Head Detection & Tracking Results Average Performance DFFS BoundFalse Positive 25009% 20007% ClipFrames Error before KFError after KF Note MeanStdMeanStd #1200-1°8°-1°7° #275-19°27°18°24°Uneven illum. #3701°7°0°8° #43016°28°-15°16°Face occlusion #5150°19°4°7° #615-3°8°-2°3° Head Detection before KF (DFFS Bound = 2500) Setup 1 (Side View) Setup 2 (Front View) Rough RHT, 1 Epoch32%50% Rough RHT, 2 Epochs52%61% Extensive RHT, 10 Epochs71%79% RHT+Feedback, 10→1 Epoch64%73% RHT+Feedback, 10→2 Epochs67%87% Head Detection after KF: 100 % Head Detection Face Orientation Head Detection & Tracking Face Orientation Estimation Driver’s View Generation Current frame of the image, with estimated image motion in the area of interest. Points used for estimation of ego- motion. Gray: inliers, White: outliers, Black: unused. Normalized frame difference in the area of interest. Output after post-processing and clustering. CAN Bus Calibration Delay Motion Transform Parameters Spatial/Temporal Gradients Post Processing & Clustering Obstacle Positions Motion Parameter Correction Omni-Video Stream Flat-Plane Transform Ego-Motion Compensation Inverse Flat- Plane Tx. Normalized Frame Difference H g x, g y, g t x -, P - x, P Surround Vehicle Detection Bayesian Correction to Motion Parameters Approximate motion parameters obtained from calibration, CAN bus. Planar motion compensation equation: Optical flow constraint satisfied under favorable conditions: Image motion is expressed parametrically in terms of motion parameters for a number of image points as: Correction performed by update similar to iterated extended Kalman filter: Driver’s View Generation Summary Simultaneous driver head detection, driver face pose estimation, driving view generation, and surround vehicle monitoring in real-time using a single omni-video stream. Suitable for novel televiewing interfaces, driver assistance systems, and driver distraction studies.


Download ppt "Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision."

Similar presentations


Ads by Google