Signal and Image Processing Lab

Slides:



Advertisements
Similar presentations
Caroline Rougier, Jean Meunier, Alain St-Arnaud, and Jacqueline Rousseau IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, VOL. 21, NO. 5,
Advertisements

Street Crossing Tracking from a moving platform Need to look left and right to find a safe time to cross Need to look ahead to drive to other side of road.
Rear Lights Vehicle Detection for Collision Avoidance Evangelos Skodras George Siogkas Evangelos Dermatas Nikolaos Fakotakis Electrical & Computer Engineering.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Gaze Tracking Using A Webcamera.
1 Video Processing Lecture on the image part (8+9) Automatic Perception Volker Krüger Aalborg Media Lab Aalborg University Copenhagen
Face Recognition and Biometric Systems 2005/2006 Filters.
Database-Based Hand Pose Estimation CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
Automatic in vivo Microscopy Video Mining for Leukocytes * Chengcui Zhang, Wei-Bang Chen, Lin Yang, Xin Chen, John K. Johnstone.
Image-Based Target Detection and Tracking Aggelos K. Katsaggelos Thrasyvoulos N. Pappas Peshala V. Pahalawatta C. Andrew Segall SensIT, Santa Fe January.
GIS and Image Processing for Environmental Analysis with Outdoor Mobile Robots School of Electrical & Electronic Engineering Queen’s University Belfast.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
Vehicle Movement Tracking
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Scale Invariant Feature Transform (SIFT)
Goal: Fast and Robust Velocity Estimation P1P1 P2P2 P3P3 P4P4 Our Approach: Alignment Probability ●Spatial Distance ●Color Distance (if available) ●Probability.
Robust estimation Problem: we want to determine the displacement (u,v) between pairs of images. We are given 100 points with a correlation score computed.
כמה מהתעשייה? מבנה הקורס השתנה Computer vision.
1 REAL-TIME IMAGE PROCESSING APPROACH TO MEASURE TRAFFIC QUEUE PARAMETERS. M. Fathy and M.Y. Siyal Conference 1995: Image Processing And Its Applications.
Overview Introduction to local features
Kalman filter and SLAM problem
FEATURE EXTRACTION FOR JAVA CHARACTER RECOGNITION Rudy Adipranata, Liliana, Meiliana Indrawijaya, Gregorius Satia Budhi Informatics Department, Petra Christian.
GM-Carnegie Mellon Autonomous Driving CRL TitleAutomated Image Analysis for Robust Detection of Curbs Thrust AreaPerception Project LeadDavid Wettergreen,
Course Syllabus 1.Color 2.Camera models, camera calibration 3.Advanced image pre-processing Line detection Corner detection Maximally stable extremal regions.
Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different.
3D SLAM for Omni-directional Camera
Vision-based Landing of an Unmanned Air Vehicle
International Conference on Computer Vision and Graphics, ICCVG ‘2002 Algorithm for Fusion of 3D Scene by Subgraph Isomorphism with Procrustes Analysis.
Localization for Mobile Robot Using Monocular Vision Hyunsik Ahn Jan Tongmyong University.
Video Segmentation Prepared By M. Alburbar Supervised By: Mr. Nael Abu Ras University of Palestine Interactive Multimedia Application Development.
General ideas to communicate Show one particular Example of localization based on vertical lines. Camera Projections Example of Jacobian to find solution.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
Soccer Video Analysis EE 368: Spring 2012 Kevin Cheng.
Vehicle Segmentation and Tracking From a Low-Angle Off-Axis Camera Neeraj K. Kanhere Committee members Dr. Stanley Birchfield Dr. Robert Schalkoff Dr.
Crowd Analysis at Mass Transit Sites Prahlad Kilambi, Osama Masound, and Nikolaos Papanikolopoulos University of Minnesota Proceedings of IEEE ITSC 2006.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
CVPR2013 Poster Detecting and Naming Actors in Movies using Generative Appearance Models.
Overview Introduction to local features Harris interest points + SSD, ZNCC, SIFT Scale & affine invariant interest point detectors Evaluation and comparison.
CSE 185 Introduction to Computer Vision Feature Matching.
A New Method for Crater Detection Heather Dunlop November 2, 2006.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
CSSE463: Image Recognition Day 29 This week This week Today: Surveillance and finding motion vectors Today: Surveillance and finding motion vectors Tomorrow:
Suspicious Behavior in Outdoor Video Analysis - Challenges & Complexities Air Force Institute of Technology/ROME Air Force Research Lab Unclassified IED.
1.Optical Flow 2.LogPolar Transform 3.Inertial Sensor 4.Corner Detection 5. Feature Tracking 6.Lines.
By Sridhar Godavarthy. Co-Author: Joshua Candamo Ph.D Advisors: Dr. Kasturi Rangachar Dr. Dmitry Goldgof.
Digital Image Processing Lecture 17: Segmentation: Canny Edge Detector & Hough Transform Prof. Charlene Tsai.
Video Google: Text Retrieval Approach to Object Matching in Videos Authors: Josef Sivic and Andrew Zisserman University of Oxford ICCV 2003.
Multi-view Synchronization of Human Actions and Dynamic Scenes Emilie Dexter, Patrick Pérez, Ivan Laptev INRIA Rennes - Bretagne Atlantique
Detection of nerves in Ultrasound Images using edge detection techniques NIRANJAN TALLAPALLY.
Motion tracking TEAM D, Project 11: Laura Gui - Timisoara Calin Garboni - Timisoara Peter Horvath - Szeged Peter Kovacs - Debrecen.
Vision-based Android Application for GPS Assistance in Tunnels
Paper – Stephen Se, David Lowe, Jim Little
Traffic Sign Recognition Using Discriminative Local Features Andrzej Ruta, Yongmin Li, Xiaohui Liu School of Information Systems, Computing and Mathematics.
Lecture 26 Hand Pose Estimation Using a Database of Hand Images
Robust Range Only Beacon Localization
Video Google: Text Retrieval Approach to Object Matching in Videos
Fast and Robust Object Tracking with Adaptive Detection
Vehicle Segmentation and Tracking in the Presence of Occlusions
Vehicle Segmentation and Tracking from a Low-Angle Off-Axis Camera
IMAGE BASED VISUAL SERVOING
Dongwook Kim, Beomjun Kim, Taeyoung Chung, and Kyongsu Yi
Maximally Stable Extremal Regions
Maximally Stable Extremal Regions
Maximally Stable Extremal Regions
Finding Periodic Discrete Events in Noisy Streams
Nanyang Technological University
CSE 185 Introduction to Computer Vision
Multi-Information Based GCPs Selection Method
Presented by Mohammad Rashidujjaman Rifat Ph.D Student,
2011 International Geoscience & Remote Sensing Symposium
Presentation transcript:

Signal and Image Processing Lab Andrew and Erna Viterbi Faculty of Electrical Engineering Distance Estimation of Marine Vehicles Using a Monocular Video Camera Ran Gladstone and Avihai Barel, Supervised by Yair Moshe Introduction Horizon Detection Distance to Horizon Unmanned ship vehicles (USVs) are vehicles that operate on the surface of the water without a crew Distance estimation of marine vehicles is a critical requirement for autonomous navigation USV size limitations prevent usage of multiple sensors Hough transform based solution Accurate and consistent horizon detection Adaptive thresholds help in handling varying environmental conditions, such as sun glare Distance of each pixel in the ROI from the horizon is calculated The pixel with maximal distance is chosen Morphological Erosion Input Frame Horizon Line Histogram Equalization Canny Edge Detection Noise Reduction Choosing Maximal Non-Vertical Line Temporal Median Filter Hough Transform The distance of two pixels in the ROI to the horizon Distance in Meters Pixels in the image represent angles 𝜑 is the angle of the start of the field of view of the camera 𝛼 is the angle between the tracked object and beginning of the field of view of the camera 𝑑=ℎtan(𝛼+𝜑) A USV at sea Goals Horizon line detection for two different settings Estimate the distance of marine vehicles from a USV based on video Monocular visual camera provides input Robust treatment for varying conditions Computationally efficient algorithm ROI Detection Determine Threshold Increment Input Frame and Object Tracking Extract MSER Regions Choose Closest Region ROI Challenges Results No prior info on object shape, size or velocity Fast changing environment Waves, weather, sea traffic Camera movement Sun glare No suitable solutions in the literature An experiment with multiple marine vessels was conducted For comparison to ground truth, videos were synchronized with GPS coordinate measurements for each vessel Results show a mean absolute relative error of 7.1% with a standard deviation of 5.8% Maximally Stable Extremal Regions (MSER) is an efficient feature extraction algorithm Suggested in [Matas et al., 2004] Thresholds the image under a sequence of increasing threshold values and looks for stable connected components Stable connected components are those whose area remains unchanged over a certain number of thresholds Distance Estimation Input Frame (𝑥,𝑦) Distances from the Horizon Distance in Pixels Distance in Meters Horizon Line ROI Horizon Detector ROI Detection Calculating Distances from the Horizon (x,y) Translate Distance to Meters Choosing Pixel with Maximal Distance Object Tracking Estimated distance vs. GPS measured distance Conclusions Successful estimation of distance with suitable accuracy for navigation applications Accurate horizon detection Robust selection of nearest point of tracked vehicle Computationally efficient algorithm Feasible to run in real-time on a USV The tracker input is a pixel location The tracker is somewhere on or near the object The ROI (region of interest) contains the contact point of the vehicle with the sea surface MSER regions for a frame of a sailboat. The coordinates supplied by the tracker are marked as a red cross. The ROI is chosen to be the region closest to the tracker In collaboration with September 2016