Distance Estimation Ohad Eliyahoo And Ori Zakin. Introduction Current range estimation techniques require use of an active device such as a laser or radar.

Slides:



Advertisements
Similar presentations
George Ritchie.
Advertisements

Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Laser Speckle Extensometer ME 53
Micro-triangulation for high accuracy short range measurements of dynamic objects Vasileios Vlachakis 1 st PACMAN Workshop February 2015 CERN, Geneva,
Laser ​ Distance measurement ​ by Aleksi Väisänen ​ Pauli Korhonen.
The Bioloid Robot Project Presenters: Michael Gouzenfeld Alexey Serafimov Supervisor: Ido Cohen Winter Department of Electrical Engineering.
By : Adham Suwan Mohammed Zaza Ahmed Mafarjeh. Achieving Security through Kinect using Skeleton Analysis (ASKSA)
Intelligent Systems Lab. Extrinsic Self Calibration of a Camera and a 3D Laser Range Finder from Natural Scenes Davide Scaramuzza, Ahad Harati, and Roland.
1 EDM Calculations ASM 215 EDM CALCULATIONS. 2 EDM Calculations HORIZONTAL DISTANCE MEASUREMENT n In plane surveying, the distance between two points.
Localization of Piled Boxes by Means of the Hough Transform Dimitrios Katsoulas Institute for Pattern Recognition and Image Processing University of Freiburg.
Internet Traffic Patterns Learning outcomes –Be aware of how information is transmitted on the Internet –Understand the concept of Internet traffic –Identify.
Tracking Migratory Birds Around Large Structures Presented by: Arik Brooks and Nicholas Patrick Advisors: Dr. Huggins, Dr. Schertz, and Dr. Stewart Senior.
Spectrophotometer Jan 28, 2002 Deryck Hong Suryadi Gunawan.
Navigation Systems for Lunar Landing Ian J. Gravseth Ball Aerospace and Technologies Corp. March 5 th, 2007 Ian J. Gravseth Ball Aerospace and Technologies.
Seismic Equipment.
SL Introduction to Optical Inspection1. Introduction to Optical Inspection Helge Jordfald Sales & Marketing Manager Tordivel AS – Norway.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
 Experiment 07 Function Dr. rer.nat. Jing LU
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Virtual Modeling Simulation of a camera, placed on the robot Author Astapkovich Dmitry,
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Wireless Localization System Using Propagation Timing Brian Dake Brian Vaughan Sean Brennan.
A HIGH RESOLUTION 3D TIRE AND FOOTPRINT IMPRESSION ACQUISITION DEVICE FOR FORENSICS APPLICATIONS RUWAN EGODA GAMAGE, ABHISHEK JOSHI, JIANG YU ZHENG, MIHRAN.
Use of piezoelectric sensors for Vehicular speed measurement The CoEP Astronomy Club.
LOCALIZATION in Sensor Networking Hamid Karimi. Wireless sensor networks Wireless sensor node  power supply  sensors  embedded processor  wireless.
NSF Engineering Research Center for Reconfigurable Manufacturing Systems University of Michigan College of Engineering In-Line Engine Valve Seat Inspection.
Enhanced Navigational Aid For the Visually Impaired Peter Okma (CPE) Abhay Sampath (EE) Katelyn Sapio (EE) 04/28/2010.
Introduction to Engineering Camera Lab #3 - 1 Agenda Do parts I and II of the lab Record data Answer questions.
Implementing a Sentient Computing System Presented by: Jing Lin, Vishal Kudchadkar, Apurva Shah.
Stanley – RC Car.
Wii Care James Augustin Benjamin Cole Daniel Hammer Trenton J. Johnson Ricardo Martinez.
1 Finding depth. 2 Overview Depth from stereo Depth from structured light Depth from focus / defocus Laser rangefinders.
Stereo Object Detection and Tracking Using Clustering and Bayesian Filtering Texas Tech University 2011 NSF Research Experiences for Undergraduates Site.
Submitted by: Giorgio Tabarani, Christian Galinski Supervised by: Amir Geva CIS and ISL Laboratory, Technion.
1/30 Challenge the future Auto-alignment of the SPARC mirror W.S. Krul.
Asian Institute of Technology
Laser-Based Finger Tracking System Suitable for MOEMS Integration Stéphane Perrin, Alvaro Cassinelli and Masatoshi Ishikawa Ishikawa Hashimoto Laboratory.
Human Tracking System Using DFP in Wireless Environment 3 rd - Review Batch-09 Project Guide Project Members Mrs.G.Sharmila V.Karunya ( ) AP/CSE.
Refraction of light through a Convex Lens. Lens diagram a cross section through the centre plane.
Development of a laser slit system in LabView
University “Ss. Cyril and Methodus” SKOPJE Cluster-based MDS Algorithm for Nodes Localization in Wireless Sensor Networks Ass. Biljana Stojkoska.
CONTENT 1. Introduction to Kinect 2. Some Libraries for Kinect 3. Implement 4. Conclusion & Future works 1.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
Tracking Systems in VR.
Sole Supports Imaging Software Group 9: Edward Krei (BME) Edward Krei (BME) Michael Galante (CompE) Michael Galante (CompE) Derrick Snyder (CompE) Derrick.
Estimating the storage capacity of a CD/DVD using diffraction.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Introduction The word transform means “to change.” In geometry, a transformation changes the position, shape, or size of a figure on a coordinate plane.
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
3D Modeling with the Tinmith Mobile Outdoor Augmented Reality System Editors: Lawrence Rosenblum and Simon Julier.
SIMULATION MODELLING. Your exam Lesson objectives To be able to define and understand: Definition of a simulation model. Advantages and disadvantages.
AUGMENTED REALITY VIJAY COLLEGE OF ENGG. FOR WOMEN PRESENTED BY…….
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
3D printed Luneburg lens antenna for millimeter wave system Min Liang and Hao Xin Electrical and Computer Engineering Department, University of Arizona,
UNIT-3 ADVANCES IN METROLOGY
Real-Time Soft Shadows with Adaptive Light Source Sampling
Active Remote Sensing for Elevation Mapping
Recent developments on micro-triangulation
Contents Team introduction Project Introduction Applicability
THE FEASIBILTY STUDY LECTURE-5.
Nov Visualization with 3D CG
AGNINAYAN S PATIL 1AP07EC004 ECE
SMART ANTENNAS Reham mahmoud Amira aboelnasr.
Robotics Controlled 3D Photography
The focal length of a lens
By: Mohammad Qudeisat Supervisor: Dr. Francis Lilley
Viability of vanishing point navigation for mobile devices
IEEE/ASME TRANSACTIONS ON MECHATRONICS, VOL. 10, NO. 4, AUGUST 2005
Presentation transcript:

Distance Estimation Ohad Eliyahoo And Ori Zakin

Introduction Current range estimation techniques require use of an active device such as a laser or radar. Current range estimation techniques require use of an active device such as a laser or radar. Drawbacks of an active approach are: Drawbacks of an active approach are: Expensive Expensive Use in military scenarios can compromise the measurers position. Use in military scenarios can compromise the measurers position. Requires dedicated hardware Requires dedicated hardware

Introduction (cont) Our System uses 2 cameras to triangulate the object ’ s position, and estimate the distance between the camera plane and the object. Our System uses 2 cameras to triangulate the object ’ s position, and estimate the distance between the camera plane and the object. Advantages: Advantages: Passive, based on passive image processing. Passive, based on passive image processing. Cheap, only requires two cameras. Cheap, only requires two cameras. Can be implemented in any system that has two cameras, by adding appropriate software. For example, a robot can use this approach to estimate distance to objects in it ’ s path. Can be implemented in any system that has two cameras, by adding appropriate software. For example, a robot can use this approach to estimate distance to objects in it ’ s path.

Introduction (cont) The approach ’ s main disadvantage is accuracy compared to active systems. The approach ’ s main disadvantage is accuracy compared to active systems.

Algorithm Phases System configuration System configuration Supply two different images of same object Supply two different images of same object Select object for distance estimation Select object for distance estimation Identify selected object in second image using image processing Identify selected object in second image using image processing Perform calculations and return result Perform calculations and return result

System configuration Camera ’ s Focal Length Camera ’ s Diameter

Angle of view Distance Congruent Area

Supply two different images of same object User selection

R D Calculations

Results delta Estimated Distance Real Distance Distance between cameras

Summary Increasing the distance between the cameras increases the accuracy of the system. Increasing the distance between the cameras increases the accuracy of the system. Our system measures distance on the horizontal plane only. Our system measures distance on the horizontal plane only. Our experiments were limited by out setup, a more controlled environment is needed to achieve more accurate results Our experiments were limited by out setup, a more controlled environment is needed to achieve more accurate results