Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Mark Hibbeler, Nicholas Logan, Stephanie Herd Tele-Operation and Vision System for Moon.

Slides:



Advertisements
Similar presentations
Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
Advertisements

3D Model Matching with Viewpoint-Invariant Patches(VIP) Reporter :鄒嘉恆 Date : 10/06/2009.
Lecture 11: Two-view geometry
DEPTH FROM DISTORTION: PERFORMING 3D RECONSTRUCTION USING A CURVED MIRROR AND SINGLE CAMERA DREXEL UNIVERSITY DEARTMENT OF MATHEMATICS ASST. PROF. ANDREW.
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Hi_Lite Scott Fukuda Chad Kawakami. Background ► The DARPA Grand Challenge ► The Defense Advance Research Project Agency (DARPA) established a contest.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
Hybrid Position-Based Visual Servoing
Fisheye Camera Calibration Arunkumar Byravan & Shai Revzen University of Pennsylvania.
Multi video camera calibration and synchronization.
Video Processing EN292 Class Project By Anat Kaspi.
Srikumar Ramalingam Department of Computer Science University of California, Santa Cruz 3D Reconstruction from a Pair of Images.
Summary of ARM Research: Results, Issues, Future Work Kate Tsui UMass Lowell January 8, 2006 Kate Tsui UMass Lowell January 8, 2006.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Obstacle detection using v-disparity image
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
COMP322/S2000/L271 Stereo Imaging Ref.V.S.Nalwa, A Guided Tour of Computer Vision, Addison Wesley, (ISBN ) Slides are adapted from CS641.
Coordinate Transformation. How to transform coordinates from one system to another. In this situation we have earth coordinates on the left and digitizer.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Optical flow (motion vector) computation Course: Computer Graphics and Image Processing Semester:Fall 2002 Presenter:Nilesh Ghubade
CSE 6367 Computer Vision Stereo Reconstruction Camera Coordinate Transformations “Everything should be made as simple as possible, but not simpler.” Albert.
Automatic Camera Calibration
A Brief Overview of Computer Vision Jinxiang Chai.
8/29/2015 GEM Lecture 14 Content Orientation in analytical plotters.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Generating panorama using translational movement model.
Supporting Beyond-Surface Interaction for Tabletop Display Systems by Integrating IR Projections Hui-Shan Kao Advisor : Dr. Yi-Ping Hung.
A Practical Solution: GOLD GenericObstacleand LaneDetection Theoretical Bases Lane Detection Obstacle Detection Strong Points Weak Points.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
3D SLAM for Omni-directional Camera
3D-2D registration Kazunori Umeda Chuo Univ., Japan CRV2010 Tutorial May 30, 2010.
Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
Computer Vision, Robert Pless Lecture 11 our goal is to understand the process of multi-camera vision. Last time, we studies the “Essential” and “Fundamental”
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Frame Decimation for Structure and Motion Young Ki Baik CV Lab.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Metrology 1.Perspective distortion. 2.Depth is lost.
Acquiring 3D models of objects via a robotic stereo head David Virasinghe Department of Computer Science University of Adelaide Supervisors: Mike Brooks.
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
Stereoscopic Video Overlay with Deformable Registration Balazs Vagvolgyi Prof. Gregory Hager CISST ERC Dr. David Yuh, M.D. Department of Surgery Johns.
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
0 Test Slide Text works. Text works. Graphics work. Graphics work.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry Topics: Basics of projective geometry Points and hyperplanes in projective space Homography.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Using Adaptive Tracking To Classify And Monitor Activities In A Site W.E.L. Grimson, C. Stauffer, R. Romano, L. Lee.
Visual Odometry David Nister, CVPR 2004
Vision-based SLAM Enhanced by Particle Swarm Optimization on the Euclidean Group Vision seminar : Dec Young Ki BAIK Computer Vision Lab.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
John Morris Stereo Vision (continued) Iolanthe returns to the Waitemata Harbour.
Digital Image Processing Additional Material : Imaging Geometry 11 September 2006 Digital Image Processing Additional Material : Imaging Geometry 11 September.
1 NASA NYCRI 2006 Autonomous Vehicle Control System Advisor: Dr. Haim Baruh Advisor: Dr. Haim Baruh Mentor: Lucian Iorga Mentor: Lucian Iorga Researcher:
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
1 2D TO 3D IMAGE AND VIDEO CONVERSION. INTRODUCTION The goal is to take already existing 2D content, and artificially produce the left and right views.
Computer vision: geometric models Md. Atiqur Rahman Ahad Based on: Computer vision: models, learning and inference. ©2011 Simon J.D. Prince.
Think-Pair-Share What visual or physiological cues help us to perceive 3D shape and depth?
Paper – Stephen Se, David Lowe, Jim Little
José Manuel Iñesta José Martínez Sotoca Mateo Buendía
Networks of Autonomous Unmanned Vehicles
Multiple View Geometry for Robotics
Presentation transcript:

Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Mark Hibbeler, Nicholas Logan, Stephanie Herd Tele-Operation and Vision System for Moon Exploration NASA JSC Mentors Dr. Bob Savely Mike Goza Project Mentor Dr. Giovanni Giardini Project Advisor Prof. Tamás Kalmár-Nagy

Project Members  Nuclear Engineering  Mechanical Engineering  Aerospace Engineering  Mechanical Engineering  Computer Engineering  Aaron Roney  Albert Soto  Brian Kuehner  David Taylor  Mark Hibbeler  Nicholas Logan  Stephanie Herd  Sophomore  Senior  Sophomore  Freshman

Motivations  Lunar surface exploration  Human perspective  In safety  With low risk  3D environment reconstruction  Self location with artificial vision system

Objectives  Vision System  Ego-Motion estimation  Environment reconstruction Tele-Operation System with Visual Feedback  Tele-Operation System  Remote Vehicle Control  Hardware and Mechanical Implementation

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware WIRELESS NETWORK WIRELESS NETWORK

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware WIRELESS NETWORK WIRELESS NETWORK

Theory Left image Right image u left p u right p v left p v right p u left p  It is impossible to compute the 3D coordinates of an object with a single image  Solution: Stereo Cameras  Disparity computation 3D reconstruction Image

 Main Goal: digital environment 3D reconstruction  Object detection (i.e. obstacles)  High level planning  Self localization Use Stereo Cameras to generate 3D environment Environment Reconstruction

 Disparity map computation:  Given 2 images, it is a collection of pixel disparities  Point distances can be calculated from disparities Environment can be reconstructed from disparity map Left ImageRight ImageDisparity Map Environment Reconstruction

Perspective Projection Equation  Main goal: evaluate the motion (translation and rotation) of the vehicle from sequences of images Ego-Motion Estimation  Solving will give velocities of the vehicle Optical Flow Example  Optical Flow is related to vehicle movement through the  Least Square solution

Reference Motion [mm]Detected Motion [mm] TxTx 04.5 TyTy TzTz ΩxΩx ΩyΩy ΩzΩz 00 Ego-Motion: Example Optical Flow Left ImageOptical Flow Right Image

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware WIRELESS NETWORK WIRELESS NETWORK

Calibration and Filtering  Calibration: removes image distortion  Filtering Process:  Improves image quality  Increases the robustness of the vision system

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware WIRELESS NETWORK WIRELESS NETWORK

Tele-Operations Laptop on TAMUBOT TAMUBOT Control SystemWireless Router Control PC Tropos Router Picture PC

Vehicle Vehicle Courtesy of Prof. Dezhen Song Baseline D L FOV 1 FOV 2 α Horizontal View  Camera support system  3-DOF mechanical neck:  Panoramic rotation  Tilt rotation  Telescopic capability  Controlled height and baseline length

Conclusions and Future Work  Demonstrated:  Ego-motion estimation  Environment Reconstruction  Vehicle control and movement  Future Developments:  System integration  Filtering and improving results

 Thanks to: –Prof. Tamás Kalmár-Nagy –Dr. Giovanni Giardini –Prof. Dezhen Song –Change Young Kim –Magda Lagoudas –Tarek Elgohary –Pedro Davalos Acknowledgements