Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,

Slides:



Advertisements
Similar presentations
Zhengyou Zhang Vision Technology Group Microsoft Research
Advertisements

Miroslav Hlaváč Martin Kozák Fish position determination in 3D space by stereo vision.
The fundamental matrix F
Lecture 11: Two-view geometry
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Computer vision: models, learning and inference
Hybrid Position-Based Visual Servoing
3D M otion D etermination U sing µ IMU A nd V isual T racking 14 May 2010 Centre for Micro and Nano Systems The Chinese University of Hong Kong Supervised.
Lecture 8: Stereo.
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Multi video camera calibration and synchronization.
MSU CSE 240 Fall 2003 Stockman CV: 3D to 2D mathematics Perspective transformation; camera calibration; stereo computation; and more.
Lecture 7: Image Alignment and Panoramas CS6670: Computer Vision Noah Snavely What’s inside your fridge?
Uncalibrated Epipolar - Calibration
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Mark Hibbeler, Nicholas Logan, Stephanie Herd Tele-Operation and Vision System for Moon.
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
CSE473/573 – Stereo Correspondence
Autonomous Robotics Team Autonomous Robotics Lab: Cooperative Control of a Three-Robot Formation Texas A&M University, College Station, TX Fall Presentations.
Visualization- Determining Depth From Stereo Saurav Basu BITS Pilani 2002.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Computer Vision Spring ,-685 Instructor: S. Narasimhan Wean 5403 T-R 3:00pm – 4:20pm Lecture #15.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
Firefighter Indoor Navigation using Distributed SLAM (FINDS) Major Qualifying Project Matthew Zubiel Nick Long Advisers: Prof. Duckworth, Prof. Cyganski.
CSE 6367 Computer Vision Stereo Reconstruction Camera Coordinate Transformations “Everything should be made as simple as possible, but not simpler.” Albert.
Stereoscopic PIV.
Automatic Camera Calibration
Computer vision: models, learning and inference
1 Test Slide Text works. Text works. Graphics work. Graphics work.
Lecture 11 Stereo Reconstruction I Lecture 11 Stereo Reconstruction I Mata kuliah: T Computer Vision Tahun: 2010.
3D Stereo Reconstruction using iPhone Devices Final Presentation 24/12/ Performed By: Ron Slossberg Omer Shaked Supervised By: Aaron Wetzler.
PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace.
Geometric and Radiometric Camera Calibration Shape From Stereo requires geometric knowledge of: –Cameras’ extrinsic parameters, i.e. the geometric relationship.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
3D SLAM for Omni-directional Camera
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
Shape from Stereo  Disparity between two images  Photogrammetry  Finding Corresponding Points Correlation based methods Feature based methods.
CS654: Digital Image Analysis Lecture 8: Stereo Imaging.
Image stitching Digital Visual Effects Yung-Yu Chuang with slides by Richard Szeliski, Steve Seitz, Matthew Brown and Vaclav Hlavac.
Acquiring 3D models of objects via a robotic stereo head David Virasinghe Department of Computer Science University of Adelaide Supervisors: Mike Brooks.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
The geometry of the system consisting of the hyperbolic mirror and the CCD camera is shown to the right. The points on the mirror surface can be expressed.
Geometric Transformations
0 Test Slide Text works. Text works. Graphics work. Graphics work.
Feature Matching. Feature Space Outlier Rejection.
Realtime Robotic Radiation Oncology Brian Murphy 4 th Electronic & Computer Engineering.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Course14 Dynamic Vision. Biological vision can cope with changing world Moving and changing objects Change illumination Change View-point.
Stereo on PC. WE CAN OBSERVE THAT THE PLOTTING OVERLAYS THE IMAGE MANTAINING THE PERSPECTIVE RECTIFICATION AND PLOTTING OF SINGLE IMAGE THE ORIENTATIÓN.
Visual Odometry David Nister, CVPR 2004
Vision Sensors for Stereo and Motion Joshua Gluckman Polytechnic University.
John Morris Stereo Vision (continued) Iolanthe returns to the Waitemata Harbour.
1 NASA NYCRI 2006 Autonomous Vehicle Control System Advisor: Dr. Haim Baruh Advisor: Dr. Haim Baruh Mentor: Lucian Iorga Mentor: Lucian Iorga Researcher:
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
55:148 Digital Image Processing Chapter 11 3D Vision, Geometry
Computer vision: models, learning and inference
Paper – Stephen Se, David Lowe, Jim Little
COSC579: Image Align, Mosaic, Stitch
Vision Based Motion Estimation for UAV Landing
Multidisciplinary Engineering Senior Design Project P06441 See Through Fog Imaging Preliminary Design Review 05/19/06 Project Sponsor: Dr. Rao Team Members:
ISOMAP TRACKING WITH PARTICLE FILTERING
Multiple View Geometry for Robotics
Noah Snavely.
Depth Analysis With Stereo Camera
Presentation transcript:

Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan, Stephanie Herd NASA JSC Mentors Dr. Bob Savely Dr. Mike Goza Project Mentor Dr. Giovanni Giardini Project Advisor Prof. Tamás Kalmár-Nagy

Project Members  Electrical Engineering  Computer Engineering  Nuclear Engineering  Mechanical Engineering  Aerospace Engineering  Nicholas Logan  Stephanie Herd  Aaron Roney  Albert Soto  Bonnie Stern  Brian Kuehner  David Taylor  Freshman  Sophomore  Junior  Senior

Outline  Motivation and Objectives  Ego-Motion Theory  Code Flow  Calibration and Rectification  Hardware  Testing Results  Future Work

Motivation  Lunar surface exploration  Human perspective  In safety  With low risk  3D environment reconstruction  Self location with artificial vision system

Objectives  Vision System  Ego-Motion estimation  Environment reconstruction Visual Feedback System for Tele-Operations  Tele-Operation System  Remote control mobile unit  Hardware and Mechanical Implementation

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Ego-Motion Theory

3D Reconstruction Theory Left image Right image u left p u right p v left p v right p u left p  It is impossible to compute the 3D coordinates of an object with a single image  Solution: Stereo Cameras  Disparity computation 3D reconstruction Image

 Disparity map computation:  Given 2 images, it is a collection of pixel disparities  Point distances can be calculated from disparities Environment can be reconstructed from disparity map Left ImageRight ImageDisparity Map Environment Reconstruction

Perspective Projection Equation  Main goal: Evaluate the motion (translation and rotation) of the vehicle from sequences of images Ego-Motion Estimation  Solving will give change in position of the vehicle Optical Flow Example  Optical Flow is related to vehicle movement through the Perspective Projection Equation  Least Square solution

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Code Flow

Sony VAIO - Pentium 4 Logitech QuickCam Deluxe Image Processing Code Calibration Parameters Acquire Images Rectify Images Ego-Motion Estimation Wireless Network Ground Station

Mobile Unit Detailed Code Calibration Parameters Snapshot Image Matrix Image Parameters: Gray Scale (640x480) … Image Parameters: Gray Scale (640x480) … Acquire ImageRectify Images Rectified Image Matrix Save Image T = 0.15 secT = 0.5 sec Ego-Motion Estimation Apply Distortion Coefficient to Image Matrix Wireless Network Ground Station

Ego-Motion Estimation Overview Find Features in Right Image Calibration Parameters Right Image Left Image Track Right Image Features in Left Image New Right Image New Left Image Find Features in New Right Image Find Features in Left Image Find Features in New Left Image Track Right Image Features in New Right Image Track Right Image Features in New Left Image Discard All non- Identical Points in All images Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) T = 3 sec Image Feature Matrix Wireless Network

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Calibration and Rectification

 Calibration: Utilizes Matlab tools to determine image distortion associated with the camera  Rectification: Removes the distortion in the images

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Hardware

Mobile Unit TROPOS Router Laptop Web Cameras Mobile UnitBase Station Linksys Router Operator Computer Command Computer Wireless Wireless

Improvements Implemented in the System  Improved robustness of the software  Implemented a menu driven system for the operator using Matlab’s network handling protocol  Allowed pictures to be taken  Run Ego-motion  Sending all the results to the operator  Graphic displaying of optical flow  Reduced crashing  Achieved greater mobile unit control

Mobile Unit Vehicle Courtesy of Prof. Dezhen Song Baseline D L FOV 1 FOV 2 α Horizontal View  Camera support system  3-DOF mechanical neck:  Panoramic rotation  Tilt rotation  Telescopic capability  Controlled height and baseline length

Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Testing Result

Test Environment Light to simulate solar exposure Black background to eliminate background features Lunar Environment Walls to eliminate stray light and side shadows Measured displacements

Test Setup  25 pictures taken from each location (0, 5, 10 and 15 cm) in the Z direction (perpendicular to camera focal plane), unidirectional movement  Set 1 25 images located at Z=0  Set 2 25 images located at Z=5  Set 3 25 images located at Z=10  Set 4 25 images located at Z=15  The distances are measured using a tape measure  The cameras are mounted using a semi ridged fixture

Determining the Number of Features  The standard deviation decreases with the more features  But the accuracy of the results decrease 100 Features were selected Results for 5 cm displacement Used all 100 images Compared each set to the previous

Ego-Motion: Example Optical Flow Left Image Optical Flow Right Image RANSAC degree5 cmStd Dev10 cmStd Dev15 cmStd Dev

Problems  Images were not rectified  Possible motion of cameras between images  No image filtering  Camera mounting is misaligned  Images acquired from the right camera appear blurry

Conclusions and Future Work  Demonstrated:  Ego-motion estimation  Environment Reconstruction  Vehicle control and movement  System integration  Future Developments:  Filtering and improving results  Increase the robustness of the vision system  Create a visual 3D environment map

 Thanks to: –Prof. Tamás Kalmár-Nagy –Dr. Giovanni Giardini –Prof. Dezhen Song –Change Young Kim –Magda Lagoudas –Tarek Elgohary –Pedro Davalos Acknowledgements