Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan, Stephanie Herd NASA JSC Mentors Dr. Bob Savely Dr. Mike Goza Project Mentor Dr. Giovanni Giardini Project Advisor Prof. Tamás Kalmár-Nagy
Project Members Electrical Engineering Computer Engineering Nuclear Engineering Mechanical Engineering Aerospace Engineering Nicholas Logan Stephanie Herd Aaron Roney Albert Soto Bonnie Stern Brian Kuehner David Taylor Freshman Sophomore Junior Senior
Outline Motivation and Objectives Ego-Motion Theory Code Flow Calibration and Rectification Hardware Testing Results Future Work
Motivation Lunar surface exploration Human perspective In safety With low risk 3D environment reconstruction Self location with artificial vision system
Objectives Vision System Ego-Motion estimation Environment reconstruction Visual Feedback System for Tele-Operations Tele-Operation System Remote control mobile unit Hardware and Mechanical Implementation
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Ego-Motion Theory
3D Reconstruction Theory Left image Right image u left p u right p v left p v right p u left p It is impossible to compute the 3D coordinates of an object with a single image Solution: Stereo Cameras Disparity computation 3D reconstruction Image
Disparity map computation: Given 2 images, it is a collection of pixel disparities Point distances can be calculated from disparities Environment can be reconstructed from disparity map Left ImageRight ImageDisparity Map Environment Reconstruction
Perspective Projection Equation Main goal: Evaluate the motion (translation and rotation) of the vehicle from sequences of images Ego-Motion Estimation Solving will give change in position of the vehicle Optical Flow Example Optical Flow is related to vehicle movement through the Perspective Projection Equation Least Square solution
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Code Flow
Sony VAIO - Pentium 4 Logitech QuickCam Deluxe Image Processing Code Calibration Parameters Acquire Images Rectify Images Ego-Motion Estimation Wireless Network Ground Station
Mobile Unit Detailed Code Calibration Parameters Snapshot Image Matrix Image Parameters: Gray Scale (640x480) … Image Parameters: Gray Scale (640x480) … Acquire ImageRectify Images Rectified Image Matrix Save Image T = 0.15 secT = 0.5 sec Ego-Motion Estimation Apply Distortion Coefficient to Image Matrix Wireless Network Ground Station
Ego-Motion Estimation Overview Find Features in Right Image Calibration Parameters Right Image Left Image Track Right Image Features in Left Image New Right Image New Left Image Find Features in New Right Image Find Features in Left Image Find Features in New Left Image Track Right Image Features in New Right Image Track Right Image Features in New Left Image Discard All non- Identical Points in All images Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) Displacement Vector (X, Y, Z, X- Rot, Y-Rot, Z- Rot) T = 3 sec Image Feature Matrix Wireless Network
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Calibration and Rectification
Calibration: Utilizes Matlab tools to determine image distortion associated with the camera Rectification: Removes the distortion in the images
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Hardware
Mobile Unit TROPOS Router Laptop Web Cameras Mobile UnitBase Station Linksys Router Operator Computer Command Computer Wireless Wireless
Improvements Implemented in the System Improved robustness of the software Implemented a menu driven system for the operator using Matlab’s network handling protocol Allowed pictures to be taken Run Ego-motion Sending all the results to the operator Graphic displaying of optical flow Reduced crashing Achieved greater mobile unit control
Mobile Unit Vehicle Courtesy of Prof. Dezhen Song Baseline D L FOV 1 FOV 2 α Horizontal View Camera support system 3-DOF mechanical neck: Panoramic rotation Tilt rotation Telescopic capability Controlled height and baseline length
Visual System (onboard the Vehicle) Ground Station Vehicle Hardware Wireless Network Testing Result
Test Environment Light to simulate solar exposure Black background to eliminate background features Lunar Environment Walls to eliminate stray light and side shadows Measured displacements
Test Setup 25 pictures taken from each location (0, 5, 10 and 15 cm) in the Z direction (perpendicular to camera focal plane), unidirectional movement Set 1 25 images located at Z=0 Set 2 25 images located at Z=5 Set 3 25 images located at Z=10 Set 4 25 images located at Z=15 The distances are measured using a tape measure The cameras are mounted using a semi ridged fixture
Determining the Number of Features The standard deviation decreases with the more features But the accuracy of the results decrease 100 Features were selected Results for 5 cm displacement Used all 100 images Compared each set to the previous
Ego-Motion: Example Optical Flow Left Image Optical Flow Right Image RANSAC degree5 cmStd Dev10 cmStd Dev15 cmStd Dev
Problems Images were not rectified Possible motion of cameras between images No image filtering Camera mounting is misaligned Images acquired from the right camera appear blurry
Conclusions and Future Work Demonstrated: Ego-motion estimation Environment Reconstruction Vehicle control and movement System integration Future Developments: Filtering and improving results Increase the robustness of the vision system Create a visual 3D environment map
Thanks to: –Prof. Tamás Kalmár-Nagy –Dr. Giovanni Giardini –Prof. Dezhen Song –Change Young Kim –Magda Lagoudas –Tarek Elgohary –Pedro Davalos Acknowledgements