Astrobiology Science and Technology for Exploring Planets (ASTEP) Mid-Year Review August 4, 2004 Robust Autonomous Instrument Placement for Rovers (JPL:

Slides:



Advertisements
Similar presentations
CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Advertisements

The fundamental matrix F
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
CLARAty: Towards Standardized Abstractions and Interfaces for Robotics Systems Coupled Layer Architecture for Robotic Autonomy Issa A.D. Nesnas Jet Propulsion.
MASKS © 2004 Invitation to 3D vision Lecture 7 Step-by-Step Model Buidling.
Literature Review: Safe Landing Zone Identification Presented by Keith Sevcik.
Terrain Relative Navigation for Pinpoint Landing using Cubesats
Hybrid Position-Based Visual Servoing
Parallel Tracking and Mapping for Small AR Workspaces Vision Seminar
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Kiyoshi Irie, Tomoaki Yoshida, and Masahiro Tomono 2011 IEEE International Conference on Robotics and Automation Shanghai International Conference Center.
Forward-Backward Correlation for Template-Based Tracking Xiao Wang ECE Dept. Clemson University.
Stereo.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Recognition of Traffic Lights in Live Video Streams on Mobile Devices
Nighttime Driving Capabilities for Rovers Danielle Ator Tim Eddy Jack Hompland.
Feature matching and tracking Class 5 Read Section 4.1 of course notes Read Shi and Tomasi’s paper on.
Computing motion between images
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
UNIVERSITY OF MURCIA (SPAIN) ARTIFICIAL PERCEPTION AND PATTERN RECOGNITION GROUP REFINING FACE TRACKING WITH INTEGRAL PROJECTIONS Ginés García Mateos Dept.
Jason Li Jeremy Fowers Ground Target Following for Unmanned Aerial Vehicles.
EE392J Final Project, March 20, Multiple Camera Object Tracking Helmy Eltoukhy and Khaled Salama.
EE887 Special Topics in Robotics Paper Review Initial Results in the Development Guidance System of a Guidance System for a Powered Wheelchair
Zereik E., Biggio A., Merlo A. and Casalino G. EUCASS 2011 – 4-8 July, St. Petersburg, Russia.
Mobile Distributed 3D Sensing Sandia National Laboratories Intelligent Sensors and Robotics POC: Chris Lewis
Driver’s View and Vehicle Surround Estimation using Omnidirectional Video Stream Abstract Our research is focused on the development of novel machine vision.
Localisation & Navigation
Autonomous Tracking Robot Andy Duong Chris Gurley Nate Klein Wink Barnes Georgia Institute of Technology School of Electrical and Computer Engineering.
Mars Science Laboratory FY04 Year End Review MSL Focused Technology – Rover Technology TB (incl CLARAty) Issa A.D. Nesnas October 15,
Lecture 12 Stereo Reconstruction II Lecture 12 Stereo Reconstruction II Mata kuliah: T Computer Vision Tahun: 2010.
Task Manager Issa A.D. Nesnas Vision Max Bajracharya (JPL) Alt. Task Manager Tara Estlin JPL - Issa A.D. Nesnas ARC – Anne Wright CMU – Reid Simmons U.
The Brightness Constraint
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
PRE-DECISIONAL DRAFT: For Planning and Discussion Purposes Only Test Plan Review MSL Focused Technology Instrument Placement Validation Test Plan for 2D/3D.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
An Information Fusion Approach for Multiview Feature Tracking Esra Ataer-Cansizoglu and Margrit Betke ) Image and.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
JPL LASER MAPPER (LAMP) POC Bob Bunker (Task Manager) Jet Propulsion Laboratory Inter-Agency AR&C Working Group Meeting May , 2002 Naval Research.
Tracking CSE 6367 – Computer Vision Vassilis Athitsos University of Texas at Arlington.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
Stable Multi-Target Tracking in Real-Time Surveillance Video
Computer Vision Lecture #10 Hossam Abdelmunim 1 & Aly A. Farag 2 1 Computer & Systems Engineering Department, Ain Shams University, Cairo, Egypt 2 Electerical.
Unified Robotic Software Development using CLARAty Issa A.D. Nesnas Mobility and Robotic Systems Section Autonomous Systems Division July 20, 2005
Avoiding Planetary Rover Damage by Automated Path Planning Michael Flammia Mentor: Dr. Wolfgang Fink Tempe, AZ April 18 th, 2015.
Accelerated Long Range Traverse (ALERT) Paul Springer Michael Mossey.
Based on the success of image extraction/interpretation technology and advances in control theory, more recent research has focused on the use of a monocular.
State Estimation for Autonomous Vehicles
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
Accurate Indoor Localization With Zero Start-up Cost
1 Stereographic Analysis of Coronal Features for the STEREO Mission Eric De Jong, Paulett Liewer, Jeff Hall, Jean Lorre, Shigeru Suzuki and the SECCHI.
Visual Odometry David Nister, CVPR 2004
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Rover and Instrument Capabilities Life in the Atacama 2004 Science & Technology Workshop Michael Wagner, James Teza, Stuart Heys Robotics Institute, Carnegie.
©Roke Manor Research Ltd 2011 Part of the Chemring Group 1 Startiger SEEKER Workshop Estelle Tidey – Roke Manor Research 26 th February 2011.
Person Following with a Mobile Robot Using Binocular Feature-Based Tracking Zhichao Chen and Stanley T. Birchfield Dept. of Electrical and Computer Engineering.
Linearizing (assuming small (u,v)): Brightness Constancy Equation: The Brightness Constraint Where:),(),(yxJyxII t  Each pixel provides 1 equation in.
A Programmatic View of CLARAty Richard Volpe JPL Space Exploration Technology Program Office NASA Mars Technology Program 2009 Mars Science Laboratory.
Pre-decisional – for Planning and Discussion Purposes Only 1 Technology Planning for Future Mars Missions Samad Hayati Manager, Mars Technology Program.
REMOTE SCIENCE INTERFACE DURING THE LIFE IN THE ATACAMA 2003 EXPEDITION Peter Coppin Remote Experience and Learning Lab STUDIO for Creative Inquiry Carnegie.
Autonomous Rover Spectroscopy with Dozens of Targets / iSAIRAS 2008 / Caldeón, Thompson, Wettergreen 1 Autonomous Rover Reflectance Spectroscopy with Dozens.
Paper – Stephen Se, David Lowe, Jim Little
A Forest of Sensors: Using adaptive tracking to classify and monitor activities in a site Eric Grimson AI Lab, Massachusetts Institute of Technology
The Brightness Constraint
The Brightness Constraint
Eric Grimson, Chris Stauffer,
The Brightness Constraint
Sensor Placement Agile Robotics Program Review August 8, 2008
Optical flow and keypoint tracking
Presentation transcript:

Astrobiology Science and Technology for Exploring Planets (ASTEP) Mid-Year Review August 4, 2004 Robust Autonomous Instrument Placement for Rovers (JPL: Visual Target Tracking) Issa A.D. Nesnas (PI) Max Bajracharya Richard Madison Section 348 Jet Propulsion Laboratory

FY04 Mid-Year Review – August 4, Robust Autonomous Instrument Placement for Rover Objectives: Demonstrate using a single command cycle autonomous placement of a science instrument on a target designated from ten rover lengths away JPL: Develop and demonstrate algorithms to visually track a target designated from ten rover lengths away in realistic, Mars-like, terrain conditions FY03-FY05 Milestones: FY03 S oftware prototype of 2D/3D target feature tracker FY04 In JPL Mars Yard demonstrate tracker on Rocky 8 rover using a fixed mast similar to K9 rover at ARC FY05 Tune tracker performance based on feedback for an independent validation task. Deliver to ARC Funding Profile ($K): Task Manager: Issa A.D. Nesnas(818) Participating Organizations: Jet Propulsion Laboratory Ames Research Center (PI: Liam Pedersen) Facilities: Rocky 8 and CLARAty software JPL Mars Yard, CLARAty Test bed Motion Correlator Motion Correlator Stereo Correlator Stereo Correlator K9 Rocky 8

FY04 Mid-Year Review – August 4, Problem Statement Problem: –Track target designated from 10 m away (10 rover lengths) –Maintain lock on target while rover navigates across rough terrain –Keep target within one pixel accuracy (1 cm final placement) Key Challenges –Continuous, high-frame-rate visual tracking not possible due to slow acquisition and limited/shared computational resources –Both image and 3D reconstructed model change significantly Target grows as rover moves toward it Sudden changes in FOV due to tilt or drop off rocks –Must operate with obstacle avoidance and pose estimators

FY04 Mid-Year Review – August 4, Key Challenges (a) Target (b) Designated Target Target Tracking time = t2 (avoiding an obstacle) time = t1 1 st Frame 37 th Frame after 10 m

FY04 Mid-Year Review – August 4, Mission Relevance and State-of-the-Art Mission Relevance –Key component technology for single-sol instrument placement –Relevant to MSL and future rover missions –Enabling technology that will reduce sols (3 sols for MER to 1 sol for MSL) State-of-the-art –MER baseline – no designated target tracking or single-sol instrument placement –MER non-baseline – visual odometry for improved pose estimation –Previous and current related research efforts: ARC single-cycle instrument placement (ASTEP) (FY02-FY05) FIDO instrument placement using homography transforms (FY00) Planetary Dexterous Manipulation - sample acquisition & instrument placement (FY95-99) ARC visual servoing on Marsokhod (FY96-97) –Military tracking relevant but not directly applicable Can safely assume high-frame rates Assumes some knowledge of tracked target – human-made objects –Most trackers require small motion between frames or a well estimated motion –Visual odometry alone accumulates errors that exceed 1 cm tracking error budget for a 10 m traverse –Our task: Tracks natural terrain features not known a priori Uses visual odometry for pose estimation Combines 2D and 3D information to improve accuracy Gracefully degrades in the absence of good stereo

FY04 Mid-Year Review – August 4, Technical Approach Use visual information to maintain lock on target as rover moves Combine 2D imaging techniques with 3D stereovision information Bound error by affine matching to originally selected feature Increase accuracy by tracking from different FOV cameras Develop precise mast kinematics to control the gaze of the cameras Seed algorithm with good pose estimates (visual odometry) Integrate software into CLARAty infrastructure Adapt to Rocky 8 rover Test in JPL Mars Yard Deliver software (through CLARAty) to MTP validation task Tune algorithm based on feedback from MTP validation task

FY04 Mid-Year Review – August 4, Milestones and Deliverables FY03: –Software prototype of a combined target feature tracker that uses both 2D and 3D visual information to keep a user selected feature tracked FY04: –Adapted version of the combined feature tracker operating the Rocky 8 rover with a fixed mast, similar to K9 rover at ARC, and tested in the JPL Mars Yard –Deliverable: Progress report to program office/sponsor FY05: –Tuned tracker parameters and improved tracking performance based on feedback for an independent validation task. Clearances for all software for ARC access. Support ARC for integration on K9 rover –Deliverable: The “combined 2D/3D visual tracker” software integrated into the CLARAty environment. To be delivered to : ARC - PI; Liam Pedersen Necessary clearances for the PI to access the technology component listed in item 1 and all its dependent component technologies (contingency: all dependent components are also cleared by JPL and Caltech IP offices). To be delivered to : ARC - PI; Liam Pedersen

FY04 Mid-Year Review – August 4, FY03 - FY04 Accomplishments

FY04 Mid-Year Review – August 4, Rover Platforms Intel x86 Ames JPL Rocky 8 VxWorks K9 Intel x86 Linux

FY04 Mid-Year Review – August 4, FY03 Experimental Setup Rocky 8 mast head Rocky 8 Rover with mast head Vision System Used for Tracking Camera Name Placed On BaselineLensFOVCCD Resolution Pixel size (μm) NavigationMast19 cm4 mm 60  640x PanoramicMast23 cm16 mm 17  1024x HazardBody8.6 cm2.8 mm 90  640x4809.9

FY04 Mid-Year Review – August 4, FY04 Experimental Setup Camera Name Placed On BaselineLensFOVCCD Resolution Pixel size (μm) NavigationMast20 cm6 mm 45  1024x PanoramicMast30 cm16 mm 17  1024x HazardBody8.6 cm2.8 mm 90  640x Vision System Used for Tracking Fixed Rocky 8 Mast

FY04 Mid-Year Review – August 4, Integrated 2D/3D Tracker Til t Pan Wheel Odometry Estimator Visual Odometry (Hazard Cameras) Normalized Cross-correlation (NCC) 2D Affine Tracking Multiple Pyramid Levels (start w/ smallest template) Stereovision Target Point 2D image coordinat e NCC Template Locomotor command from navigator Create Affine (KLT) Templates of various sizes (not updated) Verify 2D Location If fails use predicted Pan/Tilt angles One time operation Verify 2D Location If fails use NCC result Verify 3D Location If fails use predicted 3D If fails use wheel odometry Odometry Pose Estimate Δ Rover Pose Non-autonomous step Image / sub image input Coordinate/Transform input First Left Image 2D/3D Tracked Target Predict New Target Location Drive a Step Towards Target Acquire mast stereo Images Left Image Right Image Mast Pointing Kinematics (Pan/Tilt) Updated every step Operator Designates Target Target Tracker Pose Estimator

FY04 Mid-Year Review – August 4, Feature Tracker 1.Rover acquires stereo images from tracking cameras (e.g. navigation) 2.Rover sends one (left) image to the ground system 3.Operator selects a point in the image and sends point to the rover 4.Rover receives the target and computes the 3D location using stereovision 5.Tracker creates KLT template windows of various sizes around the target 6.For each drive command, the algorithm: 1.Creates normalized cross correlation (NCC) template of the target 2.Moves the rover one step 3.Estimates change in rover pose (using visual odometry from hazard cameras) 4.Points the tracking cameras 5.Acquires an image pair 6.Matches NCC template across a search window in the new image. 7.Verifies 3D location of target is within an error bound based on rover pose 8.If fails, use predicted location from rover pose alone 9.Matches three different size affine (KLT) templates across the search window. 1.Starts with largest size original template for a coarse match 2.Verifies the 2D location 3.[Verifies the 3D location] 4.If fails to match, uses previous results (NCC result if using largest template or previousresult if using smaller templates) 5.Repeats steps with smaller KLT template to refine target position 7.Scientist selects a single point to track in one panoramic camera (mast mounted 16mm camera) 8.Compute 3D point location using stereo 9.Grow point to template window that maximizes flat area 10.Command rover motion (approx. 25 cm, max. 10  heading change) 11.Estimate rover motion using visual odometry 12.If visual odometry fails, apply adaptive-view based matching 13.Point 4 mm mast cameras using rover motion estimate (4) and 3D target location (1 or 10) 14.Track target in 4 mm cameras Use normalized cross-correlation between consecutive frames (allows large motion of target between frames) Update affine parameters between single original template and current image (accurate localization with no drift) 15.Triangulate 3D target location using 4mm camera images 16.Repeat steps 6-8 with 16mm cameras 17.Repeat from step 3

FY04 Mid-Year Review – August 4, Tracking Results over Rough Terrain Tracking Video View from 4 mm camera View from 16 mm camera

FY04 Mid-Year Review – August 4, Same Traverse Different Target View from 4 mm camera View from 16 mm camera

FY04 Mid-Year Review – August 4, Publications White papers: –I. Nesnas, M. Bajracharya, E. Bandari, R. Madison, C. Kunz, M. Deans, M. Bualat, “Visual Target Tracking for Rover-based Planetary Exploration,” submitted to IEEE Aerospace Conference, Big Sky, Montana, March 2004 New Technology Report –Title: 2D/3D Visual Tracker for Rover Platforms - NTR Number: 40696

FY04 Mid-Year Review – August 4, Tracking Target Designated from 10 m Demonstrates tracking a target designated from 10 m away on Rocky 8 rover in rough terrain Enables single-cycle instrument placement for MSL – reduces 3 sols to 1 sol –Tracked targets in 4 mm camera –Tracked targets in 16 mm camera –While rover driving over rocks up to 1 wheel diameter –Final accuracy to within 2 cm with 4 mm cameras (one pixel error from designated target) –Shorter runs with 16 mm had < 1 cm error in accuracy –Tracks even when no stereo information is available –Integrated in CLARAty