DARPA Subterranean Challenge “Software” Projects (first version)

Slides:



Advertisements
Similar presentations
Vision-based Motion Planning for an Autonomous Motorcycle on Ill-Structured Road Reporter :鄒嘉恆 Date : 08/31/09.
Advertisements

1/14/11 -SP 11 Digital CapstonesCopyright Joanne DeGroat, ECE, OSU1 Project Descriptions For the digital orientated 682 projects.
Sensor Based Planners Bug algorithms.
Discussion topics SLAM overview Range and Odometry data Landmarks
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Probabilistic Robotics SLAM. 2 Given: The robot’s controls Observations of nearby features Estimate: Map of features Path of the robot The SLAM Problem.
(Includes references to Brian Clipp
Simultaneous Localization and Mapping
1 Observers Data Only Fault Detection Bo Wahlberg Automatic Control Lab & ACCESS KTH, SWEDEN André C. Bittencourt Department of Automatic Control UFSC,
Abstract This project focuses on realizing a series of operational improvements for WPI’s unmanned ground vehicle Prometheus with the end goal of a winning.
Robotics Simulator Intelligent Systems Lab. What is it ? Software framework - Simulating Robotics Algorithms.
Sponsors Mechanical Improvements Software The software is written in C++ and Python using the Robot Operating System (ROS) framework. The ROS tool, rviz,
Probabilistic Robotics
Visual Odometry for Ground Vehicle Applications David Nister, Oleg Naroditsky, James Bergen Sarnoff Corporation, CN5300 Princeton, NJ CPSC 643, Presentation.
Autonomous Vehicle Positioning with GPS in Urban Canyon Environments
Dr. Shankar Sastry, Chair Electrical Engineering & Computer Sciences University of California, Berkeley.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Weighted Range Sensor Matching Algorithms for Mobile Robot Displacement Estimation Sam Pfister, Kristo Kriechbaum, Stergios Roumeliotis, Joel Burdick Mechanical.
1/53 Key Problems Localization –“where am I ?” Fault Detection –“what’s wrong ?” Mapping –“what is my environment like ?”
POLI di MI tecnicolano VISION-AUGMENTED INERTIAL NAVIGATION BY SENSOR FUSION FOR AN AUTONOMOUS ROTORCRAFT VEHICLE C.L. Bottasso, D. Leonello Politecnico.
InerVis Mobile Robotics Laboratory Institute of Systems and Robotics ISR – Coimbra Contact Person: Jorge Lobo Human inertial sensor:
Team Phoenix March 15, Project Goal Our team will develop an air vehicle that will not only navigate a course autonomously while providing real.
Low Infrastructure Navigation for Indoor Environments October 31, 2012 Arne Suppé CMU NavLab Group.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Autonomous Unmanned Ground Vehicle Navigation: Present and Future Larry Jackel DARPA IPTO / TTO darpatech2004/
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
ROBOT LOCALISATION & MAPPING: NAVIGATION Ken Birbeck.
Cooperating AmigoBots Framework and Algorithms
Low-Cost Localization for Educational Robotic Platforms via an External Fixed-Position Camera Drew Housten Dr. William Regli
Accuracy Evaluation of Stereo Vision Aided Inertial Navigation for Indoor Environments D. Grießbach, D. Baumbach, A. Börner, S. Zuev German Aerospace Center.
MAGIC Robot Demo ESE111 November Joe Trovato, James Yang.
Flow Separation for Fast and Robust Stereo Odometry [ICRA 2009]
Solving the Indoor SLAM Problem for a Low-Cost Robot Using Sensor Data Fusion and Autonomous Feature-Based Exploration PhD Student: Prof. MSc. Luciano.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 2.3: 2D Robot Example Jürgen Sturm Technische Universität München.
DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Third Quarterly IPR Meeting May 11, 1999 P. I.s: Leonidas J. Guibas and Jean-Claude.
Robust localization algorithms for an autonomous campus tour guide Richard Thrapp Christian Westbrook Devika Subramanian Rice University Presented at ICRA.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
Visual SLAM Visual SLAM SPL Seminar (Fri) Young Ki Baik Computer Vision Lab.
MTP FY03 Year End Review – Oct 20-24, Visual Odometry Yang Cheng Machine Vision Group Section 348 Phone:
Lab 6: Sensor Based Planning Lab TAs: Ben Morse Sarun Savetsila (Pong)
CSE-473 Mobile Robot Mapping. Mapping with Raw Odometry.
State Estimation for Autonomous Vehicles
Principle Investigator: Lynton Dicks Supervisor: Karen Bradshaw CO-OPERATIVE MAPPING AND LOCALIZATION OF AUTONOMOUS ROBOTS.
Visual Odometry for Ground Vehicle Applications David Nistér, Oleg Naroditsky, and James Bergen Sarnoff Corporation CN5300 Princeton, New Jersey
Robust Localization Kalman Filter & LADAR Scans
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
Mobile Robot Localization and Mapping Using Range Sensor Data Dr. Joel Burdick, Dr. Stergios Roumeliotis, Samuel Pfister, Kristo Kriechbaum.
Scarab Autonomous Traverse Carnegie Mellon December 2007 David Wettergreen.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
Embedded Autonomous Wheelchair Navigation System for AAL Scenarios
Group 3 Corey Jamison, Joel Keeling, & Mark Langen
CSE-473 Mobile Robot Mapping.
Alexis Maldonado & Georg Bartels
Using Sensor Data Effectively
Pursuit-Evasion Games with UGVs and UAVs
Team C November 2017 Shivang Bhaveja Harikrishnan Suresh Nick Crispie
Autonomous Robots Key questions in mobile robotics What is around me?
Non-parametric Filters
Jefferson Ridgeway, 2017 NIFS Summer Intern Mentor: Brian Coltin, IRG
A Short Introduction to the Bayes Filter and Related Models
Scanners – Robots – Measurement Plans Synergy in Motion
Navigation System on a Quadrotor
Principle of Bayesian Robot Localization.
Non-parametric Filters: Particle Filters
Non-parametric Filters
Non-parametric Filters: Particle Filters
SENSOR BASED CONTROL OF AUTONOMOUS ROBOTS
Probabilistic Robotics Bayes Filter Implementations FastSLAM
Presentation transcript:

DARPA Subterranean Challenge “Software” Projects (first version)

Lego-LOAMS Projects Background LeGo-LOAMS is a public-domain code for visual odometry and local map making that we will use for the foreseeable future. In Visual Odometry (VO), two LiDAR scans of the robot’s environment are take in successive (and nearby) robot locations. The scan features are matched across the two data scans in order to estimate the robot’s displacement between the adjacent locations (hence the term odometry). Based on the odometry estimates, a local map of the environment is made. We want to make several improvements to LeGO-LOAMS

Project: Lego-LOAMS #1: IMU Fusion Goal: Develop an estimator (e.g. Kalman Filter or particle filter) that “fuses” IMU data with the Visual Odometry Estimates Motivation: The fusion of these two forms of prioprioception produces better maps. Currently, LeGO-LOAMS use of an IMU (Inertial Measurement Unit) is very unsophisticated. Activity: Develop an estimator that fuses these two streams of sensor data, and test on a ground or aerial vehicle. Contacts: Caltech: Anushri Dixit, adixit@caltech.edu, Joel Burdick, jwb@robotics.caltech.edu

Project: Lego-LOAMS #2: Improved Feature Extraction Goal: Develop a better feature extractor to find planes (e.g., walls, ceilings, floors) in the LiDAR return signals, and other useful features. Motivation: When features are present in LiDAR data, they can be used to improve VO estimates and map quality. Planes are a common feature in man-made environments. We can think that we can improve LeGO-LOAMS feature detector. Activity: Develop and feature extractors for Velodyne LiDARS, and test them on vehicles Contacts: Caltech: Anushri Dixit, adixit@caltech.edu, Joel Burdick, jwb@robotics.caltech.edu

Project: Lego-LOAMS #3: Aerial Version Goal: Modify LeGO-LOAMS so that it works with flying vehicles. Motivation: LeGO-LOAMS was developed for ground robots. Several of its algorithms take shortcuts, or modify the analysis to make the VO and map estimates appropriate for ground vehicles. We would like to make a version which works for aerial vehicles. Activity: Understand LeGO-LOAMS implicit biases to ground robots, and upgrade it to work with aerial vehicles. Test on aerial vehicles. Contacts: Caltech: Anushri Dixit, adixit@caltech.edu, Joel Burdick, jwb@robotics.caltech.edu

Project: Lego-LOAMS #4: Multi-Robot Mapping Goal: Develop a multi-robot version of LeGO-LOAMS. Motivation: LeGO-LOAMS is currently focused on incorporating data from a single robot. To map a large space, we must integrate the data from all vehicles and their heterogeneous sensors into a coherent, unified map. Activity: Develop a Multi-Robot mapping algorithm that uses the GT-SAM and LeGO-LOAMS framework. This is Ph.D.-level research. Contacts: Caltech: Anushri Dixit, adixit@caltech.edu, Joel Burdick, jwb@robotics.caltech.edu

Project: Ground Robot Motion Planning System Goal: Develop a local motion planner for a Husky robot equipped with a Velodyne LiDAR, IMU, and Intel Real-Sense sensors. The planner should take a target location input from a human operator (usually a few 10’s of meters away), and construct/execute a collision-free path to the goal. Motivation: During the SubT competition, ground robots must autonomously reach targets given by the human operator Activity: Start with an existing ROS planning package, and adapt as necessary for the Husky robots. Contacts: Caltech: Drew Singletary (grad student): drewsingletary1@gmail.com JPL: Ali Agha (team lead): aliakbar.aghamohammadi@jpl.nasa.gov