Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

CSE 424 Final Presentation Team Members: Edward Andert Shang Wang Michael Vetrano Thomas Barry Roger Dolan Eric Barber Sponsor: Aviral Shrivastava.
Design Presentation Spring 2009 Andrew Erdman Chris Sande Taoran Li.
Discussion topics SLAM overview Range and Odometry data Landmarks
Software Engineering CSE470: Process 15 Software Engineering Phases Definition: What? Development: How? Maintenance: Managing change Umbrella Activities:
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
(Includes references to Brian Clipp
IR Lab, 16th Oct 2007 Zeyn Saigol
Lab 2 Lab 3 Homework Labs 4-6 Final Project Late No Videos Write up
Objectives The objective of this design process was to create a small, autonomous robot capable of completing a set of predefined objectives within an.
Simultaneous Localization & Mapping - SLAM
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Visual Navigation in Modified Environments From Biology to SLAM Sotirios Ch. Diamantas and Richard Crowder.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
Reliable Range based Localization and SLAM Joseph Djugash Masters Student Presenting work done by: Sanjiv Singh, George Kantor, Peter Corke and Derek Kurth.
CSE Design Lab – Milestone 2 James Hopkins Dave Festa Dennis O’Flaherty Karl Schwirz.
Autonomous Vehicle: Navigation by a Line Created By: Noam Brown and Amir Meiri Mentor: Johanan Erez and Ronel Veksler Location: Mayer Building (Electrical.
X96 Autonomous Robot Design Review Saturday, March 13, 2004 By John Budinger Francisco Otibar Scott Ibara.
Particle Filters for Mobile Robot Localization 11/24/2006 Aliakbar Gorji Roborics Instructor: Dr. Shiri Amirkabir University of Technology.
Preliminary Design Review The Lone Rangers Brad Alcorn Tim Caldwell Mitch Duggan Kai Gelatt Josh Peifer Capstone – Spring 2007.
Distributed Robot Agent Brent Dingle Marco A. Morales.
Hand Signals Recognition from Video Using 3D Motion Capture Archive Tai-Peng Tian Stan Sclaroff Computer Science Department B OSTON U NIVERSITY I. Introduction.
The CarBot Project Group Members: Chikaod Anyikire, Odi Agenmonmen, Robert Booth, Michael Smith, Reavis Somerville ECE 4006 November 29 th 2005.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
Fuzzy control of a mobile robot Implementation using a MATLAB-based rapid prototyping system.
ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead.
Kalman filter and SLAM problem
Autonomous Surface Navigation Platform Michael Baxter Angel Berrocal Brandon Groff.
Dillon: CSE470: SE, Process1 Software Engineering Phases l Definition: What? l Development: How? l Maintenance: Managing change l Umbrella Activities:
Patient Location via Received Signal Strength (RSS) Analysis Dan Albano, Chris Comeau, Jeramie Ianelli, Sean Palastro Project Advisor Taib Znati Tuesday.
Sensors. Sensors are for Perception Sensors are physical devices that measure physical quantities. – Such as light, temperature, pressure – Proprioception.
Multiple Autonomous Ground/Air Robot Coordination Exploration of AI techniques for implementing incremental learning. Development of a robot controller.
Ruslan Masinjila Aida Militaru.  Nature of the Problem  Our Solution: The Roaming Security Robot  Functionalities  General System View  System Design.
Cooperating AmigoBots Framework and Algorithms
Real-time object tracking using Kalman filter Siddharth Verma P.hD. Candidate Mechanical Engineering.
International Conference on Computer Vision and Graphics, ICCVG ‘2002 Algorithm for Fusion of 3D Scene by Subgraph Isomorphism with Procrustes Analysis.
EE 492 ENGINEERING PROJECT LIP TRACKING Yusuf Ziya Işık & Ashat Turlibayev Yusuf Ziya Işık & Ashat Turlibayev Advisor: Prof. Dr. Bülent Sankur Advisor:
(Wed) Young Ki Baik Computer Vision Lab.
Final Presentation.  Software / hardware combination  Implement Microsoft Robotics Studio  Lego NXT Platform  Flexible Platform.
FAST: Fully Autonomous Sentry Turret
1 Formation et Analyse d’Images Session 7 Daniela Hall 25 November 2004.
By: Eric Backman Advisor: Dr. Malinowski.  Introduction  Goals  Project Overview and Changes  Work Completed  Updated Schedule.
University of Windsor School of Computer Science Topics in Artificial Intelligence Fall 2008 Sept 11, 2008.
Advisor: Dr. Edwin Jones 1 Client: Paul Jewell ISU Engineering Distance Learning Facility May01-13 Design Team: David DouglasCprE Matt EngelbartEE Hank.
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
Devin Mullen Advisor: Professor Andrew Kun.  Background  Problem Definition  Proposed Solution  Design Objectives  Implementation and Testing  Budget.
Autonomous Navigation Based on 2-Point Correspondence 2-Point Correspondence using ROS Submitted By: Li-tal Kupperman, Ran Breuer Advisor: Majd Srour,
Real-Time Simultaneous Localization and Mapping with a Single Camera (Mono SLAM) Young Ki Baik Computer Vision Lab. Seoul National University.
One reason for this is that curricular resources for robot mapping are scarce. This work fills the gap between research code, e.g., at openslam.org, and.
Range-Only SLAM for Robots Operating Cooperatively with Sensor Networks Authors: Joseph Djugash, Sanjiv Singh, George Kantor and Wei Zhang Reading Assignment.
Principle Investigator: Lynton Dicks Supervisor: Karen Bradshaw CO-OPERATIVE MAPPING AND LOCALIZATION OF AUTONOMOUS ROBOTS.
Lynton Dicks Supervisor: Karen Bradshaw CO-OPERATIVE MAPPING AND LOCALIZATION OF AUTONOMOUS ROBOTS.
Software Narrative Autonomous Targeting Vehicle (ATV) Daniel Barrett Sebastian Hening Sandunmalee Abeyratne Anthony Myers.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
Kalman Filter and Data Streaming Presented By :- Ankur Jain Department of Computer Science 7/21/03.
Robot Intelligence Technology Lab. Evolutionary Robotics Chapter 3. How to Evolve Robots Chi-Ho Lee.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Vision Based Automation of Steering using Artificial Neural Network Team Members: Sriganesh R. Prabhu Raj Kumar T. Senthil Prabu K. Raghuraman V. Guide:
3D Perception and Environment Map Generation for Humanoid Robot Navigation A DISCUSSION OF: -BY ANGELA FILLEY.
Self-Navigation Robot Using 360˚ Sensor Array
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
ASEN 5070: Statistical Orbit Determination I Fall 2014
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Velocity Estimation from noisy Measurements
EE 492 ENGINEERING PROJECT
Jetson-Enabled Autonomous Vehicle
Presentation transcript:

Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001

Project Mentors: Dr. Jeffrey Uhlmann Dr. Marjorie Skubic Project Members: Karen Casey Kenneth Estes Laura Heffernan

Overview Problem Definition Problem Definition Background Background Goals and Objectives Goals and Objectives

Overview Requirements Analysis Requirements Analysis System Components System Components Constraints Constraints Requirements Requirements Alternative Approaches Alternative Approaches Testing Methods Testing Methods Scheduling Scheduling

Background Covariance Intersection Research and Development Naval Research Laboratory Naval Research Laboratory Decentralized data fusion problems Decentralized data fusion problems Dynamic map building and localization Dynamic map building and localization NASA Mars Rover NASA Mars Rover Onboard data fusion system Onboard data fusion system Simultaneous Localization and Map Building System (SLAM) Simultaneous Localization and Map Building System (SLAM)

Background Covariance Intersection Research and Development Sojourner rover on Mars taken by the Mars Pathfinder Lander

Background Covariance Intersection Research and Development What is data fusion? Demands a method of combining information from multiple sources Demands a method of combining information from multiple sources Covariance Intersection (CI) consistently provides a conservatively fused estimate of the input.

Background Kalman Filter vs. CI Filter Kalman Filter Represents information about estimated or measured quantities in terms of a mean and a covariance matrix Represents information about estimated or measured quantities in terms of a mean and a covariance matrix Can combine estimates with a known degree of independence Can combine estimates with a known degree of independence

Background Kalman Filter vs. CI Filter CI Filter Does not make any assumptions about the degree of independence between information it fuses Does not make any assumptions about the degree of independence between information it fuses Exhibits considerably more stable behavior than Kalman filters Exhibits considerably more stable behavior than Kalman filters

Background Kalman Filter vs. CI Filter Kalman Filter: CI Filter: Equations for combining estimates {a,A} and {b,B} Equations only differ by the parameter ω.

Background Kalman Filter vs. CI Filter Principle advantage of CI: Permits filtering and data fusion without the need to know the degree of correlation between the estimates being fused Permits filtering and data fusion without the need to know the degree of correlation between the estimates being fused Applications include simultaneous map building and localization for autonomous vehicles.

Goals To prove that CI is a reliable solution to the SLAM problem To prove that CI is a reliable solution to the SLAM problem Use a test robot to identify predefined beacons Use a test robot to identify predefined beacons Given the information gathered, use CI to create a simultaneous and localization map Given the information gathered, use CI to create a simultaneous and localization map

Objectives Research and obtain a robot that will be able to move in two dimensions. We will also need to be able to get speed information from the robot in order to dependably update our map. Research and obtain a robot that will be able to move in two dimensions. We will also need to be able to get speed information from the robot in order to dependably update our map. Construct a small controlled test bed environment with identifiable beacons. Construct a small controlled test bed environment with identifiable beacons.

Objectives Develop software that will take the image and extract the relative beacon position we will need in order to build a map. Develop software that will take the image and extract the relative beacon position we will need in order to build a map. Develop software that will control the robot's movements and navigate it around its environment so it can map the beacons in the test bed. Develop software that will control the robot's movements and navigate it around its environment so it can map the beacons in the test bed. Use Covariance Intersection to estimate the beacon locations so that the relative map can be updated. Use Covariance Intersection to estimate the beacon locations so that the relative map can be updated.

System Components Palm Pilot Robot Palm Pilot Robot Handspring Visor Prism Handspring Visor Prism Eyemodule2 camera Eyemodule2 camera CI software CI software

System Components Palm Pilot Robot constructed from the Palm Pilot Robot Kit constructed from the Palm Pilot Robot Kit uses a Palm Pilot to move around and sense the nearby environment uses a Palm Pilot to move around and sense the nearby environment

System Components Handspring Visor Prism Used for: Robot motion control Robot motion control Object avoidance Object avoidance Image capture and processing system Image capture and processing system CI program CI program Localization Localization Map building Map building

System Components Eyemodule2 Camera Serves as integrated image capture device Serves as integrated image capture device Connects directly to Visor Prism Connects directly to Visor Prism Produces color images Produces color images

System Components CI Software Modular program will include: Robot movement and motor control Robot movement and motor control Image processing Image processing Coordinate triangulation Coordinate triangulation Covariance matrix calculations Covariance matrix calculations

Constraints Vision system Vision system Environment Environment Beacon size and height Beacon size and height Walls and floor Walls and floor Color recognition Color recognition Hardware interface connections Hardware interface connections Maintenance Maintenance

Constraints Vision System Environment: Beacons of same size and height Beacons of same size and height White walls White walls White level floor White level floor Color Recognition: Beacons of different colors for individual beacon identification with largest threshold difference between them Beacons of different colors for individual beacon identification with largest threshold difference between them Objective is to minimize noise in vision system.

Constraints Hardware Interface Connections Need serial connection between Visor and robot Need serial connection between Visor and robot Need communication between development software and Visor to download code Need communication between development software and Visor to download code

Constraints Maintenance Minimal maintenance: Connections and wiring Connections and wiring C program for Palm OS C program for Palm OS OS version OS version Visor model Visor model Camera model Camera model NASA testing and modifications NASA testing and modifications

Requirements Cost Requirements Cost Requirements Time: Time: Research the hardware/software components Research the hardware/software components Implement the robot and software. Implement the robot and software. Meet with project mentors. Meet with project mentors. Resources: Resources: Mentors will serve as our basic reference tool for background and implementation information. Mentors will serve as our basic reference tool for background and implementation information.

Requirements Mentors as a Resource Dr. Marjorie Skubic, Assistant Professor Specialties: Sensory perception Pattern Recognition Intelligent control Robotics

Requirements Mentors as a Resource Dr. Jeffery Uhlmann, Assistant Professor Specialties: Kalman Filtering Statistical Algorithms Autonomous Vehicles and Robotics Large Scale Simulation

Requirements Cost Requirements Cost Requirements Resources: Resources: Reference books (CodeWarrior) Reference books (CodeWarrior) Visor developer support for code to access the structure that contains pixel information. Visor developer support for code to access the structure that contains pixel information. Covariance Intersection software libraries Covariance Intersection software libraries Facilities: Facilities: EBW Lab 222 with Artemis. EBW Lab 222 with Artemis.

Requirements Cost Requirements Cost Requirements Money: Money: Purchase serial connection converter to handle communication needs between the robot and Visor. Purchase serial connection converter to handle communication needs between the robot and Visor. Materials for the testing environment. Materials for the testing environment. Performance Requirements Performance Requirements No required time limit for system response. No required time limit for system response.

Alternative Solutions for Finding Beacon Distance Infrared LEDs and sensors Infrared LEDs and sensors Pros Pros Inexpensive Inexpensive Have been proven to work Have been proven to work Cons Cons Need filters Need filters Occupies too much space Occupies too much space Adds complexity Adds complexity

Alternative Solutions for Finding Beacon Distance Sonar emitter and sensors Sonar emitter and sensors Pros Pros More information about the environment More information about the environment Cons Cons Too much information Too much information Adds complexity Adds complexity

Testing Navigation and Control Test limitations of robot control Test limitations of robot control Successful if the robot is able to move about the environment and move within visual range of the beacons for identification Successful if the robot is able to move about the environment and move within visual range of the beacons for identification

Testing Identification of Beacons Test the ability of our software to distinguish between the background and a beacon Test the ability of our software to distinguish between the background and a beacon Need to be able to identify the size of the beacon and the position of the center of the beacon Need to be able to identify the size of the beacon and the position of the center of the beacon

Testing Identification of Beacons Use trial and error to create a hash table associating the size of the beacon to the distance to the beacon Use trial and error to create a hash table associating the size of the beacon to the distance to the beacon Successful if the software can identify beacons, determine the distance to the beacons, and reduce the information to a specific XYZ coordinate for a beacon Successful if the software can identify beacons, determine the distance to the beacons, and reduce the information to a specific XYZ coordinate for a beacon

Testing CI Solution Hardware is being used in order to test our CI solution Hardware is being used in order to test our CI solution Successful when the robot is able to identify beacons and move around while maintaining an acceptably accurate map Successful when the robot is able to identify beacons and move around while maintaining an acceptably accurate map

Schedule

Conclusion Further refine scope of the project Further refine scope of the project Familiarize ourselves with the operation of the system components Familiarize ourselves with the operation of the system components Design the algorithm for image processing Design the algorithm for image processing Continue research into CI Continue research into CI