Download presentation
Presentation is loading. Please wait.
Published byRachel Anderson Modified over 9 years ago
1
One reason for this is that curricular resources for robot mapping are scarce. This work fills the gap between research code, e.g., at openslam.org, and low-cost autonomous platforms. Benefits include (1) a lightweight installation, without dependencies, (2) cross-platform compatibility, and (3) full implementations, suitable for investigating/assigning the algorithms per se or as subroutines to other tasks. Mapping for all Elaine Shaver, Peter Mawhorter, Zeke Koziol, Zach Dodds Robot mapping has been a compelling recent success of the AI and robotics communities. Yet, even as modern mapping algorithms form a great deal of current work at venues such as AAAI, RSS, ICRA, and IROS, these new spatial reasoning approaches do not always receive commensurate attention in AI and robotics courses, projects, and assignments, particularly at the undergraduate level. … all platforms RobotsAlgorithms … all students Consistent mapping requires only the ability to re-identify landmarks; with an on-board PC, an iRobot Create has plenty of processing power and can wander on its own using bump sensors. Strap an iSight camera to the top and you can run vision code on the laptop to recognize blobs of color as landmarks, making it very simple to run FastSLAM on the Create. To date, we have implemented FastSLAM 1.0 and extended it with unknown data association. The algorithm builds consistent maps with only vision and landmarks of known shape and distinguishable color. A Kalman filter tracks the landmarks’ poses for each particle among a set of possible robot paths. Create Because the mapping, control, visualization, and vision are all built separately, each piece can be replaced by a simulation. In the picture at right, the vision and control components have been simulated. This particular software setup was used to develop and test the algorithm. Scribbler & Fluke The Qwerkbot Blob extraction and processing is handled by the open-source OpenCV library, and landmark position estimates are derived from each region’s width and lateral position. Although these estimates are very noisy, the FastSLAM algorithm filters the uncertainty resulting from that noise, reducing variance via repeated measurements. While both the Scribbler and iRobot Create have proven to be adept platforms, it often feels as though we must fight for every skerrick of progress. Enter the Qwerk, whose capabilities (sonar, panning motors, substantial room for expansion) allow additional autonomous behaviour. It can run the same FastSLAM implementation as the others; further, it enables evidence-grid mapping and keener obstacle avoidance. FastSLAM 1.0 on the iRobot Create SLAM in Simulation The Scribbler as it comes off the shelf is a long way from mapping the world. It provides infrared detectors for obstacle detection, but lacks a camera to identify landmarks. We have used the IPRE Fluke, a camera that interfaces with the Scribbler and provides additional control capabilities in Python. With the Fluke’s images, the ability to wander, and an accessible coding platform, the Scribbler ends up making a very welcoming platform for FastSLAM. Mapping with Myro Qwerk! A single particle’s observation and map: in this case, the estimated pose is away from the real location of the robot.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.