CS 5 for all! Bridgette Eichelburger ’14, David Lingenbrink ‘14, Yael Mayer ‘11, Obosa Obazuaye ‘14, Becca Thomas ‘14, Maia Valcarce ‘13, Joshua Vasquez ‘14, Garrett Wong ‘14, Tim Yee ‘13, Christine Alvarado, Zachary Dodds, Geoff Kuenning, and Ran Libeskind-Hadas MyCS: Middle School CS 5 Hardware Ideally, the April Tags and other markers used in these tasks could be partially or wholly replaced with uncontrived image cues. We explored the system’s ability to localize itself within a visual map using the drone’s forward-facing camera. Ground and aerial robots offer complementary strengths. Flying vehicles offer novel sensing vantagepoints, while wheeled platforms carry sensors with higher power demands; they also permit more accurate odometry and control. This project focused on the aerial facets of coordinated ground/aerial autonomy. We used off-the-shelf hardware and Willow Garage’s freely available ROS software, including several project-specific contributions. Tagless Localization? Acknowledgments Hula-hoop hopping: (right) a two-node graph of locations indicated by April tags within Hula Hoops. (left) our visualization of a larger graph of locations This project both leveraged and contributed to ROS’s vision+control codebase. We gratefully acknowledge support from HHMI, The Rose Hills Foundation, the NSF CPATH project # , and funds provided by Harvey Mudd.. We use Willow Garage’s Robot Operating System, or ROS, for its many drivers, OpenCV vision library, and communications. We contributed a Python-based ROS ARDrone driver with support for both of the copter’s cameras. In addition, OpenCV made it easy to create a Python-based video- processing module that supports live and captured streams equally well. The navigation and localization routines used ROS’s C++ APIs and interface. (top) the $300 ARDrone (bottom) Mudd’s ground vehicle, based on Michael Ferguson’s and Willow Garage’s designs Python, plus… Software This project explored the capabilities of the inexpensive ($300) Parrot ARDrone, a quadrotor helicopter. The iRobot Create was the foundation for our ground robots. The ARDrone is sensitive to its surroundings, and requires careful scaffolding to succeed in making deliberate movements autonomously. The Create offers an extensible base for a Kinect, laptop, and shelf. Image matching: (left) April Tags incoporated into ROS (right) OpenCV’s SURF features The demonstrations and algorithms at left were implemented using sliding-scale autonomy, so that a human can intervene – if necessary or desired -- at almost any step of the process. Outreach A summary of the interfaces used and the accuracy attained with SURF-based (tagless) localization. This project’s resources are accessible enough to support many research and educational objectives. Western State colleagues joined us to develop Kinect- based control of their drone and Create. Interactive demos for visitors offer hands-on insight into the challenges underlying aerial autonomy. ROSsified flying Mutual support: (left) An iRobot Create leads an ARDrone upto an obstacle; the drone’s overhead view then provides the correct direction to navigate (right) factorial-finding and image segmentation April tags output picture with pose and id # – (Kim/Malen) SURF feature matching: two pairs and their respective scores (Brad/Lilian) Galileo picture (Nick) Hopping interface (Brad/Lilian) Drone picture (Nick) Create with Kinect picture (Josh/David) Architecture diagram (white background) Brad/Lilian detail Wall segmentation (Lilian/Brad) Factorial finding (Kim/Malen) Picture from GCER (Zach) Outreach picture (Zach) SURF table of data or graphs/bar charts/etc – some quantitative comparison… The “control-panel / SURFmaster interface (Brad/Lilian) Improving CS 5