Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Robotic Platform for Exploring Emergent Behavior

Similar presentations


Presentation on theme: "A Robotic Platform for Exploring Emergent Behavior"— Presentation transcript:

1 A Robotic Platform for Exploring Emergent Behavior
Felix Duvallet Aaron Johnson Ryan Kellogg James Kong Eugene Marinelli Alexander May Suresh Nidhiry Iain Proctor Gregory Tress Kevin Woo Why a colony? Accomplish tasks that are not feasible with a single robot Redundancy as a way of achieving robustness Robots can often fail, especially in dangerous environments Individual failure will not halt the task Capabilities of the colony can degrade gracefully Interesting research questions: Emergent behaviors Multi-robot planning and control Cooperation Decentralized Simultaneous Localization and Mapping (SLAM) Abstract In nature, some of the most successful species are those that form colonies. For instance, while individually vulnerable, myopic, and simple minded, the ant is remarkably effective in groups. We wish to take this same philosophy in designing robots. We seek to explore how complex the behavior of a robot colony can be when the constituent robots are, like the ant, limited in their sensory and processing capabilities yet inexpensive and abundant. To this end, we have developed a platform consisting of a homogeneous group of robots that has enabled us to explore simple behaviors. Furthermore, this platform will enable further research in many robotics topics, including multi-robot localization, large-scale emergent behaviors, communication, and multi-robot cooperation. Our Colony: Several homogeneous robotic agents working together to solve a problem Small size and inexpensive design Leader-less group Sensors and wireless communication network Create complex global behaviors from simple local actions Robots Behaviors Microcontroller Base Design Emergent Behavior Complex global patterns from simple local rules Each robot has limited sensory and processing capabilities Multiple simple agents combine to create intricate behaviors The Firefly+ is the main microprocessor board of each robot Backlit LCD provides status and debugging information Easily accessible analog and digital input and output Multi-colored LED (the “orb”) and other LEDs provide feedback to user and status indications ATmega 128 Compiles using avr-gcc Interfaces for sensors and other hardware devices Serial connection to a computer User input - two buttons, potentiometer Light and sound elements Custom design specifically for the Colony project Implemented Behaviors Hunter/Prey Hunter chases prey Switch roles when prey is caught Uses a combination of the BOM, Sharp IR rangefinder, bump sensors, and wireless network Hunger Robot simulates hunger, feeding, and emotion Follow the leader Robots follow each other Sensors Communication Simulated Behaviors Bump sensors are attached to the front and rear of the robot Self-assembling ad-hoc wireless network: Leader-less: no single robot has more responsibility than the others Network acts as a token ring, with each robot periodically transmitting All robots are always listening, but only one is transmitting at a time Wireless network is closely tied to BOM operation Bearing data can be linked to wireless data containing robot identification Each robot is able to gather bearing information for every other robot. This information can be used in a localization algorithm to create a map of the robot positions XBee wireless module Low power, low cost Sharp IR rangefinder measures distances to objects Follow the leader Hunter/Prey Formation control The Bearing and Orientation Module (BOM) allows robots to determine relative angular position to other robots Robots follow one another in a set order Each robot has a very simple behavior, which combined created an interesting large-scale behavior One prey (red), many hunters (blue) Prey moves Hunters converge on it. Robots create a shape (in this case a circle) If any robot is moved, the others will move to compensate and retain formation BOM Specifications: Ring of IR emitter-detector pairs Emitter mode: all emitters powered simultaneously acting as a beacon Detector mode: each detector polled for analog intensity readings. Most excited detector points in the direction of closest robot in emitter mode Coplanar among colony robots Line of sight visibility Future Behaviors and Applications Simulation Acknowledgements The project was funded in part by Carnegie Mellon’s Undergraduate Research Office and the Ford Motor Corporation. These results represent the views of the authors and not those of Carnegie Mellon University. We would like to thank our advisor Howie Choset, Peggy Martin for her help, as well as Steven Shamlian, Brian Kirby, and Tom Lauwers for their support and contributions to the project. Simulation can ease behavior development Enables behavior tests with many more robots Player/Stage – open-source robot simulator Additions to simulate custom sensors In the future, the same code will be used for the simulated behavior and the behavior on the actual robots More complex formation control and movement Obstacle detection Simultaneous Localization and Mapping (SLAM) Environment mapping and exploration


Download ppt "A Robotic Platform for Exploring Emergent Behavior"

Similar presentations


Ads by Google