Download presentation
Presentation is loading. Please wait.
Published byMartin Soward Modified over 9 years ago
1
Manuela Veloso, Anthony Stentz, Alexander Rudnicky Brett Browning, M. Bernardine Dias Faculty Thomas Harris, Brenna Argall, Gil Jones Satanjeev Banerjee Students Human-Robot “Pickup” Teams with Language-Based Interaction Sponsored by The Boeing Company
2
Boeing Human-Robot Teams Project 2 Carnegie Mellon School of Computer Science Project Goals Robots that discover and understand each other’s capabilities Robots that can team together and coordinate their activities Human-robot teams that collaborate to accomplish tasks
3
Boeing Human-Robot Teams Project 3 Carnegie Mellon School of Computer Science Domain: Treasure Hunt Human-robot teams competing to locate “treasure” in an unknown environment Team 3, report your location
4
Boeing Human-Robot Teams Project 4 Carnegie Mellon School of Computer Science Treasure Hunt scenarios One human and two robots search for a treasure and return it to base [Y1] stationary human; Pioneer/Segway close-coupled team Semi-mobile human (partially accessible zones) [Y2] Human and two teams search for treasure(s) [Y2-3] Two teams of human/robots compete to locate and retrieve treasure(s) [Y4]
5
Boeing Human-Robot Teams Project 5 Carnegie Mellon School of Computer Science Language Interface Integration MAP GUI integrated Graphical display of information from robots Mixed speech / gesture inputs Flexible architecture Dynamic incorporation of additional robots Improved communications protocols Language and Dialog Navigation and search language Clarification and confirmation keyed to robot capability
6
Boeing Human-Robot Teams Project 6 Carnegie Mellon School of Computer Science Architecture SphinxPhoenix Tablet Rosetta Raven Claw 1 Raven Claw 2 Helios Kalliope Backend Client Robot 1 Robot 2 User interfaceDialog controlRobot system OpTrader Map Server Speech GUI Olympus BoingLib
7
Boeing Human-Robot Teams Project 7 Carnegie Mellon School of Computer Science Multi-modal interface Some classes of information are communicated more effectively by gesture than by language User can specify a search path using stylus, ask robots to identify themselves Components Java GUI TeamTalk dialog system Fujitsu Stylistic 5500 tablet PC
8
Boeing Human-Robot Teams Project 8 Carnegie Mellon School of Computer Science Video of Human-Robot interaction
9
Boeing Human-Robot Teams Project 9 Carnegie Mellon School of Computer Science Near term goals Dynamic updating of map information Access to robot capability data Goal conflict resolution (play / direct command) Better Op Trader state transparency
10
Boeing Human-Robot Teams Project 10 Carnegie Mellon School of Computer Science Publications T. K. Harris, S. Banerjee, and A. I. Rudnicky. Heterogeneous Multi-Robot Dialogues for Search Tasks. (2005) AAAI Spring Symposium: Dialogical Robots, Palo Alto, California. T. K. Harris, S. Banerjee, A. Rudnicky, J. Sison, Kerry B., and A. Black. A Research Platform for Multi-Agent Dialogue Dynamics. (2004) Proceedings of The IEEE International Workshop on Robotics and Human Interactive Communications, Kurashiki, Japan.
11
Boeing Human-Robot Teams Project 11 Carnegie Mellon School of Computer Science Demonstration Videos Robot Coordination Video Human-Robot multi-modal interaction Video
12
Boeing Human-Robot Teams Project 12 Carnegie Mellon School of Computer Science Ongoing challenges Search in cluttered environments Learn to dynamically select of tactics, integrating information from team members (human, robot) and knowledge of environment Extend play coordination to include complete state machines with additional synchronization primitives and human input Develop multiple model-based object and teammate tracking Extend grounding interactions between human and robot Improve robot infrastructure; system operations made fully routine.
13
Boeing Human-Robot Teams Project 13 Carnegie Mellon School of Computer Science Y2 Plan Develop techniques to enable robots to form pickup teams with dynamic sub-team formation and execute coordinated actions Formalize requirements for robot team participation Incorporate a new (Boeing) robot into the team Explore different human-robot team compositions (team size, robot types)
14
Boeing Human-Robot Teams Project 14 Carnegie Mellon School of Computer Science Y2 Plan Extend language and agent interfaces to allow humans to interact efficiently with pickup robot teams Incorporate visual feedback from robots Access richer robot state information Extend domain ontology Extend capability for clarification sub-dialogs Introduce simple landmark grounding capability
15
Boeing Human-Robot Teams Project 15 Carnegie Mellon School of Computer Science Y2 Plan Investigate extensions to the Y1 scenario Size and complexity of the environment Dynamic environments Identify a final Y2 demo scenario
16
Boeing Human-Robot Teams Project 16 Carnegie Mellon School of Computer Science Technology Transfer to Boeing TeamTalk spoken language interface Includes Sphinx recognition system, Phoenix semantic parser, RavenClaw dialogue manager, Rosetta language generator; Boeing has synthesizer (Theta) Updates provided over time Working with Boeing to adapt system
17
Boeing Human-Robot Teams Project 17 Carnegie Mellon School of Computer Science Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.