Spatial Reasoning with Guinness References Acknowledgements University of Missouri, Columbia, MO
THE STARTING ARCHITECTURE mapserver SRserverpose continuous localization speech PDA(NRL) robot palmhelper sensor data corrections oldest short term map robot pose user commands and responses encoders sensor info speech commands trulla vfh short term map long term map robot cmds palmserver GUI(EUT) gesture
THE CURRENT ARCHITECTURE imageserver mapserver SRserver robot_spatial pose continuous localization speech PDA(MU) robot Cortex sensor data robot cmds corrections oldest short term map user commands and responses robot pose user commands and responses encoders sensor info speech commands trulla vfh short term map long term map robot cmds sketch log files sketch log files GUI(EUT) gesture spatialview
THE PLANNED ARCHITECTURE imageserver mapserver SRserver spatial behaviors obstacle avoidance pose continuous localization speech robot Cortex sensor data robot cmds corrections oldest short term map user commands and responses SR & map info robot pose user commands and responses encoders sensor info query & label speech commands sketch directives & feedback robot commands trulla vfh short term map long term map GUI(EUT) gesture PDA(MU)
User:How many objects do you see? Robot: I am sensing four objects. User: Object 2 is a table. User: Describe the scene. Robot: There are objects on my front right. The object number 4 is mostly in front of me. The table is behind me. User: Go behind the table. Behind the table SRserver
between object 1 and object 2 using the midpoint between closest points using the midpoint between centroids using the CFMD
Image Server
PATH DESCRIPTION GENERATED FROM THE SKETCHED ROUTE MAP 1. When table is mostly on the right and door is mostly to the rear (and close) Then Move forward 2. When chair is in front or mostly in front Then Turn right 3. When table is mostly on the right and chair is to the left rear Then Move forward 4. When cabinet is mostly in front Then Turn left 5. When ATM is in front or mostly in front Then Move forward 6. When cabinet is mostly to the rear and tree is mostly on the left and ATM is mostly in front Then Stop Understanding Sketched Route Maps
[1] M. Skubic, P. Matsakis, G. Chronis and J. Keller, "Generating Multi- Level Linguistic Spatial Descriptions from Range Sensor Readings Using the Histogram of Forces", Autonomous Robots, Vol. 14, No. 1, Jan., 2003, pp [2] M. Skubic, D. Perzanowski, S. Blisard, A. Schultz, W. Adams, M. Bugajska and D. Brock “Spatial Language for Human-Robot Dialogs,” IEEE Transactions on SMC, Part C, to appear in the special issue on Human-Robot Interaction. [3] M. Skubic, S. Blisard, C. Bailey, J.A. Adams and P. Matsakis, "Qualitative Analysis of Sketched Route Maps: Translating a Sketch into Linguistic Descriptions," IEEE Transactions on SMC Part B, to appear. [4] G. Chronis and M. Skubic, “Sketch-Based Navigation for Mobile Robots,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO. [5] G. Scott, J.M. Keller, M. Skubic and R.H. Luke III, “Face Recognition for Homeland Security: A Computational Intelligence Approach,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO. References
From left to right George Chronis, Grant Scott, Dr. Marge Skubic, Matt Williams, Craig Bailey, Bob Luke, Charlie Huggard and Sam Blisard Missing: Dr. Jim Keller Guinness and Gang
Sketch-Based Navigation The sketched route map The robot traversing the sketched route
Sketch-Based Navigation The digitized sketched route map The robot traversing the sketched route
Sketch-Based Navigation The digitized sketched route map The robot traversing the sketched route
This work has been supported by ONR and the U.S. Naval Research Lab. Natural language understanding is accomplished using a system developed by NRL, called Nautilus [Wauchope, 2000]. We also want to acknowledge the help of Dr. Pascal Matsakis. Acknowledgements
NRL’s Multimodal Robot Interface