A Sketch Interface for Mobile Robots

Slides:



Advertisements
Similar presentations
Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
Advertisements

S 1 Intelligent MultiModal Interfaces Manuel J. Fonseca Joaquim A. Jorge
Breakout session B questions. Research directions/areas Multi-modal perception cognition and interaction Learning, adaptation and imitation Design and.
Game-Based Design of Human-Robot Interfaces Notes on USR design papers ( ) Presented by Misha Kostandov.
Computational Intelligence Research in ECE Target detection and Recognition Linguistic Scene Description Sketch Understanding What’s Coming? Jim Keller,
NUS CS5247 Motion Planning for Camera Movements in Virtual Environments By Dennis Nieuwenhuisen and Mark H. Overmars In Proc. IEEE Int. Conf. on Robotics.
Spatial Reasoning with Guinness References Acknowledgements University of Missouri, Columbia, MO.
Representation at the Interface Gabriel Spitz 1 Lecture #13.
Dialogue – Driven Intranet Search Suma Adindla School of Computer Science & Electronic Engineering 8th LANGUAGE & COMPUTATION DAY 2009.
Robot Navigation based on the Mapping of Coarse Qualitative Route Descriptions to Route Graphs Motivation Outline Global World Knowledge Local Path Planning.
The MU Mites Robot Team Marjorie Skubic Derek Anderson Srikanth Kavirayani Mohammed Khalilia Benjamin Shelton Computational Intelligence Research Lab University.
1 CS 430: Information Discovery Lecture 22 Non-Textual Materials 2.
Spatial Reasoning for Semi-Autonomous Vehicles Using Image and Range Data Marjorie Skubic and James Keller Students: Sam Blisard, George Chronis, Grant.
ADVISE: Advanced Digital Video Information Segmentation Engine
Tracking a moving object with real-time obstacle avoidance Chung-Hao Chen, Chang Cheng, David Page, Andreas Koschan and Mongi Abidi Imaging, Robotics and.
Supervised by Prof. LYU, Rung Tsong Michael Department of Computer Science & Engineering The Chinese University of Hong Kong Prepared by: Chan Pik Wah,
Experiences with an Architecture for Intelligent Reactive Agents By R. Peter Bonasso, R. James Firby, Erann Gat, David Kortenkamp, David P Miller, Marc.
Combining Informal and Tangible Interfaces for Early Stages of Web Site Design Raecine Sapien Mentor: Mark Newman Professor: Dr. James Landay This presentation.
Ch 7 & 8 Interaction Styles page 1 CS 368 Designing the Interaction Interaction Design The look and feel (appearance and behavior) of interaction objects.
Integration of Representation Into Goal- Driven Behavior-Based Robots By Dr. Maja J. Mataric` Presented by Andy Klempau.
Data Input How do I transfer the paper map data and attribute data to a format that is usable by the GIS software? Data input involves both locational.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Lecturing with Digital Ink Richard Anderson University of Washington.
1 User Interface Design CIS 375 Bruce R. Maxim UM-Dearborn.
Where Do I Start REFERENCE: LEARNING WEB DESIGN (4 TH EDITION) BY ROBBINS 2012 – CHAPTER 1 (PP. 3 – 14)
An Integral System for Assisted Mobility Manuel Mazo & the Research group of the SIAMO Project Yuchi Ming, IC LAB.
Course: Introduction to Computers
Constructing Images Eyes-free: A Grid-based Dynamic Drawing Tool for the Blind Hesham M. Kamel James A. Landay Group for User Interface Research EECS.
June 12, 2001 Jeong-Su Han An Autonomous Vehicle for People with Motor Disabilities by G. Bourhis, O.Horn, O.Habert and A. Pruski Paper Review.
Systems Analysis and Design in a Changing World, 6th Edition
Systems Analysis and Design in a Changing World, 6th Edition
1 ISE 412 Human-Computer Interaction Design process Task and User Characteristics Guidelines Evaluation.
1 USC Map Navigation System Team: 42 Rohit Philip Eddy4096Campus Vijay GopalakrishnanCampus Vadim S. Klochko8956DEN.
Systems Analysis and Design in a Changing World, 6th Edition
Pardon Me, Your Computer’s Showing Using speech to speed and streamline desktop computing Kimberly Patch President, Redstart Systems SpeechTek West February.
Ink and Gesture recognition techniques. Definitions Gesture – some type of body movement –a hand movement –Head movement, lips, eyes Depending on the.
Point-to-GeoBlog: Gestures and Sensors to Support User Generated Content Creation Simon Robinson, Parisa Eslambolchilar, Matt Jones MobileHCI 2008.
Activity 3: Multimodality HMI for Hands-free control of an intelligent wheelchair L. Wei, T. Theodovidis, H. Hu, D. Gu University of Essex 27 January 2012.
What is Usability? Usability Is a measure of how easy it is to use something: –How easy will the use of the software be for a typical user to understand,
Life in the Atacama, Design Review, December 19, 2003 Carnegie Mellon SCIENCE OPS [contributions from Peter, Trey, Dom, Kristen, Kristina and Mike] Life.
MULTIMEDIA DEFINITION OF MULTIMEDIA
Hyper-Hitchcock F. Shipman, A. Girgensohn, and L. Wilcox, "Hyper-Hitchcock: Towards the Easy Authoring of Interactive Video", Proceedings of INTERACT 2003,
1 Sketch tools and Related Research Rachel Patel.
University of Amsterdam Search, Navigate, and Actuate - Qualitative Navigation Arnoud Visser 1 Search, Navigate, and Actuate Qualitative Navigation.
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology 1 Wireless Sensor Network Wireless Sensor Network Based.
Boundary Assertion in Behavior-Based Robotics Stephen Cohorn - Dept. of Math, Physics & Engineering, Tarleton State University Mentor: Dr. Mircea Agapie.
Windows User Interface and Web User Interface By E. Marlene Graham.
DARPA ITO/MARS Project Update Vanderbilt University A Software Architecture and Tools for Autonomous Robots that Learn on Mission K. Kawamura, M. Wilkes,
Topological Path Planning JBNU, Division of Computer Science and Engineering Parallel Computing Lab Jonghwi Kim Introduction to AI Robots Chapter 9.
Shuang Wu REU-DIMACS, 2010 Mentor: James Abello.  Project description  Our research project Input: time data recorded from the ‘Name That Cluster’ web.
1 Chapter 4: User Interface Design. 2 Introduction … Purpose of user interface design:-  Easy to learn  Easy to use  Easy to understand.
AN INTELLIGENT ASSISTANT FOR NAVIGATION OF VISUALLY IMPAIRED PEOPLE N.G. Bourbakis*# and D. Kavraki # #AIIS Inc., Vestal, NY, *WSU,
DO NOT FEED THE ROBOT. The Autonomous Interactive Multimedia Droid (GuideBot) Bradley University Department of Electrical and Computer Engineering EE-452.
Conceptual Model Design Informing the user what to do Lecture # 10 (a) Gabriel Spitz.
A Framework with Behavior-Based Identification and PnP Supporting Architecture for Task Cooperation of Networked Mobile Robots Joo-Hyung Kiml, Yong-Guk.
Importance of user interface design – Useful, useable, used Three golden rules – Place the user in control – Reduce the user’s memory load – Make the.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It describes what is a user doing or will.
Task Analysis Lecture # 8 Gabriel Spitz 1. Key Points  Task Analysis is a critical element of UI Design  It specifies what functions the user will need.
System Model: Physical Module Constraints: 1)A module must have a minimum of two adjacent free sides in order to move. 2) A module must have an adjacent.
Problem Statement Goal & Objective Background Procedure Literature Reviewed Data Collection KML File and Arcmap Creation, Arc IMS Setup Design of Website.
William H. Bowers – Specification Techniques Torres 17.
5. Methodology Compare the performance of XCS with an implementation of C4.5, a decision tree algorithm called J48, on the reminder generation task. Exemplar.
Visual Information Processing. Human Perception V.S. Machine Perception  Human perception: pictorial information improvement for human interpretation.
COGNITIVE APPROACH TO ROBOT SPATIAL MAPPING
Evaluation of a multimodal Virtual Personal Assistant Glória Branco
Copyright Catherine M. Burns
Ying Dai Faculty of software and information science,
Handwritten Characters Recognition Based on an HMM Model
Database System Concepts and Architecture
Evaluation of a multimodal Virtual Personal Assistant Glória Branco
Presentation transcript:

A Sketch Interface for Mobile Robots Marjorie Skubic Craig Bailey George Chronis Computational Intelligence Research Lab University of Missouri-Columbia

Outline Motivation and context Route maps The PDA sketch interface Experimental study and results Conclusions and future work

Spatial Reasoning with Guinness References Acknowledgements

Route Maps Tversky’s work Michon and Denis Depictions vs. Descriptions Extraction of route descriptions 1 to 1 correlation Michon and Denis Landmarks and critical nodes

The Sketch Interface Objects Labels Paths Delete Start Move Undo Send

Objects Closed Polygons Any shape or size Thresholds to determine gap closure Feedback on recognition Sound Color

Labels Default numbering for object labels Tap on screen to edit Can use Palm OS Graffiti recognition or a software keyboard

Paths Limit of one A minimum length required Color Feedback

Path Direction Default direction is the direction the path is drawn User can specify the direction with a sketched “blob” to denote the start of the path Recognized by Number of points Average distance of all points Proximity to path endpoint

Delete An intuitive delete: cross out an object Recognized by Two consecutive strokes Both lengths shorter than a path The strokes cross Color feedback Search for closest object or path

Determining Crossed Marks Use the slope equations of lines Endpoints of strokes determine the line A pair of decision parameters can be computed If both parameters lie between 0 and 1, then the two strokes must have an intersection ua= (X4-X3)(Y1-Y3) - (Y4-Y3)(X4-X3) (Y4-Y3)(X2-X1) - (X4-X3)(Y2-Y1) ub= (X2-X1)(Y1-Y3) - (Y2-Y1)(X1-X3) IF (0 < ua < 1) AND (0 < ub < 1) THEN the lines intersect (X1,Y1) (X3,Y3) (X4,Y4) (X2,Y2)

Menu Commands Also accessible through graffiti m  Move u  Undo c  Clear t  Transmit f  Configure

“Digitizing” the Sketch

User Evaluation Tested how well the interface performed with real users Pre-experimental questionnaire Tasks Sketch tasks Re-sketch tasks Task scores Post-experimental questionnaire Questionnaires contain Lickert style statements (Lickert, 1932) along with several open-ended questions

Statistical Analysis 2 groups, 2 scenes: Compared by scene sketched Compared by course level of participant Means compared with the t test Null Hypothesis: there are no differences when compared by sketched scene or course level

Participants 26 students from CS courses One participant scores was not used Only 5 owned a PDA Students of Scene B rated themselves significantly better at giving directions (p = 0.02) No differences when compared by course level

Scene A

Example Sketches of Scene A

Scene B

Example Sketches of Scene B

Post-Experimental Survey: Landmark Scores 1 = very difficult; 5 = very easy Creating Landmarks 4.6 ± 0.6 Deleting Landmarks 4.2 ± 0.9 Usefulness of Deleting 4.7 ± 0.6 Usefulness of Labeling 4.8 ± 0.6

Post-Experimental Survey: Path Scores 1 = very difficult; 5 = very easy Creating a path 4.4 ± 1.0 Deleting a path Usefulness of deleting 4.7 ± 0.7 Usefulness of the starting point 4.2 ± 0.9

Post-Experimental Survey: Overall Scores Usefulness of changing sketch 4.8 ± 0.4 Usefulness of deleting sketch 4.2 ± 1.0 How well sketch represents environment 83.6 ± 7.4 Overall ease of interface 4.4 ± 0.6

Usability Results Only two significant differences (p<=0.05) were found among the scores Usefulness of deleting by scene (p=0.0) Final sketch rating by scene (p=0.05) In both cases, students in scene B rated higher Same group that rated themselves better at giving directions Differences were not found when compared by course level The Null Hypothesis is accepted

Task Score Results Collected sketches were scored +1 for starting landmark +1 for each correct turn +1 for landmark at turn +1 for each correct straight segment + 1 for ending landmark -1 for extra turns or straight segments No significant differences found (p=0.12) Sketch Task Score = 0.91 ± 0.11 Re-sketchTask Score = 0.82 ± 0.26

Conclusions Created a new sketch based interface on a handheld computer Intuitive and little reliance on traditional menus and icons User evaluation finds the interface as easy to use as pencil and paper by 2:1

Future Work Continue integration into the Guinness system Recognition of more sketched symbols Recognition of turning rate Creation of 3D virtual environments with libraries of objects Email: SkubicM@missouri.edu Web: www.cecs.missouri.edu/~skubic funded by the Naval Research Lab

ARCHITECTURE PDA SRserver imageserver speech mapserver spatial behaviors obstacle avoidance pose continuous localization speech robot Cortex sensor data robot cmds corrections oldest short term map user commands and responses SR & map info robot pose encoders sensor info query & label commands sketch directives & feedback robot commands trulla vfh short term map long GUI(EUT) gesture PDA

SRserver User: How many objects do you see? Behind the table User: How many objects do you see? Robot: I am sensing four objects. User: Object 2 is a table. User: Describe the scene. Robot: There are objects on my front right. The object number 4 is mostly in front of me. The table is behind me. User: Go behind the table.

between object 1 and object 2 using the midpoint between closest points using the midpoint between centroids using the CFMD

Image Server

Understanding Sketched Route Maps PATH DESCRIPTION GENERATED FROM THE SKETCHED ROUTE MAP 1. When table is mostly on the right and door is mostly to the rear (and close) Then Move forward 2. When chair is in front or mostly in front Then Turn right 3. When table is mostly on the right and chair is to the left rear Then Move forward 4. When cabinet is mostly in front Then Turn left 5. When ATM is in front or mostly in front Then Move forward 6. When cabinet is mostly to the rear and tree is mostly on the left and ATM is mostly in front Then Stop

References [1] M. Skubic, P. Matsakis, G. Chronis and J. Keller, "Generating Multi-Level Linguistic Spatial Descriptions from Range Sensor Readings Using the Histogram of Forces", Autonomous Robots, Vol. 14, No. 1, Jan., 2003, pp. 51-69. [2] M. Skubic, D. Perzanowski, S. Blisard, A. Schultz, W. Adams, M. Bugajska and D. Brock “Spatial Language for Human-Robot Dialogs,” IEEE Transactions on SMC, Part C, to appear in the special issue on Human-Robot Interaction. [3] M. Skubic, S. Blisard, C. Bailey, J.A. Adams and P. Matsakis, "Qualitative Analysis of Sketched Route Maps: Translating a Sketch into Linguistic Descriptions," IEEE Transactions on SMC Part B, to appear. [4] G. Chronis and M. Skubic, “Sketch-Based Navigation for Mobile Robots,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO. [5] G. Scott, J.M. Keller, M. Skubic and R.H. Luke III, “Face Recognition for Homeland Security: A Computational Intelligence Approach,” In Proc. of the IEEE 2003 Intl. Conf. on Fuzzy Systems, May, 2003, St. Louis, MO.

Guinness and Gang From left to right George Chronis, Grant Scott, Dr. Marge Skubic, Matt Williams, Craig Bailey, Bob Luke, Charlie Huggard and Sam Blisard Missing: Dr. Jim Keller

Sketch-Based Navigation The robot traversing the sketched route The sketched route map

Sketch-Based Navigation The robot traversing the sketched route The digitized sketched route map

Acknowledgements This work has been supported by ONR and the U.S. Naval Research Lab. Natural language understanding is accomplished using a system developed by NRL, called Nautilus [Wauchope, 2000]. We also want to acknowledge the help of Dr. Pascal Matsakis.

NRL’s Multimodal Robot Interface