Download presentation
Presentation is loading. Please wait.
Published byElizabeth McCoy Modified over 9 years ago
1
Improving Human-Robot Interaction Jill Drury, The MITRE Corporation Improving Human-Robot Interaction Jill Drury, The MITRE Corporation Collaborators: Holly Yanco, UMass Lowell Jean Scholtz, NIST Mike Baker, Bob Casey, Dan Hestand, Brenden Keyes, Phil Thoren (UML)
2
Methodology for Evaluating HRI Two approaches: field and laboratory Field work: so far, robotics competitions See many different user interfaces but have no control over what operator does See many different user interfaces but have no control over what operator does Difficult to collect data Difficult to collect data Can see what they did – but there isn’t time to determine why Can see what they did – but there isn’t time to determine why Best used to get an idea of the difficulties in the real world Best used to get an idea of the difficulties in the real world Can identify “critical events” but don’t know for certain whether operator was aware of them Can identify “critical events” but don’t know for certain whether operator was aware of them
3
Methodology for Evaluating HRI Laboratory studies Take what we learned in the real world and isolate factors to determine effects Take what we learned in the real world and isolate factors to determine effects Repeatability is still difficult to achieve due to fragile nature of robots Repeatability is still difficult to achieve due to fragile nature of robots
4
Analysis Frameworks Taxonomy to define human/robot system Detailed definition of human-robot interaction awareness Coding scheme/metrics for analyzing data Scholtz’ evaluation guidelines
5
Taxonomy Autonomy Amount of intervention Human-robot ratio Level of shared interaction Composition of robot teams Available sensors Sensor Fusion CriticalityTimeSpace
6
Awareness as a concept from CSCW CSCW software: “...makes the user aware that he is part of a group, while most other software seeks to hide and protect users from each other” [Lynch et al. 1990] “...makes the user aware that he is part of a group, while most other software seeks to hide and protect users from each other” [Lynch et al. 1990] HRI software: Makes humans aware of robots’ status and activities via the interface Makes humans aware of robots’ status and activities via the interface
7
What is “awareness”? No standard definition in the CSCW field We’ve seen at least 16 different definitions! We’ve seen at least 16 different definitions! Are many different types of awareness, e.g. Concept awareness Concept awareness Conversational awareness Conversational awareness Group-structural awareness Group-structural awareness Informal awareness Informal awareness Peripheral awareness Peripheral awareness Common thread: understanding that participants have of each other in a shared environment Situation awareness Situation awareness Social awareness Social awareness Task awareness Task awareness Workspace awareness Workspace awareness
8
But CSCW systems are different from robotic systems… CSCW: Multiple humans interacting via a CSCW system Multiple humans interacting via a CSCW systemRobotics: Single or multiple humans interacting with a single or multiple robots Single or multiple humans interacting with a single or multiple robots Non-symmetrical relationships between humans and robots; e.g., differences in Non-symmetrical relationships between humans and robots; e.g., differences in Free will Cognition
9
Tailoring an awareness definition for HRI: a base case Given one human and one robot...... HRI awareness is the understanding that the human has of the location, location, activities, activities, status, and status, and surroundings of the robot; and surroundings of the robot; and the knowledge that the robot has of the human’s commands necessary to direct its activities and the human’s commands necessary to direct its activities and the constraints under which it must operate the constraints under which it must operate
10
An awareness framework: General case Given n humans and m robots working together on a synchronous task, HRI awareness consists of five components: Human-robot awareness Human-robot awareness Human-human awareness Human-human awareness Robot-human awareness Robot-human awareness Robot-robot awareness Robot-robot awareness Humans’ overall mission awareness Humans’ overall mission awareness
11
General case: a detailed look Given n humans and m robots working together on a synchronous task, HRI awareness consists of five components: Human-robot: the understanding that the humans have of the locations, identities, activities, status and surroundings of the robots. Further, the understanding of the certainty with which humans know this information. Human-robot: the understanding that the humans have of the locations, identities, activities, status and surroundings of the robots. Further, the understanding of the certainty with which humans know this information. Human-human: the understanding that the humans have of the locations, identities and activities of their fellow human collaborators Human-human: the understanding that the humans have of the locations, identities and activities of their fellow human collaborators
12
General case, concluded Robot-human: the robots’ knowledge of the humans’ commands needed to direct activities and any human-delineated constraints that may require command noncompliance or a modified course of action Robot-human: the robots’ knowledge of the humans’ commands needed to direct activities and any human-delineated constraints that may require command noncompliance or a modified course of action Robot-robot: the knowledge that the robots have of the commands given to them, if any, by other robots, the tactical plans of the other robots, and the robot-to- robot coordination necessary to dynamically reallocate tasks among robots if necessary. Robot-robot: the knowledge that the robots have of the commands given to them, if any, by other robots, the tactical plans of the other robots, and the robot-to- robot coordination necessary to dynamically reallocate tasks among robots if necessary. Humans’ overall mission awareness: the humans’ understanding of the overall goals of the joint human- robot activities and the measurement of the moment- by-moment progress obtained against the goals. Humans’ overall mission awareness: the humans’ understanding of the overall goals of the joint human- robot activities and the measurement of the moment- by-moment progress obtained against the goals.
13
Coding Scheme: Problems Relating to Critical Incidents Critical incident: Robot has, or could, cause harm or damage Types of problems: Local navigation Local navigation Global navigation Global navigation Obstacle encounter Obstacle encounter Vehicle state Vehicle state Victim identification (specific to search and rescue) Victim identification (specific to search and rescue)
14
Some Metrics for HRI Time spent navigating, on UI overhead and avoiding obstacles Amount of space covered Number of victims found Critical incidents Positive outcomes Positive outcomes Negative outcomes Negative outcomes Operator interventions Amount of time robot needs help Amount of time robot needs help Time to acquire situation awareness Time to acquire situation awareness Reason for intervention Reason for intervention
15
Scholtz’ Guidelines (tailored) Is sufficient status and robot location information available so that the operator knows the robot is operating correctly and avoiding obstacles? Is the information coming from the robots presented in a manner that minimizes operator memory load, including the amount of information fusion that needs to be performed in the operators’ heads? Are the means of interaction provided by the interface efficient and effective for the human and the robot (e.g., are shortcuts provided for the human)? Does the interface support the operator directing the actions of more than one robot simultaneously? Will the interface design allow for adding more sensors and more autonomy?
16
Design Guidelines Enhance awareness Provide a map of where the robot has been Provide a map of where the robot has been Provide more spatial information about the robot in the environment to make the operators more award of their robot’s immediate surroundings Provide more spatial information about the robot in the environment to make the operators more award of their robot’s immediate surroundings Lower cognitive load Provide fused sensor information to avoid making the user fuse data mentally Provide fused sensor information to avoid making the user fuse data mentally Display important information near or fused with the video image Display important information near or fused with the video image
17
Design Guidelines, concluded Increase efficiency Provide user interfaces that support multiple robots in a single window, if possible Provide user interfaces that support multiple robots in a single window, if possible In general, minimize the use of multiple windows and maximize use of the primary viewing area In general, minimize the use of multiple windows and maximize use of the primary viewing area Provide help in choosing robot modality Give the operator assistance in determining the most appropriate level of robot autonomy at any given time Give the operator assistance in determining the most appropriate level of robot autonomy at any given time
18
Fusing Information Victims can be missed in video images
19
Fusing Infrared and Color Video
22
Other Sensor Modalities for USAR CO 2 detection Audio
23
Overlay of four sensor modalities
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.