Download presentation
Presentation is loading. Please wait.
Published byLawrence Cross Modified over 8 years ago
1
Study & Conclusions
2
Perspectives on Face-to-face Interaction Success at anticipating the actions of the other – Implies need for Model of user that supports prediction of actions Recognition of actions predicted Prescription of appropriate response Success at responding to unanticipated actions of the other – Implies need for Maximizing sensitivity to actions taken by minimizing predetermined sequences of machine behavior
3
Expert Help System Engineering an appropriate response – Limitations Designers’ ability to predict user’s actions System’s access to and ability to make sense of user actions Goal: more situated than documentation, but reusable
4
Plans and Predictable Action Backward chaining used for locating step within a plan – Allows for skipping steps whose outcomes are already detected Problems with intermediate steps in plans – Human and machine have interpretations of current state No history of actions – could have used one
5
The User’s Resource: Situated Inquiry The procedural assumption of always providing next step does not allow for repair and abandonment Context of “help” requests ambiguous – What level of plan should be considered? Designer cannot predict all methods of failure – Thus, cannot predict all types of help needed Instructions and physical objects mutually reinforce one-another’s interpretation
6
Conditional Relevance of Response Conversational paradigm – New response indicates successful action – No response indicates incomplete action – Repeated response is ambiguous Could be iteration or trouble to be repaired Slow response can be interpreted as no response
7
Iteration vs. Trouble In human-human interaction loops/problems are identified/escaped from via – Exploratory questions (“Hello. Hello. … Are you there?”) – New explanations of the same idea – Assertions of completion How can the system know the difference? How can the user access the plan?
8
Communication Breakdowns False alarm – Response to correct action interpreted as failure – May occur When there are conflicting indicators Due to different interpretations of plan/task Garden Path – Incorrect action not discovered until later Losing context of original problem – Machine can interpret action as correct for some alternative path Users can assume they know the process without machine Trivial breaches of understanding can become “fatal”
9
Lessons from Theory/Studies Situation is not something to be avoided by systems determining action Mutual intelligibility of actions relies on situation Communication practices maximize use of context Face-to-face communication includes resources for recognizing/correcting trouble
10
Considerations for Interface Design Asymmetry of human-machine communication – Increase machine’s access to user actions/context – Make clear the limits of the machine – Compensate for lack of access with computation User models & coaching – Differential modeling and deviation from ideal Depends on match and boundaries of domain – Diagnostic inconsistencies Machine/design must consider potential for own mistakes – Local vs. global interpretations Separate to support short-term/long-term goals – Constructive use of trouble Can be used as an opportunity to “teach”
11
On Communication “Communication … is not a symbolic process that happens to go on in real-world settings, but a real-world activity in which we make use of language to delineate the collective relevance of our shared environment.”
12
On Plans “While plans can be elaborated indefinitely, they elaborate actions just to the level that elaboration is useful; they are vague with respect to the details of action precisely at the level at which it makes sense to forego abstract representation, and rely on the availability of a particular, embodied response.”
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.