Download presentation
Presentation is loading. Please wait.
1
Human trust in technology and etiquette
John D. Lee & Katrina See Department of Mechanical and Industrial Engineering University of Iowa Sponsored by NSF Grant IIS
2
Beyond theoretical importance
Misues/disuse 18:32
3
Affect, decision making, and reliance
Damasio (1994): Neurological basis for affect as a key element in effective decision making Norman (2002): Affect and cognition interact with technology to influence behavior Picard (1997): Affective computing can enhance acceptance and performance 18:32
4
Strange case of Phineus Gage
Left intellectual abilities intact, but greatly impaired decision making and emotional response 18:32
5
Affect, decision making, and reliance
Kramer (1999) Trust as an important social decision heuristic in organizations Nass (1996): People respond to technology as they do people Zuboff (1987): Operators report role of trust on reliance in working with computerized controllers Lee & Moray (1992): Trust influences reliance on automation Trust: a critical social lubricant that mitigates the cognitive complexity of increasingly unstructured relationships 18:32
6
Trust in technology? Trust: An attitude that reflects a person’s expectation that technology will achieve his or her goals based on: Performance of the system (What it has done) Process that governs operation (How it works) Purpose of the design (Why is was created) Similar to trust between people? 18:32
7
Belief Attitude Intention Behavior
18:32
8
Calibration and resolution of trust
18:32
9
Calibration of trust in automation
Consider cognitive complexity and social constraints to support trustable behavior (make algorithm understandable) Reveal behavior to calibrate trust (make behavior understandable) 18:32
10
An experiment with a supervisory control microworld
Evaluate role of sonification in calibrating trust to enhance supervisory control Investigate individual differences in trusting tendency “it sounded ‘hollow’ and didn’t feel right so we got out” 18:32
11
Pasteurization plant microworld
18:32
12
(Length indicative of severity of fault)
Day One Day Two Day Three This is what the experiment looked like. The increasing columns indicate an increasing severity in fault. Day one was training. The participants were also randomly assigned order. Some would experience the manual fault first, then the automatic fault. Some otherwise. The first day was for training. Auto. Fault Man. Fault No fault (Length indicative of severity of fault) 18:32
13
Performance as a function of sound and automatic controller fault
18:32
14
Sonification and reliance during automatic controller faults
18:32
15
Sonification and trust during automatic pump controller faults
18:32
16
Increasingly complex automation and human-automation interactions
18:32
17
Conclusions Trust influences reliance as an attitude in the context of beliefs, attitudes, intentions, and behavior. Reliance is not a good measure of trust Designers should focus on calibrating trust not enhancing trust Trust calibration may depend on the design of the algorithm and feedback Systems that mimic human characteristics, may lead to inappropriate trust as human characteristics lead to false expectations 18:32
18
Layers of etiquette Pragmatic (communicate, co Role identification
Empathetic, affective 18:32
19
Thoughts? Mapping function to define system etiquette Human Etiquette
Machine Etiquette System Etiquette Layers of Etiquette Pragmatic Role specification Empathetic/affective Level of abstraction in etiquette translation Literal vs. figurative Functional vs. physical Mapping function to define system etiquette “Please, can I interrupt” 18:32
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.