Human trust in technology and etiquette John D. Lee & Katrina See Department of Mechanical and Industrial Engineering University of Iowa Sponsored by NSF Grant IIS 01-17494
Beyond theoretical importance Misues/disuse 18:32
Affect, decision making, and reliance Damasio (1994): Neurological basis for affect as a key element in effective decision making Norman (2002): Affect and cognition interact with technology to influence behavior Picard (1997): Affective computing can enhance acceptance and performance 18:32
Strange case of Phineus Gage Left intellectual abilities intact, but greatly impaired decision making and emotional response 18:32
Affect, decision making, and reliance Kramer (1999) Trust as an important social decision heuristic in organizations Nass (1996): People respond to technology as they do people Zuboff (1987): Operators report role of trust on reliance in working with computerized controllers Lee & Moray (1992): Trust influences reliance on automation Trust: a critical social lubricant that mitigates the cognitive complexity of increasingly unstructured relationships 18:32
Trust in technology? Trust: An attitude that reflects a person’s expectation that technology will achieve his or her goals based on: Performance of the system (What it has done) Process that governs operation (How it works) Purpose of the design (Why is was created) Similar to trust between people? 18:32
Belief Attitude Intention Behavior 18:32
Calibration and resolution of trust 18:32
Calibration of trust in automation Consider cognitive complexity and social constraints to support trustable behavior (make algorithm understandable) Reveal behavior to calibrate trust (make behavior understandable) 18:32
An experiment with a supervisory control microworld Evaluate role of sonification in calibrating trust to enhance supervisory control Investigate individual differences in trusting tendency “it sounded ‘hollow’ and didn’t feel right so we got out” 18:32
Pasteurization plant microworld 18:32
(Length indicative of severity of fault) Day One Day Two Day Three 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 This is what the experiment looked like. The increasing columns indicate an increasing severity in fault. Day one was training. The participants were also randomly assigned order. Some would experience the manual fault first, then the automatic fault. Some otherwise. The first day was for training. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 Auto. Fault Man. Fault No fault (Length indicative of severity of fault) 18:32
Performance as a function of sound and automatic controller fault 18:32
Sonification and reliance during automatic controller faults 18:32
Sonification and trust during automatic pump controller faults 18:32
Increasingly complex automation and human-automation interactions 18:32
Conclusions Trust influences reliance as an attitude in the context of beliefs, attitudes, intentions, and behavior. Reliance is not a good measure of trust Designers should focus on calibrating trust not enhancing trust Trust calibration may depend on the design of the algorithm and feedback Systems that mimic human characteristics, may lead to inappropriate trust as human characteristics lead to false expectations 18:32
Layers of etiquette Pragmatic (communicate, co Role identification Empathetic, affective 18:32
Thoughts? Mapping function to define system etiquette Human Etiquette Machine Etiquette System Etiquette Layers of Etiquette Pragmatic Role specification Empathetic/affective Level of abstraction in etiquette translation Literal vs. figurative Functional vs. physical Mapping function to define system etiquette “Please, can I interrupt” 18:32