Download presentation
Presentation is loading. Please wait.
Published byRosalyn Farmer Modified over 6 years ago
1
The Responses of People to Virtual Humans in an Immersive Virtual Environment
Maia Garau, Mel Slater, David-Paul Pertaub, Sharif Razzaque Presented by Eugene Khokhlov Eugene
2
The Responses of People to Virtual Humans
An experiment investigating responses of people to virtual humans. Responses investigated are: Presence (a state of being present; current existence) Copresence (sensing being close enough to be perceived) Heart rate & electrodermal activity (EDA) Eugene
3
Clarifications Agents = virtual humans
Hypothesis: the higher agent’s responsiveness (virtual human) would lead to participants feel like agents are humans. VR Environment for this paper is the library settings where students are studying at the table. Eugene
4
Experimental Design Condition 1 (Static): All agents (virtual humans) were static, frozen in a reading pose. Eugene
5
Experimental Design Condition 2 (Moving): The agents were animated. Movements included fidgeting, turning pages, and looking around. Non responsive to the participant. Eugene
6
Experimental Design Condition 3 (Responsive): Condition 2 plus responsiveness to participant’s location in the space. Upon approach change posture, involve in gazing/staring behavior. 4 zones Intimate (Agent is right next to participant) Personal (Agent not that far from participant) Social-consultative (Between personal & Public) Public (Quite a hike) Eugene
7
Zones Explained If participant is in the “public” zone, the agent would move as in condition 2. As participant moves through each zone toward the agent, the agent’s behavior would change as follows: gaze would increase, posture would become more upright, frequency of gazing and posture would increase. Eugene
8
Condition 3 (Responsive)
Eugene
9
Condition 4 (Talking) Condition 4 (Talking): the approached agent would briefly speak to the participant. Language is not recognizable, but tonality suggests a question, pause, then another question. After a few seconds an agent would say “OK” and turn back to the table. Eugene
10
Participants $7.50 per hour for 1 hr experiment. Condition Male Female
1. Static 7 4 2. Moving 6 3. Responsive 4. Talking 5 Eugene
11
Apparatus 2 Rooms 1st – reception room where participants were greeted and completed questionnaires 2nd – a laboratory containing the Cave and the PC for monitoring participants’ physiological responses Eugene
12
Physiological monitoring
Participants were fitted with EKG sensors on their torso and EDA sensors on their non-dominant hand. Cables were secured in a pack strapped to the participant’s back and linked to the PC via serial cable. Eugene
13
Environment Eugene
14
Agents & Virtual Environment
Agents (Virtual Humans) three male and two female agents who are sitting at the library table, studying. Additional behaviors such as blinking, leaning forward to read a book were implemented. Eugene
15
Task Observe your surroundings and we will ask you a few questions about what you experienced. Four minutes to explore the room. No mention was made of the characters Post questioner was given. Eugene
16
Post Questionnaire Copresence – extend to which the participants had the sense of being with other people. Participant behavior – to which extent participants reported that they altered their behavior in response to the agents. Agent awareness – the extent to which the participants perceived that the agents were aware of them. Eugene
17
Electrodermal Activity EDA
Studies show that EDA increases when a person approaches another person. This study wanted to see if the same is true for the person approaching an agent (virtual human). Eugene
18
Result Copresence Results
Eugene
19
Questionnaire Results
Participants who experienced the “talking” agents (condition 4) were significantly more likely to want to interact with the agents than in any other condition Participants who experienced the responsive and talking agents (condition 3 and 4) were more likely to perceive the agents as being aware of them than in “static” (1) or “moving” (2) conditions. Eugene
20
Copresence significance
Condition 3 & 4 (responsive and talking) are significantly different from 1 (Static) & 2 (moving), but Condition 2 (moving) is not significantly different from the static 1 condition. Eugene
21
Computer usage The greater the degree of computer usage, the more the agents were responded to as a computer interface rather than human like. Eugene
22
Participant Behavior In condition 4 (talking) there is a significant difference on how participants reported that their behavior was actually affected by agent presence. Avoidance, or talking back… Eugene
23
Perceived agent awareness
Extent to which the participants perceive the agents to be aware of them in various ways. (3 & 4) are significantly higher than 1 &2. Eugene
24
Physiological Measure
Data was not available for all participants, because the equipment did not always function correctly. EDA & heart rate is significantly higher for condition 3 (responsive) Heart rate increase diminishes with computer use. Eugene
25
Physiological Measure
Eugene
26
Discussion Sense of personal contact was significantly higher in condition 3 & 4 as described earlier. In conditions 1 (static) & 2 (moving) the fact that the agents did not respond made some participants feel “invisible” and “ghostlike,” and unable to engage in two-way interaction of any form. Eugene
27
Discussion In spite of this, several people mentioned their surprise that despite their rationalization of the agents as being computer-driven, they nonetheless responded to them on some level as people. Eugene
28
Conclusion Participants who encountered the visually responsive agents in condition 3 experienced a significantly higher sense of personal contact with the agents. The effect diminished with experienced computer users. Eugene
29
References Maia Garau, Mel Slater, David-Paul Pertaub, Sharif Razzaque The Responses of People to Virtual Humans in an Immersive Virtual Environment Presence, Vol 14, No 1, February 2005, Eugene
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.