7/17/2018 10:54 AM Conveying User Facial Expressions into Virtual World E-Learning Applications Mike Procter GSRDD 2014 © 2007 Microsoft Corporation. All rights reserved. Microsoft, Windows, Windows Vista and other product names are or may be registered trademarks and/or trademarks in the U.S. and/or other countries. The information herein is for informational purposes only and represents the current view of Microsoft Corporation as of the date of this presentation. Because Microsoft must respond to changing market conditions, it should not be interpreted to be a commitment on the part of Microsoft, and Microsoft cannot guarantee the accuracy of any information provided after the date of this presentation. MICROSOFT MAKES NO WARRANTIES, EXPRESS, IMPLIED OR STATUTORY, AS TO THE INFORMATION IN THIS PRESENTATION.
Goal to investigate usability of inexpensive consumer-grade devices for affect detection Part of MScIS thesis “A Multi-agent Framework to support User-Aware Conversational Agents in an E-learning Environment” Supervisors Fuhua (Oscar) Lin, Bob Heller
Background Conversational Agents Freudbot and Piagetbot A role-playing actor agent that simulates speaking with Sigmund Freud or Jean Piaget about their life, theories, and colleagues. (Heller & Procter, 2011) Uses conversational rules to switch or progress through topics.
Background 3D Virtual Worlds Second Life, OpenSim, Open Wonderland Embodied Conversational Agent (ECA) (Cassel 2001) Animated Pedagogical Agent (APA) (Lester, 2000) AU Island (Second Life and OpenSim), Montclair U Theorists Project, Sim-on-a-Stick
Background Brain Control Interface (BCI) Consumer devices (mostly for gamers) Emotiv EPOC headset Question: Can we use the Emotiv device to make Freudbot aware of the user’s emotions?
A Prototype System Facial expression and emotional state detected and analysed by emotion agent on user’s computer. frown bored attentive Summarised emotion data transmitted to user avatar in virtual world smile confused Emotion data is transmitted from user avatar to Freudbot frown attentive bored smile confused
Demonstration AU Science and Technology Showcase at Telus World of Science, May 2014 Adults & children participate using Emotiv headset Freudbot tells a joke and gauges user response Main issues: hair, sensors, individual differences
Conclusions & Future Work Proof of concept To be useful need to Improve the reliability of the interface (more sophisticated s/w, training) Incorporate other affective cues relevant to learning process (boredom, interest, frustration, …) Update e-learning application (e.g. Freudbot) to make use of these cues to improve learning outcomes, engage user Controlled user trials
Questions? References Acknowledgements Cassell, J. (2001) "Embodied Conversational Agents: Representation and Intelligence in User Interface" AI Magazine, Winter 2001, 22(3): 67-83. Johnson. W.L., Rickel, J.W., & Lester, J.C. (2000). Animated pedagogical agents: Face-to-face interaction in interactive learning environments. International Journal of Artificial Intelligence in Education, 11, 47-78. Heller, R.B. & Procter, M. (2011). Embedded and embodied intelligence: Virtual actors on virtual stages. In S Graf, F Lin, Kinhsuk, R& R. McGreal (Eds.) Intelligent and Adaptive Learning Systems: Technology Enhanced Support for Learners and Teachers. Athabasca University Press. Acknowledgements Graduate Student Research Fund Alberta Innovates Graduate Student Scholarship