ENTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.

Slides:



Advertisements
Similar presentations
Chapter 14: Usability testing and field studies
Advertisements

Methodology and Explanation XX50125 Lecture 1: Part I. Introduction to Evaluation Methods Part 2. Experiments Dr. Danaë Stanton Fraser.
Design, prototyping and construction
Cross-modal perception of motion- based visual-haptic stimuli Ian Oakley & Sile OModhrain Palpable Machines Research Group
MMAP Middle School Math Through Applications Project Dahwun Deepak Gazi Scott Sun-Young.
How to improve group performances on collocated synchronous manipulation tasks? Jean Simard, Mehdi Ammi and Anaïs Mayeur CNRS-LIMSI, University of Paris-Sud.
Empirical Studies Pablo Romero Computer Science Department, IIMAS, UNAM
Cognitive Issues in Virtual Reality Wickens, C.D., and Baker, P., Cognitive issues in virtual environments, in Virtual Environments and Advanced Interface.
Living Environments Lab Human- Computer Interaction Institute Carnegie Mellon Stacey Kuznetsov, Anind K. Dey, Scott E. Hudson, Eric Paulos {stace, anind,
The Power of the Image 2011 Using Sound to Augment ‘the Image’ Nick Bearman University of East Anglia.
Multimodal feedback : an assessment of performance and mental workload (NASA-TLX) 남종용.
Chapter 14: Usability testing and field studies. Usability Testing Emphasizes the property of being usable Key Components –User Pre-Test –User Test –User.
SienceSpace Virtual Realities for Learning Complex and Abstract Scientific Concepts.
Enhanced Rendering of Fluid Field Data Using Sonification and Visualization Maryia Kazakevich May 10, 2007.
Evaluating Non-Visual Feedback Cues for Touch Input Device Selina Sharmin Project for the course New Interaction Techniques Spring.
Research Methods for HCI: Cognitive Modelling BCS HCI Tutorial 1 st September, 2008.
Usability Evaluation Methods Computer-Mediated Communication Lab week 10; Tuesday 7/11/06.
9th International Conference on Disability, Virtual Reality and Associated Technologies Laval, France September 10–12, 2012 Upper-Body Interactive Rehabilitation.
1 USABLE AND ADAPTIVE FRAMEWORK IN THE TEACHING-LEARNING BOOLEAN ALGEBRA D RA. S ILVIA B EATRIZ G ONZÁLEZ B RAMBILA D RA. R OSA E LENA Á LVAREZ M ARTÍNEZ.
Virtual Reality Design and Representation. VR Design: Overview Objectives, appropriateness Creating a VR application Designing a VR experience: goals,
Chapter 12 Designing the Inputs and User Interface.
Fall 2002CS/PSY On-Speech Audio Area Overview Will it be heard ? Will it be identified ? Will it be understood Four Areas Uses of Non-speech Audio.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
1 Shengdong Zhao Department of Computer Science University of Toronto July 9, 2008 earPod: Efficient, Hierarchical, Eyes-free Menu Selection.
The Effect of a Prism Manipulation on a Walking Distance Estimation Task Jonathan Giles Beverley Ho Jessica Blackwood-Beckford Aurora Albertina Dashrath.
Computational Perception Li Liu. Course 10 lectures 2 exercises 2 labs 1 project 1 written examination.
Ch 14. Testing & modeling users
Break-out Group # D Research Issues in Multimodal Interaction.
Visualizing Information in Global Networks in Real Time Design, Implementation, Usability Study.
Household appliances control device for the elderly On how to encourage universal usability in the home environment.
ENTERFACE ‘08: Project4 Design and Usability Issues for Multimodal Cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4.
Testing & modeling users. The aims Describe how to do user testing. Discuss the differences between user testing, usability testing and research experiments.
Understanding Action Verbs- Embodied Verbal Semantics Approach Pavan Kumar Srungaram M.Phil Cognitive Science (09CCHL02) Supervisor: Prof. Bapi.
BlindAid: a Virtual Exploration Tool for People who are Blind O. Lahav, Ph.D., D. Schloerb, Ph.D., S. Kumar, and M. A. Srinivasan, Ph.D Touch Lab, RLE,
Experimental evaluation of two haptic techniques for 3D interaction: constriction polyhedron / magnetic attraction Antonio CAPOBIANCO Caroline ESSERT-VILLARD.
Navigation and orientation in 3D user interfaces: the impact of navigation aids and landmarks Author: Avi Parush, Dafna Berman International journal of.
Simulation of small head-movements on a Virtual Audio Display using headphone playback and HRTF synthesis Wersényi György SZÉCHENYI ISTVÁN UNIVERSITY,
1 What is a Virtual Environment? Wide field presentation of computer- generated, multi-sensory information with user tracked in real time Computer simulation.
Users’ Quality Ratings of Handheld devices: Supervisor: Dr. Gary Burnett Student: Hsin-Wei Chen Investigating the Most Important Sense among Vision, Hearing.
1 Human Computer Interaction Week 5 Interaction Devices and Input-Output.
Learning from Model-Produced Graphs in a Climate Change Science Class Catherine Gautier Geography Department UC Santa Barbara.
Evaluating Perceptual Cue Reliabilities Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
Immersive Displays The other senses…. 1962… Classic Human Sensory Systems Sight (Visual) Hearing (Aural) Touch (Tactile) Smell (Olfactory) Taste (Gustatory)
1 Human-Computer Interaction Usability Evaluation: 2 Expert and Empirical Methods.
Evaluation and Designing
Gesture Modeling Improving Spatial Recognition in Architectural Design Process Chih-Pin Hsiao Georgia Institute of Technology.
Improving O&M Skills Through the Use of VE for People Who Are Blind: Past Research and Future Potential O. Lahav School of Education, Tel Aviv University.
Conceptual Design Dr. Dania Bilal IS588 Spring 2008.
Chapter 3-Multimedia Skills
1 What the body knows: Exploring the benefits of embodied metaphors in hybrid physical digital environments Alissa N. Antle, Greg Corness, Milena Droumeva.
Multimodal Virtual Environments: Response Times, Attention, and Presence B 陳柏叡.
On the improvement of virtual localization in vertical directions using HRTF synthesis and additional filtering Wersényi György SZÉCHENYI ISTVÁN UNIVERSITY,
Factors influencing the usability of icons in the LCD touch screens Hsinfu Huang, Wang-Chin Tsai, Hsin-His Lai Department of Industrial Design, National.
Searching and Using Electronic Literature III. Research Design.
Landscape Animation Summary of an experimental approach to inform the use of camera dynamics in visualization.
6.S196 / PPAT: Principles and Practice of Assistive Technology Wed, 19 Sept Prof. Rob Miller Today: User-Centered Design [C&H Ch. 4]
1 07/11/07 Using an Event-Based Approach to Improve the Multimodal Rendering of 6DOF Virtual Contact Jean Sreng, Florian Bergez, Jérémie Le Garrec, Anatole.
The Benefit of Force Feedback in Surgery: Examination of Blunt Dissection Manish Mehta Group 5 Mentors: Michael Kutzer, Ryan Murphy, Mehran Armand Team.
SFnight SFnight 213 project Overview Special Challenges Heuristic Evaluation Results of the Pilot Study Formal Usability Study Demo of Current Design Last.
IMPROVING THE HUMAN TECHNOLOGY INTERFACE NURSING INFORMATICS CHAPTER 4 1.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
HUMAN MEDIA INTERACTION CREATIVE TECHNOLOGY FOR PLAY AND
SIE 515 Touch and Haptics Class 19.
Neurofeedback of beta frequencies:
HCI in the curriculum The human The computer The interaction
Group 4 - Library Usability Study
BlindAid: a Virtual Exploration Tool for People who are Blind
Testing & modeling users
Presentation transcript:

eNTERFACE ‘08: Project4 Design and Usability Issues for multimodal cues in Interface Design/ Virtual Environments eNTERFACE ‘08| Project 4

eNTERFACE ‘08: Project4 Project Team Team Leader: Catherine Guastavino, McGill University, Canada Team Members: Emma Murphy, McGill University, Canada Charles Verron, France Telecom R&D and Laboratoire de Mecanique et d'Acoustique de Marseille (LMA), France Camille Mousette, Ume Institute of Design, Sweden

eNTERFACE ‘08: Project4 Project Proposal The main aim of this project was to run usability tests to investigate issues of integration and effectiveness of information delivery between audio and haptic modalities. It is proposed that an interface involving a target finding task with audio and haptic (touch) feedback is relevant to this aim Furthermore target finding using multimodal cues is relevant to the field of New Instrument Design and also the wider field of Human Computer Interaction

eNTERFACE ‘08: Project4 Literature Various studies have indicated that the use of non-speech audio and haptics can help improve access to graphical user interfaces (Mynatt and Weber, 1994; Ramstein et al., 1996), by reducing the burden on other senses, such as vision and speech. Studies have specifically investigated the use of audio and haptics to convey object location in a spatial structure (Wood et al., 2003; Lahav and Mioduser, 2004; Murphy et al., 2007) Previous studies have also investigated the use of 3D audio with gesture for target finding in virtual environments (Marentakis and Brewster, 2005 )

eNTERFACE ‘08: Project4 Interface Design Audio Feedback Non-Speech Auditory Cues Freesound MAX/MSP Ircam Spat Object Haptic Feedback PHANTOM OMNI H3D API

eNTERFACE ‘08: Project4 Audio-Haptic Design The proposed idea was to implement a target finding task with haptic feedback using the PHANTOM and non-speech audio cues. A virtual environment composed of a number of parallel planes was created with a target located randomly on one the the planes Haptics: A magnetic effect was used to create a rigid surface for the planes and also on the target. Audio: Auditory cues were designed based on a string instrument (a cello) utilizing 3D spatial audio

eNTERFACE ‘08: Project4 Audio-Haptic Design Horizontal and Vertical conditions Audio: Crossing the planes Target Location Cue Target Found Cue Haptics: Magnetic Effect on the surface of planes and target

eNTERFACE ‘08: Project4 Audio-Haptic Design We used the IRCAM SPAT object for 3D sound spatialization over headphones using binaural rendering with HRTF database by (Martin et al., 1994; Gardner et al, 1994) The virtual sound source (the bowed cello sound) is spatialized using the “ears in hand metaphor” The virtual sound source is played only when the target and the stylus are located on the same plane (horizontal or vertical, according to the configuration).

eNTERFACE ‘08: Project4 Experimental Design Independent variables Feedback = audio only or audio-haptic Orientation = vertical or horizontal Resulting on 4 experimental conditions Audio-Haptic Vertical Audio-Haptic Horizontal Audio-Only Vertical Audio-Only Horizontal Dependent variables completion times trajectories perceived effectiveness and ease of use cognitive strategies

eNTERFACE ‘08: Project4 Experimental Hypotheses Hypotheses: 1. Users would find the audio only condition more difficult to navigate without the support of the haptic planes 2. Investigate the effect of flipping the planes from vertical to horizontal orientations in both audio only and audio-haptic conditions.

eNTERFACE ‘08: Project4 Demo Video

eNTERFACE ‘08: Project4 Experiment: Target Finding Task 23 Participants Training Introduction: Users became familiar with audio-haptic cues using a visual representation of the planes. Users were asked to navigate the planes, firstly find the plane with the target and then locate that target. Trial Experiment: Users were presented with 8 trials, 2 of each condition. Users were not give any information about the 3D audio mappings or the haptic feedback Main Experiment - 44 Trials (11 per condition) Condition, Target position within and across planes were randomised

eNTERFACE ‘08: Project4 Initial Results: Completion Times Initial completion time analyses confirm hypothesis; audio completion times are significantly longer than audio-haptic condition. Furthermore the vertical condition took significantly longer than the horizontal condition

eNTERFACE ‘08: Project4 Initial Results: Interaction effects Factorial Anova: Significant effect of feedback Audio vs. audio-haptic (p=>.001) Orientation: vertical vs. horizontal (p=>.01)

eNTERFACE ‘08: Project4 Analysis Strategies From observation the most efficient users were those who immediately grasped the structure of the virtual environment and understood the 3D audio cues. Spatial audio - elevation cues more difficult to perceive without individualized HRTFs Gestures Interesting gestural use of the haptic device Some participants changed hand movement according to the position of the planes One participant had an interesting gestural strategy of recreating the haptic planes with his free hand in the audio-only condition Further Analysis Post-task Questionnaires Trajectories - to further analys gestural control and to investigate random identifications of the target source

eNTERFACE ‘08: Project4 Demo: Recreating User Trajectories

eNTERFACE ‘08: Project4 Future Work Further Analysis Post Task Questionnaires Trajectories Further quantitative results Further Evaluations Develop cues Implement experiment using other haptic devices Applications Visually impaired users Small screen devices

eNTERFACE ‘08: Project4 Summary The aim of this project was to highlight usability issues for audio haptic cues by conducting user evaluations of a multimodal interface. Experiments confirmed our initial hypothesis that the audio-only condition would be most difficult for users to navigate Further analysis will focus on the qualitative comments from post task questionnaires and also trajectory analysis and gestural movements We intend to develop this study in terms of the audio-haptic cues, using other haptic devices and also extend the perceptual evaluation to investigate other aspects of multimodal integration