A Cross-modal Electronic Travel Aid Device F. Fontana, A. Fusiello, M. Gobbi, V. Murino, D. Rocchesso, L. Sartor, A. Panuccio. Università di Verona Dipartimento.

Slides:



Advertisements
Similar presentations
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Advertisements

How To Design a User Interface For People With Disabilties
A Lightweight Computer Vision-based Electronic Travel Aid Andrew B. Raij Enabling Tech Project Status Report 3/6/2003.
Multimedia Specification Design and Production 2012 / Semester 1 / week 6 Lecturer: Dr. Nikos Gazepidis
Virtual Reality Design Virtual reality systems are designed to produce in the participant the cognitive effects of feeling immersed in the environment.
3-D Sound and Spatial Audio MUS_TECH 348. Psychology of Spatial Hearing There are acoustic events that take place in the environment. These can give rise.
MEG Experiments Stimulation and Recording Setup Educational Seminar Institute for Biomagnetism and Biosignalanalysis February 8th, 2005.
Topics Dr. Damian Schofield Director of Human Computer Interaction.
A Lightweight Computer- Vision-based Electronic Travel Aid Andrew B. Raij Enabling Tech Project Final Report 4/17/2003.
Class 6 LBSC 690 Information Technology Human Computer Interaction and Usability.
1 MURI review meeting 09/21/2004 Dynamic Scene Modeling Video and Image Processing Lab University of California, Berkeley Christian Frueh Avideh Zakhor.
Enhanced Rendering of Fluid Field Data Using Sonification and Visualization Maryia Kazakevich May 10, 2007.
Handhelds and Collaborative Command and Control Brad Myers Human Computer Interaction Institute Carnegie Mellon University February 23, 2001.
Virtual Reality. What is virtual reality? a way to visualise, manipulate, and interact with a virtual environment visualise the computer generates visual,
Lecture 7 Date: 23rd February
Grasping Graphs by Ear: Sonification of Interaction with Hidden Graphs Leena Vesterinen Department of Computer Sciences University of Tampere Finland.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Theoretical Foundations of Multimedia Chapter 3 Virtual Reality Devices Non interactive Slow image update rate Simple image Nonengaging content and presentation.
Multimodality Alistair D N Edwards Department of Computer Science
Your Interactive Guide to the Digital World Discovering Computers 2012.
Developed By: Digital Tech Frontier, LLC EAST.
A Multisensory and Multimodal Game for Children with Cochlear Implants Maria Carmela Sogono Supervisor: Dr. Deborah Richards ITEC810 Information Technology.
Frequency Coding And Auditory Space Perception. Three primary dimensions of sensations associated with sounds with periodic waveforms Pitch, loudness.
Virtual Reality: How Much Immersion Is Enough? Angela McCarthy CP5080, SP
Virtual Reality Design and Representation. VR Design: Overview Objectives, appropriateness Creating a VR application Designing a VR experience: goals,
Binaural Sonification of Disparity Maps Alfonso Alba, Carlos Zubieta, Edgar Arce Facultad de Ciencias Universidad Autónoma de San Luis Potosí.
Fall 2002CS/PSY On-Speech Audio Area Overview Will it be heard ? Will it be identified ? Will it be understood Four Areas Uses of Non-speech Audio.
Output Thomas W. Davis. What is Output? Output it data that has been processed into a useful form Output includes: Monitors Printers Speakers Etc.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Enabling enactive interaction in virtualized experiences Stefano Tubaro and Augusto Sarti DEI – Politecnico di Milano, Italy.
A haptic presentation of 3D objects in virtual reality for the visually disabled MSc Marcin Morański Professor Andrzej Materka Institute of Electronics,
Van Warren Van Warren April 2002 The Echomatic™. What Is The Echomatic™? An Ultrasonic Mobility Device For the Visually Impaired.
1 Auditory, tactile, and vestibular sensory systems n Perceptually relevant characteristics of sound n The receptor system: The ear n Basic sensory characteristics.
Visual, auditory, and haptic displays Dr. Xiangyu Wang Acknowledgment of Dr. Doug Bowman’s lecture notes.
Virtual Worlds: Audio and Other Senses. VR Worlds: Output Overview Visual Displays: –Visual depth cues –Properties –Kinds: monitor, projection, head-based,
Haptic Interfaces for Virtual Reality and Teleoperation.
Vocabularies for Description of Accessibility Issues in MMUI Željko Obrenović, Raphaël Troncy, Lynda Hardman Semantic Media Interfaces, CWI, Amsterdam.
VIRTUAL REALITY Sagar.Khadabadi. Introduction The very first idea of it was presented by Ivan Sutherland in 1965: “make that (virtual) world in the window.
GENESIS OF VIRTUAL REALITY  The term ‘Virtual reality’ (VR) was initially coined by Jaron Lanier, founder of VPL Research (1989)..
Current Assistive Technologies Available for Orientation and Mobility Purposes: Applications, Limitations, and Criteria for Successful Use Ed Gervasoni,
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
Microsoft Assistive Technology Products Brought to you by... Jill Hartman.
1 What is a Virtual Environment? Wide field presentation of computer- generated, multi-sensory information with user tracked in real time Computer simulation.
HCI 입문 Graphics Korea University HCI System 2005 년 2 학기 김 창 헌.
1 Perception and VR MONT 104S, Fall 2008 Lecture 14 Introduction to Virtual Reality.
VIRTUAL REALITY PRESENTED BY, JANSIRANI.T, NIRMALA.S, II-ECE.
Interaction design centre ICAD-03, Boston, USA, July 6-9, 2003 Experiments with the Sonic Browser Eoin Brazil, Mikael Fernström Interaction Design Centre.
Improving O&M Skills Through the Use of VE for People Who Are Blind: Past Research and Future Potential O. Lahav School of Education, Tel Aviv University.
Multi-Modal Dialogue in Personal Navigation Systems Arthur Chan.
Projectargus.eu ARGUS Assisting Personal Guidance System for People with Visual Impairment Presented by Manfred Schrenk Managing Director of.
Copyright John Wiley & Sons, Inc. Chapter 3 – Interactive Technologies HCI: Developing Effective Organizational Information Systems Dov Te’eni Jane.
Real Time Collaboration and Sharing
A BRIEF OVERVIEW THE AUDITORY PERSPECTIVE OF INTERFACES.
Fletcher’s band-widening experiment (1940)
MULTI TOUCH. Introduction Multi-touch is a human-computer interaction technique. Consists of a touch screen as well as software that recognizes multiple.
A Framework for Perceptual Studies in Photorealistic Augmented Reality Martin Knecht 1, Andreas Dünser 2, Christoph Traxler 1, Michael Wimmer 1 and Raphael.
Portable Camera-Based Assistive Text and Product Label Reading From Hand-Held Objects for Blind Persons.
Digital Video Representation Subject : Audio And Video Systems Name : Makwana Gaurav Er no.: : Class : Electronics & Communication.
MULTIMODAL AND NATURAL COMPUTER INTERACTION Domas Jonaitis.
FIFTH MEETING.
Non-Speech Audio Area Overview Four Areas Will it be heard ?
Human Computer Interaction Lecture 20 Universal Design
CEN3722 Human Computer Interaction Advanced Interfaces
Tim Pennick British Telecommunications Plc
Auditory Interfaces map to event in computer sound new mail mail sent
Human-centered Interfaces
universal design (web accessibility)
Human and Computer Interaction (H.C.I.) &Communication Skills
Lecture 4. Human Factors : Psychological and Cognitive Issues (II)
Overview of augmented reality By M ISWARYA. CONTENTS  INTRODUCTION  WHAT IS AR?  ROLE OF AR  EXAMPLES  APPLICATION  ADVANTAGES  CONCLUSION.
Presentation transcript:

A Cross-modal Electronic Travel Aid Device F. Fontana, A. Fusiello, M. Gobbi, V. Murino, D. Rocchesso, L. Sartor, A. Panuccio. Università di Verona Dipartimento di Informatica

Andrea Fusiello Mobile HCI Overview n Motivations n The system n Visual analisys n Auditory display n Work in progress

Andrea Fusiello Mobile HCI Motivations n Humans use several senses simultaneously to explore and experience the environment. n Multimodal displays can enhance the user experience and the sense of engagement. n The redundancy of our sensory system can be exploited in order to choose the display that is the most convenient for a given application. n Our interest: exploring new ways of transferring information across different sensorial modalities, especially in the context of interactive systems.

Andrea Fusiello Mobile HCI n Cross-modality: methods to perform analysis tasks using one modality, and synthesis (display) tasks using another modality. n Examples: give auditory display to visual information, visualize auditory scenes, and sonify haptic sensations. n Design problem: how to render a given percept using a certain modality, the latter possibly being not the most obvious for the stimulus at hand.

Andrea Fusiello Mobile HCI A cross-modality instance n Auditory display of visual information –More effective HC interfaces –Aid devices for visually impaired people n Auditory display needs sonification of the (visual) objects being displayed. n Sonification is the acoustic analogous of visualization.

Andrea Fusiello Mobile HCI n n The blind person is equipped with a stereo camera and earphones. n n He/she uses a laser pointer as a cane. n n The system computes the 3D position of the laser spot and sonifies this piece of information The system

Andrea Fusiello Mobile HCI Visual analisys n Track the position of the laser spot in two images n Compute the 3D position of the laser spot by triangulation

Andrea Fusiello Mobile HCI Stereopsis: the concept The same 3D point projects onto two different pixels: the difference is the disparity, which is related to the depth of the point.

Andrea Fusiello Mobile HCI Laser spot tracking n Red band-pass filter n Brightness normalization n Temporal averaging n Thresholding n Size filter n Epipolar constraint n Kalman filter

Andrea Fusiello Mobile HCI Auditory display n Sonification: the most important visual objects are associated to sounds that “label” such objects. n Visual objects are: –The laser spot –Disparity blobs n For disparity blobs we sonify their area, using pitch and loudness.

Andrea Fusiello Mobile HCI n Spatialization: sounds are positioned in a virtual scenario, and displayed in 3D –Relative angular position –Relative distance

Andrea Fusiello Mobile HCI Spatial attributes n Distance is conveyed by processing sounds as if they were located inside a tube n n Angular position: – –Interaural delay – –Head diffraction (low pass)

Andrea Fusiello Mobile HCI The prototype

Andrea Fusiello Mobile HCI Work in progress n Empirical evaluation n Porting to PocketPC n Sketch based interface n Other sonification modes (global) n Looking for other interesting applications of cross-modality

Andrea Fusiello Mobile HCI Questions?

Andrea Fusiello Mobile HCI Brown and Duda model n Interaural distance  time delay n Head diffraction  low pass filter on the opposite hear n Brown&Duda, IEEE Tr. Speech and Audio, 6(5) 1998.