21 st February 2013 The voices in your head & utilizing head movement in hearing-aid signal processing Alan Boyd (CeSIP/IHR) Supervisors Prof. John Soraghan.

Slides:



Advertisements
Similar presentations
Acoustic Localization by Interaural Level Difference Rajitha Gangishetty.
Advertisements

Hearing Aids and Hearing Impairments Part II Meena Ramani 02/23/05.
Advanced Speech Enhancement in Noisy Environments
Spatial Perception of Audio J. D. (jj) Johnston Neural Audio Corporation.
Learning the cues associated with non-individualised HRTFs John Worley and Jonas Braasch Binaural and Spatial Hearing Group.
The effect of advanced signal processing strategies in hearing aids on user performance and preference Gitte Keidser, Lyndal Carter, and Harvey Dillon.
A practical DSP solution for mobile phone stereo sound recording.
Listening Tests and Evaluation of Simulated Sound Fields Using VibeStudio Designer Wersényi György Hesham Fouad SZÉCHENYI ISTVÁN UNIVERSITY, Hungary VRSonic,
3-D Sound and Spatial Audio MUS_TECH 348. Psychology of Spatial Hearing There are acoustic events that take place in the environment. These can give rise.
Improvement of Audibility for Multi Speakers with the Head Related Transfer Function Takanori Nishino †, Kazuhiro Uchida, Naoya Inoue, Kazuya Takeda and.
Source Localization in Complex Listening Situations: Selection of Binaural Cues Based on Interaural Coherence Christof Faller Mobile Terminals Division,
3-D Sound and Spatial Audio MUS_TECH 348. Cathedral / Concert Hall / Theater Sound Altar / Stage / Screen Spiritual / Emotional World Subjective Music.
1/44 1. ZAHRA NAGHSH JULY 2009 BEAM-FORMING 2/44 2.
Spectral centroid 6 harmonics: f0 = 100Hz E.g. 1: Amplitudes: 6; 5.75; 4; 3.2; 2; 1 [(100*6)+(200*5.75)+(300*4)+(400*3.2)+(500*2 )+(600*1)] / = 265.6Hz.
1 New Technique for Improving Speech Intelligibility for the Hearing Impaired Miriam Furst-Yust School of Electrical Engineering Tel Aviv University.
U.S. Army Research, Development and Engineering Command Braxton B. Boren, Mark Ericson Nov. 1, 2011 Motion Simulation in the Environment for Auditory Research.
1 Recent development in hearing aid technology Lena L N Wong Division of Speech & Hearing Sciences University of Hong Kong.
1 Department of Electrical and Computer Engineering Advisor: Professor Zink Team Acoustic Beamformer Preliminary Design Review 10/18/2013.
ENGINEERING BIOMEDICAL Michael G. Heinz Background: PhD at MIT –Speech and Hearing Bioscience and Technology Postdoctoral Fellow at Johns Hopkins Univ.
Sound Source Localization based Robot Navigation Group 13 Supervised By: Dr. A. G. Buddhika P. Jayasekara Dr. A. M. Harsha S. Abeykoon 13-1 :R.U.G.Punchihewa.
Speech Segregation Based on Sound Localization DeLiang Wang & Nicoleta Roman The Ohio State University, U.S.A. Guy J. Brown University of Sheffield, U.K.
A VOICE ACTIVITY DETECTOR USING THE CHI-SQUARE TEST
What they asked... What are the long term effects of fitting bilateral amplification simultaneously (both aids on Day #1) versus sequentially (the second.
Harvey Dillon Director, National Acoustic Laboratories Parliamentary Breakfast Hearing Awareness Week, 2012 Hearing aids – how much do they really help?
Creating sound valuewww.hearingcrc.org Kelley Graydon 1,2,, Gary Rance 1,2, Dani Tomlin 1,2 Richard Dowell 1,2 & Bram Van Dun 1,4. 1 The HEARing Cooperative.
Customer Satisfaction with Single and Multiple Microphone Digital Hearing Aids Sergei Kochkin, Ph.D. Knowles Electronics January 9, 2001.
From Auditory Masking to Supervised Separation: A Tale of Improving Intelligibility of Noisy Speech for Hearing- impaired Listeners DeLiang Wang Perception.
Environmentally Aware Feature-packed DSP Hearing Aid Amplifier Updated Feb 23, 2010.
METHODOLOGY INTRODUCTION ACKNOWLEDGEMENTS LITERATURE Low frequency information via a hearing aid has been shown to increase speech intelligibility in noise.
2010/12/11 Frequency Domain Blind Source Separation Based Noise Suppression to Hearing Aids (Part 1) Presenter: Cian-Bei Hong Advisor: Dr. Yeou-Jiunn Chen.
Applied Psychoacoustics Lecture: Binaural Hearing Jonas Braasch Jens Blauert.
Virtual Worlds: Audio and Other Senses. VR Worlds: Output Overview Visual Displays: –Visual depth cues –Properties –Kinds: monitor, projection, head-based,
3-D Sound and Spatial Audio MUS_TECH 348. Main Types of Errors Front-back reversals Angle error Some Experimental Results Most front-back errors are front-to-back.
Sounds in a reverberant room can interfere with the direct sound source. The normal hearing (NH) auditory system has a mechanism by which the echoes, or.
New Fitter News Volume 2, Number 3 This Month’s Topic for the New Fitter Is… Beltone AVE. Have you taken a trip down Beltone AVE.? Beltone AVE. is a multi-media.
Need for cortical evoked potentials Assessment and determination of amplification benefit in actual hearing aid users is an issue that continues to be.
A Acoustic Source Direction by Hemisphere Sampling Stanley T. Birchfield Daniel K. Gillmor Quindi Corporation Palo Alto, California.
Will technological advances improve the outcome? Ruth Bentler University of iowa.
‘Missing Data’ speech recognition in reverberant conditions using binaural interaction Sue Harding, Jon Barker and Guy J. Brown Speech and Hearing Research.
Simulation of small head-movements on a Virtual Audio Display using headphone playback and HRTF synthesis Wersényi György SZÉCHENYI ISTVÁN UNIVERSITY,
L INKWITZ L AB S e n s i b l e R e p r o d u c t i o n & R e c o r d i n g o f A u d i t o r y S c e n e s Hearing Spatial Detail in Stereo Recordings.
Jens Blauert, Bochum Binaural Hearing and Human Sound Localization.
Figures for Chapter 14 Binaural and bilateral issues Dillon (2001) Hearing Aids.
TDOA SLaP (Time Difference Of Arrival Sound Localization and Placement) Project Developers: Jordan Bridges, Andrew Corrubia, Mikkel Snyder Advisor: Robert.
IIT Bombay {pcpandey,   Intro. Proc. Schemes Evaluation Results Conclusion Intro. Proc. Schemes Evaluation Results Conclusion.
Laboratory for Experimental ORL K.U.Leuven, Belgium Dept. of Electrotechn. Eng. ESAT/SISTA K.U.Leuven, Belgium Combining noise reduction and binaural cue.
Autonomous Robots Vision © Manfred Huber 2014.
Effects of Low-Frequency Bias Tones on Stimulus-Frequency Otoacoustic Emissions Eric L. Carmichel Mentors Dr. Michael Dorman Dr. Lin Bian.
Automatic Equalization for Live Venue Sound Systems Damien Dooley, Final Year ECE Progress To Date, Monday 21 st January 2008.
Turning a Mobile Device into a Mouse in the Air
Fast Bayesian Acoustic Localization
Data Analysis Algorithm for GRB triggered Burst Search Soumya D. Mohanty Center for Gravitational Wave Astronomy University of Texas at Brownsville On.
3-D Sound and Spatial Audio MUS_TECH 348. Are IID and ITD sufficient for localization? No, consider the “Cone of Confusion”
>>ITD.m running… IC 800Hz 40 sp/sec 34 O azim Neuron April 16, 2009 Bo Zhu HST.723 Spring 2009 Theme 3 Paper Presentation April 1, 2009.
PSYC Auditory Science Spatial Hearing Chris Plack.
Benedikt Loesch and Bin Yang University of Stuttgart Chair of System Theory and Signal Processing International Workshop on Acoustic Echo and Noise Control,
Fletcher’s band-widening experiment (1940)
What can we expect of cochlear implants for listening to speech in noisy environments? Andrew Faulkner: UCL Speech Hearing and Phonetic Sciences.
SPATIAL HEARING Ability to locate the direction of a sound. Ability to locate the direction of a sound. Localization: In free field Localization: In free.
3-D Sound and Spatial Audio MUS_TECH 348. What do these terms mean? Both terms are very general. “3-D sound” usually implies the perception of point sources.
By: Seungbum.  What is ear? (Summarized meaning)  Caring your ear (Ear Care)  Monitoring noise levels.  Deafness  Avoiding hearing damage.
Speech and Singing Voice Enhancement via DNN
Auditory Localization in Rooms: Acoustic Analysis and Behavior
LECTURE 07: TIME-DELAY ESTIMATION AND ADPCM
Attentional Tracking in Real-Room Reverberation
Volume 62, Issue 1, Pages (April 2009)
Volume 62, Issue 1, Pages (April 2009)
Localizing Sounds.
3 primary cues for auditory localization: Interaural time difference (ITD) Interaural intensity difference Directional transfer function.
Presenter: Shih-Hsiang(士翔)
Presentation transcript:

21 st February 2013 The voices in your head & utilizing head movement in hearing-aid signal processing Alan Boyd (CeSIP/IHR) Supervisors Prof. John Soraghan (CeSIP) & Wm. Whitmer (IHR)

Internalization/externalization

Initial aims and objectives find the causes of internalization in hearing-impaired listeners study the effect of hearing-aid signal processing on internalization determine the prevalence of the perception of internalization in the hearing-impaired population improve hearing-aid signal processing to reduce internalization

Externalization I Externalized condition Internalized condition Render headphones ‘acoustically transparent’ 30° ← 3 m → 30° ← 3 m → No HRTF but still ITD

Results for 1 talker Boyd et. al, JASA (2012)

Results for 4 talkers Boyd et. al, JASA (2012)

Externalization II 2 m Listener with/without hearing-aids Continuous single male talker 30° 2 m Impulsive noise bursts

Externalization II Normal hearingHearing impaired

Internalization survey Participants 267 respondents: 70 without hearing aids; 122 unilaterally aided; 75 bilaterally aided. Prevalence Percentage of respondents who experience internalization: 15.7% without hearing aids 19.6% unilaterally-aided 35.1% bilaterally-aided

Psychoacoustic research summary Hearing-impaired have a compressed perception of externalization Results suggest that internalization occurs when auditory cues are not prominent, with no effect of the hearing aid. Survey results show an increase in the prominence of internalization with increasing number of hearing aids No scope for a signal processing solution...

f1-attentuation TOTAL FAILURE

Listeners move The position of the ears (and hearing aids) relative to a sound source can change due to head movements Hearing aids work best when static Performance of adaptive noise-reduction algorithms can be reduced by head movements or moving sources Head movements may provide useful information Changing user behaviour may require a different setting Algorithms could compensate for head movements Head movement & hearing aids

Hardware MEMS triple-axis gyroscope (ITG 3200) Provides angular velocity information Programmed via Arduino and USB serial Small enough to fit on a hearing aid Patents exist for basic use of a gyroscope in hearing aids gyroscopeArduino

Program selection based on user behaviour Directional programs are useful for listening to one sound source in noisy environments Omni-directional programs are useful for localizing sounds Gyroscope information is used to smoothly mix between programs: Directional when listener is stationary Omni-directional when listener is “searching” Version 1

Adaptive directional mic array Teutsch & Elko, 2001

Behavioural switching DIR A B OMNIGYRODIR A B A B GYRO A BB A

Version 3 SOURCE 1 DOA histogram bins Binaurally-linked mics Compensates for head-movements during direction of arrival (DOA) estimates Uses generalized cross-correlation (GCC-PHAT) SOURCE 2

STFT Cross-correlation Find correlation peak Over-sampled IFFT delay → angle of arrival Shift current histogram against rotated angle Find angle rotated since previous time-step Update histogram Gyroscope Mic 1Mic 2 Version 3 - System

Version 3 – One source Boyd et. al, Proc. ICA (2013)

Version 3 – Four sources in reverb Boyd et. al, Proc. ICA (2013)

Version 3 – One source in noise Boyd et. al, Proc. ICA (2013)

Version 4 Front/back detector GCC-PHAT measures ± 90° Measurement in front and rear hemisperes is the same Short measurement period Compares peak movement to head movement ↓ source angle + clockwise rotation = Position 1 ↑ source angle + clockwise rotation = Position 2 Cone of Confusion Position 1 Position 2

Improvements & combinations Improve gyroscope performance using accelerometers & magnetometers (Razor Attitude and Heading Reference System) Combine gyro system with Navin Chatlani’s steerable beamformer Record interaural time differences, level differences and head movements in real-world situations (ISRA 2013)

Conclusion Perception of internalization of sound occurs in a significant minority of the hearing-impaired population The hearing aid may not cause it, however it may exacerbate the perception if it already exists Head movement information can be used for; Automatic program selection based on behaviour Robust DOA estimates Longer timescale techniques

Thanks for listening Any questions?

Externalization over headphones

Head movement & hearing aids DIROMNIGYRODIRGYRO A B A B A B A B A B Standard adaptive beamformer Gyroscopically guided beamformer