Localization cues with bilateral cochlear implants Bernhard U. Seeber and Hugo Fastl (2007) Maria Andrey Berezina HST.723 April 8 th, 2009.

Slides:



Advertisements
Similar presentations
Revised estimates of human cochlear tuning from otoacoustic and behavioral measurements Christopher A. Shera, John J. Guinan, Jr., and Andrew J. Oxenham.
Advertisements

Sounds that “move” Diphthongs, glides and liquids.
Indications of non-organic hearing loss
Binaural Hearing Or now hear this! Upcoming Talk: Isabelle Peretz Musical & Non-musical Brains Nov. 12 noon + Lunch Rm 2068B South Building.
3-D Sound and Spatial Audio MUS_TECH 348. Psychology of Spatial Hearing There are acoustic events that take place in the environment. These can give rise.
Effect of reverberation on loudness perceptionInsert footer on Slide Master© University of Reading Department of Psychology 12.
Localizing Sounds. When we perceive a sound, we often simultaneously perceive the location of that sound. Even new born infants orient their eyes toward.
Chapter 6: Masking. Masking Masking: a process in which the threshold of one sound (signal) is raised by the presentation of another sound (masker). Masking.
SYED SYAHRIL TRADITIONAL MUSICAL INSTRUMENT SIMULATOR FOR GUITAR1.
Vocal Emotion Recognition with Cochlear Implants Xin Luo, Qian-Jie Fu, John J. Galvin III Presentation By Archie Archibong.
Source Localization in Complex Listening Situations: Selection of Binaural Cues Based on Interaural Coherence Christof Faller Mobile Terminals Division,
All you have is a pair of instruments (basilar membranes) that measure air pressure fluctuations over time Localization.
ICA Madrid 9/7/ Simulating distance cues in virtual reverberant environments Norbert Kopčo 1, Scott Santarelli, Virginia Best, and Barbara Shinn-Cunningham.
Interrupted speech perception Su-Hyun Jin, Ph.D. University of Texas & Peggy B. Nelson, Ph.D. University of Minnesota.
Lecture 7 AM and FM Signal Demodulation
There are several clues you could use: 1.arrival time 2.phase lag (waves are out of sync) 3.sound shadow (intensity difference)- sound is louder at ear.
Hearing & Deafness (3) Auditory Localisation
Spectral centroid 6 harmonics: f0 = 100Hz E.g. 1: Amplitudes: 6; 5.75; 4; 3.2; 2; 1 [(100*6)+(200*5.75)+(300*4)+(400*3.2)+(500*2 )+(600*1)] / = 265.6Hz.
1 New Technique for Improving Speech Intelligibility for the Hearing Impaired Miriam Furst-Yust School of Electrical Engineering Tel Aviv University.
Cross-Spectral Channel Gap Detection in the Aging CBA Mouse Jason T. Moore, Paul D. Allen, James R. Ison Department of Brain & Cognitive Sciences, University.
Sound source segregation (determination)
Fundamentals of Perceptual Audio Encoding Craig Lewiston HST.723 Lab II 3/23/06.
Introduction to Frequency Selective Circuits
PULSE MODULATION.
Alan Kan, Corey Stoelb, Matthew Goupell, Ruth Litovsky
1 Improved Subjective Weighting Function ANSI C63.19 Working Group Submitted by Stephen Julstrom for October 2, 2007.
Measuring the brain’s response to temporally modulated sound stimuli Chloe Rose Institute of Digital Healthcare, WMG, University of Warwick, INTRODUCTION.
Hearing Impairment Hair cells are responsible for translating mechanical information into neural information. Thus, with damaged hair cells, the auditory.
Resonance, Revisited March 4, 2013 Leading Off… Project report #3 is due! Course Project #4 guidelines to hand out. Today: Resonance Before we get into.
SOUND IN THE WORLD AROUND US. OVERVIEW OF QUESTIONS What makes it possible to tell where a sound is coming from in space? When we are listening to a number.
METHODOLOGY INTRODUCTION ACKNOWLEDGEMENTS LITERATURE Low frequency information via a hearing aid has been shown to increase speech intelligibility in noise.
Applied Psychoacoustics Lecture: Binaural Hearing Jonas Braasch Jens Blauert.
Filtering. What Is Filtering? n Filtering is spectral shaping. n A filter changes the spectrum of a signal by emphasizing or de-emphasizing certain frequency.
Chapter 5: Normal Hearing. Objectives (1) Define threshold and minimum auditory sensitivity The normal hearing range for humans Define minimum audible.
Sounds in a reverberant room can interfere with the direct sound source. The normal hearing (NH) auditory system has a mechanism by which the echoes, or.
LATERALIZATION OF PHONOLOGY 2 DAY 23 – OCT 21, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
 Space… the sonic frontier. Perception of Direction  Spatial/Binaural Localization  Capability of the two ears to localize a sound source within an.
‘Missing Data’ speech recognition in reverberant conditions using binaural interaction Sue Harding, Jon Barker and Guy J. Brown Speech and Hearing Research.
Simulation of small head-movements on a Virtual Audio Display using headphone playback and HRTF synthesis Wersényi György SZÉCHENYI ISTVÁN UNIVERSITY,
3-D Sound and Spatial Audio MUS_TECH 348. Physical Modeling Problem: Can we model the physical acoustics of the directional hearing system and thereby.
Gammachirp Auditory Filter
Spatial and Spectral Properties of the Dummy-Head During Measurements in the Head-Shadow Area based on HRTF Evaluation Wersényi György SZÉCHENYI ISTVÁN.
Adaphed from Rappaport’s Chapter 5
Hearing Research Center
This research was supported by Delphi Automotive Systems
Additivity of auditory masking using Gaussian-shaped tones a Laback, B., a Balazs, P., a Toupin, G., b Necciari, T., b Savel, S., b Meunier, S., b Ystad,
Evaluating Perceptual Cue Reliabilities Robert Jacobs Department of Brain and Cognitive Sciences University of Rochester.
Functional Listening Evaluations:
Listeners weighting of cues for lateral angle: The duplex theory of sound localization revisited E. A. MacPherson & J. C. Middlebrooks (2002) HST. 723.
Amplitude/Phase Modulation
On the improvement of virtual localization in vertical directions using HRTF synthesis and additional filtering Wersényi György SZÉCHENYI ISTVÁN UNIVERSITY,
3-D Sound and Spatial Audio MUS_TECH 348. Are IID and ITD sufficient for localization? No, consider the “Cone of Confusion”
PSYC Auditory Science Spatial Hearing Chris Plack.
Fletcher’s band-widening experiment (1940)
What can we expect of cochlear implants for listening to speech in noisy environments? Andrew Faulkner: UCL Speech Hearing and Phonetic Sciences.
Sampling Rate Conversion by a Rational Factor, I/D
SPATIAL HEARING Ability to locate the direction of a sound. Ability to locate the direction of a sound. Localization: In free field Localization: In free.
Fundamentals of Sensation and Perception
Amity School of Engineering & Technology 1 Amity School of Engineering & Technology DIGITAL IMAGE PROCESSING & PATTERN RECOGNITION Credit Units: 4 Mukesh.
Auditory Localization in Rooms: Acoustic Analysis and Behavior
The influence of hearing loss and age on sensitivity to temporal fine structure Brian C.J. Moore Department of Experimental Psychology, University of Cambridge,
Volume 62, Issue 1, Pages (April 2009)
1-channel 2-channel 4-channel 8-channel 16-channel Original
RESULTS: Individual Data
Volume 62, Issue 1, Pages (April 2009)
Speech Perception (acoustic cues)
Localizing Sounds.
Waves.
3 primary cues for auditory localization: Interaural time difference (ITD) Interaural intensity difference Directional transfer function.
Week 13: Neurobiology of Hearing Part 2
Presentation transcript:

Localization cues with bilateral cochlear implants Bernhard U. Seeber and Hugo Fastl (2007) Maria Andrey Berezina HST.723 April 8 th, 2009

Goals and Background Investigate good localization ability that certain people with bilateral cochlear implants (CI) have. Localization in normal hearing subjects: 1) ITD is the dominant cue for low frequency tones (low-passed noises) 2) ILD is the dominant cue for high frequency tones (high-passed noises) 3) For wide band noises both cues have significant weight, although ITD has larger weight than ILD. 4) Ongoing ITD cues in envelop of high-pass noise contributes to localization. Present study: CI processors transmit ILD and envelop ITD, hence contribution of these two cues to localization is of interest.

Method Two subjects: BW: 50 y.o. male both processors: 300 – 5547 Hz for 12 channels DF: 54 y.o. male left processor: Hz for 8 channels right processor: Hz for 11 channels Localization: light pointer method – subject adjusted a movable light spot to perceived direction Experiment 1: Localization of free-field sound sources with different spectral content and temporal envelop structure. Experiment 2: Localization of wideband noise with acoustic effects of the head minimized. Experiment 3: ITDs and ILDs were manipulated in virtual acoustic space.

Experiment 1 Localization of free-field sound sources with different spectral content and temporal envelop structure. Low-pass noise (LPN): restricted availability of ILD, localization on basis of carrier or envelop ITDs LPN, 200 ms env. – some ILD, carrier ITD, no envelop ITD LPN, pulsed – some ILD, carrier ITD, envelop ITD High-pass noise (HPN): localized on basis of ILD, ITD could contribute HPN, scrambled – ILD, some envelop ITD Wide-band noise (WBN): ILD as well as both carrier and envelop ITD cues are availble WBN, pulse – ILD, carrier ITD, envelop ITD WBN-CI, 200 ms env. - ILD, carrier ITD, some envelop ITD

Experiment 1: Results BWDF LPN: BW –unable to localize DF – can localize: LPN with slow envelop changes - relatively well; when envelop ITD cue is strengthened, LPN pulsed, localization improves

Experiment 1: Results BWDF WBN: Both are able to localize, BW localizes WBN of either temporal structure equality well; DF can localizes both WBN stimuli well (but does slightly poorer on pulsed WBN stimulus)

Experiment 1: Results BWDF HPN Scrambled: Both are able to localize, ILD or envelop ITD must have been used.

Experiment 2 Localization of wideband noise with acoustic effects of the head minimized. Subject: DF Stimulus: WBN Pulsed: ILD as well as both carrier and envelop ITD cues are available Configurations: “Together” – processor brought together at the center of the head 18cm above the head (minimum ITD and ILD) “Distanced” – processors were placed exactly above ears, with distance in between them approximate to head diameter (preserve ITD, but give nearly no ILD) “Plate” – processors were placed next to each other but were separated by carton board with area approximate to cross-section of the head. (ITD are nearly absent, while ILD is evoked)

Experiment 2: Results The results of “Plate” condition suggest that ILD is a strong cue while results of “Distanced” condition indicates a small contribution of ITD cues. ”Together” – no localization “Distanced” – right – left differentiation “Plate” – localization improved significantly

Experiment 3 ITDs and ILDs were manipulated in virtual acoustic space Subject: BW Stimulus: WBN Pulsed, HPN, 100 ms env., HPN, pulsed Virtual Acoustics with CIs 1)Head-related transfer functions (HRTFs) were derived; response to maximum length sequences (MLS) were measured via CI’s processor; the processors were set to minimum amplification, thus resulting in least compressive setting. 2)The test stimuli were recorded also directly off processors with amplification and compression setting commonly used by subject. The recorded stimuli this contained all the directional information. 3)ITD and ILD bias was introduced by filtering the prerecorded stimulus with ITD- or ILD – only filter.

Experiment 3: Results Pulsed WBN: follows an ILD offset by about 50%. This confirms that ILDs provided the predominant information for localization of wide-band sounds, relative contribution of ITD cues is small. HPN with slow envelope changes: ILD-weight increases and ITD-influence appears absent. Pulsed HPN: The dominance of ILDs did not change. The pulsation emphasized envelope ITDs and their weight increased, but it still remained low.

Summary: Both subject consistently rely on ILD for localization of all tested sound types, while ILD contributed small amount of information.