AUDITORY LOCALIZATION Lynn E. Cook, AuD Occupational Audiologist NNMC, Bethesda, MD.

Slides:



Advertisements
Similar presentations
Auditory Localisation
Advertisements

In this presentation you will: explore how sound is propagated
Psychoacoustics Perception of Direction AUD202 Audio and Acoustics Theory.
Binaural Hearing Or now hear this! Upcoming Talk: Isabelle Peretz Musical & Non-musical Brains Nov. 12 noon + Lunch Rm 2068B South Building.
3-D Sound and Spatial Audio MUS_TECH 348. Psychology of Spatial Hearing There are acoustic events that take place in the environment. These can give rise.
Hearing Detection Loudness Localization Scene Analysis Music Speech.
3-D Sound and Spatial Audio MUS_TECH 348. Wightman & Kistler (1989) Headphone simulation of free-field listening I. Stimulus synthesis II. Psychophysical.
Localizing Sounds. When we perceive a sound, we often simultaneously perceive the location of that sound. Even new born infants orient their eyes toward.
1 Auditory Sensitivity, Masking and Binaural Hearing.
Hospital Physics Group
All you have is a pair of instruments (basilar membranes) that measure air pressure fluctuations over time Localization.
Exam and Assignment Dates Midterm 1 Feb 3 rd and 4 th Midterm 2 March 9 th and 10 th Final April 20 th and 21 st Idea journal assignment is due on last.
Development of sound localization
ICA Madrid 9/7/ Simulating distance cues in virtual reverberant environments Norbert Kopčo 1, Scott Santarelli, Virginia Best, and Barbara Shinn-Cunningham.
Sensory Systems: Auditory. What do we hear? Sound is a compression wave: When speaker is stationary, the air is uniformly dense Speaker Air Molecules.
There are several clues you could use: 1.arrival time 2.phase lag (waves are out of sync) 3.sound shadow (intensity difference)- sound is louder at ear.
The Auditory System. Audition (Hearing)  Transduction of physical sound waves into brain activity via the ear. Sound is perceptual and subjective. 
Hearing & Deafness (3) Auditory Localisation
AUDITORY PERCEPTION Pitch Perception Localization Auditory Scene Analysis.
Spectral centroid 6 harmonics: f0 = 100Hz E.g. 1: Amplitudes: 6; 5.75; 4; 3.2; 2; 1 [(100*6)+(200*5.75)+(300*4)+(400*3.2)+(500*2 )+(600*1)] / = 265.6Hz.
TOPIC 4 BEHAVIORAL ASSESSMENT MEASURES. The Audiometer Types Clinical Screening.
1 Recent development in hearing aid technology Lena L N Wong Division of Speech & Hearing Sciences University of Hong Kong.
PURE TONE AUDIOMETRY BALASUBRAMANIAN THIAGARAJAN DRTBALU'S OTOLARYNGOLOGY ONLINE.
Audiology Training Course ——Marketing Dept. Configuration of the ear ① Pinna ② Ear canal ③ Eardrum ④ Malleus ⑤ Incus ⑥ Eustachian tube ⑦ Stapes ⑧ Semicircular.
Copyright © 2009 Wolters Kluwer Health | Lippincott Williams & Wilkins Chapter 3 Structure and Function of the Auditory System.
SOUND IN THE WORLD AROUND US. OVERVIEW OF QUESTIONS What makes it possible to tell where a sound is coming from in space? When we are listening to a number.
3-D Sound and Spatial Audio MUS_TECH 348. Main Types of Errors Front-back reversals Angle error Some Experimental Results Most front-back errors are front-to-back.
Chapter 12: Auditory Localization and Organization
Chapter 5: Normal Hearing. Objectives (1) Define threshold and minimum auditory sensitivity The normal hearing range for humans Define minimum audible.
Pure Tone Audiometry most commonly used test for evaluating auditory sensitivity delivered primarily through air conduction and bone conduction displayed.
 Space… the sonic frontier. Perception of Direction  Spatial/Binaural Localization  Capability of the two ears to localize a sound source within an.
Chapter 12: Sound Localization and the Auditory Scene.
How Can You Localize Sound? Ponder this: –Imagine digging two trenches in the sand beside a lake so that water can flow into them. Now imagine hanging.
3-D Sound and Spatial Audio MUS_TECH 348. Physical Modeling Problem: Can we model the physical acoustics of the directional hearing system and thereby.
Fiber Optic Transmission SL/HL – Option C.3. Reflection/Refraction Reflection – A wave encounters a boundary between two mediums and cannot pass through.
Figures for Chapter 14 Binaural and bilateral issues Dillon (2001) Hearing Aids.
Spatial and Spectral Properties of the Dummy-Head During Measurements in the Head-Shadow Area based on HRTF Evaluation Wersényi György SZÉCHENYI ISTVÁN.
Hearing Research Center
This research was supported by Delphi Automotive Systems
Reading Assignment! We’ll discuss the chapter by Gregory in your book on Thursday of next week.
Hearing Conservation Training
Auditory & tactile displays EGR 412 Human Factors Engineering ISE
Auditory Neuroscience 1 Spatial Hearing Systems Biology Doctoral Training Program Physiology course Prof. Jan Schnupp HowYourBrainWorks.net.
Listeners weighting of cues for lateral angle: The duplex theory of sound localization revisited E. A. MacPherson & J. C. Middlebrooks (2002) HST. 723.
Sound. Characteristics of Sound Intensity of Sound: Decibels The Ear and Its Response; Loudness Interference of Sound Waves; Beats Doppler Effect Topics.
Hearing Detection Loudness Localization Scene Analysis Music Speech.
Hearing in Distance Or Where is that sound? Today: Isabelle Peretz Musical & Non-musical Brains Nov. 12 noon + Lunch Rm 2068B South Building.
3-D Sound and Spatial Audio MUS_TECH 348. Are IID and ITD sufficient for localization? No, consider the “Cone of Confusion”
AUDIOMETRY An Audiometer is a machine, which is used to determine the hearing loss in an individual.
Artifacts Ultrasound Physics George David, M.S.
Development of Sound Localization 2 How do the neural mechanisms subserving sound localization develop?
PSYC Auditory Science Spatial Hearing Chris Plack.
Fletcher’s band-widening experiment (1940)
The owl Ecology and behavior
SOUND 5 th Six Weeks. Intro to Sound The source of all waves (including sound) are vibrations. In a sound wave, a disturbance causes molecules in a medium.
Sound Reception Types of ears Extraction of information –Direction –Frequency –Amplitude Comparative survey of animal ears.
SPATIAL HEARING Ability to locate the direction of a sound. Ability to locate the direction of a sound. Localization: In free field Localization: In free.
Fundamentals of Sensation and Perception
3-D Sound and Spatial Audio MUS_TECH 348. What do these terms mean? Both terms are very general. “3-D sound” usually implies the perception of point sources.
Sound Localization and Binaural Hearing
Ultrasound Physics Image Formation ‘97.
Auditory Localization in Rooms: Acoustic Analysis and Behavior
PSYCHOACOUSTICS A branch of psychophysics
Precedence-based speech segregation in a virtual auditory environment
Ultrasound Physics Image Formation ‘97.
Consistent and inconsistent interaural cues don't differ for tone detection but do differ for speech recognition Frederick Gallun Kasey Jakien Rachel Ellinger.
The barn owl (Tyto alba)
Localizing Sounds.
3 primary cues for auditory localization: Interaural time difference (ITD) Interaural intensity difference Directional transfer function.
Week 13: Neurobiology of Hearing Part 2
Presentation transcript:

AUDITORY LOCALIZATION Lynn E. Cook, AuD Occupational Audiologist NNMC, Bethesda, MD

How do we tell where a sound is coming from? LOCALIZATION The ability to identify the direction and distance of a sound source outside the head LATERALIZATION Occurs when headphones are used, and the sound appears to come from within the head.

LOCALIZATION Complex perceptual process Sensory integration of a variety of cues Still no consensus on how these cues are weighted, the frequency range over which each is viable, the regions of auditory space where each is important, and relative accuracy of each.

Horizontal Localization (L vs R) Perceived by comparing the signal input between two ears Interaural time difference (ITD) Interaural phase difference Interaural level difference (ILD)

ITD Sounds arrive earlier at the ear closest to the source. The difference in arrival time=ITD – Dependant on speed of sound and size of head ITD = 0 for frontally incident sound ITD ~ 0.7 msec for 90° azimuth (maximum)

Interaural Phase Difference Coincident with the time delay (ITD) Varies systematically with source azimuth and wavelength due to distance from source and refraction around the head Useful for frequencies up to about 700 Hz. Sound envelope provides similar information for higher frequencies, but to a lesser degree Dominant cue for horizontal localization for frequencies up to 1500 Hz.

Interaural Level Difference (ILD) Due to head shadow effects Head and pinna defraction attenuates sound at far ear, while boosting the sound at near ear. Greatest for high frequency sounds Most pronounced for frequencies>1500Hz. About 20 dB at 6K, almost 0 at 200 Hz.

Horizontal localization poorest at 1500 Hz. Most precise at 800 Hz, esp. when source is directly in front of listener.

Horizontal Localization Low Frequencies / Timing Cues Dominate High Frequencies / Intensity Cues Dominate

Accurate horizontal localization is possible ONLY when the relevant acoustic cues are clearly audible in BOTH EARS

Vertical localization (Up/Down) Determined from pinna cues Listener’s intimate knowledge of complex geometry of pinna helps pinpoint elevation For freq’s above 5K Shoulder reflection causes changes in signal in 2-3 K range

Front/Back Localization Less understood Spectral balance = primary cue – Hi freq sounds boosted by pinna when they arrive from the front; attenuated when from behind MOST COMMON LOCALIZATION ERROR!

Reducing ambiguity Head movement – Feasible for sources up to 18’ – Listener must be able to turn head, and source must be repeated or be continuous for sufficient time to allow multiple head orientations – Provides info re: front vs. back & distance – Cues are found in variance in ITD’s and ILD’s as listener moves head

Reducing ambiguity (con’t) Non acoustic cues may also contribute – Visual cues – Source familiarity Comparison with stored patterns – Once head reaches final size and distance between ears, nothing will change these stored patterns except ear disease, trauma, or hearing changes – Can adapt to stable unilateral hearing loss, assuming sound remains audible on both sides.

Why is auditory localization important? Allows us to pinpoint a sound of interest Locate the position of another person Locate direction and distance of a moving sound source Allows us to quickly locate and attend to a speaker, esp. in multi-talker situations

Visual localization Just as accurate, but not nearly as efficient Not possible in low or reduced light situations, or when the source of the sound cannot be visualized

Effects of hearing loss on localization ablility Horizontal localization ability decreases with increasing low freq. hearing loss (below 1500 Hz) Sounds must be audible (at least 10 dB above threshold) Vertical localization ability decreases with increasing high freq. hearing loss

Unilateral hearing loss Severely disrupts horizontal localization ability Front to back localization remains intact (other studies dispute this) Vertical localization only slightly affected provided the other ear is adequate

Monaural localization May be possible, but not as accuarate as binaural localization Time delay between direct and pinna- reflected sound is the dominant cue for monaural localization Skill disrupted when pinna is taped flat, filled with putty, or bypassed with glass tubes

Repetitions plus head movement First occurrence of the sound random in terms of spatial orientation Listener makes effort to turn towards source for second repetition Third repetition with head at third (random) angle provides refined information

Conductive hearing loss Results in marked decrease in localization ability – As conductive component increases, the amount of B/C information becomes dominant where there is no interaural attenuation – Conductive hearing loss also causes disruption in phase information critical to localization

How do we measure localization ability? No standardized way to directly measure this ability Must be done through your own pinnae, therefore headphones tests (lateralization tasks) are not the same thing, even when head transfer functions are considered.

Effects of noise on localization Greatest decrease in accuracy found in judgment of front/back differences Up/down errors occur with less frequency Least influence on left/right judgments Accuracy decreases as S/N decreases

Source Azimuth in Noise Test (SAINT) Vermiglio 1999 Listener sits in clock-like array of 12 speakers Task is to detect a signal (pistol shot, female vocalization) in quiet and in noise (helicopter noise, crowd noise) for a variety of presentation azimuths May be tested under headphones (no pinna cues for horizontal localization)

Hearing in Noise Test (HINT) Soli and Nillson, 1994 NOT a localization test May, however, provide indirect proof of binaural superiority as many subjects with unilateral loss will fail the portion of the HINT where noise is directed towards the good ear.

Establishing an audiometric standard

Suggested guidelines Applicants must have adequate and usable hearing in both ears, particularly for the all- important speech frequencies SRT MUST BE 25 dB OR BETTER IN EACH EAR WHEN TESTED UNDER HEADPHONES

Suggested guidelines, con’t Low frequency hearing loss in one or both ears averaging 50 dB at the frequencies of 500 and 1000 Hz. should be disqualifying in and of itself, regardless of performance on any other applicable audiometric tests

Suggested guidelines (con’t) Conditions involving fluctuating hearing loss such as Meniere’s disease should be disqualifying until such a point occurs that the hearing loss remains stable for at least 30 days. If the thresholds of 500, 1000, and 2000 Hz. differ by 25 dB or more in either ear, for two audiograms separated by at least 48 hours, hearing levels may be considered unstable.

Suggested guidelines (con’t) Unresolved or chronic conductive hearing loss in one or both ears, where air/bone gap exceeds an average of 25 dB at the frequencies 500 and 1000 Hz, should be disqualifying until or unless the condition can be successfully resolved through medical and/or surgical means

Use of hearing aids Hearing aids alter both time and intensity cues Digital processing can delay the sound by several msec., signal is further delayed as it travels through tubing, transducers, etc. Vented hearing aids allow listener to receive two different signals, which can cause ambiguity in time, phase, and intensity cues Coupling of device to ear eliminates critical pinna cues needed for vertical and front/back localization