Presentation is loading. Please wait.

Presentation is loading. Please wait.

Contact Introduction Viewing a talking person’s face and mouth may enhance speech understanding.

Similar presentations


Presentation on theme: "Contact Introduction Viewing a talking person’s face and mouth may enhance speech understanding."— Presentation transcript:

1 Contact Email: Luuk.vandeRijt@radboudumc.nlLuuk.vandeRijt@radboudumc.nl Introduction Viewing a talking person’s face and mouth may enhance speech understanding in noisy environments (Helfer et al, 1997). This is regarded as a form of multisensory integration, in which unisensory signals from multiple modalities are merged to form a coherent and enhanced percept (Stein & Meredith, 1993). When it comes to study this in cochlear implant (CI) patients most techniques (e.g. fMRI, EEG) are not feasible. Here, we characterize multisensory integration in speech processing by an alternative, non-invasive method of recording neural activity: functional near-infrared spectroscopy (fNIRS). Objective To test whether multisensory integration in human auditory cortex can be demonstrated with fNIRS in normal-hearing subjects and post-lingually deaf CI-users. Methods Subjects. 33 normal-hearing adults; 15 female (pure tone air conduction thresholds <= 15 dB HL, 18-57 years). 5 post-lingually deaf CI-users; 5 female (> 5 years use of cochlear implant). Task. Subjects were instructed to view and listen to the segments of a story, and were afterwards asked whether they understood the gist of the storyline (‘passive listening’). Paradigm. The segments were played in chronological order, followed by a silent, dark period from 25-40s. The segments were divided in three blocks of 12. For every block, the 12 segments were pseudo-randomnly assigned to an experimental sensory-modality condition. Recordings. Data was collected with a pulsed continuous-wave NIRS- instrument (Oxymon MK III, Artinis Medical Systems BV). Normal-hearing subjects: measured bilaterally CI-users: measured contralateral to the implant Figure 1. (A) Schematic layout of optical sources (open circles) and photodetectors (filled circles) on the left hemisphere. (B) Schematic top view of probe layout. Results Figure 2. Grand average responses of normal-hearing subjects (A/C) and CI-users (B/D). Lines denote colors are red (visual), blue (auditory), green (audiovisual) and grey (additive model). Figure 3. Pooled β-regression coefficients for normal-hearing subjects (open circles) and CI-users (filled squares). Kdfg Adsfasdfasdf Discussion These data indicate that the fNIRS is suitable in normal-hearing subjects and CI-users. It demonstrates that cortical responses can be measured. Results show evoked responses of auditory cortex to all stimulus modalities: visual-only condition elicited smallest response, as expected auditory and audiovisual stimuli showed clear responses. Conclusion This study indicates a preliminary attempt to determine if the novel neuroimaging modality fNIRS could be useful in assessing auditory function in subjects using CIs. Auditory cortex activation to audiovisual speech in normal-hearing subjects and CI-users measured with functional near-infrared spectroscopy L.P.H. van de Rijt 1,2, E.A.M. Mylanus 1, A.J. van Opstal 2, A.F.M. Snik 1, M.M. van Wanrooij 2 ¹Department of Otorhinolaryngology, Head and Neck Surgery, Radboudumc ²Department of Biophysics, Donders Institute for Brain, Cognition and Behaviour, Radboud University


Download ppt "Contact Introduction Viewing a talking person’s face and mouth may enhance speech understanding."

Similar presentations


Ads by Google