© 2015 Psychology Press / Taylor & Francis

Slides:



Advertisements
Similar presentations
Figure Three-dimensional reconstruction of the left hemisphere of the human brain showing increased activity in ventrolateral area 45 during verbal.
Advertisements

Tone perception and production by Cantonese-speaking and English- speaking L2 learners of Mandarin Chinese Yen-Chen Hao Indiana University.
Human Speech Recognition Julia Hirschberg CS4706 (thanks to John-Paul Hosum for some slides)
Lexical Ambiguity in Sentence Comprehension By R. A. Mason & M. A. Just Brain Research 1146 (2007) Presented by Tatiana Luchkina.
Chapter 12 Speech Perception. Animals use sound to communicate in many ways Bird calls Bird calls Whale calls Whale calls Baboons shrieks Baboons shrieks.
Human Communication.
Human Neuropsychology,
SPEECH PERCEPTION 2 DAY 17 – OCT 4, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
The Neuroscience of Language. What is language? What is it for? Rapid efficient communication – (as such, other kinds of communication might be called.
functional magnetic resonance imaging study in a nonverbal task.
Speech/Language Function BCS 242 Neuropsychology Fall 2004.
Niebuhr, D‘Imperio, Gili Fivela, Cangemi 1 Are there “Shapers” and “Aligners” ? Individual differences in signalling pitch accent category.
Spatial Neglect and Attention Networks
J. Gandour, PhD, Linguistics B. Chandrasekaran, MS, Speech & Hearing J. Swaminathan, MS, Electrical Engineering Long term goal is to understand the nature.
Vocal Emotion Recognition with Cochlear Implants Xin Luo, Qian-Jie Fu, John J. Galvin III Presentation By Archie Archibong.
SYNTAX 1 DAY 30 – NOV 6, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
LATERALIZATION OF PHONOLOGY DAY 22 – OCT 18, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
Functional Anatomy of Spoken Input Note that the low-level auditory pathway is not specialized for speech sounds – Both speech and non-speech sounds activate.
Language. Using Language What is language for? Using Language What is language for? – Rapid, efficient communication To accomplish this goal, what needs.
Final Review Session Neural Correlates of Visual Awareness Mirror Neurons
What is Phonetics? Short answer: The study of speech sounds in all their aspects. Phonetics is about describing speech. (Note: phonetics ¹ phonics) Phonetic.
Sound and Speech. The vocal tract Figures from Graddol et al.
Auditory-acoustic relations and effects on language inventory Carrie Niziolek [carrien] may 2004.
Lateralization & The Split Brain and Cortical Localization of Language.
Phonetics and Phonology
Schizophrenia and Depression – Evidence in Speech Prosody Student: Yonatan Vaizman Advisor: Prof. Daphna Weinshall Joint work with Roie Kliper and Dr.
Background Infants and toddlers have detailed representations for their known vocabulary items Consonants (e.g., Swingley & Aslin, 2000; Fennel & Werker,
Categorizing Emotion in Spoken Language Janine K. Fitzpatrick and John Logan METHOD RESULTS We understand emotion through spoken language via two types.
Speech Perception 4/6/00 Acoustic-Perceptual Invariance in Speech Perceptual Constancy or Perceptual Invariance: –Perpetual constancy is necessary, however,
1 Speech Perception 3/30/00. 2 Speech Perception How do we perceive speech? –Multifaceted process –Not fully understood –Models & theories attempt to.
Suprasegmentals Segmental Segmental refers to phonemes and allophones and their attributes refers to phonemes and allophones and their attributes Supra-
Prosody-driven Sentence Processing: An Event-related Brain Potential Study Ann Pannekamp, Ulrike Toepel, Kai Alter, Anja Hahne and Angela D. Friederici.
Comprehension of Grammatical and Emotional Prosody is Impaired in Alzheimer’s Disease Vanessa Taler, Shari Baum, Howard Chertkow, Daniel Saumier and Reported.
 The origin of grammatical rules is ascribed to an innate system in the human brain.  The knowledge of and competence for human language is acquired.
Hearing in Time Slides used for talk to accompany Roger go to yellow three … Sarah Hawkins* and Antje Heinrich** *Centre for Music and Science, University.
Speech Perception 4/4/00.
A prosodically sensitive diphone synthesis system for Korean Kyuchul Yoon Linguistics Department The Ohio State University.
Psychology of Music Learning Miksza Music and Brain Research …from Peretz & Zatorre (2005)
1. Background Evidence of phonetic perception during the first year of life: from language-universal listeners to native listeners: Consonants and vowels:
SPEECH PERCEPTION DAY 16 – OCT 2, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
Segmental encoding of prosodic categories: A perception study through speech synthesis Kyuchul Yoon, Mary Beckman & Chris Brew.
Evaluating prosody prediction in synthesis with respect to Modern Greek prenuclear accents Elisabeth Chorianopoulou MSc in Speech and Language Processing.
LATERALIZATION OF PHONOLOGY 2 DAY 23 – OCT 21, 2013 Brain & Language LING NSCI Harry Howard Tulane University.
Language By Angela Moss Tanisha Flowers Reginald Alexander.
ADULT LANGUAGE DISORDERS Week 1 Jan 13, Text Book LaPointe, L. L. (2005). Aphasia and Related Neurogenic Language Disorders. 3rd edition, Thieme,
1 Cross-language evidence for three factors in speech perception Sandra Anacleto uOttawa.
Drummon, S. P. A., Brown, G. G., Gillin, J. C., Stricker, J. L., Wong, E. C., Buxton, R. B. Lecturer: Katie Yan.
Higher Mental Function: Information Processing Scott S. Rubin, Ph.D. Neuroscience.
Phonetics, part III: Suprasegmentals October 19, 2012.
AUDITORY CORTEX 4 SEPT 21, 2015 – DAY 12 Brain & Language LING NSCI Fall 2015.
Auditory Cortex 3 Sept 18, 2015 – DAY 11
Control of prosodic features under perturbation in collaboration with Frank Guenther Dept. of Cognitive and Neural Systems, BU Carrie Niziolek [carrien]
Suprasegmental Properties of Speech Robert A. Prosek, Ph.D. CSD 301 Robert A. Prosek, Ph.D. CSD 301.
By: Angela D. Friederici Presented By: Karol Krzywon.
Zatorre paper Presented by MaryKate Chester
The superior temporal sulcus Oct 4, 2017 – DAY 16
Physiology of Cerebral Cortex
Biological foundations of language
August 15, 2008, presented by Rio Akasaka
“I am going to the other movies”
SUPRASEGMENTAL PHONEME
Emotion and Social Cognition
RESULTS AND DISCUSSION Fall Level High-rising Fall Level High-rising
Studying Intonation Julia Hirschberg CS /21/2018.
Cerebral responses to vocal attractiveness and auditory
What is Phonetics? Short answer: The study of speech sounds in all their aspects. Phonetics is about describing speech. (Note: phonetics ¹ phonics) Phonetic.
Speech Perception.
NeuroLinguistics Mam Lubna Umar.
Towards a neural basis of auditory sentence processing
Topic: Language perception
Presentation transcript:

© 2015 Psychology Press / Taylor & Francis Chapter 7: Prosody © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis Two main kinds of prosody: emotional linguistic. A comprehensive model of the neural substrates of prosody has not yet been developed. However, two main proposals have been made about one of the key issues—namely, cerebral lateralization: Proposal #1: The key factor involves acoustic features: right-lateralized: long duration pitch variation left-lateralized: short duration temporal variation. Proposal #2: The key factor involves functional features: emotional prosody linguistic prosody. Each proposal has received some support, but neither one can account for all the data. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.1. Perception 1.1.1. The Right Mid to Anterior Superior Temporal Cortex—Auditory Integration Stimuli: sentences with emotional meanings produced in two ways: by actors who used appropriate prosody by Kali, a software program that builds expressions out of naturally spoken syllables but lacks emotional prosody. Contrast: The actor condition MINUS the Kali condition. The cluster of activation included the human voice-sensitive area (circles). © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.1. Perception 1.1.1. The Right Mid to Anterior Superior Temporal Cortex—Auditory Integration Receives input from early auditory areas. Sensitive to the combination of multiple prosodic parameters: stimulus duration mean intensity mean pitch pitch variability. Engaged by the emotional prosody of utterances even when listeners concentrate on other aspects of the stimuli, such as the semantic content, whether the speaker is male or female, or whether the sounds are presented to the right or left ear. This suggests that the region integrates prosodic information automatically. © 2015 Psychology Press / Taylor & Francis

1. Emotional Prosody 1.1. Perception 1.1.2. The Amygdala—Relevance Detection Mixed findings: Evidence for involvement in perceiving emotional prosody: Activated in many fMRI studies. Damage can disrupt recognition. Evidence against involvement in perceiving emotional prosody: Not activated in some fMRI studies. Damage does not always disrupt recognition. A more nuanced approach: Recent work suggests that the responsiveness of the amygdala to different tones of voice depends on the following factors: subjective relevance, i.e., how much the listener cares about other people contextual novelty, i.e., the degree to which the prosodic pattern is expected acoustic salience, i.e., how clearly the prosodic pattern stands out. © 2015 Psychology Press / Taylor & Francis

1. Emotional Prosody 1.1. Perception 1.1.3. The Right Ventral Frontoparietal Cortex—Emotion Simulation Analyzed behavioral and lesion data for 66 patients. Patients rated the degree to which the intonation of utterances expressed each of five types of emotion: happy sad angry afraid Surprised. Patients were divided into two subgroups on the basis of their performance: low-scoring (bottom half, N=33) high-scoring (top half, N=33). The lesion sites of the high-scoring subgroup were subtracted from those of the low-scoring subgroup, and three main regions of damage were found: right mid to anterior superior temporal bilateral orbitofrontal and inferior frontal right ventral frontoparietal. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.1. Perception Summary Right mid to anterior superior temporal cortex—auditory integration. Amygdalae—relevance detection. Right ventral frontoparietal region—emotion simulation. Basal ganglia—emotion simulation, sequence decoding, and/or response triggering. Bilateral orbitofrontal and inferior frontal cortices—cognitive evaluation. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.2. Production Right Hemisphere © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.2. Production Left Hemisphere Task: Repetition of prosodic patterns carried by: “I’m going to the other movie” “ba ba ba ba ba ba” “aaaaaahhhh.” Patients with right brain damage (RBD) were consistently impaired. But patients with left brain damage (LBD) performed increasingly better across the conditions. Suggests that the left hemisphere may be more sensitive to verbal-articulatory demands than to emotional prosody per se. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 1. Emotional Prosody 1.2. Production Basal Ganglia Case study of a patient with bilateral basal ganglia damage. Elicitation task: Produce semantically neutral sentences with angry, happy, sad, or surprised intonation. Repetition task: Copy the examiner’s renditions of the same sentences with the same intonation patterns. Listeners subjectively rated the “goodness” of each emotion type, and tried to objectively identify each emotion type. Note also Parkinson’s disease. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Three main domains of linguistic prosody: Syntactic: Sam is going to the party too Sam is going to the party too? The boy said, “The girl is cute” The boy, said the girl, is cute Lexical: object vs. object green house vs. greenhouse I like the big dog, not the little one Tonal: ma (high level tone) = “mother” ma (rising tone) = “numb,” “numbness,” “hemp,” or “cannabis” ma (falling-rising) = “horse” ma (falling tone) = “scold” © 2015 Psychology Press / Taylor & Francis

2. Linguistic Prosody 2.1. Perception Syntactic Domain The anterior superior temporal region (STR) was engaged bilaterally not only during the perception of normal speech, but also during the perception of pseudo speech (no real content words) and degraded speech (no segmental, lexical, or syntactic cues). This could reflect prosodic processing. In addition, for pseudo speech and degraded speech, but not for normal speech, there was activation in several frontal areas bilaterally. This could reflect effortful top-down processing. Lesion studies support the view that the perception of prosodic cues for declarative/interrogative mood is subserved bilaterally. But there may be a mild RH bias, since patients with RBD are worse than patients with LBD at understanding filtered utterances. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Syntactic Domain Examples of sentences with (a) one or (b) two intonational phrase boundaries (#): (a) Peter verspricht Anna zu arbeiten # und das Büro zu putzen. (Peter promises Anna to work and to clean the office.) (b) Peter verspricht # Anna zu erlasten # und das Büro zu putzen. (Peter promises to support Anna and to clean the office.) Condition with two intonational phrase boundaries MINUS condition with one intonational phrase boundary = bilateral mid superior temporal activation for natural sentences, but just left mid superior temporal activation for hummed sentences. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Tonal Domain Wong et al. (2004) © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Tonal Domain Wong et al. (2004) © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Tonal Domain Discrimination task involving: Chinese words (CC), in which Chinese tones were super-imposed on Chinese syllables tonal chimeras (CT), in which Thai tones were superimposed on Chinese syllables. Contrasts to identify regions most responsive to native tones: Chinese group: CC>CT Thai group: CT>CC Overlapping activation for both contrasts was found in the left planum temporale. This region displayed a double dissociation: Chinese group: Stronger response to Chinese than Thai tones Thai group: Stronger response to Thai than Chinese tones. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Tonal Domain Neuropsychological studies support left-hemisphere dominance for linguistic tone perception. The size of the left primary auditory cortex may even be relevant. There is also evidence that experience perceiving linguistic tones modifies the brainstem. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.1. Perception Summary Prosodic perception in the syntactic domain appears to be bilateral, with the following caveats: mild RH bias for declarative/interrogative distinctions mild LH bias for intonational phrase boundaries. Prosodic perception in the lexical domain appears to be governed primarily by the LH. Prosodic perception in the tonal domain appears to be governed primarily by the LH. Acoustic considerations can explain most of these patterns of hemispheric asymmetry. But functional considerations must be invoked to explain why linguistic tone perception is strongly left-lateralized whereas non-linguistic tone perception is strongly right-lateralized. The basal ganglia may also contribute to the perception of linguistic prosody. © 2015 Psychology Press / Taylor & Francis

© 2015 Psychology Press / Taylor & Francis 2. Linguistic Prosody 2.2. Production Prosodic production in the syntactic domain appears to be bilateral. Prosodic production in the lexical domain appears to be governed primarily by the LH. Prosodic production in the tonal domain appears to be governed primarily by the LH. © 2015 Psychology Press / Taylor & Francis