Presentation is loading. Please wait.

Presentation is loading. Please wait.

The two different parts of speech Speech Production Speech Perception.

Similar presentations


Presentation on theme: "The two different parts of speech Speech Production Speech Perception."— Presentation transcript:

1 The two different parts of speech Speech Production Speech Perception

2 Three basic components Respiration Phonation Articulation

3 Air starts in diaphragm Pushed out of lungs through trachea, up to larynx At larynx air goes through vocal cords Phonation – process of vocal folds vibrating when air pushed out of the lungs

4 Vocal Cords: Just above the Larynx

5

6 The airway above the larynx used to produce speech, including: Oral Tract Nasal Tract

7 Articulation Speech sounds most often described in terms of articulation Vowel sounds are made with a relatively open vocal tract – Shape of the mouth and lips influence the sound of vowels – Vary with how high or low and how forward or back the tongue is placed in oral tract Consonant sound can be classified according to three articulatory dimensions influencing air flow: 1. Place of Articulation – obstructed at lips or behind teeth 2. Manner of Articulation – total or partial obstruction 3. Voicing –vocal cord vibrating or NOT vibrating

8 Place of Articulation Airflow can be obstructed: - At the lips (bilabial speech sounds ‘ba’, ‘pa’) - At the alveolar ridge just behind the teeth (alveolar speech sounds ‘dee’, ‘tee’) - At the soft palate (velar speech sounds ‘ga’, ‘ka’)

9 Manner of Articulation “Manner” of Airflow can be: - Totally obstructed (stops ‘ba’) - Partially obstructed (fricatives ‘es’) - Only slightly obstructed (laterals ‘ar’, ‘eL’, and glides ‘wa’, ‘ya’) - First blocked, then allowed to sneak through (affricates ‘cha’) - Blacked at first from going through the mouth but allowed to go through the nasal passage (nasals ‘na’, ‘ma’)

10 Voicing Whether the vocal cords are: - Vibrating (Voiced consonants ‘ba’) - Not vibrating (voiceless consonants ‘pa’)

11 Coarticulation & Speech Perception The overlap of articulation in space and time Production of one speech sound overlaps the production of the next Does not cause much trouble for listeners and understanding speech (in your native language), but makes understanding speech perception harder for learning new languages (and for researchers)

12 Components of Speech Perception

13 Components of language

14 Crucial need for language stimulation Talking to the baby is essential: correlation between mom and baby’s language “Infant Directed Speech” (“motherese”) How we teach the baby about our “mother tongue” and how our society communicates

15 Crucial need for language stimulation Infant Directed Speech & hearing the sounds of language – prosody: rhythm, tempo, cadence, melody, intonation – categorical perception: differentiating the phonemes (sounds) of your “mother tongue” – Turn-taking & social reciprocity: the communication dance

16 “Categorical” Speech Perception Learning the phonemes (speech sounds) of your “mother tongue” –200 “speech sounds” (sound “categories”) universally heard by all newborns –45-ish sound categories (/bah/, /pah/, /rah/, /lah/, etc.) are heard in typical languages by adults –Why are they called “categories” of sound? – Because “bah” is always “bah” no matter who says it – You hear “bah” if it is said by 2 year old or 22-year old; female or male speaker, etc.

17

18 “Categorical” Speech Perception Learning the phonemes (speech sounds) of your “mother tongue” –What happened to the other 150-ish sounds categories that young babies can hear, but are lost by adulthood? –Examples: Adult Japanese speakers do not distinguish the sounds /rah/ vs. /lah/ -- “flied lice” for “fried rice” –The Japanese language do not differentiate these sound “categories”(they are the same sound in Japanese) –English does differentiate these sounds in their langauge –What sounds do babies continue to “hear”? Those that they are routinely exposed to -- “Use it, or lose it”

19

20 Speech Perception through your eyes? The McGurk Effect What you see influences what you hear http://www.youtube.com/watch?v=G- lN8vWm3m0 http://www.youtube.com/watch?v=G- lN8vWm3m0

21 Crucial need for language stimulation Critical Periods – Victor the “wild child” -- Aveyron, France, 1800: learned a few words. – Genie imprisoned in home until 13-years of age Developed “toddler-like” language “Father take piece wood. Hit. Cry.”

22 Language Deprivation Oxana Malaya -- http://www.youtube.com/watch?v=2PyUfG9u -P4 http://www.youtube.com/watch?v=2PyUfG9u -P4 Genie -- http://www.youtube.com/watch?v=ICUZN462 qMw http://www.youtube.com/watch?v=ICUZN462 qMw

23 Language Development: need of a human brain

24

25 Primate Communication and Symbolic Skills (tool use) Imitation and discourse (Oprah piece) -- http://www.youtube.com/watch?v=jKauXrp9dl4&feature= related http://www.youtube.com/watch?v=jKauXrp9dl4&feature= related Vocabulary – http://www.youtube.com/watch?v=wRM7vTrIIis Receptive vocabulary -- http://www.youtube.com/watch?v=h7IdghtkKmA&feature =related http://www.youtube.com/watch?v=h7IdghtkKmA&feature =related

26 The human brain & language

27

28 Most linguists ignore the Basal Ganglia (BG) Swearing associated with BG involvement (see Steve Pinker, 2008) Parkinson’s patients “loss” of language implicates the importance of BG

29 Speech In The Brain – “hemispheric lateralization” Development of techniques for brain imaging has made it possible for us to learn more about how speech is processed in the brain Ex: positron emission tomography (PET) and functional magnetic resonance imaging (fMRI) For most people (who are right handed), the left hemisphere of the brain is dominant for language processing – 95% of right-handed are left temporal lobe dominant – About 19% of left-handed are right temporal lobe dom. – About 20% of left-handed have “bilateral” language dominance between hemispheres

30 Speech In The Brain – “hemispheric lateralization” Processing of complex sounds relies on additional areas of the cortex adjacent to A1 – called belt and parabelt regions – usually referred to as “secondary” or “association” areas These areas are activated when listeners hear speech and music (and all other sounds) Also, activity in response to music, speech, and other complex sounds is relatively balanced across the 2 hemispheres – But, there are “preferences” for some sounds processed in one hemisphere (language left), vs. the other (non-speech in right)

31 Speech & “hemispheric lateralization”


Download ppt "The two different parts of speech Speech Production Speech Perception."

Similar presentations


Ads by Google