Download presentation
Presentation is loading. Please wait.
1
Susan Geffen, Suzanne Curtin and Susan Graham
Do infants distinguish between statements and questions using audiovisual information? Susan Geffen, Suzanne Curtin and Susan Graham Department of Psychology, University of Calgary Four-month-olds looked longer at the eyes, six-month-olds looked equally at eyes and mouth, 8 and 10-month-olds looked longer at the mouth, 12-month-olds looked equally at eyes and mouth Background Cross-linguistically, questions and statements have distinct prosodic and syntactic structures. Infants can distinguish between statements and questions based on prosodic [1,2] and distributional information [3]. Adults use a combination of auditory and visual (e.g. eyebrow raising, head tilt) prosody to distinguish between statements and questions [4]. Infants as young as 4-months are sensitive to the relationship between audio and visual stimuli [e.g. 5]. By 8-months, infants distinguish between point-light displays of faces speaking different infant-directed declarative sentences [6]. 10-month-olds detect audiovisual synchrony when presented with naturally produced infant-directed speech paired with only the upper part of a face, but not when the whole face was visible except for the mouth [7]. Since 4-month-olds are sensitive to the relationship between audio and visual stimuli [e.g. 5], we asked the following question: Research Question Do infants distinguish between statements and questions using audiovisual information? Participants Participants were English-learning infants (33 English-learning 4- and 6-month-olds, months, M =5.89 months) with no hearing or cognitive impairments Stimuli A 30-second video recording was made of the speaker reciting a story (for warm-up) 12 sentences were audio and video recorded as a statement and an echo question, read by a female speaker in an infant-directed register. Each trial consisted of three sets of side-by-side videos (presented one at a time) of the same speaker reciting the same sentence as a statement and an echo question, with one accompanying audio stimuli. Videos were matched on the duration of the audio stimuli. Statement Stimuli Question Stimuli 1. The man wore a yellow bowtie. Anne likes blueberry muffins. She made a tuna sandwich. 2. Sally bought a green sweater. There are flowers on the table. We had French toast for breakfast. 3. The teddy bear gives nice kisses. She drew a pretty picture. Billy kicked the soccer ball. 4. Mommy has a pink jacket. He did a double somersault. The parrot has red and yellow feathers. 1. She made a tuna sandwich? Billy kicked the soccer ball? There are flowers on the table? 2. We had French toast for breakfast? feathers? Anne likes blueberry muffins? 3. She drew a pretty picture? The man wore a yellow bowtie? Mommy has a pink jacket? 4. He did a double somersault? Sally bought a green sweater? The teddy bear gives nice kisses? Figure 2. Mean looking time duration for statement and question trials. Error bars represent +1 standard error of the mean. Results Proportions of hits to statement and question trials were calculated. One-sample t-tests demonstrated that neither sentence-type significantly differed from chance for 4-month-olds (p’s > 0.5) but looking time to questions was significantly above chance for 6-month-olds (p=0.004). Paired t-tests found no significant difference in looking time between sentence-types for 4-month-olds but a trend towards significance for 6-month-olds (p=0.058). Discussion Results suggest that neither age group is consistently matching auditory and visual stimuli for sentence-type discrimination, although 6-month-olds may be starting to. These results may be attributable to developmental differences in attention to auditory and visual cues. 6-months may be too young for infants to integrate audiovisual information with complex stimuli (i.e. sentences). We are currently testing 10-month-olds to further evaluate the developmental trajectory of this ability. Infants may also be attending to different parts of the face (e.g. mouth, eyebrows) at different ages [8]. These findings address the gap in audiovisual integration between infants’ ability to match vowels [e.g. 5] and sentences [7]. Billy kicked the soccer ball? Figure 1. Illustration of a sample video pair. The movie consisted of two talking faces (one saying a statement, one saying an echo question) displayed side-by-side, while a synchronized audio track matching one of the faces (either statement or question) was heard. Design & Procedure We used a version of the Intermodal Preferential Looking Procedure [e.g.5,7]. The warm-up trial consisted of a 30-second video of the speaker reciting a story, to familiarize infants to the speaker and the simultaneous audiovisual stimuli. The test phase consisted of two blocks of 4 trials each; half were statements and half were questions. Each trial consisted of three sets of side-by-side videos (presented one at a time) of the same speaker reciting the same sentence as a statement and an echo question, with one accompanying audio stimuli. References [1] Frota, S., Butler, J., & Vigário, M. (2014). Infants' Perception of Intonation: Is It a Statement or a Question? Infancy, 19(2), [2] Soderstrom, M., Ko, E. & Nevzorova, U. (2011). It’s a question? Infants attend differently to polar questions and declaratives. Infant behavior and development, 34, [3] Geffen, S., & Mintz, T. H. (2015). Can you believe it? 12-Month-olds use word order to distinguish between declaratives and polar interrogatives. Language Learning and Development, 11(3), [4] Srinivasan, R. J., & Massaro, D. W. (2003). Perceiving prosody from the face and voice: Distinguishing statements from echoic questions in English. Language and Speech, 46(1), 1-22. [5] Kuhl, P & Meltzoff, A. (1982). The bimodal perception of speech in infancy. Science, 218, [6] Kitamura, C., Guellaï, B., & Kim, J. (2014). Motherese by Eye and Ear: Infants Perceive Visual Prosody in Point-Line Displays of Talking Heads. PloS One, 9(10), e [7] Blossom, M., & Morgan, J. L. (2006). Does the face say what the mouth says? A study of infants’ sensitivity to visual prosody. In 30th annual Boston University conference on language development, Somerville, MA. [8] Lewkowicz, D. , & Hansen-Tift, A. (2012). Infants deploy selective attention to the mouth of a talking face when learning speech. Proceedings of the National Academy of Sciences, 109(5),
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.