Infant Discrimination of Voices: Predictions from the Intersensory Redundancy Hypothesis Lorraine E. Bahrick, Robert Lickliter, Melissa A. Shuman, Laura.

Slides:



Advertisements
Similar presentations
Cognitive Psychology, 2 nd Ed. Chapter 4. Selective vs. Divided Attention Selective attention: Process one stimulus while ignoring another. Divided attention:
Advertisements

Infant sensitivity to distributional information can affect phonetic discrimination Jessica Maye, Janet F. Werker, LouAnn Gerken A brief article from Cognition.
INFANTS’ PERCEPTION OF FACE-AFFECT RELATIONS IN MULTIMODAL EVENTS Melissa A. Shuman & Lorraine E. Bahrick Florida International University Introduction.
Chapter 4: Physical Development: Body, Brain, and Perception Perceptual Development By Kati Tumaneng (for Drs. Cook & Cook)
Figure 1 Mean Visual Recovery (and SD) to a novel object for trials where the object was used correctly vs. incorrectly in a moving and static display.
5-Month-Old Infants Match Facial and Vocal Expressions of Other Infants Mariana Vaillant and Lorraine E. Bahrick Florida International University Abstract.
Cognitive Processes PSY 334 Chapter 3 – Attention July 8, 2003.
EARLY COGNITIVE FOUNDATINS: SENSATION, PERCEPTION, AND LEARNING
STUDYING SENSATION & PERCEPTION IN NONVERBAL INFANTS A. Difference between sensation and perception? B.Techniques 1.Preference method.
ACKNOWLEDGMENTS REFERENCES -Are view-boundaries special – eliciting extrapolation? There are other boundaries within a view that have similar characteristics,
A Constructivistic Approach to Learning William G. Huitt, Ph.D. Valdosta State University Last revised: August 2000.
PART ONE: INTRODUCTION & GIBSON’S THEORY OF PERCEPTUAL DEVELOPMENT VCE UNIT 1 PSYCHOLOGY 2012 Chapter 5: Theories of Psychological Development.
Level 1 and Level 2 Auditory Perspective-taking in 3- and 4- Year -Olds Abstract Presented at the Psychology Undergraduate Research Conference, Atlanta,
Abstract The current literature on children’s recognition of faces typically uses static faces as opposed to dynamic, moving faces (e.g., Brace, Hole,
Chapter 6 Early Cognitive Foundations: Sensation, Perception, and Learning.
Audio Scene Analysis and Music Cognitive Elements of Music Listening
Year Review Nancy Rader May 13, esearch Emotion and Working Memory Temperament Infant Perception Attention and Early Language.
Negative Priming Vision vs. Audition Although there have been many studies examining the negative priming phenomenon, virtually all of the existing studies.
Cognitive Processes PSY 334 Chapter 3 – Attention April 14, 2003.
Pavlovian, Observational and Instructed Fear Learning: Emotional Responses to Unmasked and Masked Stimuli Andreas Olsson, Kristen Stedenfeld & Elizabeth.
Cognitive Information Processing Dr. K. A. Korb University of Jos.
Lecture 2b Readings: Kandell Schwartz et al Ch 27 Wolfe et al Chs 3 and 4.
Designing for Attention With Sound: Challenges and Extensions to Ecological Interface Design Marcus O. Watson and Penelope M. Sanderson HUMAN FACTORS,
Audiovisual Temporal Synchrony Directs Selective Listening in Four-Month-Old Infants Lorraine E. Bahrick, Melissa A. Shuman, & Irina Castellanos Department.
1. Background Evidence of phonetic perception during the first year of life: from language-universal listeners to native listeners: Consonants and vowels:
Social Orienting Impairment in Autism: Relations Among Look Length, Disengagement, and Symptom Severity Lorraine E. Bahrick, James T. Todd, Mariana Vaillant-Molina,
Three-month-old Infants Recognize Faces in Unimodal Visual but not Bimodal Audiovisual Stimulation Lorraine E. Bahrick 1, Lisa C. Newell 2, Melissa Shuman.
Session 6 : Perceptual Development and Learning Capacities Manju Nair.
Chapter 9: Goldstone, R. L., Gerganov, A., Landry, D. & Roberts, M. E. Learning to see and conceive (pp ). Interactive relationship between conception.
Physical Development In Utero: – Zygote: conception-2 weeks – Embryo: 2 weeks-2 months (8 weeks) Cell differentiation – Fetus: 2 months to birth Functioning.
Infants’ Discrimination of Speech and Faces: Testing the Predictions of the Intersensory Redundacy Hypothesis Mariana C. Wehrhahn and Lorraine E. Bahrick.
Intersensory Perception and Attention Disengagement in Young Children with Autism Lisa C. Newell 2, Lorraine E. Bahrick 1, Mariana Vaillant-Molina 1, Melissa.
Introduction Processing of configurational information is often highly affected by inversion Previous research has focused on the perception of static.
Chapter 4 The Development of Memory Basic laws of habituation and operant conditioning are sufficient for understanding behavioral relations called “memory.”
The Development of Basic Indices of Attention to Bimodal and Unimodal Social Events Across Infancy Lorraine E. Bahrick, Barbara M. Sorondo, Irina Castellanos,
1 Cross-language evidence for three factors in speech perception Sandra Anacleto uOttawa.
The effects of working memory load on negative priming in an N-back task Ewald Neumann Brain-Inspired Cognitive Systems (BICS) July, 2010.
Study Guide Questions 1. What are the contributions of object permanence, causality, means-end, imitation, and play to language development? 2. Cultural,
Abstract Bahrick and Lickliter (2000) recently proposed an intersensory redundancy hypothesis that states that early in development information presented.
Basic cognitive processes - 1 Psych 414 Prof. Jessica Sommerville.
Basic Cognitive Processes - 2
Abstract Prior research has demonstrated that young infants are able to perceive the affordance, or the potential for action, provided by the physical.
Bahrick, L. E. (1995). Intermodal origins of self-perception. In P. Rochat (Ed.), The self in infancy: Theory and research. Advances in Psychology Series.
Infant Perception of Object-Affect Relations Mariana Vaillant-Molina and Lorraine E. Bahrick Florida International University Presented at the Society.
The Role of Intersensory Redundancy in the Typical Development of Social Orienting Across Infancy: A New Hypothesis for Autism Lorraine E. Bahrick, James.
Development of Basic Indices of Attention to Nonsocial Events Across Infancy: Effects of Unimodal versus Bimodal Stimulation Lorraine E. Bahrick, James.
Multimodal Virtual Environments: Response Times, Attention, and Presence B 陳柏叡.
The Role of Multisensory Information in Infant Attention to Faces of Speakers Versus the Rhythm of Speech Lorraine E. Bahrick, Mariana Vaillant-Molina,
Extinction of Conditioned Behavior Chapter 9 Effects of Extinction Extinction and Original Learning What is learned during Extinction.
The Development of Face Perception in Dynamic, Multimodal Events: Predictions from the Intersensory Redundancy Hypothesis Lorraine E. Bahrick, Robert Lickliter,
Infants’ Detection of the Affordances of Everyday Objects Katryna Anasagasti, Lorraine E. Bahrick, and Laura C. Batista Florida International University.
Perception By: Sandra Rodriguez  Perception creates an object from sensory stimulation.  Our perceptual processes determine that certain features go.
Tonal Violations Interact with Lexical Processing: Evidence from Cross-modal Priming Meagan E. Curtis 1 and Jamshed J. Bharucha 2 1 Dept. of Psych. & Brain.
Intersensory Redundancy Facilitates Infants’ Perception of Meaning in Speech Passages Irina Castellanos, Melissa Shuman, and Lorraine E. Bahrick Florida.
Old or Young Lady?. Saxophone player or woman’s face?
Reinforcement Look at matched picture after sound ends & it moves 10 trials (5 of each pairing) 2 or 4 blocks (2 pairs of words, 2 pairs of swoops) Participants.
Processing Faces with Emotional Expressions: Negative Faces Cause Greater Stroop Interference for Young and Older Adults Gabrielle Osborne 1, Deborah Burke.
Audio Scene Analysis and Music Cognitive Elements of Music Listening Kevin D. Donohue Databeam Professor Electrical and Computer Engineering University.
The Development of Infants’ Sensitivity to the Orientation of Object Motion: Predictions of the Intersensory Redundancy Hypothesis Lorraine E. Bahrick.
Methods Identifying the Costs of Auditory Dominance on Visual Processing: An Eye Tracking Study Wesley R. Barnhart, Samuel Rivera, & Christopher W. Robinson.
THE EFFECT OF PRENATAL SENSORY STIMULATION ON PERCEPTUAL NARROWING IN BOBWHITE QUAIL NEONATES Briana O’Dowd.
Jenna Kelly1,2 & Nestor Matthews2
Adults’ Gestures During Object Naming and Preverbal Infants’ Word Mapping: Does Type of Motion Matter? Lakshmi J. Gogate & Dalit J. Matatyaho State University.
Effects of divided attention and sensorimotor synchronization on detection of timing perturbations in auditory sequences Bruno H. Repp Haskins Laboratories,
Susan Geffen, Suzanne Curtin and Susan Graham
in Single and Multiple Modalities,
Perceptual Development
Evidence of Inhibitory Processing During Visual Search
Title of Project School Name Miami, FL Introduction Problem Statement
Multisensory Integration: Maintaining the Perception of Synchrony
Presentation transcript:

Infant Discrimination of Voices: Predictions from the Intersensory Redundancy Hypothesis Lorraine E. Bahrick, Robert Lickliter, Melissa A. Shuman, Laura C. Batista, and Claudia Grandez Florida International University Abstract Bahrick & Lickliter (2000) proposed an “intersensory redundancy hypothesis” which holds that, in early development, experiencing an event redundantly across two senses facilitates perception of amodal properties (e.g., synchrony, tempo, rhythm) whereas experiencing an event in one sense modality alone facilitates perception of modality specific aspects of stimulation (e.g., pitch, timbre, color, pattern, configuration). Therefore, discrimination of individual voices (modality specific information) should be enhanced when the voices are presented unimodally, in the absence of intersensory redundancy, and attenuated when they are presented bimodally, in the presence of intersensory redundancy (where amodal properties are attended). Thirty-two 3-month-old infants were habituated to the voice of a woman speaking in the context of intersensory redundancy (along with the synchronously moving face) or no redundancy (with a static face). Test trials played the voice of a novel woman speaking. Results supported our prediction and demonstrated significant discrimination (measured by visual recovery to the novel voice) in the nonredundant, but not the redundant voice condition. These findings converge with those of our prior studies and demonstrate that in early development, infants attend to different properties of events as a function of whether the stimulation is multimodal or unimodal. Introduction Most early learning occurs in the context of close face-to-face interactions. Although research demonstrates that young infants are excellent perceivers of faces and voices, we know very little about their perception of naturalistic, dynamic, multimodal person displays. For example, we do not know under what conditions infants attend to information available in the face alone, the voice alone (modality specific information), or to information available in the face and voice together (amodal information). Bahrick and Lickliter (2000, 2002) provided evidence for an “intersensory redundancy hypothesis” which makes predictions about infant perception of amodal and modality specific properties in the context of multimodal versus unimodal stimulation. According to the hypothesis (see Figure 1), in early development, 1) information experienced redundantly across two sensory modalities selectively recruits attention to amodal properties of events (e.g., synchrony, tempo, rhythm) at the expense of modality specific properties, whereas 2) information experienced in one sense modality alone selectively recruits attention to modality-specific aspects of the event (e.g., color, pattern, orientation, pitch, timbre) and facilitates perceptual learning of these properties at the expense of others. Therefore, infants should attend and perceive modality specific properties, such as pitch and timbre of a voice, better when it is experienced unimodally (no intersensory redundancy) than when it is experienced bimodally with the synchronously moving face (intersensory redundancy). In contrast, in bimodal face-voice displays infant attention should be recruited to amodal properties that are redundantly presented (such as rhythm, tempo, and synchrony), at the expense of modality specific properties. The present study tested these predictions by asking whether 3-month-old infants could differentiate between two unfamiliar women’s voices better when the voice was experienced with or without the synchronously moving face. Method Thirty-two infants were habituated, in an infant control procedure, to face-voice displays of one of two women speaking a nursery rhyme in the presence versus absence of intersensory redundancy (see Figure 2). Intersensory redundancy was provided by accompanying the speaking voice with the natural, synchronously moving face of the woman speaking (bimodal, redundant condition, N=16). Intersensory redundancy was eliminated by presenting the speaking voice along with a static image of the face of the woman (nonredundant condition, N=16). Following habituation, infants received test trials with the voice of the novel woman speaking the same nursery rhyme, under their respective condition (with no change in the identity of the familiar face). Visual recovery to the change in voice served as the measure of discrimination. It was expected that infants would show visual recovery to the change in voice when it was experienced nonredundantly. However, when the change in voice was experienced bimodally and redundantly with the synchronously moving face, no significant visual recovery should be observed because greater attention would be directed to amodal properties of stimulation. Results Results supported our predictions and demonstrated significant visual recovery (p<.05) to the new voice under the nonredundant condition but no significant recovery under the bimodal, redundant condition (see Figure 3). Infants of 3-months showed no evidence of discriminating a change in voice when the voice was accompanied by a synchronously moving face, whereas when amodal, redundant information was eliminated by presenting the voice without the moving face, vocal discrimination was evident. Conclusion These results demonstrate that in the domain of person perception, infant differentiation between individual voices appears to be facilitated when the voices are heard without seeing the accompanying moving face, and attenuated when the face is visible. These findings support the predictions of the intersensory redundancy hypothesis and suggest that infants attend to different properties of events as a function of whether or not stimulation provides intersensory redundancy. In unimodal stimulation where no redundancy is available, attention to modality specific properties is promoted, whereas in bimodal stimulation where redundancy is evident, attention to amodal properties is promoted. This has important consequences for what is perceived, learned, and remembered in naturalistic events. References Bahrick, L.E. & Lickliter, R. (2000). Intersensory redundancy guides attentional selectivity and perceptual learning in infancy. Developmental Psychology, 36, Bahrick, L. E. & Lickliter, R. (2002). Intersensory redundancy guides early cognitive and perceptual development. In R. V. Kail (Ed.), Advances in child development and behavior, Vol. 30 (pp ). New York: Academic Press. Bimodal Unimodal Visual Recovery (s) Figure 3 Mean visual recovery (and SD) to a change in voice Modality Specific (Non Redundant) Multimodal (auditory visual) Unimodal (auditory or visual) Stimulation Available for Exploration Figure 2 Presented at the Society for Research in Child Development Biennial Meeting, April, 2003, Tampa, FL. This research was supported by NIMH grants RO1 MH and RO1 MH to the first and second authors, respectively. Requests for reprints should be sent to the first author at *p <.05 Figure 1 Amodal (Redundant) Stimulus Properties