The Development of Emotional Interactions Across the Senses:

Slides:



Advertisements
Similar presentations
THE ROLE OF EMOTION-SPECIFIC RESOURCES IN CROSS-MODAL PROCESSING OF EMOTIONAL STIMULI Marie Bayot – Asp FNRS 2012 In the following slides, you will find.
Advertisements

Initial Validation of a New Measure of Facial Expression Recognition: Survivors of Childhood Cancer Compared to Typically-Developing Children Melanie J.
Event-related potentials (ERPs) have been used in past research to study the correlates and consequences of alcohol use (Porjesz et al., 2005). In particular,
Visual speech speeds up the neural processing of auditory speech van Wassenhove, V., Grant, K. W., & Poeppel, D. (2005) Proceedings of the National Academy.
Detecting Conflict-Related Changes in the ACC Judy Savitskaya 1, Jack Grinband 1,3, Tor Wager 2, Vincent P. Ferrera 3, Joy Hirsch 1,3 1.Program for Imaging.
Perceptual Coherence in Adults with Congenital and Acquired Hearing Losses Andrea Pittman, PhD Arizona State University Supported by a grant from NIDCD.
INFANTS’ PERCEPTION OF FACE-AFFECT RELATIONS IN MULTIMODAL EVENTS Melissa A. Shuman & Lorraine E. Bahrick Florida International University Introduction.
Effects of Sound and Visual Congruency on Product Selection and Preference Brock Bass Felipe Fernandez Drew Link Andrew Schmitz.
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
5-Month-Old Infants Match Facial and Vocal Expressions of Other Infants Mariana Vaillant and Lorraine E. Bahrick Florida International University Abstract.
 The results of Experiment 2 replicated those of Experiment 1. Error rates were comparable for younger adults (2.4%) and older adults (2.1%).  Again,
Copyright restrictions may apply JAMA Ophthalmology Journal Club Slides: Amblyopia and Visual-Auditory Speech Perception Burgmeier R, Desai RU, Farner.
Shared Perceptual Basis of Emotional Expression and Trustworthiness Impressions From Faces Nikolaas N. Oosterhof & Alexander Todorov Kira Bucca.
Visual Hemifields and Perceptual Grouping Sarah Theobald & Nestor Matthews Department of Psychology, Denison University, Granville OH USA The human.
Results – effects of having a target named Comparing known & unknown targets (“name” condition) with known & unknown pictures (“look” condition) In other.
Visually-induced auditory spatial adaptation in monkeys and humans Norbert Kopčo, I-Fan Lin, Barbara Shinn-Cunningham, Jennifer Groh Center for Cognitive.
There’s more to emotion than meets the eye: Processing of emotional prosody in the auditory domain Lauren Cornew, 1 Tracy Love, 1,2 Georgina Batten, 1.
Neural Activation and Attention Bias to Emotional Faces in Autism Spectrum Disorders S.J. Weng, H. Louro, S.J. Peltier, J. Zaccagnini, L.I. Dayton, P.
Method Participants Participants were 68 preschoolers, between the ages of 29 and 59 months of age. The sample was comprised of 32 male participants and.
Instrument One instructional (INS) slide and three masking (MSK) slides provided directions for 4 test slides that each contained 3 lists of 3 color words.
Training Phase Results The RT difference between gain and loss was numerically larger for the second half of the trials than the first half, as predicted,
Results Introduction Nonconditional Feedback Selectively Eliminates Conflict Adaption Summary Methods 38 participants performed a parity judgment task.
1 Cross-language evidence for three factors in speech perception Sandra Anacleto uOttawa.
Too happy to careAlcohol, Affect and ERN amplitude Too happy to care: Alcohol, Affect and ERN amplitude Conclusions: Consistent with Ridderinkhof et al.
Introduction Can you read the following paragraph? Can we derive meaning from words even if they are distorted by intermixing words with numbers? Perea,
Visually-induced auditory spatial adaptation in monkeys and humans Norbert Kopčo, I-Fan Lin, Barbara Shinn-Cunningham, Jennifer Groh Center for Cognitive.
The Role of Multisensory Information in Infant Attention to Faces of Speakers Versus the Rhythm of Speech Lorraine E. Bahrick, Mariana Vaillant-Molina,
Tonal Violations Interact with Lexical Processing: Evidence from Cross-modal Priming Meagan E. Curtis 1 and Jamshed J. Bharucha 2 1 Dept. of Psych. & Brain.
Introduction Ruth Adam & Uta Noppeney Max Planck Institute for Biological Cybernetics, Tübingen Scientific Aim Experimental.
REACTION TIME.
 In a vacuum, perception should evolve to be increasingly accurate  Selection pressure on deception  Thus, pressure to detect deception  Adaptive.
UROP Undergraduate Research Opportunities Programme This project was supported through the University of Reading UROP (Undergraduate Research Opportunities.
Methods Identifying the Costs of Auditory Dominance on Visual Processing: An Eye Tracking Study Wesley R. Barnhart, Samuel Rivera, & Christopher W. Robinson.
How Does Multiple Group Membership Affect Face Recognition in Asian Participants? Sarah Pearson, Jane Farrell, Christopher Poirier, and Lincoln Craton.
Connecting Sound with the Mind’s Eye: Multisensory Interactions in Music Conductors W. David Hairston, Ph.D Advanced Neuroscience Imaging Research Lab.
Deep Dyadic Friendships vs. Broad Peer Preference During Adolescence as Predictors of Adolescent and Adult Internalizing Symptoms Rachel K. Narr & Joseph.
Example trial sequences for visual perspective-taking task
Emotion Knowledge in Maltreated Preschoolers
Alison Burros, Kallie MacKay, Jennifer Hwee, & Dr. Mei-Ching Lien
Generation Differences in Meaningfulness of Song Clips and Abstract Auditory Stimuli Anh N Le, Khadejia Norman, Clayton Reichart, Aaron O’Brien, Kayla.
PSYC 206 Lifespan Development Bilge Yagmurlu.
Department of Psychology Stephen F. Austin State University
The Oxford Vocal (OxVoc) Sounds Database:
Feedback from the Heart: emotional learning and memory is controlled by cardiac cycle, interoceptive accuracy and personality Gaby Pfeifer1, Sarah Garfinkel.
Development of Audiovisual Integration in Central and Peripheral Vision Yi-Chuan Chen, Terri L. Lewis, David I. Shore, and Daphne Maurer Department.
From: Holistic integration of gaze cues in visual face and body perception: Evidence from the composite design Journal of Vision. 2017;17(1):24. doi: /
Anne Preston Speed and Mike Bamman OPERATIONAL DEFINITIONS
Pieter Moors, Johan Wagemans, Lee de-Wit BAPS 2016, Antwerp
Alison Burros, Nathan Herdener, & Mei-Ching Lien
Developing Leadership Potential with Stories of Pragmatic Leaders
Effects of Oncoming Vehicle Size on Overtaking Judgments
The Effects of Musical Mood and Musical Arousal on Visual Attention
Angry Faces Capture Attention But Do They Hold It?
Volume 61, Issue 5, Pages (March 2009)
Introduction Results Methods Conclusions
Michelle Lai, Ipek Oruç, Jason J.S. Barton  Cortex 
Visually-induced auditory spatial adaptation in monkeys and humans
Modelling the Effect of Depression on Working Memory
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
Social Neuroscience Stereotyping & Prejudice Race & Emotion
Multisensory Perception and Biological Motion
Multisensory integration: perceptual grouping by eye and ear
Saccades actively maintain perceptual continuity
Serial Dependence in the Perception of Faces
Memory: Enduring Traces of Perceptual and Reflective Attention
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
Multisensory Integration: Maintaining the Perception of Synchrony
Social Attention and the Brain
Intro to EEG studies BCS204 Week 2 1/23/2019.
Perceptual-Motor Deficits in Children with down syndrome: Implications for Intervention Study by: Naznin Virji-Babul, Kimberly Kerns, Eric Zhou, Asha.
Presentation transcript:

The Development of Emotional Interactions Across the Senses: Interactions between visual and auditory emotional information in children versus adults Keri Swenson, Hiu Mei Chow, Sarah Izen and Vivian Ciaramitaro Psychology Department, University of Massachusetts, Boston, MA INTRODUCTION EXPECTED RESULTS RESULTS – PSE shift Our senses allow us to receive a wealth of information. In a social environment, voices and faces act as primary means of communicating social information (Ghazanfar, 2005). Together, multisensory input from visual and auditory information present a cross-modal interaction that should influence affective facial perception. Known as perceptual aftereffects, visual or auditory adaptation to one affective category (e.g. angry) will cause subsequent ambiguous facial affect to appear opposite from the adapted category (e.g. happy). Though this effect is well studied in adults, less is known about its development. Specifically, there is a discrepancy in the literature as to whether multisensory integration is innate at birth or modified by experience (Brandwein et. al, 2011). In our study, we examine audiovisual affective correspondence in participants aged 7-17. We hypothesize that (1) the strength of adaptation in both children and adults will be stronger for congruent (matching) vs. incongruent (non-matching) affect and 2) adaptation to happy affective stimuli will be more robust across conditions for both children and adults. Adapt Happy Adapt Happy Appears Happy (%) 100 Baseline Post- Adapt PSE at baseline is a 0% morph. After adaptation, the same morph appears 25% angrier. 50 }Effect Adaptation 80 40 20 10 0% 10 20 40 80 Face Morphs (% emotion) METHOD: AFFECTIVE ADAPTATION In our baseline condition, participants judged a series of faces on a morphed emotional continuum as happy or angry via button press, in a two alternative forced choice paradigm. From this data we establish each participant’s point of subjective equality (PSE). In the subsequent adaptation condition, participants were adapted to one of four possible affective conditions: Congruent Happy, Incongruent Happy, Congruent Angry, Incongruent Angry. After adaptation, participants again judge the morphed faces (24 trials). Adaptation is quantified as the change in PSE prior to, and after, adaptation. Adapt Angry Adapt Angry Place conditions table here 24 trials 24 trials CONCLUSIONS Visual Stimuli All face stimuli were selected from the NimStim database (Tottenham et al.). For adaptation 30 unique faces (15 female, 15 male) were displayed, either 100% happy or angry. MorphMan software was used to create probe face images included a subset of 4 unique adaptation faces (2 female, 2 male) depicted along an emotional continuum of angry  to neutral to happy,  including 80%,  40%,  20%  10%  or 0%  of a  given  emotion. RESULTS – Demographics Title here Adapt Happy Summary of findings: These findings show interesting results between the adults and children. Congruency effect: In adults, the strength of adaptation is similar when congruent vs. incongruent affect whereas in children, the strength of adaptation is stronger in the congruent vs. incongruent in the adapt happy condition; the opposite is true in the adapt angry condition. Emotion effect: In both adults and children, adaptation strength is similar across emotions. In happy affective adaptation, we see a positive shift; in angry affective adaptation, we see a negative shift . Given these differences, our findings support that multisensory integration and the perceptual processing of emotional information are processes that develop through childhood and adolescence into adulthood. Adapt Happy 5 Male, 5 Female, 2 Unknown Congruent 4 Male, 8 Female 4 Male, 3 Female 2 Male, 3Female Incongruent Angry 80% 40% 10% 10% 40% 80% Happy Auditory Stimuli Auditory adaptation included 30 (15 positive and 15 negative) unique crowd sounds. Participants Adapt Angry This research was supported in part by the UMass Boston Dean’s Office and an RTF grant from the UMass Boston Psychology Department. 7 Male, 3 Female, 4 Unknown Congruent 10 Male, 4 Female 3 Male, 4 Female 1 Unknown 2 Male, 4 Female REFERENCES 26 children (6-12 years of age) and 52 adults (18-35 years of age) were recruited from the Living Laboratory Museum of Science in Boston, Massachusetts. Before presenting the task to a participant, we first gathered consent or assent and measured their present emotional state using adult and child versions of the Positive and Negative Affect Schedule (PANAS). Before debriefing, we again presented the PANAS to capture the participants’ post affective measure. Incongruent Dionne-Dostie, E., Paquette, N., Lassonde, M., & Gallagher, A. (2015). Multisensory Integration and Child Neurodevelopment. Brain Sciences. doi:10.3390/brainsci5010032 Ghazanfar, Maier, Hoffman, & Logothetis. (2005). Multisensory integration of dynamic faces and voices in rhesus monkey auditory cortex. Journal of Neuroscience. Retrieved from http://www.ncbi.nlm.nih.gov/pubmed/15901781 Adult Children