Categorizing Emotion in Spoken Language Janine K. Fitzpatrick and John Logan METHOD RESULTS We understand emotion through spoken language via two types.

Slides:



Advertisements
Similar presentations
The perception of dialect Julia Fischer-Weppler HS Speaker Characteristics Venice International University
Advertisements

Identifying the Presence of Psychopathy in the Community A Study into Social Functioning and Deception Freya Samson, James Freeman, Gavan Palk | Queensland.
Associations of behavioral parameters of speech emotional prosody perception with EI measures in adult listeners Elena Dmitrieva Kira Zaitseva, Alexandr.
A Comparison of Emotional Expression between Canadian- born and Immigrants Living in Canada S. Safdar, L.C. Gough, R. Raiciu, & J. Rendell University of.
Method Participants Fifty-six undergraduate students (age range 19-37), 14 in each of the four language groups (monolingual, Spanish-English bilingual,
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
Vocal Emotion Recognition with Cochlear Implants Xin Luo, Qian-Jie Fu, John J. Galvin III Presentation By Archie Archibong.
AUTOMATIC SPEECH CLASSIFICATION TO FIVE EMOTIONAL STATES BASED ON GENDER INFORMATION ABSTRACT We report on the statistics of global prosodic features of.
Insight into Attributional Style: A Replication and Extension Michael T. Moore & David M. Fresco, Kent State University Insight into Attributional Style:
Sex Differences in Visual Field Lateralization: Where are they? Christine Chiarello 1, Laura K. Halderman 1, Suzanne Welcome 1, Janelle Julagay 1 & Christiana.
Suzanne E. Welcome 1, Laura K. Halderman 1, Janelle Julagay 1, Christiana Leonard 2, & Christine Chiarello 1 1 University of California, Riverside 2 University.
Emotions: Emotions: Hampson, E., van Anders, S. M., & Mullin, L. I. (2006). A female advantage in the recognition of emotional facial expressions: test.
The empathy fillip: Can training in micro expressions of emotion enhance empathic accuracy?
Stephanie Witte Wisconsin Lutheran College Deborrah Uecker COM 205
Facial Emotion Recognition and Social Functioning in Children With and Without Attention-Deficit Hyperactivity Disorder Marissa Reynolds The Pennsylvania.
Emotion and Lying in a Non-native Language. Researchers Catherine Caldwell- Harris and Ayşe Ayçiçeği-Dinn.
Resilience in Aboriginal Children and Adolescents in Out-of-Home Care: A Test of an Initial Explanatory Model Katharine M. Filbert School of Psychology.
Educational level relative to gender (Ν=28)
Emotion Perception and Social Functioning in Serious Mental Illness: Differential Relationships Among Inpatients and Outpatients Melissa Tarasenko, Petra.
Research indicates that observers do not always estimate the pain of others accurately. Often, pain is underestimated. Given the important implications.
Self-Esteem and Problem Drinking Among Male & Female College Students William R. Corbin, Lily D. McNair, James Carter University of Georgia Journal of.
Comprehension of Grammatical and Emotional Prosody is Impaired in Alzheimer’s Disease Vanessa Taler, Shari Baum, Howard Chertkow, Daniel Saumier and Reported.
CSD 5100 Introduction to Research Methods in CSD Observation and Data Collection in CSD Research Strategies Measurement Issues.
Method Participants Participants were 68 preschoolers, between the ages of 29 and 59 months of age. The sample was comprised of 32 male participants and.
Briana Cassetta Kiehl, K. A., et al (2001). Limbic abnormalities in affective processing by criminal psychopaths as revealed by functional magnetic resonance.
Gottman’s Social- Psychophysiological Research Protocol Gottman, J. M., Katz, L. F., & Hooven, C. (1997). Meta-emotion: How families communicate emotionally.
Gender differences in symptom reporting: the influence of psychological traits. Laura Goodwin Dr Stephen Fairclough Liverpool John Moores University BACKGROUND.
1 Branches of Linguistics. 2 Branches of linguistics Linguists are engaged in a multiplicity of studies, some of which bear little direct relationship.
JAM-boree: A Meta-Analysis of Judgments of Associative Memory Kathrene D. Valentine, Erin M. Buchanan, Missouri State University Abstract Judgments of.
Acknowledgments We thank Dr. Yu, Dr. Bateman, and Professor Szabo for allowing us to conduct this study during their class time. We especially thank the.
EEG – BASED EMOTION RECOGNITION in MUSIC LEARNING.
Method Introduction Results Discussion Differential Processing of Emotional Cues in Inpatients and Outpatients with Serious Mental Illness Melissa Tarasenko,
Too happy to careAlcohol, Affect and ERN amplitude Too happy to care: Alcohol, Affect and ERN amplitude Conclusions: Consistent with Ridderinkhof et al.
Method Introduction Results Discussion Different Neurocognitive Abilities Moderate the Relationship between Affect Perception and Community Functioning.
The Role of Mixed Emotional States in Predicting Men’s and Women’s Subjective and Physiological Sexual Responses to Erotic Stimuli Peterson, Z. D. 1 and.
What vocal cues indicate sarcasm? By: Jack Dolan Rockwell, P. (2000). Lower, slower, louder: Vocal cues of sarcasm. Journal of Psycholinguistic Research,
Acoustic Cues to Emotional Speech Julia Hirschberg (joint work with Jennifer Venditti and Jackson Liscombe) Columbia University 26 June 2003.
Two systems for reasoning, two systems for learning Harriet Over and Merideth Gattis School of Psychology, Cardiff University.
RESEARCH MOTHODOLOGY SZRZ6014 Dr. Farzana Kabir Ahmad Taqiyah Khadijah Ghazali (814537) SENTIMENT ANALYSIS FOR VOICE OF THE CUSTOMER.
Compassion Meditation vs. Mindfulness Meditation: Effect on Attitude and Disposition By Graham Maione Advisor: Dr. Paul Bueno de Mesquita.
Commitment Identity Motives Meaning Self Esteem Distinctiveness Continuity Belongingness Identity Motives Meaning Self Esteem Distinctiveness Continuity.
Can Deaf People See People Better?: Perception of Biological Motion in Deaf and Hearing Participants Vina Nguyen*, Rebecca Weast, and Dennis Proffitt Department.
Effect of music training on depicting emotions conveyed through ragas Mrs. Namita Joshi*, Ms. Gaurri Mangaonkar**, Ms. Nikita Kirane** * Faculty, School.
Psychopathy and the decoding of the facial expressions of emotions The Second International Conference – Towards a Safer Society, Edinburgh, « Understanding.
Participants and Procedure 1,447 participants representing 64 countries (mostly India and the United States) completed a cross-sectional survey via Amazon’s.
Results Introduction The present study focuses on adult attitudes toward children. Many examples of discrimination against children in Western societies.
Norming Study Mechanisms of Emotion Regulation: The Role of Attentional Control Lindsey R. Wallace, M. A. & Elisabeth J. Ploran, Ph.D. Department of Psychology,
Emotion Knowledge in Maltreated Preschoolers
The 157th Meeting of Acoustical Society of America in Portland, Oregon, May 21, pSW35. Confusion Direction Differences in Second Language Production.
Further Validation of the Personal Growth Initiative Scale – II: Gender Measurement Invariance Harmon, K. A., Shigemoto, Y., Borowa, D., Robitschek, C.,
Investigating Multiple Roles of Vocal Pitch in Attitude Change
British face stimuli Egyptian face stimuli
August 15, 2008, presented by Rio Akasaka
Behavioral Sciences and Education
An Exploration of the Relationship Between Behavioral Approach and Emotion Perception James P. Loveless1, Alexandra J. Stephenson1, and D. Erik Everhart1.
Delannoy, D.1, Saloppé, X.2,3, Pham, T.H.1,2,4
Differentiating Psychopathic and Alexithymic Emotional Traits With Thin Slices of Verbal Behaviour K. Kaseweter, M.Sc. Candidate, University of Northern.
From: Face Recognition is Shaped by the Use of Sign Language
Investigating Multiple Roles of Vocal Pitch in Attitude Change
59 54th Annual Meeting of the Society for Psychophysiological Research, September 10-14, 2014 Atlanta, Georgia Event-related response in skin conductance.
Predicting Aggression: Fearless Temperament and Callous Unemotional Traits as pathways to aggressive behavior. Elise M Cardinale & Abigail A Marsh Georgetown.
Protective Effects of Positive Emotions
The Effects of Musical Mood and Musical Arousal on Visual Attention
The highest possible score The lowest possible score
Studying Intonation Julia Hirschberg CS /21/2018.
Chapter 1 Data Analysis Ch.1 Introduction
Two randomised controlled crossover studies to evaluate the effect of colouring on both self-report and performance measures of well-being Holt, N. J.,
Damien Dupré & Anna Tcherkassof
Intro to EEG studies BCS204 Week 2 1/23/2019.
Conclusions and Future Implications
Presentation transcript:

Categorizing Emotion in Spoken Language Janine K. Fitzpatrick and John Logan METHOD RESULTS We understand emotion through spoken language via two types of cues: Semantic content (what is being said) Prosodic content (changes in pitch, amplitude and duration) People with psychopathy display lower accuracy when identifying emotions from spoken words, particularly fear (Blair et al., 2002). Bagley, Abramowitz and Kosson (2009): Psychopaths classified affective stimuli less accurately than non-psychopaths No fear category in experimental design Pilot study indicated that even non-psychopathic listeners have trouble identifying fear from prosodic content alone The current study aimed to replicate the findings of Bagley et al. (2009) among a non-psychopathic population with a category for fear Results will be used to provide a normative sample for use in further research with psychopathic population DISCUSSION Participants 36 monolingual English-speaking Carleton undergraduate students All are non-psychopathic (as measured by the Self-Report Psychopathy Scale SRP-II; Williams, Paulhus, & Hare, 2007) Ratings Task Total 384 sentences (18-20 in each emotion category spoken in English and French; 4 speakers) Participants rate affect by choosing from 5 emotion categories 7-point intensity scale (1=low intensity; 4=moderate intensity; 7=high intensity) Design Semantic condition: English sentences produced with neutral prosody (no prosodic cues) Prosodic condition: French sentences produced with appropriate prosodic cues (no semantic cues for monolingual English listeners) 2 male and 2 female speakers The ratings task depends on the perception and categorization of emotional cues Participants used more semantic cues when identifying happiness, sadness and fear in speech; more prosodic cues for anger Next step: analyze confusion data for multidimensional scale solution Even subclinical levels of psychopathy may be implicated in deficits in processing emotional language; Dysfunctional fear hypothesis: less adverse arousal to punishment (Blair et al., 2005)? Future iterations will examine categorization accuracy within psychopathic population REFERENCES Bagley, A. D., Abramowitz, C.S., & Kosson, D.S. (2009). Vocal affect recognition and psychopathy: Converging findings across traditional and cluster analytic approaches to assessing the construct. Journal of Abnormal Psychology, 118 (2), Blair, J., Mitchell D.R., & Blair, K. (2005). The psychopath: Emotion and the brain. London: Blackwell Publishing Professional. Blair, R.J.R., Mitchell, D.G.V., Richell, R.A., Kelly, S., & Leonard, A. (2002). Turning a deaf ear to fear: Impaired recognition of vocal affect in psychopathic individuals. Journal of Abnormal Psychology, 111 (4), Scherer, K. R., Johnstone, T., & Klasmeyer, G. (2003). Vocal expression of emotion. In R. J. Davidson, K. R. Scherer, and H. Goldsmith (Eds.), Handbook of the Affective Sciences. New York and Oxford: Oxford University Press. Williams, K., & Paulhus D. (2002). Factor structure of the Self-Report Psychopathy scale (SRP-II) in non-forensic samples. Personality and Individual Differences, 37, INTRODUCTION Figure 1. Mean accuracy for sentence categorization by emotion in semantic and prosodic conditions. Error bars represent ±1 standard error of the mean. Table 1 Mean categorization accuracy of high and low SRP-III participants by emotion and condition (SD) ConditionLow (n = 18)High (n = 18) Semantic Happiness **.85 (.23).58 (.31) Sadness.82 (.14).75 (.17) Anger **.79 (.17).56 (.24) Fear **.84 (.16).64 (.22) Neutral.83 (.22).74 (.20) Prosodic Happiness.54 (.16).50 (.17) Sadness.63 (.20).55 (.22) Anger.77 (.14).73 (.17) Fear.28 (.17).18 (.16) Neutral.63 (.22).63 (.20) Note. ** p <.01 for low and high SRP-III group comparisons. Figure 2. Relationship between SRP-III score and response accuracy for sentences expressing fear in the semantic condition, r = -.36.