Emotion-inducing tasks

Slides:



Advertisements
Similar presentations
Nettie’s Nickel 4 Comprehensive AT Report Writing 2 Support a District’s Offer of FAPE Nettie Fischer, ATP July 22, 2014.
Advertisements

Role of communication experience in facial expression coding in preschool children Vera Labunskaya a a Academy of psychology and pedagogy of Southern Federal.
Figure 1. A Trial in the Old-Unpleasant IAT Task
Cognitive Walkthrough More evaluation without users.
PLS-5 Training.
Developing interventions to encourage intergroup contact Rhiannon Turner and Keon West University of Leeds SLN Research Day, Bradford, 23 August 2011 SLN.
Gerhard Dueck -- CS3013Capturing Requirements as Use Cases 1 Capturing the Requirements as use Cases  Requirements Description  We need to describe –The.
INTERVIEWING SKILLS FOR EFFECTIVE PERFORMANCE APPRAISAL Ministry of Public Health and Sanitation Ministry of Medical Services 1.
Emotions: Emotions: Hampson, E., van Anders, S. M., & Mullin, L. I. (2006). A female advantage in the recognition of emotional facial expressions: test.
THE EFFECTS OF GENDER ON COMMUNICATION STRATEGIES OF VIETNAMESE EFL LEARNERS PRESENTER: ĐINH NGỌC HẠNH People’s Police College.
Research indicates that observers do not always estimate the pain of others accurately. Often, pain is underestimated. Given the important implications.
SiTEL LMS Focus Group Executive Summary Prepared: January 25, 2012.
Chapter 7 Reading College Textbooks. Copyright © Houghton Mifflin Company. All rights reserved Benefits of Active Reading As an active reader, you.
Reading College Textbooks
Pilot Implementation of the Digital Portfolio in Greece Zoe Albani Vocational Counselor European Projects Development and Management IEKEP.
Alternative Assessment
CS2003 Usability Engineering Usability Evaluation Dr Steve Love.
Interpersonal Skills 4 detailed studies Health Psychology.
Software Engineering User Interface Design Slide 1 User Interface Design.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
Evolutionary Psychology
© 2008 by Prentice Hall4-1 Employee Recording Describe daily work activities in diary or log Problem: Employees exaggerating job importance Valuable in.
Olfactory Cues Modulate Facial Attractiveness Dematte, Osterbauer, & Spence (2007)
Finding supports ADVANCED SOCIAL COMMUNICATION MIDDLE SCHOOL: LESSON FOUR.
How Does Multiple Group Membership Affect Face Recognition in Asian Participants? Sarah Pearson, Jane Farrell, Christopher Poirier, and Lincoln Craton.
Ch. 13 A face in the crowd: which groups of neurons process face stimuli, and how do they interact? KARI L. HOFFMANN 2009/1/13 BI, Population Coding Seminar.
Day 8 Usability testing.
Applications · E-learning apps, changing the presentation style of an e-lesson according to the mood of the person. · Psychological health services, to.
Is Social Exclusion a Moral Issue?
Department of Psychology Stephen F. Austin State University
Ending Racism to Build Power
British face stimuli Egyptian face stimuli
Monitor Student Test Status and Item Progress
Recall The Team Skills Analyzing the Problem (with 5 steps)
Economics-VII (Research Methodology) Topic-Data Collection
Details Regarding Stimuli
59 54th Annual Meeting of the Society for Psychophysiological Research, September 10-14, 2014 Atlanta, Georgia Event-related response in skin conductance.
SELF-ASSESSMENT (DiSC®)
Foundations of Individual Behavior
Observer Participants
By Dr. Abdulrahman H. Altalhi
Collaboration with Google Drive
ANNA KO ADVISORS: KRISTINA STRIEGNITZ AND NICK WEBB
L6 Entrepreneurship Assignment 2017 – 2018
Merve denizci nazlıgül, M.s.
Crowd Intelligence Grocery Shopping Mobile App
Olfactory Cues Modulate Facial Attractiveness
1st European CERE meeting Amsterdam, May 13-15, 2004
The Human Face as a Dynamic Tool for Social Communication
Damien Dupré & Anna Tcherkassof
Interaction in Urban Traffic – Insights into an Observation of Pedestrian-Vehicle Encounters André Dietrich, Chair of Ergonomics, TUM
Recognition of action readiness in natural facial expressions
Ranulfo Romo, Adrián Hernández, Antonio Zainos  Neuron 
9 Measurement and Scaling: Noncomparative Scaling Techniques
The Role of Arousal in Mood Mediation: A Closer Look at Mood Congruent Memory Eric Eich 1/17/2019.
Chapter 17 & 18 Study Guide.
Volume 27, Issue 9, Pages (May 2017)
Maryland Online IEP System Instructional Series - PD Activity #5
Introducing e-learning and imaging technology into histology practical classes in veterinary medicine: Effects on student motivation and learning quality.
The Human Face as a Dynamic Tool for Social Communication
2.Personality And Attitude
Prototyping Sriram Mohan.
Maryland Online IEP System Instructional Series - PD Activity #5
Dynein Tethers and Stabilizes Dynamic Microtubule Plus Ends
Foundations of Inclusive Education
Volume 113, Issue 3, Pages (August 2017)
Experimental procedures.
ANALYSIS ON ICT USAGE OF HUNGARIAN FRUIT AND VEGETABLE PROCESSING ENTERPRISES Szilvia Botos, László Várallyai, Róbert Szilágyi,Gergely Ráthonyi, János.
2018 JOB Position Definition Process.
Student’s Presentation
Presentation transcript:

Emotion-inducing tasks DYNEMO: A Database of Dynamic and Spontaneous Emotional Facial Expressions Tcherkassof Anna*, Dupré Damien*, Dubois Michel*, Mandran Nadine, Meillon Brigitte, Boussard Gwenn, Adam Jean Michel, Caplier Alice, Guérin-Dugué Anne, Benoît Anne-Marie & Mermillod Martial * LIP, University of Grenoble 2 —  LIG, University of Grenoble 1  — GIPSA-LAB, University of Grenoble 1  — IEP, University of Grenoble 2  — LAPSCO, University of Clermont-Ferrand Results Research on facial expression highlights several relevant expressive characteristics. In order to go into more details, researchers are in need of comprehensive facial expression databases. DynEmo is a database of dynamic and spontaneous emotional facial expressions videos, that is, naturalistic recordings of emotional facial displays in realistic settings. It offers a dynamic and spontaneous expressive material combined with emotional self-report (of expresser) and on-line assessments (of observers) data. The face of ordinary participants were videotaped while carrying an emotion-inducing task and were later watched by observers who assessed the emotions displayed. Thus, this database consists in 358 videos (1 to 15 min. long) associated with 2 types of data: 1. the emotional state of the expresser (self-reported once the inducing task completed); 2. the timeline of observers’ assessments regarding the emotions displayed all along the recording. Emotional Self-Report The questionnaire provides an indicator of the emotional state of the encoder during the task. (At present, the 12 emotional labels scales only have been analyzed) Dynamic Assessment In order to highlight the assessment of the decoders from a dynamic point of view, we calculated each 1/10 second the in-between decoders agreement for each label, during the unfolding of the video. It provides a timeline (cf. Fig. 4) where the emotional assessments of each decoder are superimposed (mass curves) 1/10 sec. after 1/10 sec. all along the video. Figure 3. Encoder’s self-report on the emotional labels questionnaire while carrying out a disgust-inducing task Video 26_1 Emotion-inducing tasks Annoyance / Astonishment / Boredom / Cheerfulness / Disgust / Fright / Curiosity / Moved / Pride / Shame Figure 4. Video of an encoder who carries out the disgust-induction task and its corresponding emotional expressive timeline underneath Method At one glance, one can visually identify when the target emotion is displayed. In the present video of a woman confronted to a disgusting stimulus, 70% of decoders have considered that she was expressing disgust (in blue) from sec. 34 to sec. 54 essentially. It can be noticed that during the same interval, about 30% of decoders have rated her face as Video 26_1 Emotional Induction Encoders (358 ordinary participants: 182 women and 176 males, from 25 to 65 year old,  = 48,  = 9,21) were recruited for a study devoted to a visual ergonomic visual task (cover story). They were covertly videotaped by 2 hidden cameras (Fig. 1) while carrying an emotion-inducing task. Once the task over, the encoders filled out a 51 scales questionnaire (6 points) regarding their emotional state: 35 action readiness scales Ex. The visual task you just carried out stirs up a tendency to approach, to make contact 3 dimensional scales Ex. The state you feel after carrying out this visual task is: Unpleasant …………….. Pleasant 1 comfort’s state scale Ex. During this visual task, you were: Ill at ease …………….. At ease 12 emotional labels scales Ex. How much this visual task made you feel disgusted? Ex. How much this visual task made you feel annoyed? Dynamic Assessment Decoders (171 students) assessed the videos via Oudjat, a data-processing interface (Tcherkassof et al., 2007). Oudjat allows decoders to assess on-line the emotions they perceive in the face of the encoder (Fig. 2). Pride Disgust Annoyance Astonishment Curiosity Boredom Moved Shame Humiliation No Emotion Fright Disappointment Cheerfulness Oudjat interface for real time emotional assessment Figure 2.1 Tagging device : during the expression’s unfolding, judge indicates the emotional intervals Figure 1. Whole face (camera 1), overview (camera 2) and participant’s screen (emotion-induction task). expressing fright instead (cf. yellow mass curve). From sec. 0 to sec. 33, several labels are in competition to describe what is expressed by her face. Discussion In order to investigate facial expressions of emotion, DynEmo offers a comprehensive database of videos of dynamic and spontaneous faces. Each video is associated with the expresser’s emotional state and the on- line ratings of decoders who assessed all the emotions displayed during the unfolding of the expression (emotional expressive timeline). Thus, the spontaneous and dynamic expressions are characterized very precisely in real-time. Emotional expressive time-lines (Fig.4) instantly signal, for each video, when the target emotion is displayed by the face. It also signals the periods where observers decode different emotions, that is, when a weak consensus exist between judges regarding what is displayed by the face (ambiguous expression). The timelines demonstrate that facial expressions of emotions are rarely prototypical and that idiosyncratic characteristics of expressers are often salient elements. Therefore, DynEmo provides an expressive material near to natural social interactions (HHI communications). Up to date, DynEmo is the most thorough database available. 358 videos associated with the expressers emotional self-reports are accessible. Out of these, 33 videos have been judged with real-time emotional assessments. The emotional expressive timeline of these 33 videos are also obtainable (the remaining videos are actually under judgment process). Free access to DynEmo (videos of dynamic and spontaneous emotional facial expressions and associated emotional data): http://havok.imag.fr/ Figure 2.2 Rating device: Judge selects the emotional label best describing each emotional interval First, the decoder marks on-line the video each time he/she perceives the beginning and the end of an emotion displayed by the face by pressing a key (Fig.2.1). Then, the decoder assesses each emotional sequences beforehand marked by selecting one of the 12 proposed emotional terms (Fig. 2.2). Each video has been assessed by about 20 decoders. References Tcherkassof, A., Bollon,T., Dubois, M., Pansu, P., & Adam, J. M. (2007). Facial expressions of emotions: A methodological contribution to the study of spontaneous and dynamic emotional faces. European Journal of Social Psychology, 37, 1325-1345. ISRE 2009, August 6-8, Leuven, Belgium