Angry Faces Capture Attention But Do They Hold It?

Slides:



Advertisements
Similar presentations
Non Verbal Communication What does the following sign mean to you?
Advertisements

The Outgroup Homogeneity Effect: What happens when Faces are Angry? Mark Schaller University of British Columbia.
DEPARTMENT OF PSYCHOLOGY U N I V E R S I T Y O F C O P E N H A G E N Suppression of neutral but not emotional words Background Anderson & Green (2001)
Attentionally Dependent Bilateral Advantage on Numerosity Judgments Jenny Ewing & Nestor Matthews Department of Psychology, Denison University, Granville.
Emoticons in IM Conversations  Past Research: –IM supplies a flexible medium for a wide range of conversations (Nardi et al., 2000). –According to the.
INFANTS’ PERCEPTION OF FACE-AFFECT RELATIONS IN MULTIMODAL EVENTS Melissa A. Shuman & Lorraine E. Bahrick Florida International University Introduction.
Age Differences in Emotion Recognition of Briefly Presented Faces Lisa Emery, Kory Morgan, Kaitlyn Pechanek & Caitlin Williams Reprints may be obtained.
Baron Cohen et al (1997) Reading Minds The eye task.
OCULOMOTOR CAPTURE BY IRRELEVANT LTM. Devue, Belopolsky, and Theeuwes, 2012 Examined whether or not oculomotor capture can occur in a bottom-up fashion.
Shared Perceptual Basis of Emotional Expression and Trustworthiness Impressions From Faces Nikolaas N. Oosterhof & Alexander Todorov Kira Bucca.
Effects of Attending an All-Women’s College on Women’s Recognition of Facial Emotional Expressions in Males and Females Alexandrina M. Gomes Petya D. Radoeva.
Sunee Holland University of South Australia School of Computer and Information Science Supervisor: Dr G Stewart Von Itzstein.
Positive Emotion in Language Production: Age Differences in Emotional Valence of Stories Elise Rosa and Deborah Burke Pomona College The Linguistic Inquiry.
Studying Visual Attention with the Visual Search Paradigm Marc Pomplun Department of Computer Science University of Massachusetts at Boston
Age-Related Identification of Emotions at Different Image Sizes 學生:董瑩蟬.
Neural Activation and Attention Bias to Emotional Faces in Autism Spectrum Disorders S.J. Weng, H. Louro, S.J. Peltier, J. Zaccagnini, L.I. Dayton, P.
The changing face of face research Vicki Bruce School of Psychology Newcastle University.
Method Participants Participants were 68 preschoolers, between the ages of 29 and 59 months of age. The sample was comprised of 32 male participants and.
Pavlovian, Observational and Instructed Fear Learning: Emotional Responses to Unmasked and Masked Stimuli Andreas Olsson, Kristen Stedenfeld & Elizabeth.
How do university students solve problems in vector calculus? Evidence from eye tracking Karolinska institutet Stockholm 4th of May 2012 Magnus Ögren 1.
Tracking Treatment Progress of Families with Oppositional Preschoolers Jaimee C. Perez, M.S., Stephen Bell, Ph.D., Robert W. Adams Linda Garzarella, B.A.,
Training Phase Results The RT difference between gain and loss was numerically larger for the second half of the trials than the first half, as predicted,
Reicher (1969): Word Superiority Effect Dr. Timothy Bender Psychology Department Missouri State University Springfield, MO
 Example: seeing a bird that is singing in a tree or miss a road sign in plain sight  Cell phone use while driving reduces attention and memory for.
A Comparison of Methods for Estimating the Capacity of Visual Working Memory: Examination of Encoding Limitations Domagoj Švegar & Dražen Domijan
Olfactory Cues Modulate Facial Attractiveness Dematte, Osterbauer, & Spence (2007)
Without Words for Emotions: Is the emotional processing deficit in alexithymia caused by dissociation or suppression? Christian Sinnott & Dr. Mei-Ching.
Baron-Cohen Cognitive Psychology The Core Studies.
Autism Traits in Typical Individuals Moderate Mimicry Responses to Happy, But Not Angry, Expressions Larissa C. D'Abreu, Daniel N. McIntosh Department.
Body Position Influences Maintenance of Objects in Visual Short-Term Memory Mia J. Branson, Joshua D. Cosman, and Shaun P. Vecera Department of Psychology,
Norming Study Mechanisms of Emotion Regulation: The Role of Attentional Control Lindsey R. Wallace, M. A. & Elisabeth J. Ploran, Ph.D. Department of Psychology,
Example trial sequences for visual perspective-taking task
Emotion Knowledge in Maltreated Preschoolers
Introduction   Many 3-D pronunciation tutors with both internal and external articulator movements have been implemented and applied to computer-aided.
Alison Burros, Kallie MacKay, Jennifer Hwee, & Dr. Mei-Ching Lien
Do Expression and Identity Need Separate Representations?
Introduction   Many 3-D pronunciation tutors with both internal and external articulator movements have been implemented and applied to computer-aided.
Department of Psychology Stephen F. Austin State University
Assist. Prof. Dr. Ilmiye Seçer Fall
Effects of Color and Emotional Arousal on Visual Perception
Effects of Working Memory on Spontaneous Recognition
Alejandro Lleras & Simona Buetti
Effects of Serial Subtractions on Elderly Gait Speed in a Virtual Reality Setting Taylor Leedera, Angeline Helseth Rothb, Molly Schiebera, Sara Myersa,
Models of Memory SAQ workshop.
Visual Memory is Superior to Auditory Memory
Cross-cultural differences on object perception
Figure 2. Change in saccade frequency (without vs with a visual cue)
Meghan Brzinski and David Havas, Ph.D.
Parts of an Academic Paper
Jessica Dénommée, Anick Labonté, Victoria Foglia & Annie Roy-Charland
Alison Burros, Nathan Herdener, & Mei-Ching Lien
The Effects of Musical Mood and Musical Arousal on Visual Attention
The involvement of visual and verbal representations in a quantitative and a qualitative visual change detection task. Laura Jenkins, and Dr Colin Hamilton.
Memory.
The Development of Emotional Interactions Across the Senses:
Hu Li Moments for Low Resolution Thermal Face Recognition
Title Introduction Discussion Results Methodology Conclusion
Two randomised controlled crossover studies to evaluate the effect of colouring on both self-report and performance measures of well-being Holt, N. J.,
Recognition of action readiness in natural facial expressions
Dense Regions of View-Invariant Features Promote Object Recognition
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
In the Blink of an Eye: Neurotransmitter Predictors of Creativity
CAMERA SHOTS Mr. Fazzalari.
Neural Mechanisms of Visual Motion Perception in Primates
Interviewing witnesses
Baron-Cohen et al. (1997).
Short-Term Memory for Figure-Ground Organization in the Visual Cortex
Jason A. Cromer, Jefferson E. Roy, Earl K. Miller  Neuron 
Bruce & Young’s model of face recognition (1986)
Dogs' Gaze Following Is Tuned to Human Communicative Signals
Presentation transcript:

Angry Faces Capture Attention But Do They Hold It? Laura Jenkins (Supervisor: Dr Paul Engelhardt) MRes Psychology 2012, Department of Psychology, Northumbria University Introduction The Anger Superiority Effect was first reported by Hansen and Hansen (1988). Since then, literature has looked at and highlighted the importance of angry faces, including the proposal that angry faces are detected quicker when being displayed in crowds (Fox et al., 2000; Tipples, Atkinson, & Young, 2002). Pinkham et al. (2010) investigated the Anger Superiority Effect using real life facial images set out in grids, whereas Fox et al. (2000) used faces set out in circles. The current research extended the research of Pinkham et al. (2010) and looked at participants eye movements to address the research question: Angry faces capture attention but do they hold it?’. Method Facial images were taken from Young, Perrett, Calder, Sprengelmeyer, and Ekman (2002) and The Nottingham Face Database. Design and Participants 13 participants from the area of Newcastle-Upon-Tyne were recruited through an email advertisement. IV = emotion type with three levels (i.e., angry, happy, other). DVs’ = 1) First saccade (eye movement to a face) . 2) The section of the screen that participants. focus on (e.g. face 1, 2, 3 or 4). 3) Score on memory task. Procedure Head movements were minimized using a chin rest. Each participant completed 60 trials (30 critical, 30 filler). An example is shown in Figure 1. The entire experimental session lasted 30 minutes. Participants were shown a fixation cross, then an encoding image and then a response image. They had to decide if the response image was in the same position as on the encoding image previously shown. Each array contained four faces. Where an angry face was presented, a happy face was presented opposite. The final two faces (diagonal) were either neutral expressions or other emotions. Analysis and Results The mean first fixation time showed a marginal effect of emotion F(2,22) = 2.78, p = .08. The angry faces were fixated more quickly than the happy faces (1220 vs. 1315) ,t(11) = 2.01, p = .07, and the angry faces were fixated more quickly than the other faces 1220 vs 1290), t(11) = 2.02, p = .068. Please see Figure 3 for a graphical representation. The mean total dwell times also showed a marginal effect of emotion F(2,22) = 2.57, p = .099. Participants spent more time looking at the angry face compared to the other faces, t(11) = 2.23, p < .05. Please see Figure 2 for a graphical representation. None of the other paired comparisons were significant. Mean Dwell Times (ms) Mean First Fixation Times (ms) Figure 2: Graph showing the mean total dwell times, in seconds for angry, happy and other emotions. Figure 3: Graph showing the mean first fixation times, in milliseconds, for angry, happy and other emotions. Aims/Rationale The study aimed to investigate the Anger Superiority Effect. Incorporation of eye tracking (as Pinkham et al. , 2010, suggested) because past literature did not focus on this method of data collection. This helped answer the question of holding attention or avoiding the angry face. Used the methodologies of Pinkham et al. (2010) and Fox et al. (2000) to help create similar stimuli (and set of images) for this study – these methods have been shown to work! Hypothesis/Prediction The mean first fixation times will be lower for the angry faces (fixated quicker than other emotions). This will indicate that participants find the angry face quicker than the other emotions. The dwell time for the angry faces will be shorter for the angry faces, indicating that participants are avoiding the angry face after locating it. Discussion and Future Directions A slight Anger Superiority Effect was present in the data. However, this could be described as more of an ‘anger pop out’ rather than Anger Superiority. Participants' were quicker to locate the angry face but did not avoid the angry face and spent more time fixating on the angry emotions (shown by the mean dwell times). Results still gave support for the background literature (Tipples, Atkinson, & Young, 2002) in suggesting that angry faces can be detected quicker in a crowd situation. This was shown by the first fixation times. Future research ideas include: 1) Use arrays of different sizes (6, 8 and 9) to represent a crowd situation more clearly. 2) Applications to a clinical population – those who have a brain injury or suffer prosopagnosia. 3) Create a new set of colour photos to increase ecological validity. A comparison between an older set (Ekman’s) and a new set could be made. Fixation Cross Encoding Phase Memory Response 2000ms Figure 1: Visual display of method – an example of one trial. 5000ms References Fox, E., Lester, V., Russo, R., Bowles, R.J., Pichler, A., and Dutton, K. (2000). Facial Expressions of Emotion: Are Angry Faces Detected More Efficiently. Cognition and Emotion, 14(1), 61-92. Hansen, C.H., and Hansen, R.D. (1988). Finding The Face In The Crowd: An Anger Superiority Effect. Journal of Personality and Social Psychology, 54(6), 917-924. Pinkham, A.E., Griffin, M., Baron, R., Sasson, N.J., and Gur, R.C. (2010). The Face In The Crowd Effect: Anger Superiority When Using Real Faces and Mulitple Identities. Emotion, 10(1), 141-146. Tipples, J., Atkinson, A.P., and Young, A.W. (2002). The Eyebrow Frown: A Salient Social Signal. Emotion, 2(3), 288-296. Young, A., Perrett, D., Calder, A., Sprengelmeyer, R., and Ekman, P. (2002). Facial Expressions of Emotion – Stimuli and Tests (FEEST). London: Thames Valley Test Company.