Data-Driven Methods to Diversify Knowledge of Human Psychology

Slides:



Advertisements
Similar presentations
Original Figures for "Molecular Classification of Cancer: Class Discovery and Class Prediction by Gene Expression Monitoring"
Advertisements

How do emotion and motivation direct executive control? Luiz Pessoa Trends in Cognitive Sciences Volume 13, Issue 4, Pages (April 2009) DOI: /j.tics
Two views of brain function Marcus E. Raichle Trends in Cognitive Sciences Volume 14, Issue 4, Pages (April 2010) DOI: /j.tics
Journal of Vision. 2012;12(6):33. doi: / Figure Legend:
Journal of Vision. 2016;16(8):14. doi: / Figure Legend:
Volume 89, Issue 2, Pages (August 2005)
The Human Face as a Dynamic Tool for Social Communication
Volume 9, Issue 5, Pages (December 2014)
Volume 27, Issue 3, Pages (February 2017)
Interpreting and Utilising Intersubject Variability in Brain Function
Volume 6, Issue 5, Pages e13 (May 2018)
Volume 28, Issue 7, Pages e5 (April 2018)
With Age Comes Representational Wisdom in Social Signals
Investigating How Peptide Length and a Pathogenic Mutation Modify the Structural Ensemble of Amyloid Beta Monomer  Yu-Shan Lin, Gregory R. Bowman, Kyle A.
Sam Norman-Haignere, Nancy G. Kanwisher, Josh H. McDermott  Neuron 
Adam M. Corrigan, Jonathan R. Chubb  Current Biology 
Mismatch Receptive Fields in Mouse Visual Cortex
Volume 27, Issue 2, Pages (January 2017)
Braden A. Purcell, Roozbeh Kiani  Neuron 
Volume 87, Issue 1, Pages (July 2015)
Rachael E. Jack, Oliver G.B. Garrod, Philippe G. Schyns 
Models of visual word recognition
Probabilistic Population Codes for Bayesian Decision Making
Nikolaus Kriegeskorte, Rogier A. Kievit  Trends in Cognitive Sciences 
A Switching Observer for Human Perceptual Estimation
A Role for the Superior Colliculus in Decision Criteria
Attentional Modulations Related to Spatial Gating but Not to Allocation of Limited Resources in Primate V1  Yuzhi Chen, Eyal Seidemann  Neuron  Volume.
Decoding Cognitive Processes from Neural Ensembles
Cortical Mechanisms of Smooth Eye Movements Revealed by Dynamic Covariations of Neural and Behavioral Responses  David Schoppik, Katherine I. Nagel, Stephen.
The Human Face as a Dynamic Tool for Social Communication
The Psychology and Neuroscience of Financial Decision Making
Volume 3, Issue 1, Pages (July 2016)
The Generality of Parietal Involvement in Visual Attention
Karl R Gegenfurtner, Jochem Rieger  Current Biology 
Cultural Confusions Show that Facial Expressions Are Not Universal
Dynamic Coding for Cognitive Control in Prefrontal Cortex
Jianing Yu, David Ferster  Neuron 
Network Neuroscience Theory of Human Intelligence
Volume 45, Issue 4, Pages (February 2005)
Attention Governs Action in the Primate Frontal Eye Field
Marian Stewart Bartlett, Gwen C. Littlewort, Mark G. Frank, Kang Lee 
A Switching Observer for Human Perceptual Estimation
Parietal and Frontal Cortex Encode Stimulus-Specific Mnemonic Representations during Visual Working Memory  Edward F. Ester, Thomas C. Sprague, John T.
Volume 65, Issue 4, Pages (February 2010)
BOLD fMRI Correlation Reflects Frequency-Specific Neuronal Correlation
Medial Axis Shape Coding in Macaque Inferotemporal Cortex
Volume 91, Issue 5, Pages (September 2016)
Volume 19, Issue 6, Pages (March 2009)
Ethan S. Bromberg-Martin, Masayuki Matsumoto, Okihide Hikosaka  Neuron 
Volume 78, Issue 5, Pages (June 2013)
Volume 89, Issue 6, Pages (March 2016)
Spatiotopic Visual Maps Revealed by Saccadic Adaptation in Humans
Bettina Sorger, Joel Reithler, Brigitte Dahmen, Rainer Goebel 
Normal Movement Selectivity in Autism
Timing, Timing, Timing: Fast Decoding of Object Information from Intracranial Field Potentials in Human Visual Cortex  Hesheng Liu, Yigal Agam, Joseph.
Volume 89, Issue 2, Pages (August 2005)
Daniel E. Winkowski, Eric I. Knudsen  Neuron 
Hiroyuki K. Kato, Shea N. Gillet, Jeffry S. Isaacson  Neuron 
Masayuki Matsumoto, Masahiko Takada  Neuron 
Volume 16, Issue 20, Pages (October 2006)
Visual selection: Neurons that make up their minds
Volume 47, Issue 6, Pages (September 2005)
Matthew J. Westacott, Nurin W.F. Ludin, Richard K.P. Benninger 
Volume 23, Issue 21, Pages (November 2013)
Looking for cognition in the structure within the noise
Volume 28, Issue 7, Pages e5 (April 2018)
Volume 34, Issue 4, Pages (May 2002)
Figure 1 NMF in ADD patients and classification of prodromal Alzheimer’s disease participants. Grey matter ... Figure 1 NMF in ADD patients and classification.
Facial Displays Are Tools for Social Influence
Volume 27, Issue 6, Pages (March 2017)
Presentation transcript:

Data-Driven Methods to Diversify Knowledge of Human Psychology Rachael E. Jack, Carlos Crivelli, Thalia Wheatley  Trends in Cognitive Sciences  Volume 22, Issue 1, Pages 1-5 (January 2018) DOI: 10.1016/j.tics.2017.10.002 Copyright © 2017 The Author(s) Terms and Conditions

Figure I Recognition Accuracy of the Six Universal Facial Expressions of Emotion across Cultures. (A) Color-coded circles show the average recognition accuracy of each of the six universal facial expressions of emotion in different parts of the world (see colorbar to the right). Circles outlined in black indicate performance below 75%. Figure reproduced, with permission, from [14]. (B) Recognition Accuracy of the Six Universal Facial Expressions of Emotion in Western and East Asian Culture. Each face shows an example of the standardized universal facial expressions of the six basic emotions (see labels above each face) used in these studies. Blue and red bars to the left of each face show the variance of recognition accuracy (median absolute deviation) in Western (blue) and East Asian (red) culture. Trends in Cognitive Sciences 2018 22, 1-5DOI: (10.1016/j.tics.2017.10.002) Copyright © 2017 The Author(s) Terms and Conditions

Figure I (A) Data-Driven Modeling of Dynamic Facial Expressions of Emotion. On each experimental trial a dynamic facial expression generator [10] creates a random facial animation by randomly sub-sampling a set of individual face movements called Action Units (AUs) – here, Upper Lid Raiser (AU5), Nose Wrinkler (AU9), and Upper Lip Raiser (AU10) – and by assigning a random movement to each AU using several temporal parameters (see labels illustrating the red curve). The randomly activated AUs are then combined to produce a facial animation (see bottom row of faces) which the cultural observer categorizes by emotion and rates by intensity (here, ‘disgust’ and ‘medium’ intensity) if it matches their prior knowledge of that facial expression. Otherwise, they select ‘other’. Statistical tools can then be used to build a relationship between the dynamic AU patterns presented on each trial and the observer’s responses to deliver a precise mathematical model of the face movement patterns that communicate emotions to individuals within a society. These models can then be precisely analyzed to reveal the cross-cultural facial expression patterns and their specific accents. (B) Cross-cultural facial expression patterns and specific accents. Color-coded face maps (top row) show four core facial expression patterns associated with 60+ emotions across two distinct societies (Western and East Asian). The bottom row of faces shows the specific accents that modify each core expressive pattern to create different facial expressions such as ‘contempt’, ‘shame’, ‘ecstatic’, and ‘fury’. Trends in Cognitive Sciences 2018 22, 1-5DOI: (10.1016/j.tics.2017.10.002) Copyright © 2017 The Author(s) Terms and Conditions