Skip to main content

Emotion Schemas are Embedded in the Human Visual System

Publication ,  Journal Article
Kragel, P; Reddan, M; LaBar, K; Wager, T
2018

Theorists have suggested that emotions are canonical responses to situations ancestrally linked to survival. If so, then emotions may be afforded by features of the sensory environment. However, few computationally explicit models describe how combinations of stimulus features evoke different emotions. Here we develop a convolutional neural network that accurately decodes images into 11 distinct emotion categories. We validate the model using over 25,000 images and movies and show that image content is sufficient to predict the category and valence of human emotion ratings. In two fMRI studies, we demonstrate that patterns of human visual cortex activity encode emotion category-related model output and can decode multiple categories of emotional experience. These results suggest that rich, category-specific emotion representations are embedded within the human visual system.

Duke Scholars

Altmetric Attention Stats
Dimensions Citation Stats

DOI

Publication Date

2018
 

Citation

APA
Chicago
ICMJE
MLA
NLM
Kragel, P., Reddan, M., LaBar, K., & Wager, T. (2018). Emotion Schemas are Embedded in the Human Visual System. https://doi.org/10.1101/470237
Kragel, Philip, Marianne Reddan, Kevin LaBar, and Tor Wager. “Emotion Schemas are Embedded in the Human Visual System,” 2018. https://doi.org/10.1101/470237.
Kragel P, Reddan M, LaBar K, Wager T. Emotion Schemas are Embedded in the Human Visual System. 2018;
Kragel, Philip, et al. Emotion Schemas are Embedded in the Human Visual System. 2018. Epmc, doi:10.1101/470237.
Kragel P, Reddan M, LaBar K, Wager T. Emotion Schemas are Embedded in the Human Visual System. 2018;

DOI

Publication Date

2018