Emotion schemas are embedded in the human visual system.

Journal Article (Journal Article)

Theorists have suggested that emotions are canonical responses to situations ancestrally linked to survival. If so, then emotions may be afforded by features of the sensory environment. However, few computational models describe how combinations of stimulus features evoke different emotions. Here, we develop a convolutional neural network that accurately decodes images into 11 distinct emotion categories. We validate the model using more than 25,000 images and movies and show that image content is sufficient to predict the category and valence of human emotion ratings. In two functional magnetic resonance imaging studies, we demonstrate that patterns of human visual cortex activity encode emotion category-related model output and can decode multiple categories of emotional experience. These results suggest that rich, category-specific visual features can be reliably mapped to distinct emotions, and they are coded in distributed representations within the human visual system.

Full Text

Duke Authors

Cited Authors

  • Kragel, PA; Reddan, MC; LaBar, KS; Wager, TD

Published Date

  • July 24, 2019

Published In

Volume / Issue

  • 5 / 7

Start / End Page

  • eaaw4358 -

PubMed ID

  • 31355334

Pubmed Central ID

  • PMC6656543

Electronic International Standard Serial Number (EISSN)

  • 2375-2548

International Standard Serial Number (ISSN)

  • 2375-2548

Digital Object Identifier (DOI)

  • 10.1126/sciadv.aaw4358

Language

  • eng