Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2012 | public
Journal Article

Nonconscious emotional processing involves distinct neural pathways for pictures and videos

Abstract

Facial expressions are known to impact observers' behavior, even when they are not consciously identifiable. Relying on visual crowding, a perceptual phenomenon whereby peripheral faces become undiscriminable, we show that participants exposed to happy vs. neutral crowded faces rated the pleasantness of subsequent neutral targets accordingly to the facial expression's valence. Using functional magnetic resonance imaging (fMRI) along with psychophysiological interaction analysis, we investigated the neural determinants of this nonconscious preference bias, either induced by static (i.e., pictures) or dynamic (i.e., videos) facial expressions. We found that while static expressions activated primarily the ventral visual pathway (including task-related functional connectivity between the fusiform face area and the amygdala), dynamic expressions triggered the dorsal visual pathway (i.e., posterior partietal cortex) and the substantia innominata, a structure that is contiguous with the dorsal amygdala. As temporal cues are known to improve the processing of visible facial expressions, the absence of ventral activation we observed with crowded videos questions the capacity to integrate facial features and facial motions without awareness. Nevertheless, both static and dynamic facial expressions activated the hippocampus and the orbitofrontal cortex, suggesting that nonconscious preference judgments may arise from the evaluation of emotional context and the computation of aesthetic evaluation.

Additional Information

© 2012 Elsevier Ltd. Received 14 July 2012 Received in revised form 22 October 2012; Accepted 29 October 2012; Available online 6 November 2012. The authors thank two anonymous reviewers for their comments, and Eric Bertasi and Romain Valabregue for their help with data acquisition. This work was supported by a grant from the French Ministry for Higher Education and Research to NF, and grants from the Agence Nationale de la Recherche (ANR) and the European Research Council (ERC) to SK.

Additional details

Created:
September 26, 2023
Modified:
October 23, 2023