Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published June 2012 | Published
Journal Article Open

Processing of Facial Emotion in the Human Fusiform Gyrus

Abstract

Electrophysiological and fMRI-based investigations of the ventral temporal cortex of primates provide strong support for regional specialization for the processing of faces. These responses are most frequently found in or near the fusiform gyrus, but there is substantial variability in their anatomical location and response properties. An outstanding question is the extent to which ventral temporal cortex participates in processing dynamic, expressive aspects of faces, a function usually attributed to regions near the superior temporal cortex. Here, we investigated these issues through intracranial recordings from eight human surgical patients. We compared several different aspects of face processing (static and dynamic faces; happy, neutral, and fearful expressions) with power in the high-gamma band (70–150 Hz) from a spectral analysis. Detailed mapping of the response characteristics as a function of anatomical location was conducted in relation to the gyral and sulcal pattern on each patient's brain. The results document responses with high responsiveness for static or dynamic faces, often showing abrupt changes in response properties between spatially close recording sites and idiosyncratic across different subjects. Notably, strong responses to dynamic facial expressions can be found in the fusiform gyrus, just as can responses to static faces. The findings suggest a more complex, fragmented architecture of ventral temporal cortex around the fusiform gyrus, one that includes focal regions of cortex that appear relatively specialized for either static or dynamic aspects of faces.

Additional Information

© 2012 Massachusetts Institute of Technology. Posted Online April 30, 2012. We thank all patients for their participation in this study; John Brugge, Mitchell Steinschneider, Jeremy Greenlee, Paul Poon, Rick Reale, and Rick Jenison for advice and comments; Haiming Chen, Chandan Reddy, Fangxiang Chen, Nader Dahdaleh, and Adam Jackson for data collection and care of participants; and Yota Kimura and Joe Hitchon for help with visual stimuli.

Attached Files

Published - Kawasaki2012p18268J_Cognitive_Neurosci.pdf

Files

Kawasaki2012p18268J_Cognitive_Neurosci.pdf
Files (670.7 kB)
Name Size Download all
md5:74a9e039ee817a56f8f1e4a770201292
670.7 kB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 17, 2023