Convergent BOLD and Beta-Band Activity in Superior Temporal Sulcus and Frontolimbic Circuitry Underpins Human Emotion Cognition

Mbemba Jabbi, Philip D Kohn, Tiffany Nash, Angela Ianni, Christopher Coutlee, Tom Holroyd, Frederick W Carver, Qiang Chen, Brett Cropp, J Shane Kippenhan, Stephen E Robinson, Richard Coppola, Karen F Berman, Mbemba Jabbi, Philip D Kohn, Tiffany Nash, Angela Ianni, Christopher Coutlee, Tom Holroyd, Frederick W Carver, Qiang Chen, Brett Cropp, J Shane Kippenhan, Stephen E Robinson, Richard Coppola, Karen F Berman

Abstract

The processing of social information in the human brain is widely distributed neuroanatomically and finely orchestrated over time. However, a detailed account of the spatiotemporal organization of these key neural underpinnings of human social cognition remains to be elucidated. Here, we applied functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG) in the same participants to investigate spatial and temporal neural patterns evoked by viewing videos of facial muscle configurations. We show that observing the emergence of expressions elicits sustained blood oxygenation level-dependent responses in the superior temporal sulcus (STS), a region implicated in processing meaningful biological motion. We also found corresponding event-related changes in sustained MEG beta-band (14-30 Hz) oscillatory activity in the STS, consistent with the possible role of beta-band activity in visual perception. Dynamically evolving fearful and happy expressions elicited early (0-400 ms) transient beta-band activity in sensorimotor cortex that persisted beyond 400 ms, at which time it became accompanied by a frontolimbic spread (400-1000 ms). In addition, individual differences in sustained STS beta-band activity correlated with speed of emotion recognition, substantiating the behavioral relevance of these signals. This STS beta-band activity showed valence-specific coupling with the time courses of facial movements as they emerged into full-blown fearful and happy expressions (negative and positive coupling, respectively). These data offer new insights into the perceptual relevance and orchestrated function of the STS and interconnected pathways in social-emotion cognition.

Trial registration: ClinicalTrials.gov NCT00004571.

Keywords: STS; coupling; emotion; neural; transient.

Published by Oxford University Press 2014. This work is written by (a) US Government employee(s) and is in the public domain in the US.

Figures

Figure 1.
Figure 1.
Sustained neural response to facial dynamics. (A) Left: BOLD response to dynamic > static expressions thresholded at P < 0.001 for display; orange to red clusters survived P < 0.05 FDR corrected; Right: fMRI time course of right STS (note the typical 5–6 s hemodynamic response delay in the BOLD signal, consistent with the relatively low temporal resolution of fMRI compared with the right STS beta-band activity time course measured with MEG in (B), right). (B) Left: MEG beta-band response to facial dynamics thresholded at P < 0.05 FDR corrected; Right: time course of right STS beta-band activity. (C) Time–frequency results illustrating frontal right hemispheric sensor-level spectral distribution from −0.5 s before stimulus onset to 1.5 s after stimulus onset, which includes 0.5 s of poststimulus time for the purpose of setting the scale for the entire frequency time window of −0.5 to 1.5 s, with the time–frequency maps shown to be predominantly distributed between theta (4–8 Hz), alpha (8–14 Hz), and beta (14–30 Hz) during the 0–1000-ms time window of interest when facial expression dynamics were shown. Note that the blue color in the time–frequency map reliably illustrates beta-band power. (D) Overlap map of the STS and amygdala showing convergent BOLD and beta-band activity at P = 0.0025 derived from a conjunction analysis of BOLD and MEG beta-band activity response to facial dynamics (conjunction P < 0.0025 uncorrected); scatter plot depicts correlation (Spearman's ρ) between beta-band activity and BOLD signal in left amygdala. The color bars represents t-maps for the various results and X, Y, Z values denote MNI coordinates.
Figure 2.
Figure 2.
Perceptual correlates of sustained and transient beta-band activity. (A) Location of correlations between left STS beta-band activity and emotion (fear and happiness) recognition speed assessed post-MEG. Corresponding correlation (Pearson's R) graph between reaction time for fear and happiness recognition and the corresponding extracted left posterior STS beta-band activity values for each individual was included to further illustrate the relationship between the observed left beta-band signals shown to correlate with recognition speed in a whole-brain regression analysis. (B) Visual (orange)- and motor (blue)-related beta-band activity elicited by first 200 ms of viewing facial videos compared with first 200 ms of gender identification-related motor response, respectively. Color bar represents t values in (B) (for (B), negative t values are not inhibitory responses, but rather denote positive t values for motor response-evoked beta-band activity compared with visually evoked beta-band activity).
Figure 3.
Figure 3.
Transient beta-band activity. Locations and millisecond timing for selected regions in which beta-band activity was significantly recruited during perception of dynamic fearful (A) and happy (B) expressions relative to dynamic neutral expressions during specific 200-ms sliding time windows; see Tables 2 and 3 for complete results.
Figure 4.
Figure 4.
Time course of STS beta-band activity couples with time course of the emergence of emotional facial dynamics. (A and B) Top depicts the global face movements for fearful and happy expressions, respectively, calculated using the PerceptualDiff image analyzer program; (A and B) Middle depicts the time courses of left STS (encircled ROI) beta-band activity evoked by fearful and happy relative to neutral expressions across the entire 1-s viewing epoch; (A and B) Bottom shows scatter plots illustrating the cross-correlation between the STS beta-band activity time courses and the global facial movement of fearful–neutral and happy–neutral expressions over the 400–1000 ms time courses of the top and middle panels of (A and B), covering a time over which the expressions fully emerge (indicated by gray lines in middle panels of A and B). Time courses of facial expressions (A and B top) represent the time-dependent, frame-to-frame differential facial movement (relative to neutral dynamic expressions) as the fearful and happy dynamic expressions emerged; these time points were averaged over the fearful and happy videos, respectively; the time-course data depicting fearful and happy facial expressions (relative to neutral expressions) consists of the stimulus time courses shown earlier in Jabbi et al. (2013).

Source: PubMed

3
Abonnieren