Abstract
Current models of emotion simulation propose that intentionally posing a facial expression can change one’s subjective feelings, which in turn influences the processing of visual input. However, the underlying neural mechanism whereby one’s facial emotion modulates the visual cortical responses to other’s facial expressions remains unknown. To understand how one’s facial expression affects visual processing we measured participants’ visual evoked potentials (VEPs) during a facial emotion judgment task of positive and neutral faces. To control for the effects of facial muscles on VEPs, we asked participants to smile (adopting an expression of happiness), to purse their lips (incompatible with smiling) or to pose with a neutral face, in separate blocks. Results showed that the smiling expression modulates face-specific visual processing components (N170/vertex positive potential) to watching other facial expressions. Specifically, when making a happy expression, neutral faces are processed similarly to happy faces. When making a neutral expression or pursing the lips, however, responses to neutral and happy face are significantly different. This effect was source localized within multisensory associative areas, angular gyrus, associative visual cortex, and somatosensory cortex. We provide novel evidence that one’s own emotional expression acts as a top-down influence modulating low-level neural encoding during facial perception.
Original language | English |
---|---|
Pages (from-to) | 1316-1322 |
Number of pages | 7 |
Journal | Social Cognitive and Affective Neuroscience |
Volume | 10 |
Issue number | 10 |
Early online date | 24 Feb 2015 |
DOIs | |
Publication status | Published - Oct 2015 |
Keywords
- face processing
- emotional embodiment
- facial feedback
- VEPs
- N170