Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience

Yuxing Fang, Quanjing Chen, Angelika Lingnau, Zaizhu Han, Yanchao Bi

Research output: Contribution to journalArticlepeer-review

Abstract

The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during nonlinguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.
Original languageEnglish
Article number94
Pages (from-to)1-13
Number of pages13
JournalFrontiers in Human Neuroscience
Volume10
DOIs
Publication statusPublished - 8 Mar 2016

Cite this