Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience. / Fang, Yuxing; Chen, Quanjing; Lingnau, Angelika; Han, Zaizhu; Bi, Yanchao.

In: Frontiers in Human Neuroscience, Vol. 10, 94, 08.03.2016, p. 1-13.

Research output: Contribution to journalArticlepeer-review

Published

Standard

Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience. / Fang, Yuxing; Chen, Quanjing; Lingnau, Angelika; Han, Zaizhu; Bi, Yanchao.

In: Frontiers in Human Neuroscience, Vol. 10, 94, 08.03.2016, p. 1-13.

Research output: Contribution to journalArticlepeer-review

Harvard

APA

Vancouver

Author

Fang, Yuxing ; Chen, Quanjing ; Lingnau, Angelika ; Han, Zaizhu ; Bi, Yanchao. / Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience. In: Frontiers in Human Neuroscience. 2016 ; Vol. 10. pp. 1-13.

BibTeX

@article{bc13058377b44a79b4661cbbf9d4b1e8,
title = "Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience",
abstract = "The observation of other people{\textquoteright}s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during nonlinguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people{\textquoteright}s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.",
author = "Yuxing Fang and Quanjing Chen and Angelika Lingnau and Zaizhu Han and Yanchao Bi",
year = "2016",
month = mar,
day = "8",
doi = "10.3389/fnhum.2016.00094",
language = "English",
volume = "10",
pages = "1--13",
journal = "Frontiers in Human Neuroscience",
issn = "1662-5161",
publisher = "Frontiers Media S.A.",

}

RIS

TY - JOUR

T1 - Areas Recruited during Action Understanding Are Not Modulated by Auditory or Sign Language Experience

AU - Fang, Yuxing

AU - Chen, Quanjing

AU - Lingnau, Angelika

AU - Han, Zaizhu

AU - Bi, Yanchao

PY - 2016/3/8

Y1 - 2016/3/8

N2 - The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during nonlinguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.

AB - The observation of other people’s actions recruits a network of areas including the inferior frontal gyrus (IFG), the inferior parietal lobule (IPL), and posterior middle temporal gyrus (pMTG). These regions have been shown to be activated through both visual and auditory inputs. Intriguingly, previous studies found no engagement of IFG and IPL for deaf participants during nonlinguistic action observation, leading to the proposal that auditory experience or sign language usage might shape the functionality of these areas. To understand which variables induce plastic changes in areas recruited during the processing of other people’s actions, we examined the effects of tasks (action understanding and passive viewing) and effectors (arm actions vs. leg actions), as well as sign language experience in a group of 12 congenitally deaf signers and 13 hearing participants. In Experiment 1, we found a stronger activation during an action recognition task in comparison to a low-level visual control task in IFG, IPL and pMTG in both deaf signers and hearing individuals, but no effect of auditory or sign language experience. In Experiment 2, we replicated the results of the first experiment using a passive viewing task. Together, our results provide robust evidence demonstrating that the response obtained in IFG, IPL, and pMTG during action recognition and passive viewing is not affected by auditory or sign language experience, adding further support for the supra-modal nature of these regions.

U2 - 10.3389/fnhum.2016.00094

DO - 10.3389/fnhum.2016.00094

M3 - Article

VL - 10

SP - 1

EP - 13

JO - Frontiers in Human Neuroscience

JF - Frontiers in Human Neuroscience

SN - 1662-5161

M1 - 94

ER -