Functional Identification of Language-Responsive Channels in Individual Participants in MEG Investigations

MEG研究中个体参与者语言反应通道的功能识别

阅读:2

Abstract

Making meaningful inferences about the functional architecture of the language system requires the ability to refer to the same neural units across individuals and studies. Traditional brain imaging approaches align and average brains together in a common space. However, lateral frontal and temporal cortices, where the language system resides, is characterized by high structural and functional inter-individual variability, which reduces the sensitivity and functional resolution of group-averaging analyses. This issue is compounded by the fact that language areas lay in close proximity to regions of other large-scale networks with different functional profiles. A solution inspired by visual neuroscience is to identify language areas functionally in each individual brain using a 'localizer' task (e.g., a language comprehension task). This approach has proven productive in fMRI, yielding a number of robust and replicable findings about the language system. Here, we extend this approach to MEG. Across two experiments (one in Dutch speakers, n=19; one in English speakers, n=23), we examined neural responses to the processing of sentences and a control condition (nonword sequences). In both the time and frequency domains, we demonstrated that the topography of neural responses to language is spatially stable within individuals but varies across individuals. Consequently, analyses that take this inter-individual variability into account are characterized by greater sensitivity, compared to the group-level analyses. In summary, similar to fMRI, functional identification within individuals yields benefits in MEG, thus opening the door to future investigations of language processing including questions where whole-brain coverage and temporal resolution are both critical.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。