Abstract
In sign languages, linguistic information is transmitted through the simultaneous movement of several bodily articulators. This study investigates cortical tracking of sign language and whether experience and knowledge of sign language can modulate the tracking. We used a camera with a depth sensor to record videos of semispontaneous sign language narratives while tracking articulators' movement in 3D space. These videos served to characterize the temporal periodicity of the sign language visual signal and as stimuli for the experiment. Using magnetoencephalography (MEG), we recorded the neurophysiological activity of two groups of hearing participants-proficient signers and sign-naive individuals-while they watched videos in a known and unknown sign language. Coherence between the preprocessed MEG data and the visual linguistic signal extracted from different articulators was used as measure of brain-language tracking. The results show that neural activity tracks sign language input in delta frequency band (0.5 to 2.5 Hz), reflecting the slower periodicity associated with articulator movements. Both groups of participants show similar tracking in occipital areas, reflecting low-level visual processing of the videos. Proficient signers show stronger synchronization compared to sign-naive controls for linguistically relevant articulators in the right temporal cortex. Proficient signers also show greater tracking for the known compared to the unknown sign language. These findings confirm that cortical tracking of language is a feature of language processing beyond the auditory domain, and is modulated by language experience.