Language Experience Predicts Eye Movements During Online Auditory Comprehension

语言经验可以预测在线听觉理解过程中的眼动情况

阅读:1

Abstract

Experience-based theories of language processing suggest that listeners use the properties of their previous linguistic input to constrain comprehension in real time (e.g. MacDonald & Christiansen, 2002; Smith & Levy, 2013; Stanovich & West, 1989; Mishra, Pandey, Singh, & Huettig, 2012). This project investigates the prediction that individual differences in experience will predict differences in sentence comprehension. Participants completed a visual world eye-tracking task following Altmann and Kamide (1999) which manipulates whether the verb licenses the anticipation of a specific referent in the scene (e.g. The boy will eat/move the cake). Within this paradigm, we ask (1) are there reliable individual differences in language-mediated eye movements during this task? If so, (2) do individual differences in language experience correlate with these differences, and (3) can this relationship be explained by other, more general cognitive abilities? Study 1 finds evidence that language experience predicts an overall facilitation in fixating the target, and Study 2 replicates this effect and finds that it remains when controlling for working memory, inhibitory control, phonological ability, and perceptual speed.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。