Location Coding of Tool-Object Pairs Based on Perceptual Grouping: Evidence from Object-Based Correspondence Effect

基于感知分组的工具-物体对的位置编码:来自基于物体的对应效应的证据

阅读:2

Abstract

Motor interactions with single, as well as pairs of objects can be automatically affected by visual asymmetries provided by protruding parts, whether the handle or not. Faster and more accurate performance is typically produced when task-defined responses correspond to the location of such protruding parts, relative to when they do not correspond (i.e., object-based spatial correspondence effects). In two experiments we investigated the mechanisms that underlie the spatial coding of tool-object pairs when semantic and action alignment relationships were orthogonally combined. Centrally presented pictures of "active" tools (depicted as potentially performing their proper action) were paired, on one side, to a "passive" object (target of the tool action). We observed S-R correspondence effects that depended on the location of the protruding side of tool-object pairs, and not on the non-protruding side of the tool handle. Thus, results further supported the location coding account of the effect, against the affordance activation one. The effect was only produced when tool-object pairs belonged to the same semantic category or were correctly aligned for action, but with no further interplay. This was not consistent with the idea that action links were coded between tool-object pairs, and that the resulting action direction interacted with response spatial codes. Alternatively, we claimed that semantic relation and action alignment acted, independent from each other, as perceptual grouping criteria; allowing for the basic spatial coding of visual asymmetries to take place. This brought to speculation, at neurocognitive level, about independent processing along the ventral and ventro-dorsal streams.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。