Population anisotropy in area MT explains a perceptual difference between near and far disparity motion segmentation

MT区域的群体各向异性解释了近距离和远距离视差运动分割之间的感知差异。

阅读:1

Abstract

Segmentation of the visual scene into relevant object components is a fundamental process for successfully interacting with our surroundings. Many visual cues, including motion and binocular disparity, support segmentation, yet the mechanisms using these cues are unclear. We used a psychophysical motion discrimination task in which noise dots were displaced in depth to investigate the role of segmentation through disparity cues in visual motion stimuli (experiment 1). We found a subtle, but significant, bias indicating that near disparity noise disrupted the segmentation of motion more than equidistant far disparity noise. A control experiment showed that the near-far difference could not be attributed to attention (experiment 2). To account for the near-far bias, we constructed a biologically constrained model using recordings from neurons in the middle temporal area (MT) to simulate human observers' performance on experiment 1. Performance of the model of MT neurons showed a near-disparity skew similar to that shown by human observers. To isolate the cause of the skew, we simulated performance of a model containing units derived from properties of MT neurons, using phase-modulated Gabor disparity tuning. Using a skewed-normal population distribution of preferred disparities, the model reproduced the elevated motion discrimination thresholds for near-disparity noise, whereas a skewed-normal population of phases (creating individually asymmetric units) did not lead to any performance skew. Results from the model suggest that the properties of neurons in area MT are computationally sufficient to perform disparity segmentation during motion processing and produce similar disparity biases as those produced by human observers.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。