Rapid visual engagement in neural processing of detailed touch interactions

快速的视觉参与促进了对精细触觉交互的神经处理。

阅读:1

Abstract

Touch perception is an inherently multisensory process in which vision plays an essential role. However, our understanding of how vision encodes sensory and emotional-affective aspects of observed touch, and the timing of these processes, remains limited. To address this gap, we investigated the neural dynamics of visual touch perception using electroencephalographic (EEG) recordings from participants who viewed videos depicting detailed tactile hand interactions from the Validated Touch-Video Database. We examined how the brain encodes basic body cues, such as hand orientation and viewing perspective, in addition to sensory aspects, including the type of touch (e.g., stroking vs. pressing; hand vs. object touch) and the object involved (e.g., knife, brush), as well as emotional-affective dimensions. Using multivariate decoding, we found that information about body cues emerged within approximately 60 ms, with information about sensory details and valence emerging around 110-160 ms, demonstrating efficient early visual encoding. Information about arousal, threat, and pain was most clearly identified by approximately 260 ms, suggesting that such evaluations require slightly extended neural engagement. Frequency decoding revealed that body cues were processed across a broad spectral range, with strongest contributions in the theta, alpha, and low beta bands (~6-20 Hz), while sensory and emotional-affective features were primarily reflected in delta, theta, and alpha frequencies (~1-13 Hz). Our findings reveal that bottom-up, automatic visual processing is integral to complex tactile assessments, important for rapidly extracting both the personal relevance and the sensory and emotional dimensions of visually observed touch.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。