The addition of a spatial auditory cue improves spatial updating in a virtual reality navigation task

在虚拟现实导航任务中,添加空间听觉线索可以改善空间更新。

阅读:1

Abstract

Auditory cues are integrated with vision and body-based self-motion cues for motion perception, balance, and gait, though limited research has evaluated their effectiveness for navigation. Here, we tested whether an auditory cue co-localized with a visual target could improve spatial updating in a virtual reality homing task. Participants navigated a triangular homing task with and without an easily localizable spatial audio signal co-located with the home location. The main outcome was unsigned angular error, defined as the absolute value of the difference between the participant's turning response and the correct response towards the home location. Angular error was significantly reduced in the presence of spatial sound compared to a head-fixed identical auditory signal. Participants' angular error was 22.79° in the presence of spatial audio and 30.09° in its absence. Those with the worst performance in the absence of spatial sound demonstrated the greatest improvement with the added sound cue. These results suggest that auditory cues may benefit navigation, particularly for those who demonstrated the highest level of spatial updating error in the absence of spatial sound.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。