Enhancing Walk-Light Detector Usage for the Visually Impaired: A Comparison of VR Exploration and Verbal Instructions

增强视障人士使用步行灯探测器的效果:VR探索与口头指导的比较

阅读:1

Abstract

People with visual impairments (PVI) increasingly rely on camera-enabled smartphone apps for tasks like photography, navigation, and text recognition. Despite the growing use of these applications, precise camera aiming remains a significant challenge. This study explores the impact of virtual reality (VR) exploration compared to traditional text/audio (TA) instructions in the context of learning to use a walk-light detector app at traffic intersections. We developed a VR exploration tool based on insights gathered from interviews with PVI. A user study was conducted, involving 13 PVI participants divided into two groups: VR exploration and TA instructions. Following indoor training using the respective approaches, participants from both groups used the walk light detector app outdoors. According to the participants' subjective feedback, a higher proportion of participants in the TA group found the training easier, potentially due to shortcomings in our VR protocol and differences between the real world and VR. However, more VR participants gained insights into walk light detection and felt unable to use the detector without VR training, compared to the TA group.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。