An intelligent framework for visually impaired people through indoor object Detection-Based assistive system using YOLO with recurrent neural networks

基于YOLO和循环神经网络的室内物体检测辅助系统,为视障人士提供智能框架

阅读:2

Abstract

Vision is a fundamental sense that profoundly impacts daily life and independence. For visually impaired people (VIP), the absence or impairment of this sense presents significant challenges, particularly in navigating their environment and identifying objects independently. A considerable challenge for visually impaired individuals is the inability to navigate independently and identify objects, which restricts their daily activities and routines. Various investigations have been conducted in the domain of real-time object detection (OD) using deep learning (DL). DL-based techniques are shown to attain performance in OD. In this manuscript, an Intelligent Object Detection-Based Assistive System Using Ivy Optimisation Algorithm (IODAS-IOA) model is proposed for VIPs. The aim is to develop an effective OD system to help visually impaired persons navigate their environment safely and independently. To achieve this, the image pre-processing stage initially employs the Gaussian filtering (GF) method to eliminate noise. Moreover, a novel YOLOv12 method is employed for the OD process. Furthermore, the DenseNet161 method is used for the feature extraction process. Additionally, the bidirectional gated recurrent unit with attention mechanism (BiGRU-AM) method is implemented for classifying the extracted images. Finally, the parameter tuning process is performed using the Ivy optimization algorithm (IOA) method to improve classification performance. Extensive experimentation was performed to validate the performance of the IODAS-IOA approach under the indoor OD dataset. The experimental results of the IODAS-IOA approach emphasized the superior accuracy value of 99.74% over recent techniques.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。