Abstract
This paper addresses real-time performance issues in assistive mobility systems by presenting a qualitative histogram of oriented gradients (HOG)-based visual servoing (QHOGVS) approach for autonomous wheelchair navigation. A computationally effective HOG-based visual servoing implementation is proposed, which enables real-time performance on low-power hardware while preserving feature visibility during navigation. The suggested technique improves speed by 68×, from 0.08 FPS to 5.5 FPS on a Raspberry Pi, while maintaining stable trajectory tracking. This is accomplished by optimizing the pipeline using Python vectorization and incorporating an adaptive activation function to modulate error convergence. Experimental validation, including real-world sequential target reaching and corridor following tasks, demonstrates the system's functionality on a physical wheelchair platform. By providing insights into the trade-offs between computational efficiency and navigation precision in embedded systems, this work bridges the gap between theoretical visual servoing principles and practical assistive technology applications. The findings demonstrate the feasibility of the approach and indicate clear avenues for further study in the area of adaptive visual navigation for people with limited mobility.