Abstract
In this paper, we present a stair detection algorithm verified on various platforms, including a real quadruped robot. The proposed approach detects dangerous situations, such as proximity to stairs, in real time, enabling the robot to move safely. In our research, we utilized data collected by a moving four-legged robot, recorded in 18 sequences containing more than 21,000 color images from two cameras positioned at different perspectives. To address the challenge, we utilize a deep neural network with RGB inputs for object detection, complemented by preprocessing, and post-processing. A key feature of this approach is its adaptability to varying camera views, including both front and bottom perspectives on the robot, with training that incorporates multi-camera images from both views. We implemented and tested this algorithm on the Unitree Go1 robot, as well as on other embedded platforms. Using the trained YOLO version 11n network on a single sequence and testing on 17 sequences, we achieved an average mAP@50 of 51.30 for images containing only stairs and 87.56 for all images. This method enables early hazard detection during stair navigation. The proposed evaluation scenario tests the model's adaptation from a single training sequence to multiple unseen sequences, extending existing stair detection methods for quadrupedal robots. The dataset presents high variability in stair appearance due to the robot's perspective and limited real-time processing capacity.