Abstract
Unmanned Aerial Vehicles (UAVs) have become a critical focus in robotics research, particularly in the development of autonomous navigation and target-tracking systems. This journal article provides an overview of a multi-year IEEE-hosted drone competition designed to advance UAV autonomy in complex environments. The competition consisted of two primary challenges. The first competition hosted was the Rover Chase Challenge. In this competition a UAV was tasked with autonomously tracking and following a ground rover as it maneuvers through an obstacle-filled environment. The drone relied on onboard sensors such as cameras and LiDAR to estimate the rover's trajectory and adjust its flight path accordingly. The second competition hosted was the Maze Navigation Challenge. In this challenge, the UAV navigated through a structured maze using LiDAR-based environment mapping and obstacle avoidance, without relying on external positioning systems such as GPS. Developing robust autonomous drone algorithms for such tasks requires extensive data collection, simulation, and testing, which can be costly and time-intensive. To address this, competitors completed this competition using a PX4-Gazebo based simulator. This dataset includes sensor data recorded in rosbag format, comprising LiDAR, IMU, GPS, and other telemetry readings. This dataset enables researchers to benchmark algorithms, conduct reproducible experiments, and develop robust UAV autonomy, perception, and GPS-denied navigation systems in both simulated and real-world contexts.