Abstract
The challenge of enhancing the detection accuracy of widely adopted and stable object detectors while maintaining cost-effectiveness has long been a topic of significant interest and concern within the industry. To address this challenge, this paper proposes a general relation-based self-distillation framework suitable for object detection to help existing detectors achieve a better balance between accuracy and overhead. Compared to existing self-distillation methods, the framework we propose focuses on integrating relation-based knowledge into the self-distillation framework. To achieve this goal, we propose a relation-based self-distillation method within the framework and design a targeted optimization strategy in the form of an adaptive filtering strategy. The relation-based self-distillation method constrains the detector from focusing on the differences in the representation of the same type of object in different scenarios; and the adaptive filtering strategy filters the low-confidence results predicted by the detector before calling the matching mechanism, thereby ensuring the efficiency of the training process. A large number of experimental results show that our method can significantly improve the accuracy of existing detectors and reduce their redundant prediction results without increasing the computational resource overhead of existing detectors.