Abstract
Dissolved Gas Analysis (DGA) of transformer oil is a critical technique for transformer fault diagnosis that involves analyzing the concentration of gases to detect potential transformer faults in a timely manner. Given the issues of large model parameters and high computational resource demands in transformer DGA diagnostics, this study proposes a lightweight convolutional neural network (CNN) model for improving gas ratio methods, combining Knowledge Distillation (KD) and recursive plots. The approach begins by extracting features from DGA data using the ratio method and Multiscale sample entropy (MSE), then reconstructs the state space of the feature data using recursive plots to generate interpretable two-dimensional image features. A deep feature extraction process is performed using the ResNet50 model, integrated with the Convolutional Block Attention Module (CBAM). Subsequently, the Sparrow Optimization Algorithm (SSA) is applied to optimize the hyperparameters of the ResNet50 model, which is trained on DGA data as the teacher model. Finally, a dual-path distillation mechanism is introduced to transfer the efficient features and knowledge from the teacher model to the student model, MobileNetV3-Large. The experimental results show that the distilled model reduces memory usage by 83.5% and computation time by 73.2%, significantly lowering computational complexity while achieving favorable performance across various evaluation metrics. This provides a novel technical solution for the improvement of gas ratio methods.