Adversarial class-wise self-knowledge distillation for medical image segmentation

对抗性类内自知识蒸馏在医学图像分割中的应用

阅读:1

Abstract

Self-knowledge distillation enables knowledge transfer by dynamically constructing the next-stage learning objectives, thus providing more effective path cues to optimize the compact student. The challenge lies in formulating effective learning objectives for the upcoming stage that mitigate the interference of inter-class similarity in medical image segmentation. This paper presents an Adversarial Class-Wise Self-Knowledge Distillation (ACW-SKD). ACW-SKD leverages an auxiliary head to generate coarse segmentation results, which are then utilized as prediction masks to refine class-wise features, followed by mitigating inter-class similarity via class-wise feature distillation. A feature reconstruction module (FRM) is introduced in the penultimate feature layer and class-wise feature layer to avoid plugging in multiple intermediate branches for constructing the next-stage learning objectives. Furthermore, adversarial temperature loss is incorporated to further recognize inter-class similarity by integrating a learnable temperature module. Extensive experiments are conducted on three benchmark datasets: Synapse, FLARE2022, and M2caiSeg. The results indicate that ACW-SKD surpasses several offline knowledge distillation methods, self-knowledge distillation methods, and U-Net networks. Ablation studies and visual analysis further validate the efficacy of ACW-SKD. This method notably enhances segmentation accuracy for challenging classes and mitigates the influence of inter-class similarities in medical image segmentation. Moreover, ACW-SKD delivers comparable results to U-Net with a reduced computational demand, positioning it as a viable option for deploying efficient medical image segmentation models on mobile devices. Our codes are available at https://github.com/shenjq77/ACW-SKD .

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。