Achieving consistency in FedSAM using local adaptive distillation on sports image classification

利用局部自适应蒸馏在体育图像分类中实现 FedSAM 的一致性

阅读:1

Abstract

Federated learning (FL) is an effective distributed learning paradigm for protecting client privacy, enabling multiple clients to collaboratively train a global model without uploading private data. It has promising applications in sports image classification. However, FL faces the issue of non-independent and identically distributed (non-IID) data, which leads to excessive variance between local models and hinders the convergence of the global model. Although FedSAM and its variants attempt to reduce this variance by finding smooth solutions between local models, local smoothing does not necessarily result in global smoothing. We refer to this issue as the smoothness inconsistency problem. To address this challenge, we propose a novel FL paradigm, named A-FedSAM, which utilizes adaptive local distillation to achieve consistency in smoothing between local and global models without incurring additional communication overhead, thereby improving the convergence accuracy of the global model. Specifically, A-FedSAM employs the global model as the teacher during local training, dynamically guiding the local models to ensure that their gradients not only maintain smoothness but also align with the global objective. Extensive experiments on sports image classification tasks demonstrate that A-FedSAM outperforms state-of-the-art methods in terms of accuracy across different data heterogeneities and client sampling rates, while requiring fewer communication and computational resources to achieve the same target accuracy.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。