Chinese herbal medicine recognition network based on knowledge distillation and cross-attention

基于知识蒸馏和交叉注意力机制的中药识别网络

阅读:1

Abstract

In order to reduce the number of parameters in the Chinese herbal medicine recognition model while maintaining accuracy, this paper takes 20 classes of Chinese herbs as the research object and proposes a recognition network based on knowledge distillation and cross-attention - ShuffleCANet (ShuffleNet and Cross-Attention). Firstly, transfer learning was used for experiments on 20 classic networks, and DenseNet and RegNet were selected as dual teacher models. Then, considering the parameter count and recognition accuracy, ShuffleNet was determined as the student model, and a new cross-attention mechanism was proposed. This cross-attention model replaces Conv5 in ShuffleNet to achieve the goal of lightweight design while maintaining accuracy. Finally, experiments on the public dataset NB-TCM-CHM showed that the accuracy (ACC) and F1_score of the proposed ShuffleCANet model reached 98.8%, with only 128.66M model parameters. Compared with the baseline model ShuffleNet, the parameters are reduced by nearly 50%, but the accuracy is improved by about 1.3%, proving this method's effectiveness.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。