Application of open domain adaptive models in image annotation and classification

开放域自适应模型在图像标注和分类中的应用

阅读:1

Abstract

In the field of computer vision, the task of image annotation and classification has attracted much attention due to its wide demand in applications such as medical image analysis, intelligent surveillance, and image retrieval. However, existing methods have significant limitations in dealing with unknown target domain data, which are manifested in the problems of reduced classification accuracy and insufficient generalization ability. To this end, the study proposes an adaptive image annotation classification model for open-set domains based on dynamic threshold control and subdomain alignment strategy to address the impact of the difference between the source and target domain distributions on the classification performance. The model combines the channel attention mechanism to dynamically extract important features, optimizes the cross-domain feature alignment effect using dynamic weight adjustment and subdomain alignment strategy, and balances the classification performance of known and unknown categories by dynamic threshold control. The experiments are conducted on ImageNet and COCO datasets, and the results show that the proposed model has a classification accuracy of up to 93.5% in the unknown target domain and 89.6% in the known target domain, which is better than the best results of existing methods. Meanwhile, the model check accuracy and recall rate reach up to 89.6% and 90.7%, respectively, and the classification time is only 1.2 seconds, which significantly improves the classification accuracy and efficiency. It is shown that the method can effectively improve the robustness and generalization ability of the image annotation and classification task in open-set scenarios, and provides a new idea for solving the domain adaptation problem in real scenarios.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。