IGNN: An improved graph neufral network with integrated attention and pre-message-passing for few-shot image classification

IGNN:一种改进的图神经网络,集成了注意力机制和预消息传递技术,用于少样本图像分类。

阅读:1

Abstract

Graph Neural Network (GNN) faces limitations in few-shot image classification due to insufficient adaptive feature extraction and limited long-range dependency modeling. To address these challenges, this study proposes an Improved Graph Neural Network (IGNN) integrating two key innovations. Firstly, we design an Attention-Enhanced Feature Extraction module, which combines Efficient Channel Attention (ECA) and self-attention mechanisms, enabling the model to dynamically focus on discriminative intra-image details and inter-image contextual relationships, thereby improving feature representation robustness. Secondly, we introduce a gated recurrent unit (GRU)-based Pre-message-passing mechanism, which establishes cross-sample associations between support and query sets before message propagation, effectively capturing long-range dependencies and mitigating information smoothing. The experimental results of three public datasets demonstrate that our proposed framework outperforms the existing methods and shows significant potential. It offers a pragmatic tool for applications requiring rapid adaptation to limited data, such as remote sensing and medical image analysis.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。