Abstract
Graph Neural Network (GNN) faces limitations in few-shot image classification due to insufficient adaptive feature extraction and limited long-range dependency modeling. To address these challenges, this study proposes an Improved Graph Neural Network (IGNN) integrating two key innovations. Firstly, we design an Attention-Enhanced Feature Extraction module, which combines Efficient Channel Attention (ECA) and self-attention mechanisms, enabling the model to dynamically focus on discriminative intra-image details and inter-image contextual relationships, thereby improving feature representation robustness. Secondly, we introduce a gated recurrent unit (GRU)-based Pre-message-passing mechanism, which establishes cross-sample associations between support and query sets before message propagation, effectively capturing long-range dependencies and mitigating information smoothing. The experimental results of three public datasets demonstrate that our proposed framework outperforms the existing methods and shows significant potential. It offers a pragmatic tool for applications requiring rapid adaptation to limited data, such as remote sensing and medical image analysis.