Abstract
The tasks of community detection in complex networks have garnered increasing attention from researchers. Concurrently, with the emergence of graph neural networks (GNNs), these models have rapidly become the mainstream approach for solving this task. However, GNNs frequently encounter the Laplacian oversmoothing problem, which dilutes the crucial neighborhood signals essential for community identification. These signals, particularly those from first-order neighbors, are the core source information defining community structure and identity. To address this contradiction, this paper proposes a novel training strategy focused on strengthening these key local signals. We design a multi-branch learning structure that injects a gradient into the GNN layer during backpropagation. This gradient is then modulated by the GNN's native message-passing path, precisely supplementing the parameters of the initial layers with first-order topological information. Based on this, we construct the network structure-informed GNN (NIGNN). A large number of experiments show that the proposed method achieves a 0.6-3.6% improvement in multiple indicators compared with the basic model in the community detection task, and performs well in the t-test. The framework has good general applicability and can be effectively applied to GCN, GAT, and GraphSAGE architectures, and shows strong robustness in networks with incomplete information. This work offers a novel solution for effectively preserving core local information in deep GNNs.