Abstract
Spiking neural networks (SNNs) inspired by biological neurons offer energy-efficient and interpretable computation but is limited by the simplistic structure of point neurons. We introduce a multi-compartment spiking neuron model (MCN) with trainable cross-compartment connections that simulate soma-dendrite interactions. Theoretically, we prove that these connections act as spatiotemporal momentum, guiding learning dynamics toward global optima. To leverage this, we propose a multi-compartment spatiotemporal backpropagation (MCST-BP) algorithm that enhances gradient flow stability. Experimental results for multiple benchmark datasets, including S-MNIST, CIFAR-10, Spiking Heidelberg Digits (SHD), and ECG, show that MC-SNNs outperform traditional SNNs in both convergence speed and accuracy. Our work bridges neurobiological structure and computational modeling, providing a theoretical and practical foundation for high-performance brain-inspired learning systems.