Abstract
The abundance of both input and process noises in the brain suggests that stochasticity is an integral part of neural computing, but how spiking neural networks (SNN) can learn general tasks under correlated variability remain unclear. In this work, we propose a stochastic neural computing (SNC) theory to implement gradient-based learning in SNN in the noise-driven regime using a moment closure approach. This leads to a new class of deep learning architecture called the moment neural network (MNN), which naturally generalizes rate-based neural networks to second-order statistical moments. Once trained, the parameters of the MNN can be directly used to recover the corresponding SNN without further fine-tuning. The trained model captures realistic firing statistics of biological neurons, including broadly distributed firing rates and Fano factors as well as weak pairwise correlation. The joint manipulation of mean firing rate and correlation structure leads to a distributed neural code that maximizes task accuracy while simultaneously minimizing prediction uncertainty, resulting in enhanced inference speed. We further demonstrate the application of our method on Intel's Loihi neuromorphic hardware. The proposed SNC framework offers insight into how SNNs process uncertainty and a practical way to build biologically plausible neural circuit models with correlated variability.