Learning and inference with correlated neural variability

利用相关神经变异性进行学习和推理

阅读:1

Abstract

The abundance of both input and process noises in the brain suggests that stochasticity is an integral part of neural computing, but how spiking neural networks (SNN) can learn general tasks under correlated variability remain unclear. In this work, we propose a stochastic neural computing (SNC) theory to implement gradient-based learning in SNN in the noise-driven regime using a moment closure approach. This leads to a new class of deep learning architecture called the moment neural network (MNN), which naturally generalizes rate-based neural networks to second-order statistical moments. Once trained, the parameters of the MNN can be directly used to recover the corresponding SNN without further fine-tuning. The trained model captures realistic firing statistics of biological neurons, including broadly distributed firing rates and Fano factors as well as weak pairwise correlation. The joint manipulation of mean firing rate and correlation structure leads to a distributed neural code that maximizes task accuracy while simultaneously minimizing prediction uncertainty, resulting in enhanced inference speed. We further demonstrate the application of our method on Intel's Loihi neuromorphic hardware. The proposed SNC framework offers insight into how SNNs process uncertainty and a practical way to build biologically plausible neural circuit models with correlated variability.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。