Bayesian continual learning and forgetting in neural networks

神经网络中的贝叶斯持续学习和遗忘

阅读:1

Abstract

Biological synapses effortlessly balance memory retention and flexibility, yet artificial neural networks still struggle with the extremes of catastrophic forgetting and catastrophic remembering. Here, we introduce Metaplasticity from Synaptic Uncertainty (MESU), a Bayesian update rule that scales each parameter's learning by its uncertainty, enabling a principled combination of learning and forgetting without explicit task boundaries. MESU also provides epistemic uncertainty estimates for robust out-of-distribution detection; the main computational cost is weight sampling to compute predictive statistics. Across image-classification benchmarks, MESU mitigates forgetting while maintaining plasticity. On 200 sequential Permuted-MNIST tasks, it surpasses established synaptic-consolidation methods in final accuracy, ability to learn late tasks, and out-of-distribution data detection. In task-incremental CIFAR-100, MESU consistently outperforms conventional training techniques due to its boundary-free streaming formulation. Theoretically, MESU connects metaplasticity, Bayesian inference, and Hessian-based regularization. Together, these results provide a biologically inspired route to robust, perpetual learning.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。