Abstract
Biological synapses effortlessly balance memory retention and flexibility, yet artificial neural networks still struggle with the extremes of catastrophic forgetting and catastrophic remembering. Here, we introduce Metaplasticity from Synaptic Uncertainty (MESU), a Bayesian update rule that scales each parameter's learning by its uncertainty, enabling a principled combination of learning and forgetting without explicit task boundaries. MESU also provides epistemic uncertainty estimates for robust out-of-distribution detection; the main computational cost is weight sampling to compute predictive statistics. Across image-classification benchmarks, MESU mitigates forgetting while maintaining plasticity. On 200 sequential Permuted-MNIST tasks, it surpasses established synaptic-consolidation methods in final accuracy, ability to learn late tasks, and out-of-distribution data detection. In task-incremental CIFAR-100, MESU consistently outperforms conventional training techniques due to its boundary-free streaming formulation. Theoretically, MESU connects metaplasticity, Bayesian inference, and Hessian-based regularization. Together, these results provide a biologically inspired route to robust, perpetual learning.