Abstract
Neuron signal activation is at the core of deep learning and broadly impacts science and engineering. Despite growing interest in neuron cell stimulation via amplitude current, the activation mechanism of biological neurons has limited application in deep learning due to the lack of a universal mathematical principle suitable for artificial neural networks. Here, we show how deep learning can go beyond the current learning effects through a newly proposed neuron signal activation mechanism. To achieve this, we report a new cross-disciplinary method for neuron signal attenuation, using the inference of differential equations within generalized linear systems to enhance the efficiency of deep learning. We formulate the mathematical model of the efficient activation function, which we refer to as Attenuation (Ant). Ant can represent higher-order derivatives and stabilize data distributions in deep-learning tasks. We demonstrate the effectiveness, stability, and generalization of Ant on many challenging tasks across various neural network architectures.