Abstract
Hopfield networks are widely used models of associative memory. When the number of stored patterns exceeds the network's storage capacity, theoretical predictions show that the overlap between final states and memorized patterns should vanish. However, numerical simulations show that a small, non-zero overlap persists, indicating that the network retains residual memory. To investigate the origin of this phenomenon, we analyze the network's dynamics during the initial update steps. Using a signal-to-noise-ratio analysis, we demonstrate that when a node undergoes a state flip, the signal term of its neighbors is enhanced by the connecting link. This effect improves the stability of these neighboring neurons, facilitating a fraction of the network to remain aligned with the memory pattern and preventing a total loss of memory. Our findings elucidate the mechanism by which residual memory traces emerge in Hopfield networks beyond the storage limit.