Abstract
A recent study has suggested that the stimulus responses of cortical neural populations follow a critical power law. More precisely, the power spectrum of the covariance matrix of neural responses follows a power law with an exponent indicating that the neural manifold lies on the edge of differentiability. This criticality is hypothesized to balance expressivity and robustness in neural encoding, as population responses on a nondifferential fractal manifold are thought to be overly sensitive to perturbations. However, contrary to this hypothesis, we prove that neural coding is far more robust than previously assumed. We develop a theoretical framework that provides an analytical expression for the Fisher information of population coding under the small noise assumption. Our results reveal that, due to its intrinsic high dimensionality, population coding maintains reliability even on a nondifferentiable fractal manifold, despite its sensitivity to perturbations. Furthermore, the theory reveals that the trade-off between energetic cost and information makes the critical power-law coding the optimal neural encoding of sensory information for a wide range of conditions. In this derivation, we highlight the essential role of a neural correlation, known as differential correlation, in power-law population coding. By uncovering the nontrivial nature of high-dimensional information coding, this work deepens our understanding of criticality and power laws in both biological and artificial neural computation.