Abstract
How physical neuronal networks, bound by spatio-temporal locality constraints, can perform efficient credit assignment, remains an intriguing question. Both backward- and forward-propagation algorithms rely on assumptions that violate this locality in various ways. We introduce Generalized Latent Equilibrium (GLE), a framework for fully local spatio-temporal credit assignment in physical, dynamical neuronal networks. From an energy based on neuron-local mismatches, we derive neuronal dynamics via stationarity and parameter dynamics as gradient descent. The result is an online approximation of backpropagation through space and time in deep networks of cortical microcircuits with continuously active, local synaptic plasticity. GLE exploits dendritic morphology to enable complex information storage and processing in single neurons, as well as their ability to react in anticipation of their future input. This "prospective coding" enables the computation of spatio-temporal convolutions in the forward direction and the approximation of adjoint variables in the backward stream.