Abstract
Neural circuits construct internal 'world-models' to guide behavior. The predictive processing framework posits that neural activity signaling sensory predictions and concurrently computing prediction-errors is a signature of those internal models. To understand how the brain generates predictions for complex sensorimotor signals, we investigate the emergence of high-dimensional, multi-modal predictive representations in recurrent networks. Contrary to previous proposals of functionally specialized cell-types, stimulus and prediction-error representations are desegregated in networks performing robust predictive processing. We confirmed these model predictions by using a rich stimulus-set to violate animals' learned expectations. We propose that predictive processing is optimal when excitation/inhibition balance is loose, and reveal distinct functional roles of excitatory and inhibitory neurons. Together, we demonstrate that neural representations of internal models are highly distributed, yet structured to support flexible readout of behaviorally-relevant information. Our results advance the understanding of how internal models are computed, by incorporating different computations into a unifying model.