Abstract
One of the hallmark features of neocortical anatomy is the presence of extensive top-down projections into primary sensory areas. It is hypothesized that one of the roles of these top-down projections is to carry contextual information that helps animals to resolve ambiguities in sensory data. One proposed mechanism of contextual integration is a combination of input streams at distinct apical and basal dendrites of pyramidal neurons. Computationally, however, it is yet to be demonstrated how such an architecture could leverage distinct compartments for flexible contextual integration and sensory processing. Here, we implement a deep neural network with distinct apical and basal compartments that integrates (a) contextual information from top-down projections to apical compartments and (b) sensory representations driven by bottom-up projections to basal compartments. In addition, we develop a new contextual integration task using generative modeling. The performance of deep neural networks augmented with our "apical prior" exceeds that of single-compartment networks. We find that a sparse subset of neurons of the context-relevant categories receive the largest top-down signals. We further show that this sparse gain modulation is necessary. Altogether, this suggests that the "apical prior" could be key for handling the ambiguities that animals encounter in the real world.