Abstract
Understanding how the brain integrates information across senses is a fundamental challenge. We investigated the mechanisms underlying reciprocal yet temporally asymmetric auditory-visual interactions using high-density EEG during an auditory spatial attention task with auditory-only (A) and audiovisual (AV) stimuli. In the AV condition, sounds were paired with a central, task-irrelevant visual input. Participants attended to sounds on one side (attended) and ignored the other (unattended). Results showed that the visual input affected auditory processing through initial stimulus-driven suppression, reflected in the eliminated auditory contralateral sensory bias (selection negativity, 220-320 ms) and attenuated fronto-temporal connectivity and auditory contralateral occipital positivity (ACOP, 300-500 ms). Subsequently, auditory-to-visual cross-modal attentional spreading exhibited attention-dependent facilitation, emerging over occipital (300-600 ms) and centro-parietal (500-600 ms) regions. These findings support a competitive-facilitative framework in which stimulus-driven suppression precedes goal-driven facilitation, providing a key temporal constraint for computational and neurocognitive models of crossmodal integration.