Abstract
In humans, the superior temporal sulcus (STS) combines auditory and visual information. However, the extent to which it relies on visual information from the ventral or dorsal stream remains uncertain. To address this, we analyzed open-source functional magnetic resonance imaging data collected from 15 participants (6 females and 9 males) as they watched a movie. We used artificial neural networks to investigate the relationship between multivariate response patterns in auditory cortex, the two visual streams, and the rest of the brain, finding that distinct portions of the STS combine information from the two visual streams with auditory information.