Abstract
Audiovisual integration occurs automatically and affects visual processing. This study aims to investigate whether temporally synchronized auditory signals enhance monocular signals during binocular observation. In Experiment 1, 16 participants performed a visual target localization task. A mirror stereoscope was used to present a rapid serial visual presentation (RSVP) stream of distractors to both eyes, with a visual target inserted in either both eyes, the dominant eye, or the non-dominant eye. Continuous low tones synchronized with distractors were paired with the target as either the same low tone (non-salience) or a high tone (salience). Detection facilitation rates by tone type were analyzed through multiple comparisons. Results showed a significant detection enhancement only when the target appeared in the non-dominant eye. In Experiment 2, involving 16 participants, a similar RSVP was presented, but with an orientation discrimination task for parafoveally presented texture stimuli comprising 17 vertical Gabor patches. The angle and proportion of tilted patches were manipulated simultaneously, and logistic regression was used to estimate orientation discrimination thresholds. Contrary to predictions, salient tones did not reduce the thresholds. These findings suggest that temporally synchronized auditory signals can selectively enhance the monocular processing of weaker visual signals (i.e., non-dominant eye signals) before binocular fusion, particularly for spatial localization. However, these effects did not extend to the identification of visual content (i.e., orientation) or stable visual signals (i.e., dominant or binocular). The results highlight the role of audiovisual integration in supporting unstable monocular signals and suggest potential applications in low vision training.