Abstract
Exploring our environment through touch often entails integration of tactile input with auditory and/or visual cues. The mechanisms by which mechanosensation integrates with other sensory modalities during active touch remain poorly understood, despite their ecological importance. Here, we investigated auditory-tactile integration in the context of edge localization during active tactile exploration. We assessed how accurately participants could determine the position of their moving finger in relation to the onsets of tactile, auditory, and auditory-tactile stimuli with respect to a visually displayed midline. We hypothesized that localization precision would be improved in the presence of combined auditory-tactile stimulation. In Experiment 1, the auditory, tactile, and auditory-tactile conditions were presented in separate blocks, while in Experiment 2, they were interleaved within blocks. For both experiments, we found that concurrent auditory-tactile stimulation did not increase localization precision. We also observed across all modalities an inclination to localize the finger position towards the right, possibly due to a shift induced by the left-to-right finger movement. This bias was reduced in the auditory-tactile condition of the second experiment, suggesting that when modality was not predictable, integration of auditory and tactile input may have led to a more accurate representation of finger position at stimulation onset. In conclusion, we show that combined auditory-tactile input may reduce biases in reconstructing the spatial location of a tactile stimulus generated by sliding the finger onto a flat surface. These observations have potential implications for the design of haptic technologies involving active touch.