Abstract
Music makes people move. This human propensity to coordinate movement with musical rhythm requires multiscale temporal integration, allowing fast sensory events composing rhythmic input to be mapped onto slower, behavior-relevant, internal templates such as periodic beats. Relatedly, beat perception has been shown to involve an enhanced representation of the beat periodicities in neural activity. However, the extent to which this ability to move to the beat and the related "periodized" neural representation are shared across the senses beyond audition remains unknown. Here, we addressed this question by recording separately the electroencephalographic (EEG) responses and finger tapping to a rhythm conveyed either through acoustic or tactile inputs in healthy volunteers of either sex. The EEG responses to the acoustic rhythm, spanning a low-frequency range (below 15 Hz), showed enhanced representation of the perceived periodic beat, compatible with behavior. In contrast, the EEG responses to the tactile rhythm, spanning a broader frequency range (up to 25 Hz), did not show significant beat-related periodization and yielded less stable tapping. Together, these findings suggest a preferential role of low-frequency neural activity in supporting neural representation of the beat. Most importantly, we show that this neural representation, as well as the ability to move to the beat, is not systematically shared across the senses. More generally, these results, highlighting multimodal differences in beat processing, reveal a process of multiscale temporal integration that allows the auditory system to go beyond mere tracking of onset timing and to support higher-level internal representation and motor entrainment to rhythm.