Abstract
The YOTO (You Only Think Once) dataset presents a human electroencephalog- raphy (EEG) resource for exploring multisensory perception and mental imagery. The study enrolled 26 participants who performed tasks involving both unimodal and multimodal stimuli. Researchers collected high-resolution EEG signals at a 1000 Hz sampling rate to capture high-temporal-resolution neural activity related to internal mental representations. The protocol incorporated visual, auditory, and combined cues to investigate the integration of multiple sensory modalities, and participants provided self-reported vividness ratings that indicate subjec- tive perceptual strength. Technical validation involved event-related potentials (ERPs) and power spectral density (PSD) analyses, which demonstrated the reli- ability of the data and confirmed distinct neural responses across stimuli. This dataset aims to foster studies on neural decoding, perception, and cognitive mod- eling, and it is publicly accessible for researchers who seek to advance multimodal mental imagery research and related applications.