Abstract
Despite advances in brain-computer interfaces, decoding high-level language representations prior to speech remains challenging. While prior efforts have focused on acoustic or articulatory features, how semantic categories are decoded in time and space remains unclear. Here, we investigated how semantic representations unfold over time by analyzing high-gamma (HG; 70-170 Hz) electrocorticography signals from 20 subjects (7 females and 13 males) performing a word-reading task with body- and nonbody-related words. HG activity was examined from word presentation to 500 ms. Group-level time-resolved decoding within each Brodmann area (BA) revealed significant classification accuracy above chance in both hemispheres (p < 0.05, FDR-corrected). In the left hemisphere, peak BAs followed a frontal-temporal-occipital-parietal cascade: dorsolateral prefrontal cortex (dlPFC; 50 ms), inferior temporal and fusiform gyri (350-400 ms), and supramarginal gyrus (SMG; 500 ms). In contrast, the right hemisphere exhibited an occipital-temporal-frontal-temporal-parietal sequence: visual and temporal pole (TP) regions (50-100 ms), dlPFC (200 ms), fusiform gyrus (400 ms), and angular gyrus (450 ms). This contrasts with the frontal-initiated cascade of the left hemisphere, underscoring hemispheric differences in the timing of peak decoding loci. Cross-temporal regression revealed predictive interregional engagement. In the left hemisphere, early dlPFC activity (0-150 ms) predicted later SMG responses (300-350 ms). In the right, a strong predictive link emerged from the TP to the angular gyrus (200-300 ms; peak R (2) ≈ 0.70). These findings demonstrate that semantic category decoding relies on temporally structured interregional interactions, revealing distinct hemispheric patterns.