Abstract
Communication involves the translation of sensory information (e.g., heard words) into abstract concepts according to abstract rules (e.g., the meaning of those words). Accordingly, using language involves an interplay betweenunimodalbrain areas that process sensory information andtransmodalareas that respond to linguistic input regardless of the input modality (e.g., reading sentences or listening to speech). Previous work has shown that intrinsic functional connectivity (iFC), when performed within individuals, can delineate a distributed language network (LANG) that overlaps in detail with regions activated by a reading task. The network is widely distributed across multiple brain regions, recapitulating an organization that is characteristic of association cortex, and which suggests that the LANG network serves transmodal, not unimodal, functions. Here, we tested whether LANG encapsulates transmodal functions by assessing its degree of overlap with two language tasks: one auditory (i.e., listening to speech) and one visual (i.e., reading sentences). The results show that the LANG network aligns well with regions activated by both tasks, supporting a transmodal function. Further, the boundaries of the distributed language network along the lateral temporal cortex serve as a good proxy for the division between transmodal language and unimodal auditory functions: listening to sounds (i.e., filtered, incomprehensible speech) evoked activity that was largely outside of the LANG network but closely followed the network boundaries. These findings support that individualized iFC estimates can delineate the division between sensory-linked and abstract linguistic functions. We conclude that within-individual iFC may be viable for language mapping in individuals with aphasia who cannot perform language tasks in the scanner.