Abstract
The core use of human language is to send complex ideas from one mind to another. In everyday conversations, comprehension and production are intertwined, as speakers and listeners alternate roles. Nonetheless, the neural systems underlying these faculties are typically studied in isolation, using paradigms that cannot capture interactive communication. Here, we used fMRI hyperscanning to simultaneously record dyads engaged in real-time conversations. We used language model embeddings to quantify the degree to which production and comprehension systems rely on shared neural representations, both within and across brains. We found that both processes key into overlapping neural systems, with similar neural tuning for both processes, spanning the cortical language network. Speaker-listener coupling extended beyond the language network into areas associated with social cognition. Our results suggest that the neural systems for speech comprehension and production align with common linguistic features encoded in a broad cortical network for language and communication.