Evidence Against Syntactic Encapsulation in Large Language Models

大型语言模型中反对句法封装的证据

阅读:1

Abstract

Transformer-based large language models (LLMs) have recently demonstrated exceptional performance in a variety of linguistic tasks. LLMs primarily combine information across words in a sentence using the attention mechanism, implemented by "attention heads:" these components assign numerical weights linking different words in the input to one another, capturing different relationships between these words. Some attention heads automatically learn to assign weights that accurately encode meaningful linguistic features including, importantly, heads that appear specialized for identifying particular syntactic dependencies. Are syntactic computations in such heads "encapsulated", i.e., impenetrable to the influence of non-syntactic information? Such encapsulated computations would be strikingly different from those of the human mind, where non-syntactic information sources (e.g., semantics) influence parsing from the earliest moments of online processing, and where syntax and semantics are tightly linked in the mental lexicon. Here, we tested whether the activity of "syntax-specialized" attention heads in transformer-based LLMs is modulated by one type of semantic information: plausibility. In each of three LLMs (BERT, GPT-2, and Llama 2), we first identified attention heads specialized for various dependency types; in nearly all cases tested, we then found that implausible semantic information reduces attention between the words that constitute the dependency for which a head is specialized. These results demonstrate that, even in attention heads that are the best a-priori candidates for syntactic encapsulation, syntactic information is penetrable to semantics. These data are broadly consistent with the integration of syntax and semantics in human minds.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。