Hierarchical text classification (HTC) is a challenging task that requires classifiers to solve a series of multi-label subtasks considering hierarchical dependencies among labels. Recent studies have introduced prompt tuning to create closer connections between the language model (LM) and the complex label hierarchy. However, we find that the model's attention to the prompt gradually decreases as the prompt moves from the input to the output layer, revealing the limitations of previous prompt tuning methods for HTC. Given the success of prefix tuning-based studies in natural language understanding tasks, we introduce Structural entroPy guIded pRefIx Tuning (SPIRIT). Specifically, we extract the essential structure of the label hierarchy via structural entropy minimization and decode the abstractive structural information as the prefix to prompt all intermediate layers in the LM. Additionally, a depth-wise reparameterization strategy is developed to enhance optimization and propagate the prefix throughout the LM layers. Extensive evaluation on four popular datasets demonstrates that SPIRIT achieves a state-of-the-art performance.
SPIRIT: Structural Entropy Guided Prefix Tuning for Hierarchical Text Classification.
阅读:6
作者:Zhu He, Xia Jinxiang, Liu Ruomei, Deng Bowen
| 期刊: | Entropy | 影响因子: | 2.000 |
| 时间: | 2025 | 起止号: | 2025 Jan 26; 27(2):128 |
| doi: | 10.3390/e27020128 | ||
特别声明
1、本文转载旨在传播信息,不代表本网站观点,亦不对其内容的真实性承担责任。
2、其他媒体、网站或个人若从本网站转载使用,必须保留本网站注明的“来源”,并自行承担包括版权在内的相关法律责任。
3、如作者不希望本文被转载,或需洽谈转载稿费等事宜,请及时与本网站联系。
4、此外,如需投稿,也可通过邮箱info@biocloudy.com与我们取得联系。
