Low-resource MobileBERT for emotion recognition in imbalanced text datasets mitigating challenges with limited resources

低资源 MobileBERT 用于不平衡文本数据集的情感识别,缓解资源有限带来的挑战

阅读:1

Abstract

Modern dialogue systems rely on emotion recognition in conversation (ERC) as a core element enabling empathetic and human-like interactions. However, the weak correlation between emotions and semantics poses significant challenges to emotion recognition in dialogue. Semantically similar utterances can express different types of emotions, depending on the context or speaker. In order to tackle this challenge, our paper proposes a novel loss called Focal Weighted Loss (FWL) with adversarial training and the compact language model MobileBERT. Our proposed loss function handles the problem of imbalanced emotion classification through Focal Weighted Loss and adversarial training and does not require large batch sizes or more computational resources. Our approach has been employed on four text emotion recognition benchmark datasets, MELD, EmoryNLP, DailyDialog and IEMOCAP demonstrating competitive performance. Extensive experiments on these benchmark datasets validate the effectiveness of our proposed FWL with adversarial training. This enables more human-like interactions on digital platforms. Our approach shows its potential to deliver competitive performance under limited resource constraints, comparable to large language models.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。