A deep sentiment model combining ALBERT-driven context and EHO-optimized architecture

一种结合了 ALBERT 驱动的上下文和 EHO 优化架构的深度情感模型

阅读:1

Abstract

Sentiment analysis, a branch of natural language processing, focuses on detecting the emotions or opinions conveyed in user-generated content, most often in text form. Recent advances in natural language processing, particularly the adoption of neural network architectures like LSTMs and bidirectional LSTMs, have considerably enhanced text analysis. SOTA transformer models like BERT are computationally expensive due to their large number of parameters, making them resource-intensive for real-world applications. In this study, we propose a unified framework that integrates ALBERT embeddings, Bi-GRU, Bi-LSTM, multihead attention mechanism followed by Elk Herd optimization (EHO) to address the aforementioned challenges. Specifically, ALBERT embeddings are employed to enhance contextual representation, while the combined Bi-GRU and Bi-LSTM architecture, augmented with multihead attention, strengthens the model’s capacity to capture salient sentiment-related features. Elk Herd optimization was employed to improve parameter tuning and avoid local minima during training, thereby enhancing overall classification robustness, particularly in imbalanced and noisy sentiment datasets. The proposed model is evaluated using Accuracy and F-score on benchmark sentiment analysis datasets. For Laptop14, Twitter, Restaurant14, Restaurant15, Restaurant16 it reports 6.1 and 6.5%, 6.4 and 5.9%, 4 and 4.8%, 0.6 and 18.4%, 2.7 and 5.7% average improvement respectively. For SST-5 dataset the proposed model reports 16.8 and 17.8% average improvement in comparison to Graph embedding and hypergraph based models. Furthermore, an extensive ablation study is conducted to analyze the impact of learning rate on accuracy, along with assessments of execution time and memory consumption.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。