Abstract
Sentiment analysis, a branch of natural language processing, focuses on detecting the emotions or opinions conveyed in user-generated content, most often in text form. Recent advances in natural language processing, particularly the adoption of neural network architectures like LSTMs and bidirectional LSTMs, have considerably enhanced text analysis. SOTA transformer models like BERT are computationally expensive due to their large number of parameters, making them resource-intensive for real-world applications. In this study, we propose a unified framework that integrates ALBERT embeddings, Bi-GRU, Bi-LSTM, multihead attention mechanism followed by Elk Herd optimization (EHO) to address the aforementioned challenges. Specifically, ALBERT embeddings are employed to enhance contextual representation, while the combined Bi-GRU and Bi-LSTM architecture, augmented with multihead attention, strengthens the model’s capacity to capture salient sentiment-related features. Elk Herd optimization was employed to improve parameter tuning and avoid local minima during training, thereby enhancing overall classification robustness, particularly in imbalanced and noisy sentiment datasets. The proposed model is evaluated using Accuracy and F-score on benchmark sentiment analysis datasets. For Laptop14, Twitter, Restaurant14, Restaurant15, Restaurant16 it reports 6.1 and 6.5%, 6.4 and 5.9%, 4 and 4.8%, 0.6 and 18.4%, 2.7 and 5.7% average improvement respectively. For SST-5 dataset the proposed model reports 16.8 and 17.8% average improvement in comparison to Graph embedding and hypergraph based models. Furthermore, an extensive ablation study is conducted to analyze the impact of learning rate on accuracy, along with assessments of execution time and memory consumption.