Position-context additive transformer-based model for classifying text data on social media

基于位置-上下文加性Transformer的社交媒体文本数据分类模型

阅读:3

Abstract

In recent years, the continuous increase in the growth of text data on social media has been a major reason to rely on the pre-training method to develop new text classification models specially transformer-based models that have proven worthwhile in most natural language processing tasks. This paper introduces a new Position-Context Additive transformer-based model (PCA model) that consists of two-phases to increase the accuracy of text classification tasks on social media. Phase I aims to develop a new way to extract text characteristics by paying attention to the position and context of each word in the input layer. This is done by integrating the improved word embedding method (the position) with the developed Bi-LSTM network to increase the focus on the connection of each word with the other words around it (the context). As for phase II, it focuses on the development of a transformer-based model based primarily on improving the additive attention mechanism. The PCA model has been tested for the implementation of the classification of health-related social media texts in 6 data sets. Results showed that performance accuracy was improved by an increase in F1-Score between 0.2 and 10.2% in five datasets compared to the best published results. On the other hand, the performance of PCA model was compared with three transformer-based models that proved high accuracy in classifying texts, and experiments also showed that PCA model overcame the other models in 4 datasets to achieve an improvement in F1-score between 0.1 and 2.1%. The results also led us to conclude a direct correlation between the volume of training data and the accuracy of performance as the increase in the volume of training data positively affects F1-Score improvement.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。