With the rapid development of computer technology, the loss of long-distance information in the transmission process is a prominent problem faced by English machine translation. The self-attention mechanism is combined with convolutional neural network (CNN) and long-term and short-term memory network (LSTM). An English intelligent translation model based on LSTM-SA is proposed, and the performance of this model is compared with other deep neural network models. The study adds SA to the LSTM neural network model and constructs the English translation model of LSTM-SA attention embedding. Compared with other deep learning algorithms such as 3RNN and GRU, the LSTM-SA neural network algorithm has faster convergence speed and lower loss value, and the loss value is finally stable at about 8.6. Under the three values of adaptability, the accuracy of LSTM-SA neural network structure is higher than that of LSTM, and when the adaptability is 1, the accuracy of LSTM-SA neural network improved the fastest, with an accuracy of nearly 20%. Compared with other deep learning algorithms, the LSTM-SA neural network algorithm has a better translation level map under the three hidden layers. The proposed LSTM-SA model can better carry out English intelligent translation, enhance the representation of source language context information, and improve the performance and quality of English machine translation model.
Application of LSTM Neural Network Technology Embedded in English Intelligent Translation.
阅读:6
作者:Yang, Yifang
| 期刊: | Computational Intelligence and Neuroscience | 影响因子: | 0.000 |
| 时间: | 2022 | 起止号: | 2022 Sep 27; 2022:1085577 |
| doi: | 10.1155/2022/1085577 | ||
特别声明
1、本文转载旨在传播信息,不代表本网站观点,亦不对其内容的真实性承担责任。
2、其他媒体、网站或个人若从本网站转载使用,必须保留本网站注明的“来源”,并自行承担包括版权在内的相关法律责任。
3、如作者不希望本文被转载,或需洽谈转载稿费等事宜,请及时与本网站联系。
4、此外,如需投稿,也可通过邮箱info@biocloudy.com与我们取得联系。
