ITM Web of Conferences (Jan 2022)

Neural machine translation model combining dependency syntax and LSTM

  • Zheng Xin,
  • Chen Hailong,
  • Ma Yuqun,
  • Wang Qing

DOI
https://doi.org/10.1051/itmconf/20224702038
Journal volume & issue
Vol. 47
p. 02038

Abstract

Read online

For the problem of the lack of linguistic knowledge in the neural machine translation model, which is called Transformer, and the insufficient flexibility of positional encoding, this paper introduces the dependency syntax analysis and the long short-term memory network. The source language syntactic structure information is constructed in the neural machine translation system, and the more accurate position information is obtained by using the memory characteristics of LSTM. Experiments show that using the improved model improves by 1.23 BLEU points in the translation task of the IWSLT14 Chinese-English language pair.

Keywords