Applied Sciences (Sep 2021)

Reusing Monolingual Pre-Trained Models by Cross-Connecting Seq2seq Models for Machine Translation

  • Jiun Oh,
  • Yong-Suk Choi

DOI
https://doi.org/10.3390/app11188737
Journal volume & issue
Vol. 11, no. 18
p. 8737

Abstract

Read online

This work uses sequence-to-sequence (seq2seq) models pre-trained on monolingual corpora for machine translation. We pre-train two seq2seq models with monolingual corpora for the source and target languages, then combine the encoder of the source language model and the decoder of the target language model, i.e., the cross-connection. We add an intermediate layer between the pre-trained encoder and the decoder to help the mapping of each other since the modules are pre-trained completely independently. These monolingual pre-trained models can work as a multilingual pre-trained model because one model can be cross-connected with another model pre-trained on any other language, while their capacity is not affected by the number of languages. We will demonstrate that our method improves the translation performance significantly over the random baseline. Moreover, we will analyze the appropriate choice of the intermediate layer, the importance of each part of a pre-trained model, and the performance change along with the size of the bitext.

Keywords