Jisuanji kexue (Jan 2022)
Incorporating Language-specific Adapter into Multilingual Neural Machine Translation
Abstract
Multilingual neural machine translation (mNMT) leverages a single encoder-decoder model for translations in multiple language pairs.mNMT can encourage knowledge transfer among related languages,improve low-resource translation and enable zero-shot translation.However,the existing mNMT models are weak in modeling language diversity and perform poor zero-shot translation.To solve the above problems,we first propose a variable dimension bilingual adapter based on the existing adapter architecture.The bilingual adapters are introduced in-between each two Transformer sub-layers to extract language-pair-specific features and the language-pair-specific capacity in the encoder or the decoder can be altered by changing the inner dimension of adapters.We then propose a shared monolingual adapter to model unique features for each language.Experiments on IWSLT dataset show that the proposed model remarkably outperforms the multilingual baseline model and the monolingual adapter can improve the zero-shot translation without deteriorating the performance of multilingual translation.
Keywords