Applied Sciences (May 2024)

Branch-Transformer: A Parallel Branch Architecture to Capture Local and Global Features for Language Identification

  • Zeen Li,
  • Shuanghong Liu,
  • Zhihua Fang,
  • Liang He

DOI
https://doi.org/10.3390/app14114681
Journal volume & issue
Vol. 14, no. 11
p. 4681

Abstract

Read online

Currently, an increasing number of people are opting to use transformer models or conformer models for language identification, achieving outstanding results. Among them, transformer models based on self-attention can only capture global information, lacking finer local details. There are also approaches that employ conformer models by concatenating convolutional neural networks and transformers to capture both local and global information. However, this static single-branch architecture is difficult to interpret and modify, and it incurs greater inference difficulty and computational costs compared to dual-branch models. Therefore, in this paper, we propose a novel model called Branch-transformer (B-transformer). In contrast to traditional transformers, it consists of parallel dual-branch structures. One branch utilizes self-attention to capture global information, while the other employs a Convolutional Gated Multi-Layer Perceptron (cgMLP) module to extract local information. We also investigate various fusion methods for integrating global and local information and experimentally validate the effectiveness of our approach on the NIST LRE 2017 dataset.

Keywords