IEEE Access (Jan 2020)

Multi-Attention-Based Capsule Network for Uyghur Personal Pronouns Resolution

  • Qimeng Yang,
  • Long Yu,
  • Shengwei Tian,
  • Jinmiao Song

DOI
https://doi.org/10.1109/ACCESS.2020.2989665
Journal volume & issue
Vol. 8
pp. 76832 – 76840

Abstract

Read online

Anaphora resolution of Uyghur is a challenging task because of complex language structure and limited corpus. We propose a multi-attention based capsule network model for Uyghur personal pronouns resolution, which can obtain the multi-layer and implicit semantic information effectively. Independently recurrent neural network (IndRNN) is applied in this model to achieve the interdependent features with long distance. Moreover, the capsule network can extract richer textual information to improve expression ability. Compared with the single attention-based model which combines Long Short-Term Memory (LSTM), the multi-attention based capsule network can capture multi-layer semantic information through a multi-attention mechanism without using any external parsing results. Experimental results on Uyghur dataset show that our approach surpasses the state-of-the-art models and gets the highest F-score of 83.85%. Meanwhile, our experimental results demonstrate the proposed method can effectively improve the performance of Uyghur personal pronouns resolution.

Keywords