IEEE Access (Jan 2023)

Multi-Head Self-Attention Gated-Dilated Convolutional Neural Network for Word Sense Disambiguation

  • Chun-Xiang Zhang,
  • Yu-Long Zhang,
  • Xue-Yao Gao

DOI
https://doi.org/10.1109/ACCESS.2023.3243574
Journal volume & issue
Vol. 11
pp. 14202 – 14210

Abstract

Read online

Word sense disambiguation (WSD) is to determine correct sense of ambiguous word based on its context. WSD is widely used in text classification, machine translation and information retrieval and so on. WSD accuracy is low because disambiguation features can not cover more language phenomenon and the discriminative ability of WSD model is not high. In order to improve accuracy of simplified Chinese WSD, a WSD model based on multi-head self-attention and gated-dilated convolutional neural network(AGDCNN) is proposed. Ambiguous word is viewed as the center and 4 adjacent lexical units are extracted successively toward the left and right side. Words, parts of speech, and semantic categories in 4 adjacent lexical units are vectorized and the vectorized results are input into gated-dilated convolutional neural network to get discriminative features. Then, multi-head self-attention is adopted to learn the difference and connection among discriminative features fully. Finally, classification weights are output from adaptive average pooling layer. Experiments are conducted on SemEval-2007: Task#5 and SemEval-2021: Task#2. Experimental results show that AGDCNN model has higher accuracy compared with other methods. Our goal is to improve the quality of simplified Chinese WSD as much as possible based on current linguistic resources and machine learning methods. The challenge we face is to extract effective discriminative features and design disambiguation model in high quality. Our novelty lies in that gated-dilated convolution is combined with multi-head self-attention to extract effective discriminative features, and learn their difference and connection from word form, parts of speech, and semantic categories.

Keywords