Journal of Applied Mathematics (Jan 2021)

Key n-Gram Extractions and Analyses of Different Registers Based on Attention Network

  • Haiyan Wu,
  • Ying Liu,
  • Shaoyun Shi,
  • Qingfeng Wu,
  • Yunlong Huang

DOI
https://doi.org/10.1155/2021/5264090
Journal volume & issue
Vol. 2021

Abstract

Read online

Keyn-gram extraction can be seen as extracting n-grams which can distinguish different registers. Keyword (as n=1, 1-gram is the keyword) extraction models are generally carried out from two aspects, the feature extraction and the model design. By summarizing the advantages and disadvantages of existing models, we propose a novel key n-gram extraction model “attentive n-gram network” (ANN) based on the attention mechanism and multilayer perceptron, in which the attention mechanism scores each n-gram in a sentence by mining the internal semantic relationship between words, and their importance is given by the scores. Experimental results on the real corpus show that the key n-gram extracted from our model can distinguish a novel, news, and text book very well; the accuracy of our model is significantly higher than the baseline model. Also, we conduct experiments on key n-grams extracted from these registers, which turned out to be well clustered. Furthermore, we make some statistical analyses of the results of key n-gram extraction. We find that the key n-grams extracted by our model are very explanatory in linguistics.