IEEE Access (Jan 2019)

Text Classification Based on Conditional Reflection

  • Yanliang Jin,
  • Can Luo,
  • Weisi Guo,
  • Jinfei Xie,
  • Dijia Wu,
  • Rui Wang

DOI
https://doi.org/10.1109/ACCESS.2019.2921976
Journal volume & issue
Vol. 7
pp. 76712 – 76719

Abstract

Read online

Text classification is an essential task in many natural language processing (NLP) applications; we know each sentence may have only a few words that play an important role in text classification, while other words have no significant effect on the classification results. Finding these keywords has an important impact on the classification accuracy. In this paper, we propose a network model, named RCNNA, recurrent convolution neural networks with attention (RCNNA), which models on the human conditional reflexes for text classification. The model combines bidirectional LSTM (BLSTM), attention mechanism, and convolutional neural networks (CNNs) as the receptors, nerve centers, and effectors in the reflex arc, respecctively. The receptors get the context information through BLSTM, the nerve centers get the important information of the sentence through the attention mechanism, and the effectors capture more key information by CNN. Finally, the model outputs the classification result by the softmax function. We test our NLP algorithm on four datasets containing Chinese and English for text classification, including a comparison of random initialization word vectors and pre-training word vectors. The experiments show that the RCNNA achieves the best performance by comparing with the state-of-the-art baseline methods.

Keywords