Tongxin xuebao (Apr 2017)
New method of text representation model based on neural network
Abstract
Method of text representation model was proposed to extract word-embedding from text feature.Firstly,the word-embedding of the dual word-embedding list based on dictionary index and the corresponding part of speech index was created.Then,feature vectors was obtained further from these extracted word-embeddings by using Bi-LSTM recurrent neural network.Finally,the sentence vectors were processed by mean-pooling layer and text categorization was classified by softmax layer.The training effects and extraction performance of the combination model of Bi-LSTM and double word-embedding neural network were verified.The experimental results show that this model not only performs well in dealing with the high-quality text feature vector and the expression sequence,but also significantly outperforms other three kinds of neural networks,which includes LSTM,LSTM+context window and Bi-LSTM.