IEEE Access (Jan 2017)

Long Short-Term Memory With Quadratic Connections in Recursive Neural Networks for Representing Compositional Semantics

  • Dong Wu,
  • Mingmin Chi

DOI
https://doi.org/10.1109/ACCESS.2016.2647384
Journal volume & issue
Vol. 5
pp. 16077 – 16083

Abstract

Read online

Long short-term memory (LSTM) has been widely used in different applications, such as natural language processing, speech recognition, and computer vision over recurrent neural network (RNN) or recursive neural network (RvNN)-a tree-structured RNN. In addition, the LSTM-RvNN has been used to represent compositional semantics through the connections of hidden vectors over child units. However, the linear connections in the existing LSTM networks are incapable of capturing complex semantic representations of natural language texts. For example, complex structures in natural language texts usually denote intricate relationships between words, such as negated sentiment or sentiment strengths. In this paper, quadratic connections of the LSTM model is proposed in terms of RvNNs (abbreviated as qLSTM-RvNN) in order to attack the problem of representing compositional semantics. The proposed qLSTM-RvNN model is evaluated in the benchmark data sets containing semantic compositionality, i.e., sentiment analysis on Stanford Sentiment Treebank and semantic relatedness on sentences involving compositional knowledge data set. Empirical results show that it outperforms the state-of-the-art RNN, RvNN, and LSTM networks in two semantic compositionality tasks by increasing the classification accuracies and sentence correlation while significantly decreasing computational complexities.

Keywords