Mathematics (Jan 2023)

Time-Varying Sequence Model

  • Sneha Jadhav,
  • Jianxiang Zhao,
  • Yepeng Fan,
  • Jingjing Li,
  • Hao Lin,
  • Chenggang Yan,
  • Minghan Chen

DOI
https://doi.org/10.3390/math11020336
Journal volume & issue
Vol. 11, no. 2
p. 336

Abstract

Read online

Traditional machine learning sequence models, such as RNN and LSTM, can solve sequential data problems with the use of internal memory states. However, the neuron units and weights are shared at each time step to reduce computational costs, limiting their ability to learn time-varying relationships between model inputs and outputs. In this context, this paper proposes two methods to characterize the dynamic relationships in real-world sequential data, namely, the internal time-varying sequence model (ITV model) and the external time-varying sequence model (ETV model). Our methods were designed with an automated basis expansion module to adapt internal or external parameters at each time step without requiring high computational complexity. Extensive experiments performed on synthetic and real-world data demonstrated superior prediction and classification results to conventional sequence models. Our proposed ETV model is particularly effective at handling long sequence data.

Keywords