Industrial Artificial Intelligence (Mar 2024)

Universal approximation property of stochastic configuration networks for time series

  • Jin-Xi Zhang,
  • Hangyi Zhao,
  • Xuefeng Zhang

DOI
https://doi.org/10.1007/s44244-024-00017-7
Journal volume & issue
Vol. 2, no. 1
pp. 1 – 17

Abstract

Read online

Abstract For the purpose of processing sequential data, such as time series, and addressing the challenge of manually tuning the architecture of traditional recurrent neural networks (RNNs), this paper introduces a novel approach-the Recurrent Stochastic Configuration Network (RSCN). This network is constructed based on the random incremental algorithm of stochastic configuration networks. Leveraging the foundational structure of recurrent neural networks, our learning model commences with a modest-scale recurrent neural network featuring a single hidden layer and a solitary hidden node. Subsequently, the node parameters of the hidden layer undergo incremental augmentation through a random configuration process, with corresponding weights assigned structurally. This iterative expansion continues until the network satisfies predefined termination criteria. Noteworthy is the adaptability of this algorithm to handle time series data, exhibiting superior performance compared to traditional recurrent neural networks with similar architectures. The experimental results presented in this paper underscore the efficacy of the proposed RSCN for sequence data processing, showcasing its advantages over conventional recurrent neural networks in the context of the performed experiments.

Keywords