CAAI Transactions on Intelligence Technology (Sep 2023)

A structural developmental neural network with information saturation for continual unsupervised learning

  • Zhiyong Ding,
  • Haibin Xie,
  • Peng Li,
  • Xin Xu

DOI
https://doi.org/10.1049/cit2.12169
Journal volume & issue
Vol. 8, no. 3
pp. 780 – 795

Abstract

Read online

Abstract In this paper, we propose a structural developmental neural network to address the plasticity‐stability dilemma, computational inefficiency, and lack of prior knowledge in continual unsupervised learning. This model uses competitive learning rules and dynamic neurons with information saturation to achieve parameter adjustment and adaptive structure development. Dynamic neurons adjust the information saturation after winning the competition and use this parameter to modulate the neuron parameter adjustment and the division timing. By dividing to generate new neurons, the network not only keeps sensitive to novel features but also can subdivide classes learnt repeatedly. The dynamic neurons with information saturation and division mechanism can simulate the long short‐term memory of the human brain, which enables the network to continually learn new samples while maintaining the previous learning results. The parent‐child relationship between neurons arising from neuronal division enables the network to simulate the human cognitive process that gradually refines the perception of objects. By setting the clustering layer parameter, users can choose the desired degree of class subdivision. Experimental results on artificial and real‐world datasets demonstrate that the proposed model is feasible for unsupervised learning tasks in instance increment and class increment scenarios and outperforms prior structural developmental neural networks.

Keywords