IEEE Access (Jan 2022)

Single-Head Lifelong Learning Based on Distilling Knowledge

  • Yen-Hsiang Wang,
  • Chih-Yang Lin,
  • Tipajin Thaipisutikul,
  • Timothy K. Shih

DOI
https://doi.org/10.1109/ACCESS.2022.3155451
Journal volume & issue
Vol. 10
pp. 35469 – 35478

Abstract

Read online

Within the machine learning field, the main purpose of lifelong learning, also known as continuous learning, is to enable neural networks to learn continuously, as humans do. Lifelong learning accumulates the knowledge learned from previous tasks and transfers it to support the neural network in future tasks. This technique not only avoids the catastrophic forgetting problem with previous tasks when training new tasks, but also makes the model more robust with the temporal evolution. Motivated by the recent intervention of the lifelong learning technique, this paper presents a novel feature-based knowledge distillation method that differs from the existing methods of knowledge distillation in lifelong learning. Specifically, our proposed method utilizes the features from intermediate layers and compresses them in a unique way that involves global average pooling and fully connected layers. The authors then use the output of this branch network to deliver information from previous tasks to the model in the future. Extensive experiments show that our proposed model consistency outperforms the state-of-the-art baselines with the accuracy metric by at least two percent improvement under different experimental settings.

Keywords