IEEE Access (Jan 2023)

Meta-Learning Based Tasks Similarity Representation for Cross Domain Lifelong Learning

  • Mingge Shen,
  • Dehu Chen,
  • Teng Ren

DOI
https://doi.org/10.1109/ACCESS.2023.3264769
Journal volume & issue
Vol. 11
pp. 36692 – 36701

Abstract

Read online

Deep neural networks perform better in most specific single tasks than humans, but it is hard to handle a sequence of new tasks from different domains. The deep learning-based models always need to remember the parameters of the learned tasks to perform well in the new tasks and forfeit the ability to generalize from previous data, which is inconsistent with human learning. We propose a novel lifelong learning framework that can guide the model to learn new knowledge without forgetting the old knowledge through learning the similarity representation based on meta-learning. Specifically, we employ a cross-domain triplets network (CDTN) by minimizing the maximum mean discrepancy (MMD) between the current task and the knowledge base to learn the domain invariant similarity representation among tasks in different domains. Furthermore, we add a self-attention module to enhance the extraction of similarity features. Secondly, a soft attention network (SAN) which can assign different weights according to the learned similarity representation of tasks is proposed. In addition, a low-level feature enhancement module (LLEM) based on self-attention mechanisms is developed to capture domain-invariant similarity information. The experimental results show that our method effectively reduces catastrophic forgetting compared with the state-of-the-art methods when learning many tasks. Moreover, we show that the proposed method can hardly forget the old knowledge while continuously enhancing the performance of the old tasks, which is more in line with the human way of learning.

Keywords