IEEE Access (Jan 2021)

Lifelong Language Learning With the Most Forgotten Knowledge

  • Heejeong Choi,
  • Pilsung Kang

DOI
https://doi.org/10.1109/ACCESS.2021.3071787
Journal volume & issue
Vol. 9
pp. 57941 – 57948

Abstract

Read online

Lifelong language learning enables a language model to accumulate knowledge through training on a stream of text data. Recent research on lifelong language learning is based on samples of previous tasks from an episodic memory or generative model. LAMOL, a representative generative model-based lifelong language learning model, preserves the previous information with the generated pseudo-old samples, which are suboptimal. In this paper, we propose an improved version of LAMOL, MFK-LAMOL, which constructs a generative replay using a more effective method. When a new task is received, MFK-LAMOL replays sufficient previous data and retrieves important examples for training alongside the new task. Specifically, it selects the examples with the most forgotten knowledge learned from previous tasks based on the extent to which they include knowledge that has been forgotten after learning new information. We showed that the proposed method outperforms LAMOL on a stream of three different natural language processing tasks.

Keywords