智能科学与技术学报 (Mar 2022)

Exploration of the continual learning ability that supports the application ecological evolution of the large-scale pretraining Peng Cheng series open source models

  • Yue YU,
  • Xin LIU,
  • Fangqing JIANG,
  • Han ZHANG,
  • Hui WANG,
  • Wei ZENG

Journal volume & issue
Vol. 4
pp. 97 – 108

Abstract

Read online

Large-scale pre-training models have achieved great success in the field of natural language processing by using large-scale corpora and pre-training tasks.With the gradual development of large models, the continual learning ability of large models has become a new research focus.The continual learning technology of the Peng Cheng series large models, the exploration of practice and the still facing challenges were mainly introduced, including the Peng Cheng series continual learning technology through task expansion, data increment and knowledge reasoning, Peng Cheng PANGU multi-task continual learning and the practical exploration of the continual learning ability of the Peng Cheng TONGYAN open source large model, the vocabulary update, semantic mapping and knowledge conflicts that the large model faces in the process of continual learning.

Keywords