Jisuanji kexue (Jul 2022)

Advances in Chinese Pre-training Models

  • HOU Yu-tao, ABULIZI Abudukelimu, ABUDUKELIMU Halidanmu

DOI
https://doi.org/10.11896/jsjkx.211200018
Journal volume & issue
Vol. 49, no. 7
pp. 148 – 163

Abstract

Read online

In recent years,pre-training models have flourished in the field of natural language processing,aiming at modeling and representing the implicit knowledge of natural language.However,most of the mainstream pre-training models target at the English domain,and the Chinese domain starts relatively late.Given its importance in the natural language processing process,extensive research has been conducted in both academia and industry,and numerous Chinese pre-training models have been proposed.This paper presents a comprehensive review of the research results related to Chinese pre-training models,firstly introducing the basic overview of pre-training models and their development history,then sorting out the two classical models Transformer and BERT that are mainly used in Chinese pre-training models,then proposing a classification method for Chinese pre-training models according to model categories,and summarizes the different evaluation benchmarks in the Chinese domain.Finally,the future development trend of Chinese pre-training models is prospected.It aims to help researchers to gain a more comprehensive understanding of the development of Chinese pre-training models,and then to provide some ideas for the proposal of new models.

Keywords