Applied Sciences (Nov 2024)

Few Adjustable Parameters Prediction Model Based on Lightweight Prefix-Tuning: Learning Session Dropout Prediction Model Based on Parameter-Efficient Prefix-Tuning

  • Yuantong Lu,
  • Zhanquan Wang

DOI
https://doi.org/10.3390/app142310772
Journal volume & issue
Vol. 14, no. 23
p. 10772

Abstract

Read online

In response to the challenge of low predictive accuracy in scenarios with limited data, we propose a few adjustable parameters prediction model based on lightweight prefix-tuning (FAP-Prefix). Prefix-tuning is an efficient fine-tuning method that only adjusts prefix vectors while keeping the model’s original parameters frozen. In each transformer layer, the prefix vectors are connected with the internal key-value pair of the transformer structure. By training on the synthesized sequence of the prefix and original input with masked learning, the transformer model learns the features of individual learning behaviors. In addition, it can also discover hidden connections of continuous learning behaviors. During fine-tuning, all parameters of the pre-trained model are frozen, and downstream task learning is accomplished by adjusting the prefix parameters. Continuous trainable prefix vectors can influence subsequent vector representations, leading to the generation of session dropout prediction results. The experiments show that FAP-Prefix significantly outperforms traditional methods in data-limited settings, with AUC improvements of +4.58%, +3.53%, and +8.49% under 30%, 10%, and 1% data conditions, respectively. It also surpasses state-of-the-art models in prediction performance (AUC +5.42%, ACC +5.3%, F1 score +5.68%).

Keywords