IEEE Access (Jan 2024)

Adaptive Fine-Tuning in Degradation-Time-Series Forecasting via Generating Source Domain

  • Jinxin Pan,
  • Bo Jin,
  • Shenglong Wang,
  • Xiaotong Yuwen,
  • Xiaoxuan Jiao

DOI
https://doi.org/10.1109/ACCESS.2023.3341159
Journal volume & issue
Vol. 12
pp. 15093 – 15104

Abstract

Read online

Parameter-Efficient Fine-Tuning is widely used to transfer models between different domains. However, for some high-reliability-equipment, the degradation is at a slow rate and continually fluctuates, making it difficult to extract features effectively. Moreover, collecting an integrated source domain for high-reliability-equipment is tough due to the small sample of related datasets. Aiming at the transfer problem of the time-series prediction model, this research proposed an LSTM-fine-tune model, where the parameters of the model are explicitly trained and partly frozen, such that a small number of gradient steps with a small amount of training data from a new task will produce good generalization performance on that task. The algorithm is then benchmarked on sinusoidal functions, where data were randomly generated with different phases and amplitudes. The results show that the LSTM-fine-tune model can learn knowledge from different sinusoidal data and fit a new one quickly with high accuracy. This paper also considers solving two actual problems. One is transferring oxygen concentrator data from the experimental condition to the actual service condition, the results show that the accuracy is largely improved. Moreover, this paper tried to extract more general degradation knowledge from the Wiener process and then transferred it to the degradation data of the oxygen concentrator. The results show that the model quickly achieved higher prediction accuracy with the knowledge. The code and data from the test bed are accessible at https://github.com/panjinxin123/Adaptive-finetuning-in-degradation-time-series-forecasting-via-generating-source-domain.

Keywords