IEEE Access (Jan 2025)
Temporal Forecasting of Distributed Temperature Sensing in a Thermal Hydraulic System With Machine Learning and Statistical Models
Abstract
We benchmark performance of long-short term memory (LSTM) network machine learning model and autoregressive integrated moving average (ARIMA) statistical model in temporal forecasting of distributed temperature sensing (DTS). Data in this study consists of fluid temperature transient measured with two co-located Rayleigh scattering fiber optic sensors (FOS) in a forced convection mixing zone of a thermal tee. We treat each gauge of a FOS as an independent temperature sensor. We first study prediction of DTS time series using Vanilla LSTM and ARIMA models trained on prior history of the same FOS that is used for testing. The results yield maximum absolute percentage error (MaxAPE) and root mean squared percentage error (RMSPE) of 1.58% and 0.06% for ARIMA, and 3.14% and 0.44% for LSTM, respectively. Next, we investigate zero-shot forecasting (ZSF) with LSTM and ARIMA trained on history of the co-located FOS only, which is advantageous when limited training data is available. The ZSF MaxAPE and RMSPE values for ARIMA are comparable to those of the Vanilla use case, while the error values for LSTM increase. We show that in ZSF, performance of LSTM network can be improved by training on most correlated gauges between the two FOS, which are identified by calculating the Pearson correlation coefficient. The improved ZSF MaxAPE and RMSPE for LSTM are 4.4% and 0.33%, respectively. Performance of ZSF LSTM can be further enhanced through transfer learning (TL), where LSTM is re-trained on a subset of the FOS that is the target of forecasting. We show that LSTM pre-trained on correlated dataset and re-trained on 30% of testing target dataset achieves MaxAPE and RMSPE values of 2.32% and 0.28%, respectively.
Keywords