Gong-kuang zidonghua (Mar 2024)

Transformer based time series prediction method for mine internal caused fire

  • WANG Shubin,
  • WANG Xu,
  • YAN Shiping,
  • WANG Ke

DOI
https://doi.org/10.13272/j.issn.1671-251x.2023100084
Journal volume & issue
Vol. 50, no. 3
pp. 65 – 70, 91

Abstract

Read online

Although traditional machine learning based methods for predicting mine internal caused fire have certain predictive capabilities, they cannot effectively capture global dependencies between complex multivariate data, resulting in low prediction precision. In order to solve the above problems, a transformer based time series prediction method for mine internal caused fire is proposed. Firstly, the Hampel filter and Lagrange interpolation method are used to detect outliers and fill in missing values in the data. Secondly, the self attention mechanism of Transformer is utilized to extract features and predict trends from time series data. Finally, by adjusting the size and step size of the sliding window, the model is trained in different time dimensions at different time steps and prediction lengths. Combining gas analysis method, the iconic gases generated by mine fires (CO, O2, N2, CO2, C2H2, C2H4, C2H6) are used as input variables for the model, with CO as the target variable for model output and O2, N2, CO2, C2H2, C2H4, C2H6 as covariates for model input. Selecting the bundle data of S1206 return air corner fire warning in Ningtiaota Coal Mine of Shanmei Coal Group for experimental verification, the results show the following points. ① Univariate prediction and multivariate prediction of CO show that multivariate prediction has higher prediction precision than univariate prediction, indicating that multivariate prediction can improve the prediction precision of the model by capturing the correlation between sequences. ② When the time step is fixed, the prediction precision of the Transformer based mine internal caused fire prediction model decreases with the increase of prediction length. When the prediction length is fixed, the prediction precision of the model improves with the increase of time step. ③ The prediction accuracy of the Transformer algorithm is improved by 7.1%-12.6% and 20.9%-24.9% over the long short-term memory (LSTM) algorithm and recurrent neural network (RNN) algorithm, respectively.

Keywords