IEEE Access (Jan 2022)

Shared Temporal Attention Transformer for Remaining Useful Lifetime Estimation

  • Gavneet Singh Chadha,
  • Sayed Rafay Bin Shah,
  • Andreas Schwung,
  • Steven X. Ding

DOI
https://doi.org/10.1109/ACCESS.2022.3187702
Journal volume & issue
Vol. 10
pp. 74244 – 74258

Abstract

Read online

This paper proposes a novel deep learning architecture for estimating the remaining useful lifetime (RUL) of industrial components, which solely relies on the recently developed transformer architectures. The RUL estimation resorts to analysing degradation patterns within multivariate time series signals. Hence, we propose a novel shared temporal attention block that allows detecting RUL patterns with the progress of time. Furthermore, we develop a split-feature attention block that enables attending to features from different sensor channels. The proposed shared temporal attention layer in the encoder fulfils the goal of attending to temporal degradation patterns in the individual sensor signals before creating a shared correlation across the feature range. We develop two transformer architectures that are specifically designed to operate with multivariate time series data based on these novel attention blocks. We apply the architectures to the well known C-MAPSS benchmark dataset and provide various hyperparameter studies to analyse their impact on the performance. In addition, we provide a thorough comparison with recently presented state-of-the-art approaches and show that the proposed transformer architectures outperform the existing methods by a considerable margin.

Keywords