Scientific Reports (Feb 2024)

Advanced hybrid LSTM-transformer architecture for real-time multi-task prediction in engineering systems

  • Kangjie Cao,
  • Ting Zhang,
  • Jueqiao Huang

DOI
https://doi.org/10.1038/s41598-024-55483-x
Journal volume & issue
Vol. 14, no. 1
pp. 1 – 24

Abstract

Read online

Abstract In the field of engineering systems—particularly in underground drilling and green stormwater management—real-time predictions are vital for enhancing operational performance, ensuring safety, and increasing efficiency. Addressing this niche, our study introduces a novel LSTM-transformer hybrid architecture, uniquely specialized for multi-task real-time predictions. Building on advancements in attention mechanisms and sequence modeling, our model integrates the core strengths of LSTM and Transformer architectures, offering a superior alternative to traditional predictive models. Further enriched with online learning, our architecture dynamically adapts to variable operational conditions and continuously incorporates new field data. Utilizing knowledge distillation techniques, we efficiently transfer insights from larger, pretrained networks, thereby achieving high predictive accuracy without sacrificing computational resources. Rigorous experiments on sector-specific engineering datasets validate the robustness and effectiveness of our approach. Notably, our model exhibits clear advantages over existing methods in terms of predictive accuracy, real-time adaptability, and computational efficiency. This work contributes a pioneering predictive framework for targeted engineering applications, offering actionable insights into.

Keywords