IEEE Access (Jan 2023)

MTLFormer: Multi-Task Learning Guided Transformer Network for Business Process Prediction

  • Jiaojiao Wang,
  • Jiawei Huang,
  • Xiaoyu Ma,
  • Zhongjin Li,
  • Yaqi Wang,
  • Dingguo Yu

DOI
https://doi.org/10.1109/ACCESS.2023.3298305
Journal volume & issue
Vol. 11
pp. 76722 – 76738

Abstract

Read online

The predictive business process monitoring mainly focuses on the performance prediction of business process execution, i.e., predicting the next activity, the execution time of the next activity, and the remaining time, respectively, for an ongoing process instance based on the knowledge gained from historical event logs. Although there is a specific relationship between these three tasks, recent research has focused on training separate prediction models for each task, resulting in high costs and time complexity. Additionally, existing technologies are limited in their ability to capture long-distance dependent features in process instances, further impeding prediction performance. To address these issues, this paper proposes the MTLFormer approach, which leverages the self-attention mechanism of the Transformer network and conducts multi-task parallel training through shared feature representation obtained from different tasks. Our approach reduces the time complexity of model training while simultaneously improving prediction performance. We extensively evaluate our approach on four real-life event logs, demonstrating its capability to achieve multi-task online real-time prediction and effectively improve prediction performance.

Keywords