Scientific Reports (Aug 2025)

MMTransformer: a multivariate time-series resource forecasting model for multi-component applications

  • Guangzhang Cui,
  • Tao Hu,
  • Wei Zhang,
  • Hujun Bao

DOI
https://doi.org/10.1038/s41598-025-07162-8
Journal volume & issue
Vol. 15, no. 1
pp. 1 – 16

Abstract

Read online

Abstract Efficient resource forecasting in multi-component application scenarios necessitates comprehensive consideration of inter-component dependencies and resource interaction characteristics. Existing methods primarily rely on single-step predictions, adopt univariate models, and ignore inter-component dependencies, making them less effective in addressing complex dynamics in multi-component applications. To address these challenges, this study introduces MMTransformer, a multivariate time series forecasting model designed for multi-component applications. The model offers several innovations: (1) a segmented embedding strategy to effectively capture sequence features; (2) a multi-stage attention mechanism to model intricate inter-variable dependencies; and (3) a multi-scale encoder-decoder structure to adapt to dynamic variations in local and global information. To evaluate the model’s performance, we constructed workload datasets for courseware production and digital human video creation systems using real-world application scenarios, with three key performance metrics established by monitoring core resource states. Experimental results indicate that MMTransformer achieves average reductions of 42.15% in MSE and 35.37% in MAE compared to traditional time series models such as LSTM, GRU, and RNN. Compared to state-of-the-art time series models like Fedformer, Autoformer, and Informer, MSE and MAE are reduced by an average of 27.14% and 25.55%, respectively. The findings confirm that MMTransformer significantly enhances resource prediction accuracy in multi-component applications.

Keywords