Complex & Intelligent Systems (Apr 2023)
A time-sensitive learning-to-rank approach for cloud simulation resource prediction
Abstract
Abstract Predicting the computing resources required by simulation applications can provide a more reasonable resource-allocation scheme for efficient execution. Existing prediction methods based on machine learning, such as classification/regression, typically must accurately predict the runtime of simulation applications and select the optimal computing resource allocation scheme by sorting the length of the simulation runtime. However, the ranking results are easily affected by the simulation runtime prediction accuracy. This study proposes a time-sensitive learning-to-rank (LTR) approach for cloud simulations resource prediction. First, we use the Shapley additive explanation (SHAP) value from the field of explainable artificial intelligence (XAI) to analyze the impact of relevant factors on the simulation runtime and to extract the feature dimensions that significantly affect the simulation runtime. Second, by modifying the target loss function of the rankboost algorithm and training a time-sensitive LTR model based on simulation features, we can accurately predict the computing resource allocation scheme that maximizes the execution efficiency of simulation applications. Compared with the traditional machine learning prediction algorithm, the proposed method can improve the average sorting performance by 3%–48% and can accurately predict the computing resources required for the simulation applications to execute in the shortest amount of time.
Keywords