IEEE Access (Jan 2021)

Stability Metrics for Enhancing the Evaluation of Outcome-Based Business Process Predictive Monitoring

  • Jongchan Kim,
  • Marco Comuzzi

DOI
https://doi.org/10.1109/ACCESS.2021.3115759
Journal volume & issue
Vol. 9
pp. 133461 – 133471

Abstract

Read online

Outcome-based predictive process monitoring deals with predicting the outcomes of running cases in a business process using feature vectors extracted from completed traces in an event log. Traditionally, in outcome-based predictive monitoring, a different model is developed using a bucket containing different types of feature vectors. This allows us to extend the traditional evaluation of the quality of process outcome predictions models beyond simply measuring the overall performance, developing a quality assessment framework based on three metrics: one considering the overall performance on all feature vectors, one considering the different levels of performance achieved on feature vectors belonging to individual buckets, i.e., the stability of the performance across buckets, and one considering the stability of the individual predictions obtained, accounting for how close the predicted probabilities are to the cutoff thresholds used to determine the predicted labels. The proposed metrics allow to evaluate, given a set of alternative designs, i.e., combinations of classifier and bucketing method, the quality of the predictions of each alternative. For this evaluation, we suggest using either the concept of Pareto-optimality or a scenario-based scoring method. We discuss an evaluation of the proposed framework conducted with real-life event logs.

Keywords