IEEE Access (Jan 2023)

Early Exit Strategies for Learning-to-Rank Cascades

  • Francesco Busolin,
  • Claudio Lucchese,
  • Franco Maria Nardini,
  • Salvatore Orlando,
  • Raffaele Perego,
  • Salvatore Trani

DOI
https://doi.org/10.1109/ACCESS.2023.3331088
Journal volume & issue
Vol. 11
pp. 126691 – 126704

Abstract

Read online

The ranking pipelines of modern search platforms commonly exploit complex machine-learned models and have a significant impact on the query response time. In this paper, we discuss several techniques to speed up the document scoring process based on large ensembles of decision trees without hindering ranking quality. Specifically, we study the problem of document early exit within the framework of a cascading ranker made of three components: 1) an efficient but sub-optimal ranking stage; 2) a pruner that exploits signals from the previous component to force the early exit of documents classified as not relevant; and 3) a final high-quality component aimed at finely ranking the documents that survived the previous phase. To maximize speedup and preserve effectiveness, we aim to increase the accuracy of the pruner in identifying non-relevant documents without early exiting documents that are likely to be ranked among the final top- $k$ results. We propose an in-depth study of heuristic and machine-learning techniques for designing the pruner. While the heuristic technique only exploits the score/ranking information supplied by the first sub-optimal ranker, the machine-learned solution named LEAR uses these signals as additional features along with those representing query-document pairs. Moreover, we study alternative solutions to implement the first ranker, either a small prefix of the original forest or an auxiliary machine-learned ranker explicitly trained for this purpose. We evaluated our techniques through reproducible experiments using publicly available datasets and state-of-the-art competitors. The experiments confirm that our early-exit strategies achieve speedups ranging from $3\times $ to $10\times $ without statistically significant differences in effectiveness.

Keywords