Proceedings of the XXth Conference of Open Innovations Association FRUCT (Apr 2022)

AGBoost: Attention-based Modification of Gradient Boosting Machine

  • Andrei Konstantinov,
  • Lev Utkin,
  • Stanislav Kirpichenko

DOI
https://doi.org/10.23919/FRUCT54823.2022.9770928
Journal volume & issue
Vol. 31, no. 1
pp. 96 – 101

Abstract

Read online

A new attention-based model for the gradient boost- ing machine (GBM) called AGBoost (the attention-based gradient boosting) is proposed for solving regression problems. The main idea behind the proposed AGBoost model is to assign attention weights with trainable parameters to iterations of GBM under condition that decision trees are base learners in GBM. Attention weights are determined by applying properties of decision trees and by using the Hubers contamination model which provides an interesting linear dependence between trainable parameters of the attention and the attention weights. This peculiarity allows us to train the attention weights by solving the standard quadratic optimization problem with linear constraints. The attention weights also depend on the discount factor as a tuning parameter, which determines how much the impact of the weight is decreased with the number of iterations. Numerical experiments performed for two types of base learners, original decision trees and extremely randomized trees with various regression datasets illustrate the proposed model.

Keywords