Applied AI Letters (Dec 2022)

Twin neural network regression

  • Sebastian Johann Wetzel,
  • Kevin Ryczko,
  • Roger Gordon Melko,
  • Isaac Tamblyn

DOI
https://doi.org/10.1002/ail2.78
Journal volume & issue
Vol. 3, no. 4
pp. n/a – n/a

Abstract

Read online

Abstract We introduce twin neural network regression (TNNR). This method predicts differences between the target values of two different data points rather than the targets themselves. The solution of a traditional regression problem is then obtained by averaging over an ensemble of all predicted differences between the targets of an unseen data point and all training data points. Whereas ensembles are normally costly to produce, TNNR intrinsically creates an ensemble of predictions of twice the size of the training set while only training a single neural network. Since ensembles have been shown to be more accurate than single models this property naturally transfers to TNNR. We show that TNNs are able to compete or yield more accurate predictions for different data sets, compared with other state‐of‐the‐art methods. Furthermore, TNNR is constrained by self‐consistency conditions. We find that the violation of these conditions provides a signal for the prediction uncertainty.

Keywords