Mathematics (Aug 2022)

An Ensemble and Iterative Recovery Strategy Based <i>k</i>GNN Method to Edit Data with Label Noise

  • Baiyun Chen,
  • Longhai Huang,
  • Zizhong Chen,
  • Guoyin Wang

DOI
https://doi.org/10.3390/math10152743
Journal volume & issue
Vol. 10, no. 15
p. 2743

Abstract

Read online

Learning label noise is gaining increasing attention from a variety of disciplines, particularly in supervised machine learning for classification tasks. The k nearest neighbors (kNN) classifier is often used as a natural way to edit the training sets due to its sensitivity to label noise. However, the kNN-based editor may remove too many instances if not designed to take care of the label noise. In addition, the one-sided nearest neighbor (NN) rule is unconvincing, as it just considers the nearest neighbors from the perspective of the query sample. In this paper, we propose an ensemble and iterative recovery strategy-based kGNN method (EIRS-kGNN) to edit data with label noise. EIRS-kGNN first uses the general nearest neighbors (GNN) to expand the one-sided NN rule to a binary-sided NN rule, taking the neighborhood of the queried samples into account. Then, it ensembles the prediction results of a finite set of ks in the kGNN to prudently judge the noise levels for each sample. Finally, two loops, i.e., the inner loop and the outer loop, are leveraged to iteratively detect label noise. A frequency indicator is derived from the iterative processes to guide the mixture approaches, including relabeling and removing, to deal with the detected label noise. The goal of EIRS-kGNN is to recover the distribution of the data set as if it were not corrupted. Experimental results on both synthetic data sets and UCI benchmarks, including binary data sets and multi-class data sets, demonstrate the effectiveness of the proposed EIRS-kGNN method.

Keywords