Open Computer Science (Oct 2019)

Semi-Supervised learning with Collaborative Bagged Multi-label K-Nearest-Neighbors

  • Settouti Nesma,
  • Douibi Khalida,
  • Bechar Mohammed El Amine,
  • Daho Mostafa El Habib,
  • Saidi Meryem

DOI
https://doi.org/10.1515/comp-2019-0017
Journal volume & issue
Vol. 9, no. 1
pp. 226 – 242

Abstract

Read online

Over the last few years, Multi-label classification has received significant attention from researchers to solve many issues in many fields. The manual annotation of available datasets is time-consuming and need a huge effort from the expert, especially for Multi-label applications in which each example of learning is associated with many labels at once. To overcome the manual annotation drawback, and to take advantages from the large amounts of unlabeled data, many semi-supervised approaches were proposed in the literature to give more sophisticated and fast solutions to support the automatic labeling of the unlabeled data. In this paper, a Collaborative Bagged Multi-label K-Nearest-Neighbors (CobMLKNN) algorithm is proposed, that extend the co-Training paradigm by a Multi-label K-Nearest-Neighbors algorithm. Experiments on ten real-world Multi-label datasets show the effectiveness of CobMLKNN algorithm to improve the performance of MLKNN to learn from a small number of labeled samples by exploiting unlabeled samples.

Keywords