ETRI Journal (Dec 2024)

Improved contrastive learning model via identification of false-negatives in self-supervised learning

  • Joonsun Auhn,
  • Changsik Cho,
  • Seon-tae Kim

DOI
https://doi.org/10.4218/etrij.2023-0285
Journal volume & issue
Vol. 46, no. 6
pp. 1020 – 1029

Abstract

Read online

Self-supervised learning is a method that learns the data representation through unlabeled data. It is efficient because it learns from large-scale unla-beled data and through continuous research, performance comparable to supervised learning has been reached. Contrastive learning, a type of self-supervised learning algorithm, utilizes data similarity to perform instance-level learning within an embedding space. However, it suffers from the problem of false-negatives, which are the misclassification of data class during training the data representation. They result in loss of information and deteriorate the performance of the model. This study employed cosine similarity and tempera-ture simultaneously to identify false-negatives and mitigate their impact to improve the performance of the contrastive learning model. The proposed method exhibited a performance improvement of up to 2.7% compared with the existing algorithm on the CIFAR-100 dataset. Improved performance on other datasets such as CIFAR-10 and ImageNet was also observed.

Keywords