Entropy (Oct 2024)

Mixed Mutual Transfer for Long-Tailed Image Classification

  • Ning Ren,
  • Xiaosong Li,
  • Yanxia Wu,
  • Yan Fu

DOI
https://doi.org/10.3390/e26100839
Journal volume & issue
Vol. 26, no. 10
p. 839

Abstract

Read online

Real-world datasets often follow a long-tailed distribution, where a few majority (head) classes contain a large number of samples, while many minority (tail) classes contain significantly fewer samples. This imbalance creates an information disparity between head and tail classes, which can negatively impact the performance of deep networks. Some transfer knowledge techniques attempt to mitigate this gap by generating additional minority samples, either through oversampling the tail classes or transferring knowledge from the head classes to the tail classes. However, these methods often restrict the diversity of the generated minority samples, primarily focusing on transferring information only to the tail classes. This paper introduces a simple yet effective method for long-tailed classification, called mixed mutual transfer (MMT), which facilitates the mutual transfer of knowledge between head and tail classes by blending samples. The core idea of our method is to create new samples by blending head and tail samples. Head samples are selected using a uniform sampler that retains the long-tailed distribution, while tail samples are selected using a differential sampler that reverses the long-tailed distribution to alleviate imbalance. Our approach aims to diversify both tail and head classes. During the training phase, we utilize the generated samples to update the original dataset for training deep networks. Mixed mutual transfer simultaneously enhances the performance of both head and tail classes. Experimental results on various class-imbalanced datasets show that the proposed method significantly outperforms existing methods, demonstrating its effectiveness in improving the performance of long-tailed deep networks.

Keywords