IEEE Access (Jan 2021)

Unsupervised Domain Adaptation Based on Pseudo-Label Confidence

  • Tingting Fu,
  • Ying Li

DOI
https://doi.org/10.1109/ACCESS.2021.3087867
Journal volume & issue
Vol. 9
pp. 87049 – 87057

Abstract

Read online

Unsupervised domain adaptation aims to align the distributions of data in source and target domains, as well as assign the labels to data in the target domain. In this paper, we propose a new method named Unsupervised Domain Adaptation based on Pseudo-Label Confidence (UDA-PLC). Concretely, UDA-PLC first learns a new feature representation by projecting data of source and target domains into a latent subspace. In this subspace, the distribution of data in two domains are aligned and the discriminability of features in both domains is improved. Then, UDA-PLC applies Structured Prediction (SP) and Nearest Class Prototype (NCP) to predicting pseudo-labels of data in the target domain, and it takes a fraction of samples with high confidence rather than all the pseudo-labeled target samples into next iterative learning. Finally, experimental results validate that the proposed method outperforms several state-of-the-art methods on three benchmark data sets.

Keywords