International Journal of Computational Intelligence Systems (Jun 2023)

Multi-source Transfer Learning Based on the Power Set Framework

  • Bingbing Song,
  • Jianhan Pan,
  • Qiaoli Qu,
  • Zexin Li

DOI
https://doi.org/10.1007/s44196-023-00281-y
Journal volume & issue
Vol. 16, no. 1
pp. 1 – 14

Abstract

Read online

Abstract Transfer learning is a great technology that can leverage knowledge from label-rich domains to address problems in similar domains that lack labeled data. Most previous works focus on single-source transfer, assuming the source domain contains sufficient labeled data and is close to the target domain. However, in practical applications, this assumption is hardly met, and labeled data exist in different domains. To improve the adaptability of transfer learning models for multi-source scenarios, many existing methods utilize the commonality and specificity across source domains. They either map all source domains with the target domain into a common feature space for knowledge transfer or combine multiple classifiers trained on pairs of each source and target to form a target classifier. However, the correlations across multiple source domains that can bring significant impacts on learning performance are ignored. In light of this, we propose a novel multi-source transfer learning method based on the power set framework (PSF-MSTL). First, PSF-MSTL constructs a power set framework that enables different source domains to be interrelated. Second, PSF-MSTL makes the source-domain framework integral and able to provide complementary knowledge using a dual-promotion strategy. Additionally, PSF-MSTL is formulated as an optimization problem, and an iterative algorithm is presented to address it. Finally, we conduct extensive experiments to show that PSF-MSTL can outperform many advanced multi-source transfer learning methods.

Keywords