IEEE Access (Jan 2020)

FSPMTL: Flexible Self-Paced Multi-Task Learning

  • Lijian Sun,
  • Yun Zhou

DOI
https://doi.org/10.1109/ACCESS.2020.3009988
Journal volume & issue
Vol. 8
pp. 132012 – 132020

Abstract

Read online

Multi-Task Learning (MTL) is a method to simultaneously utilize commonalities and differences across tasks to improve the learning performances with limited data. However, in most real-world problems, there are many sample noises which might decline the performance of MTL significantly. To address this challenge, Self-Paced Learning (SPL) method is introduced to improve its performance by increasing the numbers of instances gradually from the simplest samples to the most difficult samples. In the current self-paced multi-task learning methods, SPL is introduced as a term in the optimization process, which causes significant limitations in the combination of SPL and MTL. In this paper, we propose a new flexible framework, which combines MTL with SPL and has two stages in the learning process to make it more suitable for learning difficult samples and tasks. With this framework, we are able to take advantages of both of the existing MTL models and SPL models. Further experiments with the synthetic and real-world datasets demonstrate the higher efficiency of our approach when compared with other state-of-the-art models.

Keywords