Applied Sciences (Jul 2022)

Intra-Domain Transfer Learning for Fault Diagnosis with Small Samples

  • Liangwei Zhang,
  • Junyan Zhang,
  • Yeping Peng,
  • Jing Lin

DOI
https://doi.org/10.3390/app12147032
Journal volume & issue
Vol. 12, no. 14
p. 7032

Abstract

Read online

The concept of deep transfer learning has spawned broad research into fault diagnosis with small samples. A considerable covariate shift between the source and target domains, however, could result in negative transfer and lower fault diagnosis task accuracy. To alleviate the adverse impacts of negative transfer, this research proposes an intra-domain transfer learning strategy that makes use of knowledge from a data-abundant source domain that is akin to the target domain. Concretely, a pre-trained model in the source domain is built via a vanilla transfer from an off-the-shelf inter-domain deep neural network. The model is then transferred to the target domain using shallow-layer freezing and finetuning with those small samples. In a case study involving rotating machinery, where we tested the proposed strategy, we saw improved performance in both training efficiency and prediction accuracy. To demystify the learned neural network, we propose a heat map visualization method using a channel-wise average over the final convolutional layer and up-sampling with interpolation. The findings revealed that the most active neurons coincide with the corresponding fault characteristics.

Keywords