Scientific Reports (Apr 2023)

Light convolutional neural network by neural architecture search and model pruning for bearing fault diagnosis and remaining useful life prediction

  • Diwang Ruan,
  • Jinzhao Han,
  • Jianping Yan,
  • Clemens Gühmann

DOI
https://doi.org/10.1038/s41598-023-31532-9
Journal volume & issue
Vol. 13, no. 1
pp. 1 – 19

Abstract

Read online

Abstract Convolutional Neural Network (CNN) has been extensively used in bearing fault diagnosis and Remaining Useful Life (RUL) prediction. However, accompanied by CNN’s increasing performance is a deeper network structure and growing parameter size. This prevents it from being deployed in industrial applications with limited computation resources. To this end, this paper proposed a two-step method to build a cell-based light CNN by Neural Architecture Search (NAS) and weights-ranking-based model pruning. In the first step, a cell-based CNN was constructed with searched optimal cells and the number of stacking cells was limited to reduce the network size after influence analysis. To search for the optimal cells, a base CNN model with stacking cells was initially built, and Differentiable Architecture Search was adopted after continuous relaxation. In the second step, the connections in the built cell-based CNN were further reduced by weights-ranking-based pruning. Experiment data from the Case Western Reserve University was used for validation under the task of fault classification. Results showed that the CNN with only two cells achieved a test accuracy of 99.969% and kept at 99.968% even if 50% connections were removed. Furthermore, compared with base CNN, the parameter size of the 2-cells CNN was reduced from 9.677MB to 0.197MB. Finally, after minor revision, the network structure was adapted to achieve bearing RUL prediction and validated with the PRONOSTIA test data. Both tasks confirmed the feasibility and superiority of constructing a light cell-based CNN with NAS and pruning, which laid the potential to realize a light CNN in embedded systems.