Energy Reports (Nov 2023)

Broad Transfer Learning Network based Li-ion battery lifetime prediction model

  • Ping-Huan Kuo,
  • Yung-Ruen Tseng,
  • Po-Chien Luan,
  • Her-Terng Yau

Journal volume & issue
Vol. 10
pp. 881 – 893

Abstract

Read online

The Broad Transfer Learning Network (BTLN) model uses only one-third of the parameters as the common neural network does to achieve the similar prediction performance that a Multi-Layer Perceptron (MLP) model has. This brings considerable benefits to AI-related studies that highly rely on computational power. This study also proposed a feature mapping technique that applied a linear transformation to the original features to improve the performance of various learning models. This study combines both broad learning and transfer learning techniques. Data augmentation is used to expand the training dataset. It proves that, under certain conditions, the model with a broader network can perform better. The broad network structure can act as an effective feature extractor. The transform learning algorithm can increase the training efficiency due to decreased trainable parameters. The performance improvement of neural network models is particularly remarkable. The performance improves by 3.5% without any change to the model architecture. In this paper, the performance of the BTLN model proposed based on Root Mean Square Error (RMSE) improves by 18.5%. Compared to common neural network models, the training parameters and overall parameters are lowered to 14.57% and 36.28%, respectively.

Keywords