IEEE Access (Jan 2021)

Privacy-Aware Resource Sharing in Cross-Device Federated Model Training for Collaborative Predictive Maintenance

  • Sourabh Bharti,
  • Alan Mcgibney

DOI
https://doi.org/10.1109/ACCESS.2021.3108839
Journal volume & issue
Vol. 9
pp. 120367 – 120379

Abstract

Read online

The proliferation of Industry 4.0 has made modern industrial assets a rich source of data that can be leveraged to optimise operations, ensure efficiency, and minimise maintenance costs. The availability of data is advantageous for asset management, however, attempts to maximise the value of this data often fall short due to additional constraints, such as privacy concerns and data stored in distributed silos that is difficult to access and share. Federated Learning (FL) has been explored to address these challenges and has been demonstrated to provide a mechanism that allows highly distributed data to be mined in a privacy-preserving manner and offering new opportunities for a collaborative approach to asset management. Despite the benefits, FL has some challenges that need to be overcome to make it fully compatible for asset management or more specifically predictive maintenance applications. FL requires a set of clients that participate in the model training process, however, orchestration, device heterogeneity and scalability can hinder the speed and accuracy in the context of collaborative predictive maintenance. To address this challenge, this work proposes a split-learning-based framework (SplitPred) that enables FL clients to maximise available resources within their local network without compromising the benefits of a FL approach (i.e., privacy and shared learning). Experiments performed on the benchmark C-MAPSS data-set demonstrate the advantage of applying SplitPred in the FL process in terms of efficient use of resources, i.e., model convergence time, accuracy, and network load.

Keywords