Entropy (Aug 2020)

On the Use of Concentrated Time–Frequency Representations as Input to a Deep Convolutional Neural Network: Application to Non Intrusive Load Monitoring

  • Sarra Houidi,
  • Dominique Fourer,
  • François Auger

DOI
https://doi.org/10.3390/e22090911
Journal volume & issue
Vol. 22, no. 9
p. 911

Abstract

Read online

Since decades past, time–frequency (TF) analysis has demonstrated its capability to efficiently handle non-stationary multi-component signals which are ubiquitous in a large number of applications. TF analysis us allows to estimate physics-related meaningful parameters (e.g., F0, group delay, etc.) and can provide sparse signal representations when a suitable tuning of the method parameters is used. On another hand, deep learning with Convolutional Neural Networks (CNN) is the current state-of-the-art approach for pattern recognition and allows us to automatically extract relevant signal features despite the fact that the trained models can suffer from a lack of interpretability. Hence, this paper proposes to combine together these two approaches to take benefit of their respective advantages and addresses non-intrusive load monitoring (NILM) which consists of identifying a home electrical appliance (HEA) from its measured energy consumption signal as a “toy” problem. This study investigates the role of the TF representation when synchrosqueezed or not, used as the input of a 2D CNN applied to a pattern recognition task. We also propose a solution for interpreting the information conveyed by the trained CNN through different neural architecture by establishing a link with our previously proposed “handcrafted” interpretable features thanks to the layer-wise relevant propagation (LRP) method. Our experiments on the publicly available PLAID dataset show excellent appliance recognition results (accuracy above 97%) using the suitable TF representation and allow an interpretation of the trained model.

Keywords