IEEE Access (Jan 2020)

SensCapsNet: Deep Neural Network for Non-Obtrusive Sensing Based Human Activity Recognition

  • Cuong Pham,
  • Son Nguyen-Thai,
  • Huy Tran-Quang,
  • Son Tran,
  • Hai Vu,
  • Thanh-Hai Tran,
  • Thi-Lan Le

DOI
https://doi.org/10.1109/ACCESS.2020.2991731
Journal volume & issue
Vol. 8
pp. 86934 – 86946

Abstract

Read online

Recently, the recent advancement of deep learning with the capacity to perform automatic high-level feature extraction has achieved promising performance for sensor-based human activity recognition (HAR). Among different deep learning methods, Convolutional Neural Network (CNN) and Long Short Term Memory (LSTM) have been widely adopted. However, scalar outputs and pooling in CNN only allow to get the invariance but not the equivariance. The capsule networks (CapsNet) with the vector output and routing by agreement is able to capture the equivariance. In this paper, we propose a method for recognizing human activity from wearable sensors based on a capsule network named SensCapsNet. The architecture of SensCapsNet is designed to be suitable for spatial-temporal data coming from wearable sensors. Experimental results show that the proposed network outperforms CNN and LSTM methods. The performance of the proposed CapsNet architecture is assessed by altering dynamic routing between capsule layers. The proposed SensCapsNet yields improved accuracy values of 77.7% and 70.5% for 1 routing on two testing datasets in comparison with the baseline methods based on CNN and LSTM that yields the F1-score of 67.7% and 69.2% for the first dataset and 65.3% and 67.6% for the second dataset respectively. Moreover, even several human activity datasets are available, privacy invasion and obtrusive concerns have not been carefully taken in to consideration in dataset building. Toward to build a non-obstructive sensing based human activity recognition method, in this paper, a dataset named 19NonSens is designed and collected from twelve subjects wearing e-Shoes and a smart watch to perform 19 activities under multiple contexts. This dataset will be made publicity available. Finally, thanks to the promising results obtained by the proposed method, we develop a life logging application which achieves a real-time computation and the accuracy rate greater than 80% for 5 common upper body activities.

Keywords