IEEE Access (Jan 2024)

Human Activity Recognition via Temporal Fusion Contrastive Learning

  • Inkyung Kim,
  • Juwan Lim,
  • Jaekoo Lee

DOI
https://doi.org/10.1109/ACCESS.2024.3357143
Journal volume & issue
Vol. 12
pp. 20854 – 20866

Abstract

Read online

With recent advancements in wearable devices and the Internet of Things (IoT), human activity recognition (HAR) has attracted increasing interest in the wearable technology market. However, for sensor-based HAR, collecting sufficient labeled data for deep neural network learning is difficult because experts must find visually recognizable patterns in time-series data. In addition, collecting data is difficult due to privacy issues. To overcome these limitations, self-supervised learning (SSL)-based HAR methods have recently been proposed; these can learn representations without using labeled data. However, such methods only utilize sensor data and do not include the sensor wearer’s biometric information. A learning method that excludes biometric information can identify typical movement patterns but cannot learn customized movement patterns effectively. Thus, in this paper, we proposed the Temporal Fusion Contrastive Learning (TFCL) method, which considers a sensor wearer’s biometric information while training. Experimental results demonstrate that, when fine-tuned with biometric information, the proposed TFCL method obtained the highest F1 score of 0.9791 and 0.7433 on the DLR and MobiAct datasets, respectively. Furthermore, the results obtained when the proposed TFCL method was used to learn the representation and then applied to the downstream task were similar to or better than those obtained using supervised learning from scratch. These results indicate that representations can be learned effectively through TFCL. The experimental code can be found on GitHub at https://github.com/IKKIM00/temporal-fusion-contrastive-learning

Keywords