IEEE Access (Jan 2024)

Recognition of Human Activities Based on Ambient Audio and Vibration Data

  • Marcel Koch,
  • Thomas Pfitzinger,
  • Fabian Schlenke,
  • Fabian Kohlmorgen,
  • Roland Groll,
  • Hendrik Wohrle

DOI
https://doi.org/10.1109/ACCESS.2024.3457912
Journal volume & issue
Vol. 12
pp. 174399 – 174412

Abstract

Read online

The majority of human activity recognition (HAR) systems are based on computer vision or wearable sensors. However, these methods have inherent limitations, including concerns regarding privacy, the necessity for high computational power, and the requirement for frequent battery recharging. This paper presents a distributed multisensor system for the recognition of human activities based on ambient audio and vibration data. The system comprises multiple ambient multisensor nodes (AMSNs) situated within a smart home setting, in conjunction with a data transfer and analysis subsystem. The data transfer and analysis system consists of an Internet of Things (IoT) gateway and an MQTT broker. The data obtained by the AMSNs is classified using two distinct neural networks. The first is a ResNet model, which is employed for the analysis of ambient acoustic signals. The second is an encoder network, which is utilized for the classification of vibrational data. An empirical evaluation of the proposed approach was conducted on a dataset comprising 25 different activities, which were conducted by 14 subjects in a real-world environment. The results indicate that audio and vibration-based data can be leveraged for accurate activity detection, eliminating the need for specialized sensor equipment. These findings have several implications for smart home environments, such as improving user experience and comfort or enhancing safety and security for inhabitants.

Keywords