EURASIP Journal on Advances in Signal Processing (Feb 2019)

A multisource fusion framework driven by user-defined knowledge for egocentric activity recognition

  • Haibin Yu,
  • Wenyan Jia,
  • Zhen Li,
  • Feixiang Gong,
  • Ding Yuan,
  • Hong Zhang,
  • Mingui Sun

DOI
https://doi.org/10.1186/s13634-019-0612-x
Journal volume & issue
Vol. 2019, no. 1
pp. 1 – 23

Abstract

Read online

Abstract Recently, egocentric activity recognition has attracted considerable attention in the pattern recognition and artificial intelligence communities because of its widespread applicability to human systems, including the evaluation of dietary and physical activity and the monitoring of patients and older adults. In this paper, we present a knowledge-driven multisource fusion framework for the recognition of egocentric activities in daily living (ADL). This framework employs Dezert–Smarandache theory across three information sources: the wearer’s knowledge, images acquired by a wearable camera, and sensor data from wearable inertial measurement units and GPS. A simple likelihood table is designed to provide routine ADL information for each individual. A well-trained convolutional neural network is then used to produce a set of textual tags that, along with routine information and other sensor data, are used to recognize ADLs based on information theory-based statistics and a support vector machine. Our experiments show that the proposed method accurately recognizes 15 predefined ADL classes, including a variety of sedentary activities that have previously been difficult to recognize. When applied to real-life data recorded using a self-constructed wearable device, our method outperforms previous approaches, and an average accuracy of 85.4% is achieved for the 15 ADLs.

Keywords