Proceedings (Oct 2018)

High-Level Features for Recognizing Human Actions in Daily Living Environments Using Wearable Sensors

  • Irvin Hussein López-Nava,
  • Angélica Muñoz-Meléndez

DOI
https://doi.org/10.3390/proceedings2191238
Journal volume & issue
Vol. 2, no. 19
p. 1238

Abstract

Read online

Action recognition is important for various applications, such as, ambient intelligence, smart devices, and healthcare. Automatic recognition of human actions in daily living environments, mainly using wearable sensors, is still an open research problem of the field of pervasive computing. This research focuses on extracting a set of features related to human motion, in particular the motion of the upper and lower limbs, in order to recognize actions in daily living environments, using time-series of joint orientation. Ten actions were performed by five test subjects in their homes: cooking, doing housework, eating, grooming, mouth care, ascending stairs, descending stairs, sitting, standing, and walking. The joint angles of the right upper limb and the left lower limb were estimated using information from five wearable inertial sensors placed on the back, right upper arm, right forearm, left thigh and left leg. The set features were used to build classifiers using three inference algorithms: Naive Bayes, K-Nearest Neighbours, and AdaBoost. The F- m e a s u r e average of classifying the ten actions of the three classifiers built by using the proposed set of features was 0.806 ( σ = 0.163).

Keywords