IEEE Access (Jan 2023)

Contact Part Detection From 3D Human Motion Data Using Manually Labeled Contact Data and Deep Learning

  • Changgu Kang,
  • Meejin Kim,
  • Kangsoo Kim,
  • Sukwon Lee

DOI
https://doi.org/10.1109/ACCESS.2023.3331687
Journal volume & issue
Vol. 11
pp. 127608 – 127618

Abstract

Read online

Research on the interaction between users and their environment has been conducted in various fields, including human activity recognition (HAR), human-scene interaction (HSI), computer graphics (CG), and virtual reality (VR). Typically, the interaction process commences with a human body part’s movement and involves contact with a target object or the environment. The choice of the body part to make contact depends on the interaction’s purpose and affordance, making contact a fundamental aspect of interaction. However, detecting the specific body parts in contact, especially in the context of 3D motion and complex environments, poses computational challenges. To address this challenge, this study proposes a method for contact detection using motion data. The motion data utilized in this study are limited to actions feasible in an office environment. Since contact states of different body parts are independent, the proposed method comprises two distinct models: a feature model generating common features for each body part and a part model recognizing the contact state of each body part. The feature model employs a bidirectional long-short term memory(Bi-LSTM) structure to capture the sequential nature of motion data, ensuring the incorporation of continuous data characteristics. In contrast, the part model employs separate weights optimized for each body part within the deep neural network. Experimental results demonstrate the proposed method’s high accuracy, recall, and precision, with values of 0.99, 0.97, and 0.95, respectively.

Keywords