Advanced Intelligent Systems (Apr 2024)

Multimodal Learning‐Based Proactive Human Handover Intention Prediction Using Wearable Data Gloves and Augmented Reality

  • Rui Zou,
  • Yubin Liu,
  • Jie Zhao,
  • Hegao Cai

DOI
https://doi.org/10.1002/aisy.202300545
Journal volume & issue
Vol. 6, no. 4
pp. n/a – n/a

Abstract

Read online

Efficient object handover between humans and robots holds significant importance within collaborative manufacturing environments. Enhancing the efficacy of human–robot handovers involves enabling robots to comprehend and foresee human handover intentions. This article introduces human‐teaching–robot‐learning‐prediction framework, allowing robots to learn from diverse human demonstrations and anticipate human handover intentions. The framework facilitates human programming of robots through demonstrations utilizing augmented reality and a wearable dataglove, aligned with task requirements and human working preferences. Subsequently, robots enhance their cognitive capabilities by assimilating insights from human handover demonstrations, utilizing deep neural network algorithms. Furthermore, robots can proactively seek clarification from humans via an augmented reality system when confronted with ambiguity in human intentions, mirroring how humans seek clarity from their counterparts. This proactive approach empowers robots to anticipate human intentions and assist human partners during handovers. Empirical results underscore the benefits of the proposed approach, demonstrating highly accurate prediction of human intentions in human–robot handover tasks.

Keywords