IEEE Access (Jan 2024)

Evaluating Visual-Spatiotemporal Co-Registration of a Physics-Based Virtual Reality Haptic Interface

  • Syed T. Mubarrat,
  • Suman K. Chowdhury,
  • Antonio S. Fernandes

DOI
https://doi.org/10.1109/ACCESS.2024.3391186
Journal volume & issue
Vol. 12
pp. 57017 – 57032

Abstract

Read online

This study aimed to evaluate the visual-spatiotemporal co-registration of the real and virtual objects’ movement dynamics by designing a low-cost, physics-based virtual reality (VR) system that provides actual cutaneous and kinesthetic haptic feedback of an object instead of using computer-generated haptic feedback. Twelve healthy participants performed three human-robot collaborative (HRC) sequential pick-and-place lifting tasks while both motion capture and VR systems, respectively, traced the movement of the real and virtual objects simultaneously. We used an iterative closest point algorithm to transform and align the 3D coordinates of VR point clouds with the 3D coordinates of the motion capture system. We introduced a new method to calculate and analyze the precision of visual and spatiotemporal co-registration between virtual and real objects. Results showed a high correlation ( $r >$ 0.96) between real and virtual objects’ movement dynamics and linear and angular co-registration errors of less than 5 cm and 8°, respectively. The trend also revealed a low temporal registration error of < 12 ms and was only found along the vertical axis. The visual registration data indicated that using real objects to provide cutaneous and kinesthetic haptics in the VR setting enhanced the users’ overall proprioception and visuomotor functions.

Keywords