Frontiers in Virtual Reality (Nov 2023)
Investigating the perceptual attribution of a virtual robotic limb synchronizing with hand and foot simultaneously
Abstract
Introduction: Incorporating an additional limb that synchronizes with multiple body parts enables the user to achieve high task accuracy and smooth movement. In this case, the visual appearance of the wearable robotic limb contributes to the sense of embodiment. Additionally, the user’s motor function changes as a result of this embodiment. However, it remains unclear how users perceive the attribution of the wearable robotic limb within the context of multiple body parts (perceptual attribution), and the impact of visual similarity in this context remains unknown.Methods: This study investigated the perceptual attribution of a virtual robotic limb by examining proprioceptive drift and the bias of visual similarity under the conditions of single body part (synchronizing with hand or foot motion only) and multiple body parts (synchronizing with average motion of hand and foot). Participants in the conducted experiment engaged in a point-to-point task using a virtual robotic limb that synchronizes with their hand and foot motions simultaneously. Furthermore, the visual appearance of the end-effector was altered to explore the influence of visual similarity.Results: The experiment revealed that only the participants’ proprioception of their foot aligned with the virtual robotic limb, while the frequency of error correction during the point-to-point task did not change across conditions. Conversely, subjective illusions of embodiment occurred for both the hand and foot. In this case, the visual appearance of the robotic limbs contributed to the correlations between hand and foot proprioceptive drift and subjective embodiment illusion, respectively.Discussion: These results suggest that proprioception is specifically attributed to the foot through motion synchronization, whereas subjective perceptions are attributed to both the hand and foot.
Keywords