Virtual Reality & Intelligent Hardware (Apr 2024)

Exploring the effect of fingertip aero-haptic feedforward cues in directing eyes-free target acquisition in VR

  • Xiaofei Ren,
  • Jian He,
  • Teng Han,
  • Songxian Liu,
  • Mengfei Lv,
  • Rui Zhou

Journal volume & issue
Vol. 6, no. 2
pp. 113 – 131

Abstract

Read online

Background: The sense of touch plays a crucial role in interactive behavior within virtual spaces, particularly when visual attention is absent. Although haptic feedback has been widely used to compensate for the lack of visual cues, the use of tactile information as a predictive feedforward cue to guide hand movements remains unexplored and lacks theoretical understanding. Methods: This study introduces a fingertip aero-haptic rendering method to investigate its effectiveness in directing hand movements during eyes-free spatial interactions. The wearable device incorporates a multichannel micro-airflow chamber to deliver adjustable tactile effects on the fingertips. Results: The first study verified that tactile directional feedforward cues significantly improve user capabilities in eyes-free target acquisition and that users rely heavily on haptic indications rather than spatial memory to control their hands. A subsequent study examined the impact of enriched tactile feedforward cues on assisting users in determining precise target positions during eyes-free interactions, and assessed the required learning efforts. Conclusions: The haptic feedforward effect holds great practical promise in eyeless design for virtual reality. We aim to integrate cognitive models and tactile feedforward cues in the future, and apply richer tactile feedforward information to alleviate users' perceptual deficiencies.

Keywords