IEEE Access (Jan 2023)

You Can’t Hide Behind Your Headset: User Profiling in Augmented and Virtual Reality

  • Pier Paolo Tricomi,
  • Federica Nenna,
  • Luca Pajola,
  • Mauro Conti,
  • Luciano Gamberini

DOI
https://doi.org/10.1109/ACCESS.2023.3240071
Journal volume & issue
Vol. 11
pp. 9859 – 9875

Abstract

Read online

Augmented and Virtual Reality (AR and VR), collectively known as Extended Reality (XR), are increasingly gaining traction thanks to their technical advancement and the need for remote connections, recently accentuated by the pandemic. Remote surgery, telerobotics, and virtual offices are only some examples of their successes. As users interact with XR, they generate extensive behavioral data usually leveraged for measuring human activity, which could be used for profiling users’ identities or personal information (e.g., gender). However, several factors affect the efficiency of profiling, such as the technology employed, the action taken, the mental workload, the presence of bias, and the sensors available. To date, no study has considered all of these factors together and in their entirety, limiting the current understanding of XR profiling. In this work, we provide a comprehensive study on user profiling in virtual technologies (i.e., AR, VR). Specifically, we employ machine learning on behavioral data (i.e., head, controllers, and eye data) to identify users and infer their individual attributes (i.e., age, gender). Toward this end, we propose a general framework that can potentially infer any personal information from any virtual scenarios. We test our framework on eleven generic actions (e.g., walking, searching, pointing) involving low and high mental loads, derived from two distinct use cases: an AR everyday application (34 participants) and VR robot teleoperation (35 participants). Our framework limits the burden of creating technology- and action-dependent algorithms, also reducing the experimental bias evidenced in previous work, providing a simple (yet effective) baseline for future works. We identified users up to 97% F1-score in VR and 80% in AR. Gender and Age inference was also facilitated in VR, reaching up to 82% and 90% F1-score, respectively. Through an in-depth analysis of sensors’ impact, we found VR profiling resulting more effective than AR mainly because of the eye sensors’ presence.

Keywords