Frontiers in Psychology (Nov 2022)
Virtual reality stimulation and organizational neuroscience for the assessment of empathy
Abstract
This study aimed to evaluate the viability of a new procedure based on machine learning (ML), virtual reality (VR), and implicit measures to discriminate empathy. Specifically, eye-tracking and decision-making patterns were used to classify individuals according to their level in each of the empathy dimensions, while they were immersed in virtual environments that represented social workplace situations. The virtual environments were designed using an evidence-centered design approach. Interaction and gaze patterns were recorded for 82 participants, who were classified as having high or low empathy on each of the following empathy dimensions: perspective-taking, emotional understanding, empathetic stress, and empathetic joy. The dimensions were assessed using the Cognitive and Affective Empathy Test. An ML-based model that combined behavioral outputs and eye-gaze patterns was developed to predict the empathy dimension level of the participants (high or low). The analysis indicated that the different dimensions could be differentiated by eye-gaze patterns and behaviors during immersive VR. The eye-tracking measures contributed more significantly to this differentiation than did the behavioral metrics. In summary, this study illustrates the potential of a novel VR organizational environment coupled with ML to discriminate the empathy dimensions. However, the results should be interpreted with caution, as the small sample does not allow general conclusions to be drawn. Further studies with a larger sample are required to support the results obtained in this study.
Keywords