NeuroImage (Aug 2020)
A naturalistic viewing paradigm using 360° panoramic video clips and real-time field-of-view changes with eye-gaze tracking
Abstract
The naturalistic viewing of a video clip enables participants to obtain more information from the clip compared to conventional viewing of a static image. Because changing the field-of-view (FoV) allows new visual information to be obtained, we were motivated to investigate whether naturalistic viewing with varying FoV based on active eye movement can enhance the viewing experience of natural stimuli, such as those found in a video clip with a 360° FoV in an MRI scanner. To this end, we developed a novel naturalistic viewing paradigm based on real-time eye-gaze tracking while participants were watching a 360° panoramic video during fMRI acquisition. The gaze position of the participants was recorded using an eye-tracking computer and then transmitted to a stimulus presentation computer via a TCP/IP connection. The identified gaze position was then used to alter the participants' FoV of the video clip in real-time, so the participants could change their FoV to fully explore the 360° video clip (referred to in this paper as active viewing). The gaze position of one participant while watching a video was used to change the FoV of the same video clip for a paired participant (referred to as yoked or passive viewing). Four 360° panoramic videos were used as stimuli, divided into categories based on the brightness level (i.e., bright vs. dark) and location (i.e., nature vs. city). Each of the subjects participated in the active viewing of one of the two nature videos and one of the two city videos and then engaged in the passive viewing of the other video in each category, followed by conventional viewing with a fixed FoV (referred to as fixed viewing) after each of the active or passive viewings. Forty-eight healthy volunteers participated in the study, and data from 42 of these participants were used in the analysis. Representational similarity analysis (RSA) was conducted in a multiple regression framework using representational dissimilarity matrix (RDM) codes to accommodate all of the information regarding neuronal activations from fMRI analysis and the participants' subjective ratings of their viewing experience with the four video clips and with the two contrasting viewing conditions (i.e., “active–fixed” and “passive–fixed”). It was found that the participants' naturalistic viewing experience of the video clips was substantially more immersive with active viewing than with passive and fixed viewing. The RSA using the RDM codes revealed the brain regions associated with the viewing experience, including eye movement and spatial navigation in the superior frontal area (of Brodmann's area 6) and the inferior/superior parietal areas, respectively. Brain regions potentially associated with cognitive and affective processing during the viewing of the video, such as the default-mode networks and insular/Rolandic operculum areas, were also identified. To the best of our knowledge, this is the first study that has used the participants' eye movements to interactively change their FoV for 360° panoramic video clips in real-time. Our method of utilizing the MRI environment can be further extended to other environments such as electroencephalography and behavioral research. It would also be feasible to apply our method to virtual reality and/or augmented reality systems to maximize user experience based on their eye movement.