Frontiers in Robotics and AI (Aug 2016)

Multi-modal Virtual Scenario Enhances Neurofeedback Learning

  • Avihay Cohen,
  • Jackob Nimrod Keynan,
  • Jackob Nimrod Keynan,
  • Gilan Jackont,
  • Nili Green,
  • Nili Green,
  • Iris Rashap,
  • Iris Rashap,
  • Ofir Shany,
  • Ofir Shany,
  • Fred Charles,
  • Marc Cavazza,
  • Talma Hendler,
  • Talma Hendler,
  • Talma Hendler,
  • Talma Hendler,
  • Gal Raz,
  • Gal Raz

DOI
https://doi.org/10.3389/frobt.2016.00052
Journal volume & issue
Vol. 3

Abstract

Read online

In the past decade neurofeedback has become the focus of a growing body of research. With real-time fMRI enabling on-line monitoring of emotion related areas such as the amygdala, many have begun testing its therapeutic benefits. However most existing neurofeedback procedures still use monotonic uni-modal interfaces, thus possibly limiting user engagement and weakening learning efficiency. The current study tested a novel multi-sensory neurofeedback animated scenario aimed at enhancing user experience and improving learning. We examined whether relative to a simple uni-modal 2D interface, learning via an interface of complex multi-modal 3D scenario will result in improved neurofeedback learning. As a neural-probe, we used the recently developed fMRI-inspired EEG model of amygdala activity (amygdala-EEG finger print; amygdala-EFP), enabling low-cost and mobile limbic neurofeedback training. Amygdala-EFP was reflected in the animated scenario by the unrest level of a hospital waiting-room in which virtual characters become impatient, approach the admission-desk and complain loudly. Successful down-regulation was reflected as an ease in the room unrest-level. We tested whether relative to a standard uni-modal 2D graphic thermometer interface, this animated scenario could facilitate more effective learning and improve the training experience. Thirty participants underwent two separated neurofeedback sessions (one-week apart) practicing down-regulation of the amygdala-EFP signal. In the first session, half trained via the animated scenario and half via a thermometer interface. Learning efficiency was tested by three parameters: (a) effect-size of the change in amygdala-EFP following training, (b) sustainability of the learned down-regulation in the absence of online feedback, and (c) transferability to an unfamiliar context. Comparing amygdala-EFP signal amplitude between the last and the first neurofeedback trials revealed that the animated scenario produced a higher effect size. In addition, neurofeedback via the animated scenario showed better sustainability, as indicated by a no-feedback trial conducted in session 2 and better transferability to a new unfamiliar interface. Lastly, participants reported that the animated scenario was more engaging and more motivating than the thermometer. Together, these results demonstrate the promising potential of integrating realistic virtual environments in neurofeedback to enhance learning and improve user's experience.

Keywords