Frontiers in Neuroscience (Nov 2014)

Hybrid fNIRS-EEG based classification of auditory and visual perception processes

  • Felix ePutze,
  • Sebastian eHesslinger,
  • Chun-Yu eTse,
  • Chun-Yu eTse,
  • YunYing eHuang,
  • Christian eHerff,
  • Cuntai eGuan,
  • Tanja eSchultz

DOI
https://doi.org/10.3389/fnins.2014.00373
Journal volume & issue
Vol. 8

Abstract

Read online

For multimodal Human-Computer Interaction (HCI), it is very useful to identify the modalities on which the user is currently processing information. This would enable a system to select complementary output modalities to reduce the user's workload. In this paper, we develop a hybrid Brain-Computer Interface (BCI) which uses Electroencephalography (EEG) and functional Near Infrared Spectroscopy (fNIRS) to discriminate and detect visual and auditory stimulus processing. We describe the experimental setup we used for collection of our data corpus with 12 subjects. We present cross validation evaluation results for different classification conditions. We show that our subject-dependent systems achieved a classification accuracy of 97.8% for discriminating visual and auditory perception processes from each other and a classification accuracy of up to 94.8% for detecting modality-specific processes independently of other cognitive activity. The same classification conditions could also be discriminated in a subject-independent fashion with accuracy of up to 94.6% and 86.7%, respectively. We also look at the contributions of the two signal types and show that the fusion of classifiers using different features significantly increases accuracy.

Keywords