IEEE Access (Jan 2021)

Parameter-Free Attention in fMRI Decoding

  • Yong Qi,
  • Huawei Lin,
  • Yanping Li,
  • Jiashu Chen

DOI
https://doi.org/10.1109/ACCESS.2021.3068921
Journal volume & issue
Vol. 9
pp. 48704 – 48712

Abstract

Read online

An fMRI decoder aims to infer the corresponding type of task stimulus from the given fMRI data. Recently, deep learning techniques have attracted fMRI decoding attention. Yet, it has not demonstrated an outstanding decoding performance because of ultra-high dimensional data, extremely complex calculations, and subtle differences between different tasks. In this work, we propose a parameter-free attention module called Skip Attention Module (SAM) consisted of weight branch and skip branch, which can pay attention to areas with more information to enhance data features. SAM does not contain any parameters that need to be trained, and does not increase any burden of training. Thus, it can stack on any Convolutional Neural Networks (CNN) architecture or even on a pre-trained model. Our experiments on seven tasks of the large-scale Human Connectome Project (HCP) S1200 data set containing about 1200 subjects show that the architecture with SAM achieves a significant performance improvement compared with the non-attention architecture. We have conducted many experiments, and the average decoding accuracy is up to 88.7%. Besides, the average decoding error of the architecture using SAM is 1.2%~3.1% lower than the architecture without SAM. For a single task, the architecture decoding accuracy using SAM has the highest increase of 11.1%. In addition, the proposed method also shows excellent performance on the ADHD-200 dataset, indicating the universality of the method. These results establish that the proposed SAM can be superimposed on any architecture and can effectively improve fMRI decoding accuracy.

Keywords