PLoS Computational Biology (Aug 2024)
Convolutional neural networks can identify brain interactions involved in decoding spatial auditory attention.
Abstract
Human listeners have the ability to direct their attention to a single speaker in a multi-talker environment. The neural correlates of selective attention can be decoded from a single trial of electroencephalography (EEG) data. In this study, leveraging the source-reconstructed and anatomically-resolved EEG data as inputs, we sought to employ CNN as an interpretable model to uncover task-specific interactions between brain regions, rather than simply to utilize it as a black box decoder. To this end, our CNN model was specifically designed to learn pairwise interaction representations for 10 cortical regions from five-second inputs. By exclusively utilizing these features for decoding, our model was able to attain a median accuracy of 77.56% for within-participant and 65.14% for cross-participant classification. Through ablation analysis together with dissecting the features of the models and applying cluster analysis, we were able to discern the presence of alpha-band-dominated inter-hemisphere interactions, as well as alpha- and beta-band dominant interactions that were either hemisphere-specific or were characterized by a contrasting pattern between the right and left hemispheres. These interactions were more pronounced in parietal and central regions for within-participant decoding, but in parietal, central, and partly frontal regions for cross-participant decoding. These findings demonstrate that our CNN model can effectively utilize features known to be important in auditory attention tasks and suggest that the application of domain knowledge inspired CNNs on source-reconstructed EEG data can offer a novel computational framework for studying task-relevant brain interactions.