Scientific Reports (Nov 2024)
An adaptive multi-graph neural network with multimodal feature fusion learning for MDD detection
Abstract
Abstract Major Depressive Disorder (MDD) is an affective disorder that can lead to persistent sadness and a decline in the quality of life, increasing the risk of suicide. Utilizing multimodal data such as electroencephalograms and patient interview audios can facilitate the timely detection of MDD. However, existing depression detection methods either consider only a single modality or do not fully account for the differences and similarities between modalities in multimodal approaches, potentially overlooking the latent information inherent in various modal data. To address these challenges, we propose EMO-GCN, a multimodal depression detection method based on an adaptive multi-graph neural network. By employing graph-based methods to model data from various modalities and extracting features from them, the potential correlations between modalities are uncovered. The model’s performance on the MODMA dataset is outstanding, achieving an accuracy (ACC) of 96.30%. Ablation studies further confirm the effectiveness of the model’s individual components.The experimental results of EMO-GCN demonstrate the application prospects of graph-based multimodal analysis in the field of mental health, offering new perspectives for future research.
Keywords