Abstract
Identifying depression using electroencephalogram (EEG) data is a formidable challenge because of the intricacy of cerebral networks and substantial individual variability in neural activity. Conventional models often fail to (i) include the EEG brain connectivity beyond simple paired interactions, (ii) account for brain inter-channel spatial relationships and (iii) integrate a variety of EEG-related features. Addressing these shortcomings, this article presents a novel model, a unified brain network that captures multiple spatiotemporal features that leverage a K-Nearest Neighbour (KNN)-based channel-channel relational matrix and Graph Convolution Gate Recurrent Unit (GCGRU) for depression detection and classification from EEG data, combining Graph Convolutional Networks with Gated Recurrent Units to process both spatial and temporal features of EEG signals. Experimental results demonstrate that the proposed model achieves significant accuracy of 83.67% in major depression disorder (MDD) detection and, with the F1-score, recall and precision reaching 84, 84 and 84%, respectively. Compared with the existing state-of-the-art models for depression detection using EEG, the proposed model achieves 8% improvement in the accuracy of major depressive disorder (MDD) detection.