Global Adaptive Transformer for Cross-Subject Enhanced EEG Classification
Overview
Rehabilitation Medicine
Affiliations
Due to the individual difference, EEG signals from other subjects (source) can hardly be used to decode the mental intentions of the target subject. Although transfer learning methods have shown promising results, they still suffer from poor feature representation or neglect long-range dependencies. In light of these limitations, we propose Global Adaptive Transformer (GAT), an domain adaptation method to utilize source data for cross-subject enhancement. Our method uses parallel convolution to capture temporal and spatial features first. Then, we employ a novel attention-based adaptor that implicitly transfers source features to the target domain, emphasizing the global correlation of EEG features. We also use a discriminator to explicitly drive the reduction of marginal distribution discrepancy by learning against the feature extractor and the adaptor. Besides, an adaptive center loss is designed to align the conditional distribution. With the aligned source and target features, a classifier can be optimized to decode EEG signals. Experiments on two widely used EEG datasets demonstrate that our method outperforms state-of-the-art methods, primarily due to the effectiveness of the adaptor. These results indicate that GAT has good potential to enhance the practicality of BCI.
A composite improved attention convolutional network for motor imagery EEG classification.
Liao W, Miao Z, Liang S, Zhang L, Li C Front Neurosci. 2025; 19:1543508.
PMID: 39981403 PMC: 11841462. DOI: 10.3389/fnins.2025.1543508.
Adaptive deep feature representation learning for cross-subject EEG decoding.
Liang S, Li L, Zu W, Feng W, Hang W BMC Bioinformatics. 2024; 25(1):393.
PMID: 39741250 PMC: 11686875. DOI: 10.1186/s12859-024-06024-w.
Ma S, Zhang D Sensors (Basel). 2024; 24(21).
PMID: 39517978 PMC: 11548574. DOI: 10.3390/s24217080.
Xie X, Chen L, Qin S, Zha F, Fan X Front Neurorobot. 2024; 18:1343249.
PMID: 38352723 PMC: 10861766. DOI: 10.3389/fnbot.2024.1343249.