Abstract
Motor imagery (MI) electroencephalogram (EEG) decoding plays a critical role in brain-computer interfaces but remains challenging due to large inter-subject variability and limited training data. Existing approaches often struggle with few-shot cross-subject adaptation, as they require either extensive fine-tuning or fail to capture individualized neural dynamics. To address this issue, we propose a Task-Conditioned Prompt Learning (TCPL), which integrates a Task-Conditioned Prompt (TCP) module with a hybrid Temporal Convolutional Network (TCN) and Transformer backbone under a meta-learning framework. Specifically, TCP encodes subject-specific variability as prompt tokens, TCN extracts local temporal patterns, Transformer captures global dependencies, and meta-learning enables rapid adaptation with minimal samples. The proposed TCPL model is validated on three widely used public datasets, GigaScience, Physionet, and BCI Competition IV 2a, demonstrating strong generalization and efficient adaptation across unseen subjects. These results highlight the feasibility of TCPL for practical few-shot EEG decoding and its potential to advance the development of personalized brain-computer interface systems.