Back to Search Start Over

BELT: Bootstrapped EEG-to-Language Training by Natural Language Supervision

Authors :
Jinzhao Zhou
Yiqun Duan
Yu-Cheng Chang
Yu-Kai Wang
Chin-Teng Lin
Source :
IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol 32, Pp 3278-3288 (2024)
Publication Year :
2024
Publisher :
IEEE, 2024.

Abstract

Decoding natural language from noninvasive brain signals has been an exciting topic with the potential to expand the applications of brain-computer interface (BCI) systems. However, current methods face limitations in decoding sentences from electroencephalography (EEG) signals. Improving decoding performance requires the development of a more effective encoder for the EEG modality. Nonetheless, learning generalizable EEG representations remains a challenge due to the relatively small scale of existing EEG datasets. In this paper, we propose enhancing the EEG encoder to improve subsequent decoding performance. Specifically, we introduce the discrete Conformer encoder (D-Conformer) to transform EEG signals into discrete representations and bootstrap the learning process by imposing EEG-language alignment from the early training stage. The D-Conformer captures both local and global patterns from EEG signals and discretizes the EEG representation, making the representation more resilient to variations, while early-stage EEG-language alignment mitigates the limitations of small EEG datasets and facilitates the learning of the semantic representations from EEG signals. These enhancements result in improved EEG representations and decoding performance. We conducted extensive experiments and ablation studies to thoroughly evaluate the proposed method. Utilizing the D-Conformer encoder and bootstrapping training strategy, our approach demonstrates superior decoding performance across various tasks, including word-level, sentence-level, and sentiment-level decoding from EEG signals. Specifically, in word-level classification, we show that our encoding method produces more distinctive representations and higher classification performance compared to the EEG encoders from existing methods. At the sentence level, our model outperformed the baseline by 5.45%, achieving a BLEU-1 score of 42.31%. Furthermore, in sentiment classification, our model exceeded the baseline by 14%, achieving a sentiment classification accuracy of 69.3%.

Details

Language :
English
ISSN :
15344320 and 15580210
Volume :
32
Database :
Directory of Open Access Journals
Journal :
IEEE Transactions on Neural Systems and Rehabilitation Engineering
Publication Type :
Academic Journal
Accession number :
edsdoj.61eb1968737841b5965af14dbdcd22a9
Document Type :
article
Full Text :
https://doi.org/10.1109/TNSRE.2024.3450795