Phairot Autthasan, Rattanaphon Chaisaen, Thapanun Sudhawiyangkul, Phurin Rangpong, Suktipol Kiatthaveephong, Nat Dilokthanakul, Gun Bhakdisongkhram, Huy Phan, Cuntai Guan, Theerawit Wilaiprasitporn, and School of Computer Science and Engineering
Objective: Advances in the motor imagery (MI)-based brain-computer interfaces (BCIs) allow control of several applications by decoding neurophysiological phenomena, which are usually recorded by electroencephalography (EEG) using a non-invasive technique. Despite significant advances in MI-based BCI, EEG rhythms are specific to a subject and various changes over time. These issues point to significant challenges to enhance the classification performance, especially in a subjectindependent manner. Methods: To overcome these challenges, we propose MIN2Net, a novel end-to-end multi-task learning to tackle this task. We integrate deep metric learning into a multi-task autoencoder to learn a compact and discriminative latent representation from EEG and perform classification simultaneously. Results: This approach reduces the complexity in pre-processing, results in significant performance improvement on EEG classification. Experimental results in a subject-independent manner show that MIN2Net outperforms the state-of-the-art techniques, achieving an F1-score improvement of 6.72% and 2.23% on the SMR-BCI and OpenBMI datasets, respectively. Conclusion: We demonstrate that MIN2Net improves discriminative information in the latent representation. Significance: This study indicates the possibility and practicality of using this model to develop MI-based BCI applications for new users without calibration. This work was supported in part by PTT Public Company Limited, in part by The SCB Public Company Limited, in part by Thailand Science Research, and Innovation under Grant SRI62W1501, in part by The Office of the Permanent Secretary of the Ministry of Higher Education, Science, Research, and Innovation, Thailand under Grant RGNS63-252, and in part by the National Research Council of Thailand under Grant N41A640131.