Back to Search Start Over

A 36-Class Bimodal ERP Brain-Computer Interface Using Location-Congruent Auditory-Tactile Stimuli.

Authors :
Zhang, Boyang
Zhou, Zongtan
Jiang, Jing
Source :
Brain Sciences (2076-3425). Aug2020, Vol. 10 Issue 8, p524-524. 1p.
Publication Year :
2020

Abstract

To date, traditional visual-based event-related potential brain-computer interface (ERP-BCI) systems continue to dominate the mainstream BCI research. However, these conventional BCIs are unsuitable for the individuals who have partly or completely lost their vision. Considering the poor performance of gaze independent ERP-BCIs, it is necessary to study techniques to improve the performance of these BCI systems. In this paper, we developed a novel 36-class bimodal ERP-BCI system based on tactile and auditory stimuli, in which six-virtual-direction audio files produced via head related transfer functions (HRTF) were delivered through headphones and location-congruent electro-tactile stimuli were simultaneously delivered to the corresponding position using electrodes placed on the abdomen and waist. We selected the eight best channels, trained a Bayesian linear discriminant analysis (BLDA) classifier and acquired the optimal trial number for target selection in online process. The average online information transfer rate (ITR) of the bimodal ERP-BCI reached 11.66 bit/min, improvements of 35.11% and 36.69% compared to the auditory (8.63 bit/min) and tactile approaches (8.53 bit/min), respectively. The results demonstrate the performance of the bimodal system is superior to each unimodal system. These facts indicate that the proposed bimodal system has potential utility as a gaze-independent BCI in future real-world applications. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
20763425
Volume :
10
Issue :
8
Database :
Academic Search Index
Journal :
Brain Sciences (2076-3425)
Publication Type :
Academic Journal
Accession number :
145187402
Full Text :
https://doi.org/10.3390/brainsci10080524