1. NeuroBind: Towards Unified Multimodal Representations for Neural Signals
- Author
-
Yang, Fengyu, Feng, Chao, Wang, Daniel, Wang, Tianye, Zeng, Ziyao, Xu, Zhiyang, Park, Hyoungseob, Ji, Pengliang, Zhao, Hanbin, Li, Yuanning, and Wong, Alex
- Subjects
Quantitative Biology - Neurons and Cognition ,Computer Science - Machine Learning - Abstract
Understanding neural activity and information representation is crucial for advancing knowledge of brain function and cognition. Neural activity, measured through techniques like electrophysiology and neuroimaging, reflects various aspects of information processing. Recent advances in deep neural networks offer new approaches to analyzing these signals using pre-trained models. However, challenges arise due to discrepancies between different neural signal modalities and the limited scale of high-quality neural data. To address these challenges, we present NeuroBind, a general representation that unifies multiple brain signal types, including EEG, fMRI, calcium imaging, and spiking data. To achieve this, we align neural signals in these image-paired neural datasets to pre-trained vision-language embeddings. Neurobind is the first model that studies different neural modalities interconnectedly and is able to leverage high-resource modality models for various neuroscience tasks. We also showed that by combining information from different neural signal modalities, NeuroBind enhances downstream performance, demonstrating the effectiveness of the complementary strengths of different neural modalities. As a result, we can leverage multiple types of neural signals mapped to the same space to improve downstream tasks, and demonstrate the complementary strengths of different neural modalities. This approach holds significant potential for advancing neuroscience research, improving AI systems, and developing neuroprosthetics and brain-computer interfaces.
- Published
- 2024