Back to Search Start Over

Population Transformer: Learning Population-level Representations of Neural Activity

Authors :
Chau, Geeling
Wang, Christopher
Talukder, Sabera
Subramaniam, Vighnesh
Soedarmadji, Saraswati
Yue, Yisong
Katz, Boris
Barbu, Andrei
Publication Year :
2024

Abstract

We present a self-supervised framework that learns population-level codes for arbitrary ensembles of neural recordings at scale. We address two key challenges in scaling models with neural time-series data: sparse and variable electrode distribution across subjects and datasets. The Population Transformer (PopT) stacks on top of pretrained representations and enhances downstream decoding by enabling learned aggregation of multiple spatially-sparse data channels. The pretrained PopT lowers the amount of data required for downstream decoding experiments, while increasing accuracy, even on held-out subjects and tasks. Compared to end-to-end methods, this approach is computationally lightweight and more interpretable, while still retaining competitive performance. We further show how our framework is generalizable to multiple time-series embeddings and neural data modalities. Beyond decoding, we interpret the pretrained PopT and fine-tuned models to show how they can be used to extract neuroscience insights from massive amounts of data. We release our code as well as a pretrained PopT to enable off-the-shelf improvements in multi-channel intracranial data decoding and interpretability.<br />Comment: 19 pages, 11 figures, submitted to ICLR 2025

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.03044
Document Type :
Working Paper