Back to Search Start Over

Enhancing SNN-based spatio-temporal learning: A benchmark dataset and Cross-Modality Attention model.

Authors :
Zhou S
Yang B
Yuan M
Jiang R
Yan R
Pan G
Tang H
Source :
Neural networks : the official journal of the International Neural Network Society [Neural Netw] 2024 Dec; Vol. 180, pp. 106677. Date of Electronic Publication: 2024 Sep 03.
Publication Year :
2024

Abstract

Spiking Neural Networks (SNNs), renowned for their low power consumption, brain-inspired architecture, and spatio-temporal representation capabilities, have garnered considerable attention in recent years. Similar to Artificial Neural Networks (ANNs), high-quality benchmark datasets are of great importance to the advances of SNNs. However, our analysis indicates that many prevalent neuromorphic datasets lack strong temporal correlation, preventing SNNs from fully exploiting their spatio-temporal representation capabilities. Meanwhile, the integration of event and frame modalities offers more comprehensive visual spatio-temporal information. Yet, the SNN-based cross-modality fusion remains underexplored. In this work, we present a neuromorphic dataset called DVS-SLR that can better exploit the inherent spatio-temporal properties of SNNs. Compared to existing datasets, it offers advantages in terms of higher temporal correlation, larger scale, and more varied scenarios. In addition, our neuromorphic dataset contains corresponding frame data, which can be used for developing SNN-based fusion methods. By virtue of the dual-modal feature of the dataset, we propose a Cross-Modality Attention (CMA) based fusion method. The CMA model efficiently utilizes the unique advantages of each modality, allowing for SNNs to learn both temporal and spatial attention scores from the spatio-temporal features of event and frame modalities, subsequently allocating these scores across modalities to enhance their synergy. Experimental results demonstrate that our method not only improves recognition accuracy but also ensures robustness across diverse scenarios.<br />Competing Interests: Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.<br /> (Copyright © 2024 Elsevier Ltd. All rights reserved.)

Details

Language :
English
ISSN :
1879-2782
Volume :
180
Database :
MEDLINE
Journal :
Neural networks : the official journal of the International Neural Network Society
Publication Type :
Academic Journal
Accession number :
39260008
Full Text :
https://doi.org/10.1016/j.neunet.2024.106677