1. S7: Selective and Simplified State Space Layers for Sequence Modeling
- Author
-
Soydan, Taylan, Zubić, Nikola, Messikommer, Nico, Mishra, Siddhartha, and Scaramuzza, Davide
- Subjects
Computer Science - Machine Learning ,Electrical Engineering and Systems Science - Signal Processing ,Mathematics - Dynamical Systems - Abstract
A central challenge in sequence modeling is efficiently handling tasks with extended contexts. While recent state-space models (SSMs) have made significant progress in this area, they often lack input-dependent filtering or require substantial increases in model complexity to handle input variability. We address this gap by introducing S7, a simplified yet powerful SSM that can handle input dependence while incorporating stable reparameterization and specific design choices to dynamically adjust state transitions based on input content, maintaining efficiency and performance. We prove that this reparameterization ensures stability in long-sequence modeling by keeping state transitions well-behaved over time. Additionally, it controls the gradient norm, enabling efficient training and preventing issues like exploding or vanishing gradients. S7 significantly outperforms baselines across various sequence modeling tasks, including neuromorphic event-based datasets, Long Range Arena benchmarks, and various physical and biological time series. Overall, S7 offers a more straightforward approach to sequence modeling without relying on complex, domain-specific inductive biases, achieving significant improvements across key benchmarks., Comment: 23 pages, 3 figures, 11 tables. Equal contribution by Taylan Soydan and Nikola Zubi\'c
- Published
- 2024