Back to Search Start Over

User-friendly Foundation Model Adapters for Multivariate Time Series Classification

Authors :
Feofanov, Vasilii
Ilbert, Romain
Tiomoko, Malik
Palpanas, Themis
Redko, Ievgen
Publication Year :
2024

Abstract

Foundation models, while highly effective, are often resource-intensive, requiring substantial inference time and memory. This paper addresses the challenge of making these models more accessible with limited computational resources by exploring dimensionality reduction techniques. Our goal is to enable users to run large pre-trained foundation models on standard GPUs without sacrificing performance. We investigate classical methods such as Principal Component Analysis alongside neural network-based adapters, aiming to reduce the dimensionality of multivariate time series data while preserving key features. Our experiments show up to a 10x speedup compared to the baseline model, without performance degradation, and enable up to 4.5x more datasets to fit on a single GPU, paving the way for more user-friendly and scalable foundation models.<br />Comment: The first two authors contributed equally

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.12264
Document Type :
Working Paper