Back to Search Start Over

A Study of Quantisation-aware Training on Time Series Transformer Models for Resource-constrained FPGAs

Authors :
Ling, Tianheng
Qian, Chao
Einhaus, Lukas
Schiele, Gregor
Publication Year :
2023

Abstract

This study explores the quantisation-aware training (QAT) on time series Transformer models. We propose a novel adaptive quantisation scheme that dynamically selects between symmetric and asymmetric schemes during the QAT phase. Our approach demonstrates that matching the quantisation scheme to the real data distribution can reduce computational overhead while maintaining acceptable precision. Moreover, our approach is robust when applied to real-world data and mixed-precision quantisation, where most objects are quantised to 4 bits. Our findings inform model quantisation and deployment decisions while providing a foundation for advancing quantisation techniques.<br />Comment: 12 pages, 1 figure

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.02654
Document Type :
Working Paper