Back to Search Start Over

DAISY: Data Adaptive Self-Supervised Early Exit for Speech Representation Models

Authors :
Lin, Tzu-Quan
Lee, Hung-yi
Tang, Hao
Publication Year :
2024

Abstract

Self-supervised speech models have shown to be useful for various tasks, but their large size limits the use in devices with low computing power and memory. In this work, we explore early exit, an approach for reducing latency by exiting the forward process of a network early. Most approaches of early exit need a separate early exit model for each task, with some even requiring fine-tuning of the entire pretrained model. We introduce Data Adaptive Self-Supervised Early Exit (DAISY), an approach that decides when to exit based on the self-supervised loss, eliminating the need for multiple round of training and fine-tuning. DAISY matches the performance of HuBERT on the MiniSUPERB benchmark, but with much faster inference times. Our analysis on the adaptivity of DAISY shows that the model exits early (using fewer layers) on clean data while exits late (using more layers) on noisy data, dynamically adjusting the computational cost of inference based on the noise level of each sample.<br />Comment: Accepted by Interspeech 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.05464
Document Type :
Working Paper