Back to Search Start Over

SHiFT: An Efficient, Flexible Search Engine for Transfer Learning

Authors :
Renggli, Cedric
Yao, Xiaozhe
Kolar, Luka
Rimanic, Luka
Klimovic, Ana
Zhang, Ce
Renggli, Cedric
Yao, Xiaozhe
Kolar, Luka
Rimanic, Luka
Klimovic, Ana
Zhang, Ce
Publication Year :
2022

Abstract

Transfer learning can be seen as a data- and compute-efficient alternative to training models from scratch. The emergence of rich model repositories, such as TensorFlow Hub, enables practitioners and researchers to unleash the potential of these models across a wide range of downstream tasks. As these repositories keep growing exponentially, efficiently selecting a good model for the task at hand becomes paramount. By carefully comparing various selection and search strategies, we realize that no single method outperforms the others, and hybrid or mixed strategies can be beneficial. Therefore, we propose SHiFT, the first downstream task-aware, flexible, and efficient model search engine for transfer learning. These properties are enabled by a custom query language SHiFT-QL together with a cost-based decision maker, which we empirically validate. Motivated by the iterative nature of machine learning development, we further support efficient incremental executions of our queries, which requires a careful implementation when jointly used with our optimizations.

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1333761770
Document Type :
Electronic Resource