Back to Search Start Over

To Share or not to Share: Predicting Sets of Sources for Model Transfer Learning

Authors :
Lange, Lukas
Strötgen, Jannik
Adel, Heike
Klakow, Dietrich
Publication Year :
2021

Abstract

In low-resource settings, model transfer can help to overcome a lack of labeled data for many tasks and domains. However, predicting useful transfer sources is a challenging problem, as even the most similar sources might lead to unexpected negative transfer results. Thus, ranking methods based on task and text similarity -- as suggested in prior work -- may not be sufficient to identify promising sources. To tackle this problem, we propose a new approach to automatically determine which and how many sources should be exploited. For this, we study the effects of model transfer on sequence labeling across various domains and tasks and show that our methods based on model similarity and support vector machines are able to predict promising sources, resulting in performance increases of up to 24 F1 points.<br />Comment: Accepted at EMNLP 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2104.08078
Document Type :
Working Paper