Back to Search Start Over

Don't Wait, Just Weight: Improving Unsupervised Representations by Learning Goal-Driven Instance Weights

Authors :
Ericsson, Linus
Gouk, Henry
Hospedales, Timothy M.
Publication Year :
2020

Abstract

In the absence of large labelled datasets, self-supervised learning techniques can boost performance by learning useful representations from unlabelled data, which is often more readily available. However, there is often a domain shift between the unlabelled collection and the downstream target problem data. We show that by learning Bayesian instance weights for the unlabelled data, we can improve the downstream classification accuracy by prioritising the most useful instances. Additionally, we show that the training time can be reduced by discarding unnecessary datapoints. Our method, BetaDataWeighter is evaluated using the popular self-supervised rotation prediction task on STL-10 and Visual Decathlon. We compare to related instance weighting schemes, both hand-designed heuristics and meta-learning, as well as conventional self-supervised learning. BetaDataWeighter achieves both the highest average accuracy and rank across datasets, and on STL-10 it prunes up to 78% of unlabelled images without significant loss in accuracy, corresponding to over 50% reduction in training time.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.12360
Document Type :
Working Paper