Back to Search Start Over

An information-Theoretic Approach to Semi-supervised Transfer Learning

Authors :
Jakubovitz, Daniel
Uliel, David
Rodrigues, Miguel
Giryes, Raja
Publication Year :
2023

Abstract

Transfer learning is a valuable tool in deep learning as it allows propagating information from one "source dataset" to another "target dataset", especially in the case of a small number of training examples in the latter. Yet, discrepancies between the underlying distributions of the source and target data are commonplace and are known to have a substantial impact on algorithm performance. In this work we suggest novel information-theoretic approaches for the analysis of the performance of deep neural networks in the context of transfer learning. We focus on the task of semi-supervised transfer learning, in which unlabeled samples from the target dataset are available during network training on the source dataset. Our theory suggests that one may improve the transferability of a deep neural network by incorporating regularization terms on the target data based on information-theoretic quantities, namely the Mutual Information and the Lautum Information. We demonstrate the effectiveness of the proposed approaches in various semi-supervised transfer learning experiments.<br />Comment: arXiv admin note: substantial text overlap with arXiv:1904.01670

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2306.06731
Document Type :
Working Paper