Back to Search Start Over

DT-LET: Deep Transfer Learning by Exploring where to Transfer

Authors :
Lin, Jianzhe
Wang, Qi
Ward, Rabab
Wang, Z. Jane
Publication Year :
2018
Publisher :
arXiv, 2018.

Abstract

Previous transfer learning methods based on deep network assume the knowledge should be transferred between the same hidden layers of the source domain and the target domains. This assumption doesn't always hold true, especially when the data from the two domains are heterogeneous with different resolutions. In such case, the most suitable numbers of layers for the source domain data and the target domain data would differ. As a result, the high level knowledge from the source domain would be transferred to the wrong layer of target domain. Based on this observation, "where to transfer" proposed in this paper should be a novel research frontier. We propose a new mathematic model named DT-LET to solve this heterogeneous transfer learning problem. In order to select the best matching of layers to transfer knowledge, we define specific loss function to estimate the corresponding relationship between high-level features of data in the source domain and the target domain. To verify this proposed cross-layer model, experiments for two cross-domain recognition/classification tasks are conducted, and the achieved superior results demonstrate the necessity of layer correspondence searching.<br />Comment: Conference paper submitted to AAAI 2019

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....8f65d56aaa08050ed40be4580e23de7c
Full Text :
https://doi.org/10.48550/arxiv.1809.08541