Back to Search Start Over

A prototype-oriented framework for deep transfer learning applications

Authors :
Tanwisuth, Korawat
0009-0003-5875-5414
Publication Year :
2023
Publisher :
The University of Texas at Austin, 2023.

Abstract

Deep learning models achieve state-of-the-art performance in many applications but often require large-scale data. Deep transfer learning studies the ability of deep learning models to transfer knowledge from source tasks to related target tasks, enabling data-efficient learning. This dissertation develops novel methodologies that tackle three different transfer learning applications for deep learning models: unsupervised domain adaptation, unsupervised fine-tuning, and source-private clustering. The key idea behind the proposed methods relies on minimizing the distributional discrepancy between the prototypes and target data with the transport framework. For each scenario, we design our algorithms to suit different data and model requirements. In unsupervised domain adaptation, we leverage the source domain data to construct class prototypes and minimize the transport cost between the prototypes and target data. In unsupervised fine-tuning, we apply our framework to prompt-based zero-shot learning to adapt large pre-trained models directly on the target data, bypassing the source data requirement. In source-private clustering, we incorporate a knowledge distillation framework with our prototype-oriented clustering to address the problem of data and model privacy. All three approaches show consistent performance gains over the baselines.

Details

Language :
English
Database :
OpenAIRE
Accession number :
edsair.doi...........eae7c5c390527811681762d640685bc5
Full Text :
https://doi.org/10.26153/tsw/47308