1. Learning Target Domain Specific Classifier for Partial Domain Adaptation
- Author
-
Peiyi Yang, Chuan-Xian Ren, Shuicheng Yan, and Pengfei Ge
- Subjects
FOS: Computer and information sciences ,Domain adaptation ,Computer Networks and Communications ,business.industry ,Computer science ,Computer Vision and Pattern Recognition (cs.CV) ,Feature extraction ,Computer Science - Computer Vision and Pattern Recognition ,Pattern recognition ,02 engineering and technology ,Computer Science Applications ,Artificial Intelligence ,Outlier ,Classifier (linguistics) ,0202 electrical engineering, electronic engineering, information engineering ,Task analysis ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Classifier (UML) ,Software - Abstract
Unsupervised domain adaptation (UDA) aims at reducing the distribution discrepancy when transferring knowledge from a labeled source domain to an unlabeled target domain. Previous UDA methods assume that the source and target domains share an identical label space, which is unrealistic in practice since the label information of the target domain is agnostic. This article focuses on a more realistic UDA scenario, i.e., partial domain adaptation (PDA), where the target label space is subsumed to the source label space. In the PDA scenario, the source outliers that are absent in the target domain may be wrongly matched to the target domain (technically named negative transfer), leading to performance degradation of UDA methods. This article proposes a novel target-domain-specific classifier learning-based domain adaptation (TSCDA) method. TSCDA presents a soft-weighed maximum mean discrepancy criterion to partially align feature distributions and alleviate negative transfer. Also, it learns a target-specific classifier for the target domain with pseudolabels and multiple auxiliary classifiers to further address the classifier shift. A module named peers-assisted learning is used to minimize the prediction difference between multiple target-specific classifiers, which makes the classifiers more discriminant for the target domain. Extensive experiments conducted on three PDA benchmark data sets show that TSCDA outperforms other state-of-the-art methods with a large margin, e.g., 4% and 5.6% averagely on Office-31 and Office-Home, respectively.
- Published
- 2020