Back to Search
Start Over
Visually Source-Free Domain Adaptation via Adversarial Style Matching.
- Source :
-
IEEE transactions on image processing : a publication of the IEEE Signal Processing Society [IEEE Trans Image Process] 2024; Vol. 33, pp. 1032-1044. Date of Electronic Publication: 2024 Jan 30. - Publication Year :
- 2024
-
Abstract
- The majority of existing works explore Unsupervised Domain Adaptation (UDA) with an ideal assumption that samples in both domains are available and complete. In real-world applications, however, this assumption does not always hold. For instance, data-privacy is becoming a growing concern, the source domain samples may be not publicly available for training, leading to a typical Source-Free Domain Adaptation (SFDA) problem. Traditional UDA methods would fail to handle SFDA since there are two challenges in the way: the data incompleteness issue and the domain gaps issue. In this paper, we propose a visually SFDA method named Adversarial Style Matching (ASM) to address both issues. Specifically, we first train a style generator to generate source-style samples given the target images to solve the data incompleteness issue. We use the auxiliary information stored in the pre-trained source model to ensure that the generated samples are statistically aligned with the source samples, and use the pseudo labels to keep semantic consistency. Then, we feed the target domain samples and the corresponding source-style samples into a feature generator network to reduce the domain gaps with a self-supervised loss. An adversarial scheme is employed to further expand the distributional coverage of the generated source-style samples. The experimental results verify that our method can achieve comparative performance even compared with the traditional UDA methods with source samples for training.
Details
- Language :
- English
- ISSN :
- 1941-0042
- Volume :
- 33
- Database :
- MEDLINE
- Journal :
- IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
- Publication Type :
- Academic Journal
- Accession number :
- 38241118
- Full Text :
- https://doi.org/10.1109/TIP.2024.3353539