Back to Search
Start Over
Generalizing to Unseen Domains: A Survey on Domain Generalization
- Source :
- IEEE Transactions on Knowledge and Data Engineering; August 2023, Vol. 35 Issue: 8 p8052-8072, 21p
- Publication Year :
- 2023
-
Abstract
- Machine learning systems generally assume that the training and testing distributions are the same. To this end, a key requirement is to develop models that can generalize to unseen distributions. Domain generalization (DG), i.e., out-of-distribution generalization, has attracted increasing interests in recent years. Domain generalization deals with a challenging setting where one or several different but related domain(s) are given, and the goal is to learn a model that can generalize to an unseen test domain. Great progress has been made in the area of domain generalization for years. This paper presents the first review of recent advances in this area. First, we provide a formal definition of domain generalization and discuss several related fields. We then thoroughly review the theories related to domain generalization and carefully analyze the theory behind generalization. We categorize recent algorithms into three classes: data manipulation, representation learning, and learning strategy, and present several popular algorithms in detail for each category. Third, we introduce the commonly used datasets, applications, and our open-sourced codebase for fair evaluation. Finally, we summarize existing literature and present some potential research topics for the future.
Details
- Language :
- English
- ISSN :
- 10414347 and 15582191
- Volume :
- 35
- Issue :
- 8
- Database :
- Supplemental Index
- Journal :
- IEEE Transactions on Knowledge and Data Engineering
- Publication Type :
- Periodical
- Accession number :
- ejs63523600
- Full Text :
- https://doi.org/10.1109/TKDE.2022.3178128