Back to Search
Start Over
Wasserstein Distributionally Robust Optimization and Variation Regularization
- Source :
- Operations Research.
- Publication Year :
- 2022
- Publisher :
- Institute for Operations Research and the Management Sciences (INFORMS), 2022.
-
Abstract
- Wasserstein distributionally robust optimization (DRO) has recently achieved empirical success for various applications in operations research and machine learning, owing partly to its regularization effect. Although connection between Wasserstein DRO and regularization has been established in several settings, existing results often require restrictive assumptions, such as smoothness or convexity, that are not satisfied for many problems. In this paper, we develop a general theory on the variation regularization effect of the Wasserstein DRO - a new form of regularization that generalizes total-variation regularization, Lipschitz regularization and gradient regularization. Our results cover possibly non-convex and non-smooth losses and losses on non-Euclidean spaces. Examples include multi-item newsvendor, portfolio selection, linear prediction, neural networks, manifold learning, and intensity estimation for Poisson processes, etc. As an application of our theory of variation regularization, we derive new generalization guarantees for adversarial robust learning.<br />Comment: The paper is previously titled "Wasserstein Distributional Robustness and Regularization in Statistical Learning"
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Statistics - Machine Learning
Optimization and Control (math.OC)
FOS: Mathematics
Machine Learning (stat.ML)
Management Science and Operations Research
Mathematics - Optimization and Control
Machine Learning (cs.LG)
Computer Science Applications
Subjects
Details
- ISSN :
- 15265463 and 0030364X
- Database :
- OpenAIRE
- Journal :
- Operations Research
- Accession number :
- edsair.doi.dedup.....19daf9627808eda0e49b50937ad0ab11