Back to Search Start Over

Statistical Optimization in High Dimensions.

Authors :
Xu, Huan
Caramanis, Constantine
Mannor, Shie
Source :
Operations Research; Jul/Aug2016, Vol. 64 Issue 4, p958-979, 22p
Publication Year :
2016

Abstract

We consider optimization problems whose parameters are known only approximately, based on noisy samples. In large-scale applications, the number of samples one can collect is typically of the same order of (or even less than) the dimensionality of the problem. This so-called high-dimensional statistical regime has been the object of intense recent research in machine learning and statistics, primarily due to phenomena inherent to this regime, such as the fact that the noise one sees here often dwarfs the magnitude of the signal itself. While relevant in numerous important operations research and engineering optimization applications, this setup falls far outside the traditional scope of robust and stochastic optimization. We propose three algorithms to address this setting, combining ideas from statistics, machine learning, and robust optimization. Our algorithms are motivated by three natural optimization objectives: minimizing the number of grossly violated constraints; maximizing the number of exactly satisfied constraints; and, finally, developing algorithms whose running time scales with the intrinsic dimension of a problem, as opposed to its observed dimension-a mismatch that, as we discuss in detail, can be dire in settings where constraints are meant to describe preferences of behaviors. The key ingredients of our algorithms are dimensionality reduction techniques from machine learning, robust optimization, and concentration of measure tools from statistics. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
0030364X
Volume :
64
Issue :
4
Database :
Complementary Index
Journal :
Operations Research
Publication Type :
Academic Journal
Accession number :
117109339
Full Text :
https://doi.org/10.1287/opre.2016.1504