1. Dimension reduction for nonelliptically distributed predictors
- Author
-
Yuexiao Dong and Bing Li
- Subjects
Statistics and Probability ,kernel inverse regression ,Mathematical optimization ,inverse regression ,Dimensionality reduction ,Asymptotic distribution ,Sufficient dimension reduction ,Mathematics - Statistics Theory ,Multivariate normal distribution ,Statistics Theory (math.ST) ,central solution spaces ,parametric inverse regression ,Canonical correlation ,Joint probability distribution ,62G08 ,62G09 ,FOS: Mathematics ,Sliced inverse regression ,sliced inverse regression ,62H12 ,Statistics, Probability and Uncertainty ,62H12, 62G08, 62G09 (Primary) ,Elliptical distribution ,Mathematics ,Curse of dimensionality - Abstract
Sufficient dimension reduction methods often require stringent conditions on the joint distribution of the predictor, or, when such conditions are not satisfied, rely on marginal transformation or reweighting to fulfill them approximately. For example, a typical dimension reduction method would require the predictor to have elliptical or even multivariate normal distribution. In this paper, we reformulate the commonly used dimension reduction methods, via the notion of "central solution space," so as to circumvent the requirements of such strong assumptions, while at the same time preserve the desirable properties of the classical methods, such as $\sqrt{n}$-consistency and asymptotic normality. Imposing elliptical distributions or even stronger assumptions on predictors is often considered as the necessary tradeoff for overcoming the "curse of dimensionality," but the development of this paper shows that this need not be the case. The new methods will be compared with existing methods by simulation and applied to a data set., Comment: Published in at http://dx.doi.org/10.1214/08-AOS598 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org)
- Published
- 2009