Back to Search Start Over

Nonparametric regression with adaptive truncation via a convex hierarchical penalty.

Authors :
Haris, Asad
Shojaie, Ali
Simon, Noah
Source :
Biometrika. Mar2019, Vol. 106 Issue 1, p87-107. 21p.
Publication Year :
2019

Abstract

We consider the problem of nonparametric regression with a potentially large number of covariates. We propose a convex, penalized estimation framework that is particularly well suited to high-dimensional sparse additive models and combines the appealing features of finite basis representation and smoothing penalties. In the case of additive models, a finite basis representation provides a parsimonious representation for fitted functions but is not adaptive when component functions possess different levels of complexity. In contrast, a smoothing spline-type penalty on the component functions is adaptive but does not provide a parsimonious representation. Our proposal simultaneously achieves parsimony and adaptivity in a computationally efficient way. We demonstrate these properties through empirical studies and show that our estimator converges at the minimax rate for functions within a hierarchical class. We further establish minimax rates for a large class of sparse additive models. We also develop an efficient algorithm that scales similarly to the lasso with the number of covariates and sample size. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00063444
Volume :
106
Issue :
1
Database :
Academic Search Index
Journal :
Biometrika
Publication Type :
Academic Journal
Accession number :
134757305
Full Text :
https://doi.org/10.1093/biomet/asy056