Back to Search Start Over

Stochastic Gradient Descent for Additive Nonparametric Regression

Authors :
Chen, Xin
Klusowski, Jason M.
Publication Year :
2024

Abstract

This paper introduces an iterative algorithm for training additive models that enjoys favorable memory storage and computational requirements. The algorithm can be viewed as the functional counterpart of stochastic gradient descent, applied to the coefficients of a truncated basis expansion of the component functions. We show that the resulting estimator satisfies an oracle inequality that allows for model mis-specification. In the well-specified setting, by choosing the learning rate carefully across three distinct stages of training, we demonstrate that its risk is minimax optimal in terms of the dependence on the dimensionality of the data and the size of the training sample. We further illustrate the computational benefits by comparing the approach with traditional backfitting on two real-world datasets.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2401.00691
Document Type :
Working Paper