Back to Search Start Over

Asymptotics of Stochastic Gradient Descent with Dropout Regularization in Linear Models

Authors :
Li, Jiaqi
Schmidt-Hieber, Johannes
Wu, Wei Biao
Publication Year :
2024

Abstract

This paper proposes an asymptotic theory for online inference of the stochastic gradient descent (SGD) iterates with dropout regularization in linear regression. Specifically, we establish the geometric-moment contraction (GMC) for constant step-size SGD dropout iterates to show the existence of a unique stationary distribution of the dropout recursive function. By the GMC property, we provide quenched central limit theorems (CLT) for the difference between dropout and $\ell^2$-regularized iterates, regardless of initialization. The CLT for the difference between the Ruppert-Polyak averaged SGD (ASGD) with dropout and $\ell^2$-regularized iterates is also presented. Based on these asymptotic normality results, we further introduce an online estimator for the long-run covariance matrix of ASGD dropout to facilitate inference in a recursive manner with efficiency in computational time and memory. The numerical experiments demonstrate that for sufficiently large samples, the proposed confidence intervals for ASGD with dropout nearly achieve the nominal coverage probability.<br />Comment: 77 pages, 5 figures, 4 tables

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2409.07434
Document Type :
Working Paper