Back to Search Start Over

On the Generalization Power of Overfitted Two-Layer Neural Tangent Kernel Models

Authors :
Ju, Peizhong
Lin, Xiaojun
Shroff, Ness B.
Publication Year :
2021

Abstract

In this paper, we study the generalization performance of min $\ell_2$-norm overfitting solutions for the neural tangent kernel (NTK) model of a two-layer neural network with ReLU activation that has no bias term. We show that, depending on the ground-truth function, the test error of overfitted NTK models exhibits characteristics that are different from the "double-descent" of other overparameterized linear models with simple Fourier or Gaussian features. Specifically, for a class of learnable functions, we provide a new upper bound of the generalization error that approaches a small limiting value, even when the number of neurons $p$ approaches infinity. This limiting value further decreases with the number of training samples $n$. For functions outside of this class, we provide a lower bound on the generalization error that does not diminish to zero even when $n$ and $p$ are both large.<br />Comment: Published in ICML21. This version fixes an error of Lemma 31 and other parts affected by this error. The main results remain the same except some small changes on certain coefficients of Eq.(9)

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2103.05243
Document Type :
Working Paper