Back to Search Start Over

Non-Vacuous Generalization Bounds for Large Language Models

Authors :
Lotfi, Sanae
Finzi, Marc
Kuang, Yilun
Rudner, Tim G. J.
Goldblum, Micah
Wilson, Andrew Gordon
Publication Year :
2023

Abstract

Modern language models can contain billions of parameters, raising the question of whether they can generalize beyond the training data or simply parrot their training corpora. We provide the first non-vacuous generalization bounds for pretrained large language models (LLMs), indicating that language models are capable of discovering regularities that generalize to unseen data. In particular, we derive a compression bound that is valid for the unbounded log-likelihood loss using prediction smoothing, and we extend the bound to handle subsampling, accelerating bound computation by orders of magnitude on massive datasets. To achieve the extreme level of compression required for non-vacuous bounds, we devise SubLoRA, a simple low-dimensional nonlinear parameterization that leads to non-vacuous generalization bounds for models with nearly a billion parameters. Finally, we use our bounds to understand LLM generalization and find that larger models have better generalization bounds and are more compressible than smaller models.<br />Comment: ICML 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2312.17173
Document Type :
Working Paper