Back to Search
Start Over
Wilsonian Renormalization of Neural Network Gaussian Processes
- Publication Year :
- 2024
-
Abstract
- Separating relevant and irrelevant information is key to any modeling process or scientific inquiry. Theoretical physics offers a powerful tool for achieving this in the form of the renormalization group (RG). Here we demonstrate a practical approach to performing Wilsonian RG in the context of Gaussian Process (GP) Regression. We systematically integrate out the unlearnable modes of the GP kernel, thereby obtaining an RG flow of the GP in which the data sets the IR scale. In simple cases, this results in a universal flow of the ridge parameter, which becomes input-dependent in the richer scenario in which non-Gaussianities are included. In addition to being analytically tractable, this approach goes beyond structural analogies between RG and neural networks by providing a natural connection between RG flow and learnable vs. unlearnable modes. Studying such flows may improve our understanding of feature learning in deep neural networks, and enable us to identify potential universality classes in these models.<br />Comment: 17 pages, 1 figure; rewrote introduction, added references, section IIIA, section IVA, and appendix C
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2405.06008
- Document Type :
- Working Paper