1. Dyson Brownian motion and random matrix dynamics of weight matrices during learning
- Author
-
Aarts, Gert, Hajizadeh, Ouraman, Lucini, Biagio, and Park, Chanju
- Subjects
Condensed Matter - Disordered Systems and Neural Networks ,Computer Science - Machine Learning ,High Energy Physics - Lattice - Abstract
During training, weight matrices in machine learning architectures are updated using stochastic gradient descent or variations thereof. In this contribution we employ concepts of random matrix theory to analyse the resulting stochastic matrix dynamics. We first demonstrate that the dynamics can generically be described using Dyson Brownian motion, leading to e.g. eigenvalue repulsion. The level of stochasticity is shown to depend on the ratio of the learning rate and the mini-batch size, explaining the empirically observed linear scaling rule. We verify this linear scaling in the restricted Boltzmann machine. Subsequently we study weight matrix dynamics in transformers (a nano-GPT), following the evolution from a Marchenko-Pastur distribution for eigenvalues at initialisation to a combination with additional structure at the end of learning., Comment: 7 pages. Contribution accepted in the NeurIPS 2024 workshop "Machine Learning and the Physical Sciences"
- Published
- 2024