1. Entropy, Thermodynamics and the Geometrization of the Language Model
- Author
-
Yang, Wenzhe
- Subjects
Computer Science - Computation and Language ,Condensed Matter - Statistical Mechanics ,High Energy Physics - Theory ,Mathematics - Differential Geometry ,68T01 ,I.2.7 - Abstract
In this paper, we discuss how pure mathematics and theoretical physics can be applied to the study of language models. Using set theory and analysis, we formulate mathematically rigorous definitions of language models, and introduce the concept of the moduli space of distributions for a language model. We formulate a generalized distributional hypothesis using functional analysis and topology. We define the entropy function associated with a language model and show how it allows us to understand many interesting phenomena in languages. We argue that the zero points of the entropy function and the points where the entropy is close to 0 are the key obstacles for an LLM to approximate an intelligent language model, which explains why good LLMs need billions of parameters. Using the entropy function, we formulate a conjecture about AGI. Then, we show how thermodynamics gives us an immediate interpretation to language models. In particular we will define the concepts of partition function, internal energy and free energy for a language model, which offer insights into how language models work. Based on these results, we introduce a general concept of the geometrization of language models and define what is called the Boltzmann manifold. While the current LLMs are the special cases of the Boltzmann manifold., Comment: 18 pages
- Published
- 2024