Back to Search
Start Over
Equivalence of information production and generalized entropies in complex processes
- Publication Year :
- 2022
-
Abstract
- Complex systems that are characterized by strong correlations and fat-tailed distribution functions have been argued to be incompatible within the framework of Boltzmann-Gibbs entropy. As an alternative, so-called generalized entropies were proposed and intensively studied. Here we show that this incompatibility is a misconception. For a broad class of processes, Boltzmann entropy the log multiplicity remains the valid entropy concept, however, for non-i.i.d., non-multinomial, and non-ergodic processes, Boltzmann entropy is not of Shannon form. The correct form of Boltzmann entropy can be shown to be identical with generalized entropies. We derive this result for all processes that can be mapped reversibly to adjoint representations where processes are i.i.d.. In these representations the information production is given by the Shannon entropy. We proof that over the original sampling space this yields functionals that are identical to generalized entropies. The problem of constructing adequate context-sensitive entropy functionals therefore can be translated into the much simpler problem of finding adjoint representations. The method provides a comprehensive framework for a statistical physics of strongly correlated systems and complex processes.<br />Comment: 14 pages paper + SI, 1 figure
Details
- Database :
- arXiv
- Publication Type :
- Report
- Accession number :
- edsarx.2208.06201
- Document Type :
- Working Paper