1. Assembly Theory Reduced to Shannon Entropy and Rendered Redundant by Naive Statistical Algorithms
- Author
-
Ozelim, Luan, Uthamacumaran, Abicumaran, Abrahão, Felipe S., Hernández-Orozco, Santiago, Kiani, Narsis A., Tegnér, Jesper, and Zenil, Hector
- Subjects
Computer Science - Information Theory ,Computer Science - Computational Complexity - Abstract
We respond to arguments against our criticism that claim to show a divergence of Assembly Theory from popular compression. We have proven that any implementation of the concept of `copy number' underlying Assembly Theory (AT) and its assembly index (Ai) is equivalent to Shannon Entropy and not fundamentally or methodologically different from algorithms like ZIP compression. We show that the weak empirical correlation between Ai and LZW, which the authors offered as a defense against the proof that the assembly index calculation method is a compression scheme, is based on an incomplete and misleading experiment. When the experiment is completed, the asymptotic convergence to LZ compression and Shannon Entropy is evident and aligned with the mathematical proof previously offered. Therefore, this completes the theoretical and empirical demonstrations that any variation of the copy-number concept underlying AT, which resorts to counting the number of object repetitions `to arrive at a measure for life,' is equivalent to statistical compression and Shannon Entropy. We demonstrate that the authors' `we-are-better-because-we-are-worse' defense argument against compression does not withstand basic scrutiny and that their empirical results separating organic from inorganic compounds have not only been previously reported -- sans claims to unify physics and biology -- but are also driven solely by molecular length, not a particular feature of life captured by their assembly index. Finally, we show that Ai is a particular case of our BDM introduced almost a decade earlier and that arguments attributing special stochastic properties to Ai are misleading, not unique, and exactly the same as those that Shannon Entropy is already not only equipped with but designed for which we have also proven to be equivalent to Ai making AT redundant even in practice when applied to their own experimental data., Comment: 12 figures, 53 pages (minor tweaks and adding new refs of previous relevant work not cited by the authors of AT)
- Published
- 2024