[Umm, I dunno, what do you think? ‘Meta-Ockham 1’ seems sensible but (2) seems to move the focus of the ‘science’ potentially from explanation to description? And in “discovering and encoding regularities in irreducibly high dimensional phenomena”, the ‘encoding’ slips in there rather subtly in a way which I think deserves more conversation? Certainly interesting anyway]
Front. Complex Syst., 18 October 2023
Sec. Complex Systems Theory
Volume 1 – 2023 | https://doi.org/10.3389/fcpxs.2023.1235202
Insights In Complex Systems Theory
- Santa Fe Institute, Santa Fe, NM, United States
Complexity science and machine learning are two complementary approaches to discovering and encoding regularities in irreducibly high dimensional phenomena. Whereas complexity science represents a coarse-grained paradigm of understanding, machine learning is a fine-grained paradigm of prediction. Both approaches seek to solve the “Wigner-Reversal” or the unreasonable ineffectiveness of mathematics in the adaptive domain where broken symmetries and broken ergodicity dominate. In order to integrate these paradigms I introduce the idea of “Meta-Ockham” which 1) moves minimality from the description of a model for a phenomenon to a description of a process for generating a model and 2) describes low dimensional features–schema–in these models. Reinforcement learning and natural selection are both parsimonious in this revised sense of minimal processes that parameterize arbitrarily high-dimensional inductive models containing latent, low-dimensional, regularities. I describe these models as “super-Humean” and discuss the scientic value of analyzing their latent dimensions as encoding functional schema.
https://www.frontiersin.org/articles/10.3389/fcpxs.2023.1235202/full