The origin of life is arguably one of the greatest unanswered questions in science. A primary challenge is that without a proper definition for life -- a notoriously challenging problem in its own right -- the problem of how life began is not well posed. Here we propose that the transition from non-life to life may correspond to a fundamental shift in causal structure, where information gains direct, and context-dependent, causal efficacy over matter, a transition that may be mapped to a nontrivial distinction in how living systems process information. Dr. Walker will discuss potential measures of such a transition, which may be amenable to laboratory study, and how the proposed mechanism corresponds to the onset of the unique mode of (algorithmic) information processing characteristic of living systems.
Tag: algorithmic complexity
If you’re not following him everywhere (?) yet, start with some of the sites below (or let me know if I’ve missed anything).
His most recent paper on arXiv:
Low Algorithmic Complexity Entropy-deceiving Graphs | .pdf
A common practice in the estimation of the complexity of objects, in particular of graphs, is to rely on graph- and information-theoretic measures. Here, using integer sequences with properties such as Borel normality, we explain how these measures are not independent of the way in which a single object, such a graph, can be described. From descriptions that can reconstruct the same graph and are therefore essentially translations of the same description, we will see that not only is it necessary to pre-select a feature of interest where there is one when applying a computable measure such as Shannon Entropy, and to make an arbitrary selection where there is not, but that more general properties, such as the causal likeliness of a graph as a measure (opposed to randomness), can be largely misrepresented by computable measures such as Entropy and Entropy rate. We introduce recursive and non-recursive (uncomputable) graphs and graph constructions based on integer sequences, whose different lossless descriptions have disparate Entropy values, thereby enabling the study and exploration of a measure’s range of applications and demonstrating the weaknesses of computable measures of complexity.
Subjects: Information Theory (cs.IT); Computational Complexity (cs.CC); Combinatorics (math.CO)
Cite as: arXiv:1608.05972 [cs.IT] (or arXiv:1608.05972v4 [cs.IT]
Yesterday he also posted two new introductory videos to his YouTube channel. There’s nothing overly technical here, but they’re nice short productions that introduce some of his work. (I wish more scientists did communication like this.) I’m hoping he’ll post them to his blog and write a bit more there in the future as well.
Universal Measures of Complexity
- A Decomposition Method for Global Evaluation of Shannon Entropy and Local Estimations of Algorithmic Complexity by Hector Zenil, Fernando Soler-Toscano, Narsis A. Kiani, Santiago Hernández-Orozco, Antonio Rueda-Toicen
- Calculating Kolmogorov Complexity from the Output Frequency Distributions of Small Turing Machines by F. Soler-Toscano, H. Zenil, J.-P. Delahaye and N. Gauvrit; PLoS ONE 9(5): e96223, 2014.
- Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Randomness by Jean-Paul Delahaye, Hector Zenil; Applied Mathematics and Computation 219, pp. 63-77, 2012.
Cross-boundary Behavioural Reprogrammability Reveals Evidence of Pervasive Turing Universality by Jürgen Riedel, Hector Zenil
Preprint available at http://arxiv.org/abs/1510.01671
Ed.: 9/7/16: Updated videos with links to relevant literature