Organisms live and die by the amount of information they acquire about their environment. The systems analysis of complex metabolic networks allows us to ask how such information translates into fitness. A metabolic network transforms nutrients into biomass. The better it uses information on available nutrient availability, the faster it will allow a cell to divide.
I here use metabolic flux balance analysis to show that the accuracy I (in bits) with which a yeast cell can sense a limiting nutrient's availability relates logarithmically to fitness as indicated by biomass yield and cell division rate. For microbes like yeast, natural selection can resolve fitness differences of genetic variants smaller than 10-6, meaning that cells would need to estimate nutrient concentrations to very high accuracy (greater than 22 bits) to ensure optimal growth. I argue that such accuracies are not achievable in practice. Natural selection may thus face fundamental limitations in maximizing the information processing capacity of cells.
The analysis of metabolic networks opens a door to understanding cellular biology from a quantitative, information-theoretic perspective.
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids. https://doi.org/10.1063/1.4818538
Recent studies of active matter have stimulated interest in the driven self-assembly of complex structures. Phenomenological modeling of particular examples has yielded insight, but general thermodynamic principles unifying the rich diversity of behaviors observed have been elusive. Here, we study the stochastic search of a toy chemical space by a collection of reacting Brownian particles subject to periodic forcing. We observe the emergence of an adaptive resonance in the system matched to the drive frequency, and show that the increased work absorption by these resonant structures is key to their stabilization. Our findings are consistent with a recently proposed thermodynamic mechanism for far-from-equilibrium self-organization.
A qualitatively more diverse range of possible behaviors emerge in many-particle systems once external drives are allowed to push the system far from equilibrium; nonetheless, general thermodynamic principles governing nonequilibrium pattern formation and self-assembly have remained elusive, despite intense interest from researchers across disciplines. Here, we use the example of a randomly wired driven chemical reaction network to identify a key thermodynamic feature of a complex, driven system that characterizes the “specialness” of its dynamical attractor behavior. We show that the network’s fixed points are biased toward the extremization of external forcing, causing them to become kinetically stabilized in rare corners of chemical space that are either atypically weakly or strongly coupled to external environmental drives.
A chemical mixture that continually absorbs work from its environment may exhibit steady-state chemical concentrations that deviate from their equilibrium values. Such behavior is particularly interesting in a scenario where the environmental work sources are relatively difficult to access, so that only the proper orchestration of many distinct catalytic actors can power the dissipative flux required to maintain a stable, far-from-equilibrium steady state. In this article, we study the dynamics of an in silico chemical network with random connectivity in an environment that makes strong thermodynamic forcing available only to rare combinations of chemical concentrations. We find that the long-time dynamics of such systems are biased toward states that exhibit a fine-tuned extremization of environmental forcing.
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
Interesting article with some great references I’ll need to delve into and read.
The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.
I want to take a look at these papers as well as several about which the article is directly about.
Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”
Some truly harsh words from his former supervisor? Wow!
maybe there’s more that you can get for free
Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?
14-16 May 2018;
Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain
One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:
Physics: classical Thermodynamics and Quantum
Statistical physics and Bayesian computation
Geometrical science of information, topology and metrics
Maximum entropy principle and inference
Kullback and Bayes or information theory and Bayesian inference
Entropy in action (applications)
The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.
All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy.
Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is constrained by the thermodynamic cost of generating it. An uncertainty ε requires at least a cost of 2k_B T/ε^2 independent of the time required to generate the output.
Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.
This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. 
While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.
E. Mayr, What Makes Biology Unique? Cambridge University Press, 2004.
A. Wissner-Gross and C. Freer, “Causal entropic forces.,” Phys Rev Lett, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]
A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” Phys Rev Lett, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]
J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” Nat Rev Mol Cell Biol, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]
X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” Nature, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793
R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183
It is argued that if the non-unitary measurement transition, as codified by Von Neumann, is a real physical process, then the "probability assumption" needed to derive the Second Law of Thermodynamics naturally enters at that point. The existence of a real, indeterministic physical process underlying the measurement transition would therefore provide an ontological basis for Boltzmann's Stosszahlansatz and thereby explain the unidirectional increase of entropy against a backdrop of otherwise time-reversible laws. It is noted that the Transactional Interpretation (TI) of quantum mechanics provides such a physical account of the non-unitary measurement transition, and TI is brought to bear in finding a physically complete, non-ad hoc grounding for the Second Law.
Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. 
G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” Scientific Reports, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815
Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?
This is a nice little overview article of some of the history of thermodynamics relating to information in physics and includes some recent physics advances as well. There are a few references to applications in biology at the micro level as well.
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.
AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.
I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’sThe Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.