A long time reader of my blog has asked if I might describe how I make a cup of tea. Apparently being English introduces some kind of hoodoo into the process – some magic or other that the wi…
Organisms live and die by the amount of information they acquire about their environment. The systems analysis of complex metabolic networks allows us to ask how such information translates into fitness. A metabolic network transforms nutrients into biomass. The better it uses information on available nutrient availability, the faster it will allow a cell to divide.
I here use metabolic flux balance analysis to show that the accuracy I (in bits) with which a yeast cell can sense a limiting nutrient's availability relates logarithmically to fitness as indicated by biomass yield and cell division rate. For microbes like yeast, natural selection can resolve fitness differences of genetic variants smaller than 10-6, meaning that cells would need to estimate nutrient concentrations to very high accuracy (greater than 22 bits) to ensure optimal growth. I argue that such accuracies are not achievable in practice. Natural selection may thus face fundamental limitations in maximizing the information processing capacity of cells.
The analysis of metabolic networks opens a door to understanding cellular biology from a quantitative, information-theoretic perspective.
Received: 01 March 2007 Accepted: 30 July 2007 Published: 30 July 2007
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.
ABSTRACT Recent studies of active matter have stimulated interest in the driven self-assembly of complex structures. Phenomenological modeling of particular examples has yielded insight, but general thermodynamic principles unifying the rich diversity of behaviors observed have been elusive. Here, we study the stochastic search of a toy chemical space by a collection of reacting Brownian particles subject to periodic forcing. We observe the emergence of an adaptive resonance in the system matched to the drive frequency, and show that the increased work absorption by these resonant structures is key to their stabilization. Our findings are consistent with a recently proposed thermodynamic mechanism for far-from-equilibrium self-organization.
Significance A qualitatively more diverse range of possible behaviors emerge in many-particle systems once external drives are allowed to push the system far from equilibrium; nonetheless, general thermodynamic principles governing nonequilibrium pattern formation and self-assembly have remained elusive, despite intense interest from researchers across disciplines. Here, we use the example of a randomly wired driven chemical reaction network to identify a key thermodynamic feature of a complex, driven system that characterizes the “specialness” of its dynamical attractor behavior. We show that the network’s fixed points are biased toward the extremization of external forcing, causing them to become kinetically stabilized in rare corners of chemical space that are either atypically weakly or strongly coupled to external environmental drives. Abstract A chemical mixture that continually absorbs work from its environment may exhibit steady-state chemical concentrations that deviate from their equilibrium values. Such behavior is particularly interesting in a scenario where the environmental work sources are relatively difficult to access, so that only the proper orchestration of many distinct catalytic actors can power the dissipative flux required to maintain a stable, far-from-equilibrium steady state. In this article, we study the dynamics of an in silico chemical network with random connectivity in an environment that makes strong thermodynamic forcing available only to rare combinations of chemical concentrations. We find that the long-time dynamics of such systems are biased toward states that exhibit a fine-tuned extremization of environmental forcing.
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.
I want to take a look at these papers as well as several about which the article is directly about.
Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”
Some truly harsh words from his former supervisor? Wow!
maybe there’s more that you can get for free
Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain
One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:
- Physics: classical Thermodynamics and Quantum
- Statistical physics and Bayesian computation
- Geometrical science of information, topology and metrics
- Maximum entropy principle and inference
- Kullback and Bayes or information theory and Bayesian inference
- Entropy in action (applications)
The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.
All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy.
An introductory course in statistical mechanics.
There’s also a corresponding video lecture series available on YouTube
As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.
Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is constrained by the thermodynamic cost of generating it. An uncertainty ε requires at least a cost of 2k_B T/ε^2 independent of the time required to generate the output.
Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.
While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.
It is argued that if the non-unitary measurement transition, as codified by Von Neumann, is a real physical process, then the "probability assumption" needed to derive the Second Law of Thermodynamics naturally enters at that point. The existence of a real, indeterministic physical process underlying the measurement transition would therefore provide an ontological basis for Boltzmann's Stosszahlansatz and thereby explain the unidirectional increase of entropy against a backdrop of otherwise time-reversible laws. It is noted that the Transactional Interpretation (TI) of quantum mechanics provides such a physical account of the non-unitary measurement transition, and TI is brought to bear in finding a physically complete, non-ad hoc grounding for the Second Law.
Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. 
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
- Jeremy L. England Lab
- Statistical physics of self-replication, Jeremy L. England; J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538
- Statistical Physics of Adaptation, Nikolai Perunov, Robert Marsland, and Jeremy England, arXiv, December 8, 2014
- Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences, Gavin E. Crooks, arXiv, February 1, 2008
- Life as a manifestation of the second law of thermodynamics, E.D. Schneider, J.J. Kay, doi:10.1016/0895-7177(94)90188-0, Mathematical and Computer Modelling, Volume 19, Issues 6–8, March–April 1994, Pages 25-48
Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?
- Second Law of Thermodynamics with Discrete Quantum Feedback Control by Takahiro Sagawa and Masahito Ueda; Phys. Rev. Lett. 100, 080403 – Published 26 February 2008
- Work and information processing in a solvable model of Maxwell’s demon by Dibyendu Mandal and Christopher Jarzynski; PNAS vol. 109 no. 29, July 17, 2012
- Thermodynamic Costs of Information Processing in Sensory Adaptation by Pablo Sartori, Léo Granger, Chiu Fan Lee, and Jordan M. Horowitz; PLOS December 11, 2014 http://dx.doi.org/10.1371/journal.pcbi.1003974
- Intermittent transcription dynamics for the rapid production of long transcripts of high fidelity by Depken M1, Parrondo JM, Grill SW; Cell Rep. 2013 Oct 31;5(2):521-30. doi: 10.1016/j.celrep.2013.09.007
- The stepping motor protein as a feedback control ratchet by Martin Bier; BioSystems 88 (2007) 301–307