Self-Organized Resonance during Search of a Diverse Chemical Space (Physical Review Letters)
ABSTRACT Recent studies of active matter have stimulated interest in the driven self-assembly of complex structures. Phenomenological modeling of particular examples has yielded insight, but general thermodynamic principles unifying the rich diversity of behaviors observed have been elusive. Here, we study the stochastic search of a toy chemical space by a collection of reacting Brownian particles subject to periodic forcing. We observe the emergence of an adaptive resonance in the system matched to the drive frequency, and show that the increased work absorption by these resonant structures is key to their stabilization. Our findings are consistent with a recently proposed thermodynamic mechanism for far-from-equilibrium self-organization.
Suggested by First Support for a Physics Theory of Life in Quanta Magazine.
Spontaneous fine-tuning to environment in many-species chemical reaction networks (Proceedings of the National Academy of Sciences)
Significance A qualitatively more diverse range of possible behaviors emerge in many-particle systems once external drives are allowed to push the system far from equilibrium; nonetheless, general thermodynamic principles governing nonequilibrium pattern formation and self-assembly have remained elusive, despite intense interest from researchers across disciplines. Here, we use the example of a randomly wired driven chemical reaction network to identify a key thermodynamic feature of a complex, driven system that characterizes the “specialness” of its dynamical attractor behavior. We show that the network’s fixed points are biased toward the extremization of external forcing, causing them to become kinetically stabilized in rare corners of chemical space that are either atypically weakly or strongly coupled to external environmental drives. Abstract A chemical mixture that continually absorbs work from its environment may exhibit steady-state chemical concentrations that deviate from their equilibrium values. Such behavior is particularly interesting in a scenario where the environmental work sources are relatively difficult to access, so that only the proper orchestration of many distinct catalytic actors can power the dissipative flux required to maintain a stable, far-from-equilibrium steady state. In this article, we study the dynamics of an in silico chemical network with random connectivity in an environment that makes strong thermodynamic forcing available only to rare combinations of chemical concentrations. We find that the long-time dynamics of such systems are biased toward states that exhibit a fine-tuned extremization of environmental forcing.
Suggested by First Support for a Physics Theory of Life in Quanta Magazine.

First Support for a Physics Theory of Life by Natalie Wolchover (Quanta Magazine)
Take chemistry, add energy, get life. The first tests of Jeremy England’s provocative origin-of-life hypothesis are in, and they appear to show how order can arise from nothing.
Interesting article with some great references I’ll need to delve into and read.


The situation changed in the late 1990s, when the physicists Gavin Crooks and Chris Jarzynski derived “fluctuation theorems” that can be used to quantify how much more often certain physical processes happen than reverse processes. These theorems allow researchers to study how systems evolve — even far from equilibrium.

I want to take a look at these papers as well as several about which the article is directly about.


Any claims that it has to do with biology or the origins of life, he added, are “pure and shameless speculations.”

Some truly harsh words from his former supervisor? Wow!


maybe there’s more that you can get for free

Most of what’s here in this article (and likely in the underlying papers) sounds to me to have been heavily influenced by the writings of W. Loewenstein and S. Kauffman. They’ve laid out some models/ideas that need more rigorous testing and work, and this seems like a reasonable start to the process. The “get for free” phrase itself is very S. Kauffman in my mind. I’m curious how many times it appears in his work?

Might be attending Entropy 2018: From Physics to Information Sciences and Geometry
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain

One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:

  • Physics: classical Thermodynamics and Quantum
  • Statistical physics and Bayesian computation
  • Geometrical science of information, topology and metrics
  • Maximum entropy principle and inference
  • Kullback and Bayes or information theory and Bayesian inference
  • Entropy in action (applications)

The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.

All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy. 

Entropy 2018 Conference

Statistical Mechanics, Spring 2016 (Physics 12c) by John Preskill (Caltech)
An introductory course in statistical mechanics.
Recommended textbook Thermal Physics by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube
https://www.youtube.com/playlist?list=PL0ojjrEqIyPzgJUUW76koGcSCy6OGtDRI

Thermodynamic Uncertainty Relation for Biomolecular Processes (Phys. Rev. Lett. 114, 158101 (2015) - journals.aps.org)
Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is constrained by the thermodynamic cost of generating it. An uncertainty ε requires at least a cost of 2k_B T/ε^2 independent of the time required to generate the output.
[1]
A. C. Barato and U. Seifert, “Thermodynamic Uncertainty Relation for Biomolecular Processes,” Physical Review Letters, vol. 114, no. 15. American Physical Society (APS), 15-Apr-2015 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.114.158101 [Source]
How Life (and Death) Spring From Disorder (Quanta Magazine)
Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.
This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. [1][2][3][4][5][6][7][8][9][10]

While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.

References

[1]
E. Mayr, What Makes Biology Unique? Cambridge University Press, 2004.
[2]
A. Wissner-Gross and C. Freer, “Causal entropic forces.,” Phys Rev Lett, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]
[3]
A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” Phys Rev Lett, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]
[4]
J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” Nat Rev Mol Cell Biol, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]
[5]
X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” Nature, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793
[6]
H. Morowitz and E. Smith, “Energy Flow and the Organization of Life,” Santa Fe Institute, 07-Aug-2006. [Online]. Available: http://samoa.santafe.edu/media/workingpapers/06-08-029.pdf. [Accessed: 03-Feb-2017]
[7]
R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183
[8]
C. Rovelli, “Meaning = Information + Evolution,” arXiv, Nov. 2006 [Online]. Available: https://arxiv.org/abs/1611.02420
[9]
N. Perunov, R. A. Marsland, and J. L. England, “Statistical Physics of Adaptation,” Physical Review X, vol. 6, no. 2. American Physical Society (APS), 16-Jun-2016 [Online]. Available: http://dx.doi.org/10.1103/PhysRevX.6.021036 [Source]
[10]
S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” Physical Review Letters, vol. 109, no. 12. American Physical Society (APS), 19-Sep-2012 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.109.120604 [Source]
A Physical Basis for the Second Law of Thermodynamics: Quantum Nonunitarity (arxiv.org)
It is argued that if the non-unitary measurement transition, as codified by Von Neumann, is a real physical process, then the "probability assumption" needed to derive the Second Law of Thermodynamics naturally enters at that point. The existence of a real, indeterministic physical process underlying the measurement transition would therefore provide an ontological basis for Boltzmann's Stosszahlansatz and thereby explain the unidirectional increase of entropy against a backdrop of otherwise time-reversible laws. It is noted that the Transactional Interpretation (TI) of quantum mechanics provides such a physical account of the non-unitary measurement transition, and TI is brought to bear in finding a physically complete, non-ad hoc grounding for the Second Law.
Download .pdf copy
H-theorem in quantum physics (Nature.com)

Abstract

Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. [1]

Footnotes

[1]
G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” Scientific Reports, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815
A New Physics Theory of Life by Natalie Wolchover (quantamagazine.org)
Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.
References:

Hypothesis annotations

[ hypothesis user = 'chrisaldrich' tags = 'EnglandQM']

Matter, energy… knowledge: How to harness physics' demonic power (New Scientist)
Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?
This is a nice little overview article of some of the history of thermodynamics relating to information in physics and includes some recent physics advances as well. There are a few references to applications in biology at the micro level as well.

References

Devourer of Encyclopedias: Stanislaw Lem's "Summa Technologiae" (The Los Angeles Review of Books)
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.

Summa Technologiae

AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.

His subjects, among others, include:

  • Virtual reality
  • Artificial intelligence
  • Nanotechnology and biotechnology
  • Evolutionary biology and evolutionary psychology
  • Artificial life
  • Information theory
  • Entropy and thermodynamics
  • Complexity theory, probability, and chaos
  • Population and ecological catastrophe
  • The “singularity” and “transhumanism”

Source: Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books

I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.

What is Information? [1601.06176] (arxiv.org)
Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.
A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A
Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)
Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami
[v1] Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: Christoph Adami [1601.06176] What is Information? on arXiv