*(Caltech)*

An introductory course in statistical mechanics.

Recommended textbook *Thermal Physics* by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Skip to content
# Tag: thermodynamics

## 🔖 Statistical Mechanics, Spring 2016 (Caltech, Physics 12c with videos) by John Preskill

Statistical Mechanics, Spring 2016 (Physics 12c) by *(Caltech)*

## 👓 The Quantum Thermodynamics Revolution | Quanta Magazine

The Quantum Thermodynamics Revolution by * (Quanta Magazine)*
Syndicated copies to:

## 🔖 Thermodynamic Uncertainty Relation for Biomolecular Processes, Phys. Rev. Lett. 114, 158101 (2015)

Thermodynamic Uncertainty Relation for Biomolecular Processes by *(Phys. Rev. Lett. 114, 158101 (2015) - journals.aps.org)*
Syndicated copies to:

## 🔖 How Life (and Death) Spring From Disorder | Quanta Magazine

How Life (and Death) Spring From Disorder by *(Quanta Magazine)*
### References

Syndicated copies to:

## 🔖 A Physical Basis for the Second Law of Thermodynamics: Quantum Nonunitarity

A Physical Basis for the Second Law of Thermodynamics: Quantum Nonunitarity by *(arxiv.org)*
Syndicated copies to:

## 🔖 H-theorem in quantum physics by G. B. Lesovik, et al.

H-theorem in quantum physics by *(Nature.com)*
### Abstract

### Footnotes

Syndicated copies to:

## A New Thermodynamics Theory of the Origin of Life | Quanta Magazine

A New Physics Theory of Life by *(quantamagazine.org)*
### Hypothesis annotations

## Matter, energy… knowledge: How to harness physics’ demonic power | New Scientist

Matter, energy… knowledge: How to harness physics' demonic power by *(New Scientist)*
## References

## Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae”

Devourer of Encyclopedias: Stanislaw Lem's "Summa Technologiae"* (The Los Angeles Review of Books)*

## What is Information? by Christoph Adami

What is Information? [1601.06176] by *(arxiv.org)*

## The Information Theory of Life | Quanta Magazine

The Information Theory of Life by *(Quanta Magazine)*
Syndicated copies to:

## The Information Universe Conference

### Keynote speakers

### Conference synopsis from their homepage:

## NIMBioS Workshop: Information Theory and Entropy in Biological Systems

## Resources for Information Theory and Biology

## BIRS Workshop on Biological and Bio-Inspired Information Theory | Storify Stream

## Information Theory is the New Central Discipline

Musings of a Modern Day Cyberneticist

An introductory course in statistical mechanics.

Recommended textbook *Thermal Physics* by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Syndicated copies to:

As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.

Biomolecular systems like molecular motors or pumps, transcription and translation machinery, and other enzymatic reactions, can be described as Markov processes on a suitable network. We show quite generally that, in a steady state, the dispersion of observables, like the number of consumed or produced molecules or the number of steps of a motor, is constrained by the thermodynamic cost of generating it. An uncertainty ε requires at least a cost of 2k_B T/ε^2 independent of the time required to generate the output.

[1]

A. C. Barato and U. Seifert, “Thermodynamic Uncertainty Relation for Biomolecular Processes,” *Physical Review Letters*, vol. 114, no. 15. American Physical Society (APS), 15-Apr-2015 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.114.158101 [Source]

Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.

This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. [1][2][3][4][5][6][7][8][9][10]

While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.

[1]

E. Mayr, *What Makes Biology Unique?* Cambridge University Press, 2004.

[2]

A. Wissner-Gross and C. Freer, “Causal entropic forces.,” *Phys Rev Lett*, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]

[3]

A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” *Phys Rev Lett*, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]

[4]

J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” *Nat Rev Mol Cell Biol*, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]

[5]

X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” *Nature*, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793

[6]

H. Morowitz and E. Smith, “Energy Flow and the Organization of Life,” *Santa Fe Institute*, 07-Aug-2006. [Online]. Available: http://samoa.santafe.edu/media/workingpapers/06-08-029.pdf. [Accessed: 03-Feb-2017]

[7]

R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” *IBM Journal of Research and Development*, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183

[8]

C. Rovelli, “Meaning = Information + Evolution,” *arXiv*, Nov. 2006 [Online]. Available: https://arxiv.org/abs/1611.02420

[9]

N. Perunov, R. A. Marsland, and J. L. England, “Statistical Physics of Adaptation,” *Physical Review X*, vol. 6, no. 2. American Physical Society (APS), 16-Jun-2016 [Online]. Available: http://dx.doi.org/10.1103/PhysRevX.6.021036 [Source]

[10]

S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” *Physical Review Letters*, vol. 109, no. 12. American Physical Society (APS), 19-Sep-2012 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.109.120604 [Source]

It is argued that if the non-unitary measurement transition, as codified by Von Neumann, is a real physical process, then the "probability assumption" needed to derive the Second Law of Thermodynamics naturally enters at that point. The existence of a real, indeterministic physical process underlying the measurement transition would therefore provide an ontological basis for Boltzmann's Stosszahlansatz and thereby explain the unidirectional increase of entropy against a backdrop of otherwise time-reversible laws. It is noted that the Transactional Interpretation (TI) of quantum mechanics provides such a physical account of the non-unitary measurement transition, and TI is brought to bear in finding a physically complete, non-ad hoc grounding for the Second Law.

Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. [1]

[1]

G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” *Scientific Reports*, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815

Jeremy England, a 31-year-old physicist at MIT, thinks he has found the underlying physics driving the origin and evolution of life.

References:

- Jeremy L. England Lab
- Talks
*Statistical physics of self-replication*, Jeremy L. England; J. Chem. Phys. 139, 121923 (2013); doi: 10.1063/1.4818538*Statistical Physics of Adaptation*, Nikolai Perunov, Robert Marsland, and Jeremy England, arXiv, December 8, 2014*Entropy production fluctuation theorem and the nonequilibrium work relation for free energy differences*, Gavin E. Crooks,*arXiv*, February 1, 2008*Life as a manifestation of the second law of thermodynamics*, E.D. Schneider, J.J. Kay, doi:10.1016/0895-7177(94)90188-0, Mathematical and Computer Modelling, Volume 19, Issues 6–8, March–April 1994, Pages 25-48

`[ hypothesis user = 'chrisaldrich' tags = 'EnglandQM']`

Running a brain-twisting thought experiment for real shows that information is a physical thing – so can we now harness the most elusive entity in the cosmos?

This is a nice little overview article of some of the history of thermodynamics relating to information in physics and includes some recent physics advances as well. There are a few references to applications in biology at the micro level as well.

- Second Law of Thermodynamics with Discrete Quantum Feedback Control by Takahiro Sagawa and Masahito Ueda; Phys. Rev. Lett.
**100**, 080403 – Published 26 February 2008 - Work and information processing in a solvable model of Maxwell’s demon by Dibyendu Mandal and Christopher Jarzynski; PNAS vol. 109 no. 29, July 17, 2012
- Thermodynamic Costs of Information Processing in Sensory Adaptation by Pablo Sartori, Léo Granger, Chiu Fan Lee, and Jordan M. Horowitz; PLOS December 11, 2014 http://dx.doi.org.sci-hub.cc/10.1371/journal.pcbi.1003974
- Intermittent transcription dynamics for the rapid production of long transcripts of high fidelity by Depken M
^{1}, Parrondo JM, Grill SW; Cell Rep. 2013 Oct 31;5(2):521-30. doi: 10.1016/j.celrep.2013.09.007 - The stepping motor protein as a feedback control ratchet by Martin Bier; BioSystems 88 (2007) 301–307

A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.

## Summa Technologiae

AT LAST WE have it in English.

Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.His subjects, among others, include:

- Virtual reality
- Artificial intelligence
- Nanotechnology and biotechnology
- Evolutionary biology and evolutionary psychology
- Artificial life
- Information theory
- Entropy and thermodynamics
- Complexity theory, probability, and chaos
- Population and ecological catastrophe
- The “singularity” and “transhumanism”

Source: *Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books*

I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.

Syndicated copies to:Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A

Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)

Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami

**[v1]** Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: *Christoph Adami [1601.06176] What is Information? on arXiv*

The Information Theory of Life: The polymath Christoph Adami is investigating life’s origins by reimagining living things as self-perpetuating information strings.

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

- Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
- Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
- Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
- Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
- Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
- Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Syndicated copies to:
Web resources for participants in the NIMBioS Worshop on Information Theory and Entropy in Biological Systems.

Over the next few days, I’ll be maintaining a Storify story covering information related to and coming out of the Information Theory and Entropy Workshop being sponsored by NIMBios at the Unviersity of Tennessee, Knoxville.

For those in attendance or participating by watching the live streaming video (or even watching the video after-the-fact), please feel free to use the official hashtag #entropyWS, and I’ll do my best to include your tweets, posts, and material into the story stream for future reference.

For journal articles and papers mentioned in/at the workshop, I encourage everyone to join the Mendeley.com group ITBio: Information Theory, Microbiology, Evolution, and Complexity and add them to the group’s list of papers. Think of it as a collaborative online journal club of sorts.

Those participating in the workshop are also encouraged to take a look at a growing collection of researchers and materials I maintain here. If you have materials or resources you’d like to contribute to the list, please send me an email or include them via the suggestions/submission form or include them in the comments section below.

- Researchers
- References and Journal Articles
- Books
- Related Academic, Research Institutes, Societies, Groups, and Organizations
- Conferences, Workshops, and Symposia
- Bionet.Info-Theory (Google Group/Usenet Group)
- #ITBio on Twitter

RSS Feed for BoffoSocko posts tagged with #ITBio

Syndicated copies to:

Over the span of the coming week, I'll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

Over the span of the coming week, I’ll be updating (and archiving) the stream of information coming out of the BIRS Workshop on Biological and Bio-Inspired Information Theory.

[My comments posted to the original Facebook post follow below.]

I’m coming to this post a bit late as I’m playing a bit of catch up, but agree with it wholeheartedly.

In particular, applications to molecular biology and medicine are really beginning to come to a heavy boil in just the past five years. This particular year is the progenitor of what appears to be the biggest renaissance for the application of information theory to the area of biology since Hubert Yockey, Henry Quastler, and Robert L. Platzman’s “Symposium on Information Theory in Biology at Gatlinburg, Tennessee” in 1956.

Upcoming/recent conferences/workshops on information theory in biology include:

- BIRS Workshop: Biological and Bio-Inspired Information Theory
- Entropy and Information in Biological Systems at NIMBios
- CECAM Workshop: Entropy in Biomolecular Systems
- ALife breakout session on Information Theoretic Incentives for Artificial Life (which will also spawn off a special issue of the journal Entropy):

At the beginning of September, Christoph Adami posted an awesome and very sound paper on arXiv entitled “Information-theoretic considerations concerning the origin of life” which truly portends to turn the science of the origin of life on its head.

I’ll note in passing, for those interested, that Claude Shannon’s infamous master’s thesis at MIT (in which he applied Boolean Algebra to electric circuits allowing the digital revolution to occur) and his subsequent “The Theory of Mathematical Communication” were so revolutionary, nearly everyone forgets his MIT Ph.D. Thesis “An Algebra for Theoretical Genetics” which presaged the areas of cybernetics and the current applications of information theory to microbiology and are probably as seminal as Sir R.A Fisher’s applications of statistics to science in general and biology in particular.

For those commenting on the post who were interested in a layman’s introduction to information theory, I recommend John Robinson Pierce’s *An Introduction to Information Theory: Symbols, Signals and Noise* (Dover has a very inexpensive edition.) After this, one should take a look at Claude Shannon’s original paper. (The MIT Press printing includes some excellent overview by Warren Weaver along with the paper itself.) The mathematics in the paper really aren’t too technical, and most of it should be comprehensible by most advanced high school students.

For those that don’t understand the concept of entropy, I HIGHLY recommend Arieh Ben-Naim’s book *Entropy Demystified The Second Law Reduced to Plain Common Sense with Seven Simulated Games*. He really does tear the concept down into its most basic form in a way I haven’t seen others come remotely close to and which even my mother can comprehend (with no mathematics at all). (I recommend this presentation to even those with Ph.D.’s in physics because it is so truly fundamental.)

For the more advanced mathematicians, physicists, and engineers Arieh Ben-Naim does a truly spectacular job of extending ET Jaynes’ work on information theory and statistical mechanics and comes up with a more coherent mathematical theory to conjoin the entropy of physics/statistical mechanics with that of Shannon’s information theory in *A Farewell to Entropy: Statistical Thermodynamics Based on Information*.

For the advanced readers/researchers interested in more at the intersection of information theory and biology, I’ll also mention that I maintain a list of references, books, and journal articles in a Mendeley group entitled “ITBio: Information Theory, Microbiology, Evolution, and Complexity.”

Syndicated copies to: