*(Caltech)*

An introductory course in statistical mechanics.

Recommended textbook *Thermal Physics* by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Skip to content
# Tag: entropy

## 🔖 Statistical Mechanics, Spring 2016 (Caltech, Physics 12c with videos) by John Preskill

Statistical Mechanics, Spring 2016 (Physics 12c) by *(Caltech)*

## Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Entropy | Special Issue : Maximum Entropy and Bayesian Methods *(mdpi.com)*
## 🔖 Irreversibility and Heat Generation in the Computing Process by R. Landauer

Irreversibility and Heat Generation in the Computing Process by *(ieeexplore.ieee.org)*

## 🔖 Causal Entropic Forces, Phys. Rev. Lett. 110, 168702 (2013)

Causal Entropic Forces by *(Phys. Rev. Lett. 110, 168702 (2013) journals.aps.org )*

## 🔖 How Life (and Death) Spring From Disorder | Quanta Magazine

How Life (and Death) Spring From Disorder by *(Quanta Magazine)*
### References

Syndicated copies to:

## 🔖 H-theorem in quantum physics by G. B. Lesovik, et al.

H-theorem in quantum physics by *(Nature.com)*
### Abstract

### Footnotes

Syndicated copies to:

## Chris Aldrich is reading “Department of Energy May Have Broken the Second Law of Thermodynamics”

Department of Energy May Have Broken the Second Law of Thermodynamics by *(Inverse)*(Duration: P2016Y10M6DT1H)
Syndicated copies to:

## Physicists Hunt For The Big Bang’s Triangles | Quanta Magazine

Physicists Hunt for the Big Bang's Triangles by *(Quanta Magazine )*

## Introduction to Information Theory | SFI’s Complexity Explorer

## Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae”

Devourer of Encyclopedias: Stanislaw Lem's "Summa Technologiae" *(The Los Angeles Review of Books)*

## What is Information? by Christoph Adami

What is Information? [1601.06176] by *(arxiv.org)*

## Forthcoming ITBio-related book from Sean Carroll: “The Big Picture: On the Origins of Life, Meaning, and the Universe Itself”

## Donald Forsdyke Indicates the Concept of Information in Biology Predates Claude Shannon

### References

Syndicated copies to:

## The Information Universe Conference

### Keynote speakers

### Conference synopsis from their homepage:

## Videos from the NIMBioS Workshop on Information and Entropy in Biological Systems

Musings of a Modern Day Cyberneticist

An introductory course in statistical mechanics.

Recommended textbook *Thermal Physics* by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube

Syndicated copies to:

Open for submission now

Deadline for manuscript submissions: 31 August 2017

A special issue of *Entropy* (ISSN 1099-4300).
## Special Issue Editor

## Special Issue Information

Deadline for manuscript submissions: **31 August 2017**

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer

*Guest Editor*

**Submission**

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. *Entropy* is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

Source: Entropy | Special Issue : Maximum Entropy and Bayesian Methods

It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.

A classical paper in the history of entropy.

Syndicated copies to:Recent advances in fields ranging from cosmology to computer science have hinted at a possible deep connection between intelligence and entropy maximization, but no formal physical relationship between them has yet been established. Here, we explicitly propose a first step toward such a relationship in the form of a causal generalization of entropic forces that we find can cause two defining behaviors of the human “cognitive niche”—tool use and social cooperation—to spontaneously emerge in simple physical systems. Our results suggest a potentially general thermodynamic model of adaptive behavior as a nonequilibrium process in open systems.

[1]

A. D. Wissner-Gross and C. E. Freer, “Causal Entropic Forces,” *Physical Review Letters*, vol. 110, no. 16. American Physical Society (APS), 19-Apr-2013 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.110.168702 [Source]

Syndicated copies to:

Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.

This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. [1][2][3][4][5][6][7][8][9][10]

While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.

[1]

E. Mayr, *What Makes Biology Unique?* Cambridge University Press, 2004.

[2]

A. Wissner-Gross and C. Freer, “Causal entropic forces.,” *Phys Rev Lett*, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]

[3]

A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” *Phys Rev Lett*, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]

[4]

J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” *Nat Rev Mol Cell Biol*, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]

[5]

X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” *Nature*, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793

[6]

H. Morowitz and E. Smith, “Energy Flow and the Organization of Life,” *Santa Fe Institute*, 07-Aug-2006. [Online]. Available: http://samoa.santafe.edu/media/workingpapers/06-08-029.pdf. [Accessed: 03-Feb-2017]

[7]

R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” *IBM Journal of Research and Development*, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183

[8]

C. Rovelli, “Meaning = Information + Evolution,” *arXiv*, Nov. 2006 [Online]. Available: https://arxiv.org/abs/1611.02420

[9]

N. Perunov, R. A. Marsland, and J. L. England, “Statistical Physics of Adaptation,” *Physical Review X*, vol. 6, no. 2. American Physical Society (APS), 16-Jun-2016 [Online]. Available: http://dx.doi.org/10.1103/PhysRevX.6.021036 [Source]

[10]

S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” *Physical Review Letters*, vol. 109, no. 12. American Physical Society (APS), 19-Sep-2012 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.109.120604 [Source]

Remarkable progress of quantum information theory (QIT) allowed to formulate mathematical theorems for conditions that data-transmitting or data-processing occurs with a non-negative entropy gain. However, relation of these results formulated in terms of entropy gain in quantum channels to temporal evolution of real physical systems is not thoroughly understood. Here we build on the mathematical formalism provided by QIT to formulate the quantum H-theorem in terms of physical observables. We discuss the manifestation of the second law of thermodynamics in quantum physics and uncover special situations where the second law can be violated. We further demonstrate that the typical evolution of energy-isolated quantum systems occurs with non-diminishing entropy. [1]

[1]

G. B. Lesovik, A. V. Lebedev, I. A. Sadovskyy, M. V. Suslov, and V. M. Vinokur, “H-theorem in quantum physics,” *Scientific Reports*, vol. 6. Springer Nature, p. 32815, 12-Sep-2016 [Online]. Available: http://dx.doi.org/10.1038/srep32815

“Quantum-based demons” sound like they'd be at home in 'Stranger Things.'

Syndicated copies to:“The notion that counting more shapes in the sky will reveal more details of the Big Bang is implied in a central principle of quantum physics known as “unitarity.” Unitarity dictates that the probabilities of all possible quantum states of the universe must add up to one, now and forever; thus, information, which is stored in quantum states, can never be lost — only scrambled. This means that all information about the birth of the cosmos remains encoded in its present state, and the more precisely cosmologists know the latter, the more they can learn about the former.”

The Santa Fe Institute's free online course "Introduction to Information Theory" taught by Seth Lloyd via Complexity Explorer.

Many readers often ask me for resources for delving into the basics of information theory. I hadn’t posted it before, but the Santa Fe Institute recently had an online course *Introduction to Information Theory* through their Complexity Explorer, which has some other excellent offerings. It included videos, fora, and other resources and was taught by the esteemed physicist and professor Seth Lloyd. There are a number of currently active students still learning and posting there.

Syndicated copies to:## Introduction to Information Theory

## About the Tutorial:

This tutorial introduces fundamental concepts in information theory. Information theory has made considerable impact in complex systems, and has in part co-evolved with complexity science. Research areas ranging from ecology and biology to aerospace and information technology have all seen benefits from the growth of information theory.

In this tutorial, students will follow the development of information theory from bits to modern application in computing and communication. Along the way Seth Lloyd introduces valuable topics in information theory such as mutual information, boolean logic, channel capacity, and the natural relationship between information and entropy.

Lloyd coherently covers a substantial amount of material while limiting discussion of the mathematics involved. When formulas or derivations are considered, Lloyd describes the mathematics such that less advanced math students will find the tutorial accessible. Prerequisites for this tutorial are an understanding of logarithms, and at least a year of high-school algebra.

## About the Instructor(s):

Professor Seth Lloyd is a principal investigator in the Research Laboratory of Electronics (RLE) at the Massachusetts Institute of Technology (MIT). He received his A.B. from Harvard College in 1982, the Certificate of Advanced Study in Mathematics (Part III) and an M. Phil. in Philosophy of Science from Cambridge University in 1983 and 1984 under a Marshall Fellowship, and a Ph.D. in Physics in 1988 from Rockefeller University under the supervision of Professor Heinz Pagels.

From 1988 to 1991, Professor Lloyd was a postdoctoral fellow in the High Energy Physics Department at the California Institute of Technology, where he worked with Professor Murray Gell-Mann on applications of information to quantum-mechanical systems. From 1991 to 1994, he was a postdoctoral fellow at Los Alamos National Laboratory, where he worked at the Center for Nonlinear Systems on quantum computation. In 1994, he joined the faculty of the Department of Mechanical Engineering at MIT. Since 1988, Professor Lloyd has also been an adjunct faculty member at the Sante Fe Institute.

Professor Lloyd has performed seminal work in the fields of quantum computation and quantum communications, including proposing the first technologically feasible design for a quantum computer, demonstrating the viability of quantum analog computation, proving quantum analogs of Shannon’s noisy channel theorem, and designing novel methods for quantum error correction and noise reduction.

Professor Lloyd is a member of the American Physical Society and the Amercian Society of Mechanical Engineers.

## Tutorial Team:

Yoav Kallus is an Omidyar Fellow at the Santa Fe Institute. His research at the boundary of statistical physics and geometry looks at how and when simple interactions lead to the formation of complex order in materials and when preferred local order leads to system-wide disorder. Yoav holds a B.Sc. in physics from Rice University and a Ph.D. in physics from Cornell University. Before joining the Santa Fe Institute, Yoav was a postdoctoral fellow at the Princeton Center for Theoretical Science in Princeton University.

How to use Complexity Explorer: How to use Complexity Explore

Prerequisites: At least one year of high-school algebra

Like this tutorial? Donate to help fund more like it

## Syllabus

- Introduction
- Forms of Information
- Information and Probability
- Fundamental Formula of Information
- Computation and Logic: Information Processing
- Mutual Information
- Communication Capacity
- Shannon’s Coding Theorem
- The Manifold Things Information Measures
- Homework

A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.

## Summa Technologiae

AT LAST WE have it in English.

Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.His subjects, among others, include:

- Virtual reality
- Artificial intelligence
- Nanotechnology and biotechnology
- Evolutionary biology and evolutionary psychology
- Artificial life
- Information theory
- Entropy and thermodynamics
- Complexity theory, probability, and chaos
- Population and ecological catastrophe
- The “singularity” and “transhumanism”

Source: *Devourer of Encyclopedias: Stanislaw Lem’s “Summa Technologiae” – The Los Angeles Review of Books*

I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’s The Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.

Syndicated copies to:Information is a precise concept that can be defined mathematically, but its relationship to what we call "knowledge" is not always made clear. Furthermore, the concepts "entropy" and "information", while deeply related, are distinct and must be used with care, something that is not always achieved in the literature. In this elementary introduction, the concepts of entropy and information are laid out one by one, explained intuitively, but defined rigorously. I argue that a proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

A proper understanding of information in terms of prediction is key to a number of disciplines beyond engineering, such as physics and biology.

Comments: 19 pages, 2 figures. To appear in Philosophical Transaction of the Royal Society A

Subjects: Adaptation and Self-Organizing Systems (nlin.AO); Information Theory (cs.IT); Biological Physics (physics.bio-ph); Quantitative Methods (q-bio.QM)

Cite as:arXiv:1601.06176 [nlin.AO] (or arXiv:1601.06176v1 [nlin.AO] for this version)

From: Christoph Adami

**[v1]** Fri, 22 Jan 2016 21:35:44 GMT (151kb,D) [.pdf]

Source: *Christoph Adami [1601.06176] What is Information? on arXiv*

Physicist Sean Carroll has a forthcoming book entitled The Big Picture: On the Origins of Life, Meaning, and the Universe Itself (Dutton, May 10, 2016) that will be of interest to many of our readers.

In catching up on blogs/reading from the holidays, I’ve noticed that physicist Sean Carroll has a forthcoming book entitled *The Big Picture: On the Origins of Life, Meaning, and the Universe Itself* (Dutton, May 10, 2016) that will be of interest to many of our readers. One can already pre-order the book via Amazon.

Prior to the holidays Sean wrote a blogpost that contains a full overview table of contents, which will give everyone a stronger idea of its contents. For convenience I’ll excerpt it below.

I’ll post a review as soon as a copy arrives, but it looks like a strong new entry in the category of popular science books on information theory, biology and complexity as well as potentially the areas of evolution, the origin of life, and physics in general.

As a side bonus, for those reading this today (1/15/16), I’ll note that Carroll’s 12 part lecture series from The Great Courses *The Higgs Boson and Beyond* (The Learning Company, February 2015) is 80% off.

THE BIG PICTURE: ON THE ORIGINS OF LIFE, MEANING, AND THE UNIVERSE ITSELF0. Prologue

* Part One: Cosmos

- 1. The Fundamental Nature of Reality
- 2. Poetic Naturalism
- 3. The World Moves By Itself
- 4. What Determines What Will Happen Next?
- 5. Reasons Why
- 6. Our Universe
- 7. Time’s Arrow
- 8. Memories and Causes

* Part Two: Understanding

- 9. Learning About the World
- 10. Updating Our Knowledge
- 11. Is It Okay to Doubt Everything?
- 12. Reality Emerges
- 13. What Exists, and What Is Illusion?
- 14. Planets of Belief
- 15. Accepting Uncertainty
- 16. What Can We Know About the Universe Without Looking at It?
- 17. Who Am I?
- 18. Abducting God

* Part Three: Essence

- 19. How Much We Know
- 20. The Quantum Realm
- 21. Interpreting Quantum Mechanics
- 22. The Core Theory
- 23. The Stuff of Which We Are Made
- 24. The Effective Theory of the Everyday World
- 25. Why Does the Universe Exist?
- 26. Body and Soul
- 27. Death Is the End

* Part Four: Complexity

- 28. The Universe in a Cup of Coffee
- 29. Light and Life
- 30. Funneling Energy
- 31. Spontaneous Organization
- 32. The Origin and Purpose of Life
- 33. Evolution’s Bootstraps
- 34. Searching Through the Landscape
- 35. Emergent Purpose
- 36. Are We the Point?

* Part Five: Thinking

- 37. Crawling Into Consciousness
- 38. The Babbling Brain
- 39. What Thinks?
- 40. The Hard Problem
- 41. Zombies and Stories
- 42. Are Photons Conscious?
- 43. What Acts on What?
- 44. Freedom to Choose

* Part Six: Caring

- 45. Three Billion Heartbeats
- 46. What Is and What Ought to Be
- 47. Rules and Consequences
- 48. Constructing Goodness
- 49. Listening to the World
- 50. Existential Therapy

Appendix:The Equation Underlying You and Me- Acknowledgments
- Further Reading
- References
- Index

Source: Sean Carroll | The Big Picture: Table of Contents

Syndicated copies to:
In the 1870s Ewald Hering in Prague and Samuel Butler in London laid the foundations. Butler's work was later taken up by Richard Semon in Munich, whose writings inspired the young Erwin Schrodinger in the early decades of the 20th century.

As it was published, I had read Kevin Hartnett’s article and interview with Christoph Adami The Information Theory of Life in Quanta Magazine. I recently revisited it and read through the commentary and stumbled upon an interesting quote relating to the history of information in biology:

These two historical references predate Claude Shannon’s mathematical formalization of information in *A Mathematical Theory of Communication* (The Bell System Technical Journal, 1948) and even Erwin Schrödinger‘s lecture (1943) and subsequent book *What is Life* (1944).

For those interested in reading more on this historical tidbit, I’ve dug up a copy of the primary Forsdyke reference which first appeared on arXiv (prior to its ultimate publication in *History of Psychiatry [.pdf]*):

🔖 [1406.1391] ‘A Vehicle of Symbols and Nothing More.’ George Romanes, Theory of Mind, Information, and Samuel Butler by Donald R. Forsdyke [1]

Submitted on 4 Jun 2014 (v1), last revised 13 Nov 2014 (this version, v2)

Abstract: Today’s ‘theory of mind’ (ToM) concept is rooted in the distinction of nineteenth century philosopher William Clifford between ‘objects’ that can be directly perceived, and ‘ejects,’ such as the mind of another person, which are inferred from one’s subjective knowledge of one’s own mind. A founder, with Charles Darwin, of the discipline of comparative psychology, George Romanes considered the minds of animals as ejects, an idea that could be generalized to ‘society as eject’ and, ultimately, ‘the world as an eject’ – mind in the universe. Yet, Romanes and Clifford only vaguely connected mind with the abstraction we call ‘information,’ which needs ‘a vehicle of symbols’ – a material transporting medium. However, Samuel Butler was able to address, in informational terms depleted of theological trappings, both organic evolution and mind in the universe. This view harmonizes with insights arising from modern DNA research, the relative immortality of ‘selfish’ genes, and some startling recent developments in brain research.

Comments: Accepted for publication in History of Psychiatry. 31 pages including 3 footnotes. Based on a lecture given at Santa Clara University, February 28th 2014, at a Bannan Institute Symposium on ‘Science and Seeking: Rethinking the God Question in the Lab, Cosmos, and Classroom.’

The original arXiv article also referenced two lectures which are appended below:

[Original Draft of this was written on December 14, 2015.]

[1]

D. Forsdyke R., “‘A vehicle of symbols and nothing more’. George Romanes, theory of mind, information, and Samuel Butler,” *History of Psychiatry*, vol. 26, no. 3, Aug. 2015 [Online]. Available: http://journals.sagepub.com/doi/abs/10.1177/0957154X14562755

"The Information Universe" Conference in The Netherlands in October hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology.

Yesterday, via a notification from Lanyard, I came across a notice for the upcoming conference “The Information Universe” which hits several of the sweet spots for areas involving information theory, physics, the origin of life, complexity, computer science, and microbiology. It is scheduled to occur from October 7-9, 2015 at the Infoversum Theater in Groningen, The Netherlands.

I’ll let their site speak for itself below, but they already have an interesting line up of speakers including:

- Erik Verlinde, Professor Theoretical Physics, University of Amsterdam, Netherlands
- Alex Szalay, Alumni Centennial Professor of Astronomy, The Johns Hopkins University, USA
- Gerard ‘t Hooft, Professor Theoretical Physics, University of Utrecht, Netherlands
- Gregory Chaitin, Professor Mathematics and Computer Science, Federal University of Rio de Janeiro, Brasil
- Charley Lineweaver, Professor Astronomy and Astrophysics, Australian National University, Australia
- Lude Franke, Professor System Genetics, University Medical Center Groningen, Netherlands

Additional details about the conference including the participants, program, venue, and registration can also be found at their website.

Syndicated copies to:
Videos from the NIMBioS workshop on Information and Entropy in Biological Systems from April 8-10, 2015 are slowly starting to appear on YouTube.

Videos from the April 8-10, 2015, NIMBioS workshop on Information and Entropy in Biological Systems are slowly starting to appear on YouTube.

John Baez, one of the organizers of the workshop, is also going through them and adding some interesting background and links on his Azimuth blog as well for those who are looking for additional details and depth

Additonal resources from the Workshop:

- NIMBios Workshop page
- Participants list
- Workshop Agenda [.pdf download]
- Information and Entropy WordPress site
- YouTube playlist of videos
- Storify archive from the workshop

Syndicated copies to: