Bookmarked Nonadditive Entropies Yield Probability Distributions with Biases not Warranted by the Data by Ken Dill (academia.edu)
Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.
Bookmarked Statistical Physics of Self-Replication by Jeremy L. England (J. Chem. Phys. 139, 121923 (2013); )
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.
https://doi.org/10.1063/1.4818538
Syndicated copy also available on arXiv: https://arxiv.org/abs/1209.1179

Hat tip to Paul Davies in The Demon in the Machine

🔖 The Negentropy Principle of Information by Leon Brillouin | Journal of Applied Physics: Vol 24, No 9

Bookmarked The Negentropy Principle of Information by Leon Brillouin (Journal of Applied Physics 24, 1152 (1953))

The statistical definition of information is compared with Boltzmann's formula for entropy. The immediate result is that information I corresponds to a negative term in the total entropy S of a system.
S=S0−I
. A generalized second principle states that S must always increase. If an experiment yields an increase ΔI of the information concerning a physical system, it must be paid for by a larger increase ΔS0 in the entropy of the system and its surrounding laboratory. The efficiency ε of the experiment is defined as ε = ΔI/ΔS0≤1. Moreover, there is a lower limit k ln2 (k, Boltzmann's constant) for the ΔS0 required in an observation. Some specific examples are discussed: length or distance measurements, time measurements, observations under a microscope. In all cases it is found that higher accuracy always means lower efficiency. The information ΔI increases as the logarithm of the accuracy, while ΔS0 goes up faster than the accuracy itself. Exceptional circumstances arise when extremely small distances (of the order of nuclear dimensions) have to be measured, in which case the efficiency drops to exceedingly low values. This stupendous increase in the cost of observation is a new factor that should probably be included in the quantum theory.

https://doi.org/10.1063/1.1721463

First appearance of the word “negentropy” that I’ve seen in the literature.

👓 Differential privacy, an easy case | accuracyandprivacy.substack.com

Read Differential privacy, an easy case (accuracyandprivacy.substack.com)
By law, the Census Bureau is required to keep our responses to its questionnaires confidential. And so, over decades, it has applied several “disclosure avoidance” techniques when it publishes data — these have been meticulously catalogued by Laura McKenna
I could envision some interesting use cases for differential privacy like this within an IndieWeb framework for aggregated data potentially used for web discovery.

📑 #LoveBombs for Thimble: Saying Goodbye to Teacher, Mentor, Friend | INTERTEXTrEVOLUTION

Annotated #LoveBombs for Thimble: Saying Goodbye to Teacher, Mentor, Friend by J. Gregroy McVerry (jgmac1106homepage.glitch.me)
Everything that gets launched gets shuttered.  
True of almost everything in life.

🎧 Episode 085 How Networks Learn An Interview with Cesar Hidalgo | Human Current

Listened to Episode 085 How Networks Learn An Interview with Cesar Hidalgo by Haley Campbell-GrossHaley Campbell-Gross from HumanCurrent

In this episode, Haley talks with physicist, complexity scientist, and MIT professor, Cesar Hidalgo. Hidalgo discusses his interest in the physics of networks and complex system science and shares why he believes these fields are so important. He talks about his book, Why Information Grows: The Evolution of Order, from Atoms to Economies, which takes a scientific look at global economic complexity. Hidalgo also shares how economic development is linked to making networks more knowledgeable.

Cesar Hidalgo

Quotes from this episode:

“Thinking about complexity is important because people have a tendency to jump into micro explanations for macro phenomenon.” — Cesar Hidalgo

“I think complex systems give you not only some practical tools to think about the world, but also some sort of humbleness because you have to understand that your knowledge and understanding of how the systems work is always very limited and that humbleness gives you a different attitude and perspective and gives you some peace.” — Cesar Hidalgo

“The way that we think about entropy in physics and information theory come from different traditions and sometimes that causes a little bit of confusion, but at the end of the day it’s the number of different ways in which you can arrange something.” — Cesar Hidalgo

“To learn more complex activities you need more social reinforcement.” — Cesar Hidalgo

“When we lead groups we have to be clear about the goals and the main goal to keep in mind is that of learning.” — Cesar Hidalgo

“Everybody fails, but not everyone learns from their failures.” — Cesar Hidalgo

“Learning is not just something that is interesting to study, it is actually a goal.” — Cesar Hidalgo

A solid interview here with Cesar Hidalgo. His book has been incredibly influential on my thoughts for the past two years, so I obviously highly recommend it. He’s got a great description of entropy here. I was most surprised by his conversation about loneliness, but I have a gut feeling that’s he’s really caught onto something with his thesis.

I also appreciated about some of how he expanded on learning in the last portion of the interview. Definitely worth revisiting.

👓 Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 | The New York Times

Read Stephen Hawking, Who Examined the Universe and Explained Black Holes, Dies at 76 by Dennis Overbye (nytimes.com)
A physicist and best-selling author, Dr. Hawking did not allow his physical limitations to hinder his quest to answer “the big question: Where did the universe come from?”
Some sad news after getting back from Algebraic Geometry class tonight. RIP Stephen Hawking.

🔖 Upcoming Special Issue “Information Theory in Neuroscience” | Entropy (MDPI)

Bookmarked Special Issue "Information Theory in Neuroscience" (Entropy | MDPI)
As the ultimate information processing device, the brain naturally lends itself to be studied with information theory. Application of information theory to neuroscience has spurred the development of principled theories of brain function, has led to advances in the study of consciousness, and to the development of analytical techniques to crack the neural code, that is to unveil the language used by neurons to encode and process information. In particular, advances in experimental techniques enabling precise recording and manipulation of neural activity on a large scale now enable for the first time the precise formulation and the quantitative test of hypotheses about how the brain encodes and transmits across areas the information used for specific functions. This Special Issue emphasizes contributions on novel approaches in neuroscience using information theory, and on the development of new information theoretic results inspired by problems in neuroscience. Research work at the interface of neuroscience, Information Theory and other disciplines is also welcome. A special issue of Entropy (ISSN 1099-4300). This special issue belongs to the section "Information Theory". Deadline for manuscript submissions: 1 December 2017

📅 Entropy 2018: From Physics to Information Sciences and Geometry

RSVPed Might be attending Entropy 2018: From Physics to Information Sciences and Geometry
14-16 May 2018; Auditorium Enric Casassas, Faculty of Chemistry, University of Barcelona, Barcelona, Spain

One of the most frequently used scientific words, is the word “Entropy”. The reason is that it is related to two main scientific domains: physics and information theory. Its origin goes back to the start of physics (thermodynamics), but since Shannon, it has become related to information theory. This conference is an opportunity to bring researchers of these two communities together and create a synergy. The main topics and sessions of the conference cover:

  • Physics: classical Thermodynamics and Quantum
  • Statistical physics and Bayesian computation
  • Geometrical science of information, topology and metrics
  • Maximum entropy principle and inference
  • Kullback and Bayes or information theory and Bayesian inference
  • Entropy in action (applications)

The inter-disciplinary nature of contributions from both theoretical and applied perspectives are very welcome, including papers addressing conceptual and methodological developments, as well as new applications of entropy and information theory.

All accepted papers will be published in the proceedings of the conference. A selection of invited and contributed talks presented during the conference will be invited to submit an extended version of their paper for a special issue of the open access Journal Entropy. 

Entropy 2018 Conference

🔖 Can entropy be defined for and the Second Law applied to the entire universe? by Arieh Ben-Naim | Arxiv

Bookmarked Can entropy be defined for and the Second Law applied to the entire universe? (arXiv)
This article provides answers to the two questions posed in the title. It is argued that, contrary to many statements made in the literature, neither entropy, nor the Second Law may be used for the entire universe. The origin of this misuse of entropy and the second law may be traced back to Clausius himself. More resent (erroneous) justification is also discussed.

🔖 Statistical Mechanics, Spring 2016 (Caltech, Physics 12c with videos) by John Preskill

Bookmarked Statistical Mechanics, Spring 2016 (Physics 12c) by John Preskill (Caltech)
An introductory course in statistical mechanics.
Recommended textbook Thermal Physics by Charles Kittel and Herbert Kroemer

There’s also a corresponding video lecture series available on YouTube
https://www.youtube.com/playlist?list=PL0ojjrEqIyPzgJUUW76koGcSCy6OGtDRI

Entropy | Special Issue: Maximum Entropy and Bayesian Methods

Bookmarked Entropy | Special Issue : Maximum Entropy and Bayesian Methods (mdpi.com)
Open for submission now
Deadline for manuscript submissions: 31 August 2017

A special issue of Entropy (ISSN 1099-4300).

Deadline for manuscript submissions: 31 August 2017

Special Issue Editor


Guest Editor
Dr. Brendon J. Brewer

 

Department of Statistics, The University of Auckland, Private Bag 92019, Auckland 1142, New Zealand
Website | E-MailPhone: +64275001336
Interests: bayesian inference, markov chain monte carlo, nested sampling, MaxEnt

Special Issue Information

Dear Colleagues,

Whereas Bayesian inference has now achieved mainstream acceptance and is widely used throughout the sciences, associated ideas such as the principle of maximum entropy (implicit in the work of Gibbs, and developed further by Ed Jaynes and others) have not. There are strong arguments that the principle (and variations, such as maximum relative entropy) is of fundamental importance, but the literature also contains many misguided attempts at applying it, leading to much confusion.

This Special Issue will focus on Bayesian inference and MaxEnt. Some open questions that spring to mind are: Which proposed ways of using entropy (and its maximisation) in inference are legitimate, which are not, and why? Where can we obtain constraints on probability assignments, the input needed by the MaxEnt procedure?

More generally, papers exploring any interesting connections between probabilistic inference and information theory will be considered. Papers presenting high quality applications, or discussing computational methods in these areas, are also welcome.

Dr. Brendon J. Brewer
Guest Editor

Submission

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. Papers will be published continuously (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are refereed through a peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Entropy is an international peer-reviewed Open Access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1500 CHF (Swiss Francs).

No papers have been published in this special issue yet.

🔖 Irreversibility and Heat Generation in the Computing Process by R. Landauer

Bookmarked Irreversibility and Heat Generation in the Computing Process (ieeexplore.ieee.org)
It is argued that computing machines inevitably involve devices which perform logical functions that do not have a single-valued inverse. This logical irreversibility is associated with physical irreversibility and requires a minimal heat generation, per machine cycle, typically of the order of kT for each irreversible function. This dissipation serves the purpose of standardizing signals and making them independent of their exact logical history. Two simple, but representative, models of bistable devices are subjected to a more detailed analysis of switching kinetics to yield the relationship between speed and energy dissipation, and to estimate the effects of errors induced by thermal fluctuations.
A classical paper in the history of entropy.

🔖 Causal Entropic Forces, Phys. Rev. Lett. 110, 168702 (2013)

Bookmarked Causal Entropic Forces (Phys. Rev. Lett. 110, 168702 (2013) journals.aps.org )
Recent advances in fields ranging from cosmology to computer science have hinted at a possible deep connection between intelligence and entropy maximization, but no formal physical relationship between them has yet been established. Here, we explicitly propose a first step toward such a relationship in the form of a causal generalization of entropic forces that we find can cause two defining behaviors of the human “cognitive niche”—tool use and social cooperation—to spontaneously emerge in simple physical systems. Our results suggest a potentially general thermodynamic model of adaptive behavior as a nonequilibrium process in open systems.
[1]
A. D. Wissner-Gross and C. E. Freer, “Causal Entropic Forces,” Physical Review Letters, vol. 110, no. 16. American Physical Society (APS), 19-Apr-2013 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.110.168702 [Source]

 

🔖 How Life (and Death) Spring From Disorder | Quanta Magazine

Bookmarked How Life (and Death) Spring From Disorder by Philip Ball (Quanta Magazine)
Life was long thought to obey its own set of rules. But as simple systems show signs of lifelike behavior, scientists are arguing about whether this apparent complexity is all a consequence of thermodynamics.
This is a nice little general interest article by Philip Ball that does a relatively good job of covering several of my favorite topics (information theory, biology, complexity) for the layperson. While it stays relatively basic, it links to a handful of really great references, many of which I’ve already read, though several appear to be new to me. [1][2][3][4][5][6][7][8][9][10]

While Ball has a broad area of interests and coverage in his work, he’s certainly one of the best journalists working in this subarea of interests today. I highly recommend his work to those who find this area interesting.

References

[1]
E. Mayr, What Makes Biology Unique? Cambridge University Press, 2004.
[2]
A. Wissner-Gross and C. Freer, “Causal entropic forces.,” Phys Rev Lett, vol. 110, no. 16, p. 168702, Apr. 2013. [PubMed]
[3]
A. Barato and U. Seifert, “Thermodynamic uncertainty relation for biomolecular processes.,” Phys Rev Lett, vol. 114, no. 15, p. 158101, Apr. 2015. [PubMed]
[4]
J. Shay and W. Wright, “Hayflick, his limit, and cellular ageing.,” Nat Rev Mol Cell Biol, vol. 1, no. 1, pp. 72–6, Oct. 2000. [PubMed]
[5]
X. Dong, B. Milholland, and J. Vijg, “Evidence for a limit to human lifespan,” Nature, vol. 538, no. 7624. Springer Nature, pp. 257–259, 05-Oct-2016 [Online]. Available: http://dx.doi.org/10.1038/nature19793
[6]
H. Morowitz and E. Smith, “Energy Flow and the Organization of Life,” Santa Fe Institute, 07-Aug-2006. [Online]. Available: http://samoa.santafe.edu/media/workingpapers/06-08-029.pdf. [Accessed: 03-Feb-2017]
[7]
R. Landauer, “Irreversibility and Heat Generation in the Computing Process,” IBM Journal of Research and Development, vol. 5, no. 3. IBM, pp. 183–191, Jul-1961 [Online]. Available: http://dx.doi.org/10.1147/rd.53.0183
[8]
C. Rovelli, “Meaning = Information + Evolution,” arXiv, Nov. 2006 [Online]. Available: https://arxiv.org/abs/1611.02420
[9]
N. Perunov, R. A. Marsland, and J. L. England, “Statistical Physics of Adaptation,” Physical Review X, vol. 6, no. 2. American Physical Society (APS), 16-Jun-2016 [Online]. Available: http://dx.doi.org/10.1103/PhysRevX.6.021036 [Source]
[10]
S. Still, D. A. Sivak, A. J. Bell, and G. E. Crooks, “Thermodynamics of Prediction,” Physical Review Letters, vol. 109, no. 12. American Physical Society (APS), 19-Sep-2012 [Online]. Available: http://dx.doi.org/10.1103/PhysRevLett.109.120604 [Source]