Congratulations to Christoph Adami (@ChristophAdami) on release day for The Evolution of Biological Information: How Evolution Creates Complexity, from Viruses to Brains! I’m awaiting the post for my own hardcover copy. 

Bookmarked The Evolution of Biological Information by Christoph Adami (press.princeton.edu)
Why information is the unifying principle that allows us to understand the evolution of complexity in nature
Don’t think I’m just holding my breath waiting for this awesome book. Sometimes I turn blue and fall off my chair. 😰🪑

At least the press is saying Jan 16, 2024 now. Tough luck for those doing their holiday shopping for me.

Replied to a tweet by codexeditor (Twitter)
@brunowinck @codexeditor @alanlaidlaw When thinking about this, recall that in the second paragraph of The Mathematical Theory of Communication (University of Illinois Press, 1949), Claude Shannon explicitly separates the semantic meaning from the engineering problem of communication. 
Highlight from the book with the underlined sentence: "These semantic aspects of communication are irrelevant to the engineering problem.
Bookmarked Cellular Homeostasis, Epigenesis and Replication in Randomly Aggregated Macromolecular Systems by Stuart A. Kauffman (Journal of Cybernetics Volume 1, 1971 - Issue 1)
Pages 71-96 | Published online: 15 Apr 2008
https://doi.org/10.1080/01969727108545830
Proto-organisms probably were randomly aggregated nets of chemical reactions. The hypothesis that contemporary organisms are also randomly constructed molecular automata is examined by modeling the gene as a binary (on-off) device and studying the behavior of large, randomly constructed nets of these binary “genes.” The results suggest that, if each “gene” is directly affected by two or three other “genes,” then such random nets: behave with great order and stability; undergo behavior cycles whose length predicts cell replication time as a function of the number of genes per cell; possess different modes of behavior whose number per net predicts roughly the number of cell types in an organism as a function of its number of genes; and under the stimulus of noise are capable of differentiating directly from any mode of behavior to at most a few other modes of behavior. Cellular differentiation is modeled as a Markov chain among the modes of behavior of a genetic net. The possibility of a general theory of metabolic behavior is suggested. Analytic approaches to the behavior of switching nets are discussed in Appendix 1, and some implications of the results for the origin of self replicating macromolecular systems is discussed in Appendix 6.

Michael Marshall in He may have found the key to the origins of life. So why have so few heard of him? ()

Read He may have found the key to the origins of life. So why have so few heard of him? by Michael MarshallMichael Marshall (Science)
Hungarian biologist Tibor Gánti is an obscure figure. Now, more than a decade after his death, his ideas about how life began are finally coming to fruition.
Good to see Tibor Gánti finally getting some credit. This is a great little article with a nice overview of the Origin of Life problem (and references). The author Michael Marshall has a new book out on the topic.

Peter Molnar in IndieWeb Chat ()

Bookmarked The ergodicity problem in economics by Ole Peters (Nature Physics volume 15, pages1216–1221(2019))
The ergodic hypothesis is a key analytical device of equilibrium statistical mechanics. It underlies the assumption that the time average and the expectation value of an observable are the same. Where it is valid, dynamical descriptions can often be replaced with much simpler probabilistic ones — time is essentially eliminated from the models. The conditions for validity are restrictive, even more so for non-equilibrium systems. Economics typically deals with systems far from equilibrium — specifically with models of growth. It may therefore come as a surprise to learn that the prevailing formulations of economic theory — expected utility theory and its descendants — make an indiscriminate assumption of ergodicity. This is largely because foundational concepts to do with risk and randomness originated in seventeenth-century economics, predating by some 200 years the concept of ergodicity, which arose in nineteenth-century physics. In this Perspective, I argue that by carefully addressing the question of ergodicity, many puzzles besetting the current economic formalism are resolved in a natural and empirically testable way.
Kevin Marks retweet () of 
Simon Wardley @swardley in Simon Wardley on Twitter: “Anyway, this is a fabulous paper – The ergodicity problem in economics – https://t.co/fzS3toWvT5 … well worth the read.” / Twitter ()
Read - Want to Read: Shape: The Hidden Geometry of Information, Biology, Strategy, Democracy, and Everything Else by Jordan Ellenberg (Penguin Press)
From the New York Times-bestselling author of How Not to Be Wrong, himself a world-class geometer, a far-ranging exploration of the power of geometry, which turns out to help us think better about practically everything
How should a democracy choose its representatives? How can you stop a pandemic from sweeping the world? How do computers learn to play chess, and why is learning chess so much easier for them than learning to read a sentence? Can ancient Greek proportions predict the stock market? (Sorry, no.) What should your kids learn in school if they really want to learn to think? All these are questions about geometry.
For real. If you're like most people, geometry is a sterile and dimly-remembered exercise you gladly left behind in the dust of 9th grade, along with your braces and active romantic interest in pop singers. If you recall any of it, it's plodding through a series of miniscule steps, only to prove some fact about triangles that was obvious to you in the first place. That's not geometry. OK, it is geometry, but only a tiny part, a border section that has as much to do with geometry in all its flush modern richness as conjugating a verb has to do with a great novel.
Shape reveals the geometry underneath some of the most important scientific, political, and philosophical problems we face. Geometry asks: where are things? Which things are near each other? How can you get from one thing to another thing? Those are important questions. The word geometry, from the Greek, has the rather grand meaning of measuring the world. If anything, that's an undersell. Geometry doesn't just measure the world - it explains it. Shape shows us how.
Read FedEx Bandwidth (what-if.xkcd.com)

If you want to transfer a few hundred gigabytes of data, it’s generally faster to FedEx a hard drive than to send the files over the internet. This isn’t a new idea—it’s often dubbed SneakerNet—and it’s how Google transfers large amounts of data internally.

But will it always be faster?

Cisco estimates that total internet traffic currently averages 167 terabits per second. FedEx has a fleet of 654 aircraft with a lift capacity of 26.5 million pounds daily. A solid-state laptop drive weighs about 78 grams and can hold up to a terabyte.

That means FedEx is capable of transferring 150 exabytes of data per day, or 14 petabits per second—almost a hundred times the current throughput of the internet.

Read Passphrases That You Can Memorize — But That Even the NSA Can’t Guess by Micah LeeMicah Lee (The Intercept)

IT’S GETTING EASIER to secure your digital privacy. iPhones now encrypt a great deal of personal information; hard drives on Mac and Windows 8.1 computers are now automatically locked down; even Facebook, which made a fortune on open sharing, is providing end-to-end encryption in the chat tool WhatsApp. But none of this technology offers as much protection as you may think if you don’t know how to come up with a good passphrase.

Read What Is an Individual? Biology Seeks Clues in Information Theory. (Quanta Magazine)
To recognize strange extraterrestrial life and solve biological mysteries on this planet, scientists are searching for an objective definition for life’s basic units.
I’ve been following a bit of David’s work, but obviously there’s some newer material I need to catch up on. I like the general philosophical thrust of their direction here. I can see some useful abstractions to higher math here, maybe an analogy to a “calculus of biology” which doesn’t look at single points, but rates of change of that point(s).
Watched Learn Morse Code from a Memory Champ (in 15 minutes) by Nelson Delis from YouTube
This is a video I've been wanting to do for a while (in part because I've wanted to learn Morse Code myself, for years!) and I've also had many requests for it.
The first method is also useful for letter frequencies (or playing something like Wheel of Fortune) while the second is actually useful for the sound memory needed to practice Morse code.
Bookmarked Application of information theory in systems biology by Shinsuke Uda (SpringerLink)
Over recent years, new light has been shed on aspects of information processing in cells. The quantification of information, as described by Shannon’s information theory, is a basic and powerful tool that can be applied to various fields, such as communication, statistics, and computer science, as well as to information processing within cells. It has also been used to infer the network structure of molecular species. However, the difficulty of obtaining sufficient sample sizes and the computational burden associated with the high-dimensional data often encountered in biology can result in bottlenecks in the application of information theory to systems biology. This article provides an overview of the application of information theory to systems biology, discussing the associated bottlenecks and reviewing recent work.
Bookmarked Nonadditive Entropies Yield Probability Distributions with Biases not Warranted by the Data by Ken Dill (academia.edu)
Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.