👓 What can Schrödinger’s cat say about 3D printers on Mars? | Aeon | Aeon Essays

Read What can Schrödinger’s cat say about 3D printers on Mars? by Michael Lachmann and Sara Walker (Aeon | Aeon Essays)
A cat is alive, a sofa is not: that much we know. But a sofa is also part of life. Information theory tells us why

A nice little essay in my area, but I’m not sure there’s anything new in it for me. It is nice that they’re trying to break some of the problem down into smaller components before building it back up into something else. Reframing things can always be helpful. Here, in particular, they’re reframing the definitions of life and alive.

🔖 Origins Of Life | Complexity Explorer

Bookmarked Origins Of Life (complexityexplorer.org)

About the Course:

This course aims to push the field of Origins of Life research forward by bringing new and synthetic thinking to the question of how life emerged from an abiotic world.

This course begins by examining the chemical, geological, physical, and biological principles that give us insight into origins of life research. We look at the chemical and geological environment of early Earth from the perspective of likely environments for life to originate.

Taking a look at modern life we ask what it can tell us about the origin of life by winding the clock backwards. We explore what elements of modern life are absolutely essential for life, and ask what is arbitrary? We ponder how life arose from the huge chemical space and what this early 'living chemistry'may have looked like.

We examine phenomena, that may seem particularly life like, but are in fact likely to arise given physical dynamics alone. We analyze what physical concepts and laws bound the possibilities for life and its formation.

Insights gained from modern evolutionary theory will be applied to proto-life. Once life emerges, we consider how living systems impact the geosphere and evolve complexity. 

The study of Origins of Life is highly interdisciplinary - touching on concepts and principles from earth science, biology, chemistry, and physics.  With this we hope that the course can bring students interested in a broad range of fields to explore how life originated. 

The course will make use of basic algebra, chemistry, and biology but potentially difficult topics will be reviewed, and help is available in the course discussion forum and instructor email. There will be pointers to additional resources for those who want to dig deeper.

This course is Complexity Explorer's first Frontiers Course.  A Frontiers Course gives students a tour of an active interdisciplinary research area. The goals of a Frontiers Course are to share the excitement and uncertainty of a scientific area, inspire curiosity, and possibly draw new people into the research community who can help this research area take shape!

I’m totally in for this!

Hat tip for the reminder to:

Replied to a tweet by John StewartJohn Stewart (Twitter)

I bookmarked a great post by Jim Luke (@econproph) a few weeks ago on scale and scope. I suspect that tech’s effect on education is heavily (if not permanently) scale-limited, but scope may be a better avenue going forward.

I also suspect that Cesar Hidalgo’s text Why Information Grows: The Evolution of Order, from Atoms to Economies may provide a strong clue with some details. To some extent I think we’ve generally reached the Shannon limit for how much information we can pour into a single brain. We now need to rely on distributed and parallel networking among people to proceed forward.

📑 Solomon Golomb (1932–2016) | Stephen Wolfram Blog

Annotated Solomon Golomb (1932–2016) by Stephen Wolfram (blog.stephenwolfram.com)

As it happens, he’d already done some work on coding theory—in the area of biology. The digital nature of DNA had been discovered by Jim Watson and Francis Crick in 1953, but it wasn’t yet clear just how sequences of the four possible base pairs encoded the 20 amino acids. In 1956, Max Delbrück—Jim Watson’s former postdoc advisor at Caltech—asked around at JPL if anyone could figure it out. Sol and two colleagues analyzed an idea of Francis Crick’s and came up with “comma-free codes” in which overlapping triples of base pairs could encode amino acids. The analysis showed that exactly 20 amino acids could be encoded this way. It seemed like an amazing explanation of what was seen—but unfortunately it isn’t how biology actually works (biology uses a more straightforward encoding, where some of the 64 possible triples just don’t represent anything).  

I recall talking to Sol about this very thing when I sat in on a course he taught at USC on combinatorics. He gave me his paper on it and a few related issues as I was very interested at the time about the applications of information theory and biology.

I’m glad I managed to sit in on the class and still have the audio recordings and notes. While I can’t say that Newton taught me calculus, I can say I learned combinatorics from Golomb.

👓 Solomon Golomb (1932–2016) | Stephen Wolfram

Read Solomon Golomb (1932–2016) by Stephen WolframStephen Wolfram (blog.stephenwolfram.com)

The Most-Used Mathematical Algorithm Idea in History

An octillion. A billion billion billion. That’s a fairly conservative estimate of the number of times a cellphone or other device somewhere in the world has generated a bit using a maximum-length linear-feedback shift register sequence. It’s probably the single most-used mathematical algorithm idea in history. And the main originator of this idea was Solomon Golomb, who died on May 1—and whom I knew for 35 years.

Solomon Golomb’s classic book Shift Register Sequences, published in 1967—based on his work in the 1950s—went out of print long ago. But its content lives on in pretty much every modern communications system. Read the specifications for 3GLTEWi-FiBluetooth, or for that matter GPS, and you’ll find mentions of polynomials that determine the shift register sequences these systems use to encode the data they send. Solomon Golomb is the person who figured out how to construct all these polynomials.

A fantastic and pretty comprehensive obit for Sol. He did miss out on more of Sol’s youth as well as his cross-town chess rivalry with Basil Gordon when they both lived in Baltimore, but before they lived across town from each other again in Los Angeles.

Many of the fantastical seeming stories here, as well as Sol’s personality read very true to me with respect to the man I knew for almost two decades.

📑 Solomon Golomb (1932–2016) | Stephen Wolfram Blog

Annotated Solomon Golomb (1932–2016) by Stephen Wolfram (blog.stephenwolfram.com)
in June 1955 he wrote his final report, “Sequences with Randomness Properties”—which would basically become the foundational document of the theory of shift register sequences.  

❤️ lpachter tweeted I once asked Robert McEliece whether he would mentor me.

Liked a tweet by Lior PachterLior Pachter (Twitter)

👓 Robert J. McEliece, 1942–2019 | Caltech

Read Robert J. McEliece, 1942–2019 (caltech.edu)
Alumnus and engineering faculty member Robert J. McEliece has passed away.

May is apparently the month that many of the greats in information theory pass away. I was reminded of Sol Golomb’s passing in May 2016 the other day.

I didn’t know him well, but met Dr. McEliece a handful of times and at least a few of the books in my personal information theory library are hand-me-down copies from his personal library. He’ll definitely be missed.

Three open books piled on top of each other with McEliece's signature and dates in the top right hand of the first page and CalTech bookstore price stamps in them as well.

📺 The Bit Player (Trailer) | IEEE Information Theory Society

Watched The Bit Player (Trailer) from IEEE Information Theory Society

The Bit Player Trailer from IEEE Information Theory Society on Vimeo.

In a blockbuster paper in 1948, Claude Shannon introduced the notion of a "bit" and laid the foundation for the information age. His ideas ripple through nearly every aspect of modern life, influencing such diverse fields as communication, computing, cryptography, neuroscience, artificial intelligence, cosmology, linguistics, and genetics. But when interviewed in the 1980s, Shannon was more interested in showing off the gadgets he’d constructed — juggling robots, a Rubik’s Cube solving machine, a wearable computer to win at roulette, a unicycle without pedals, a flame-throwing trumpet — than rehashing the past. Mixing contemporary interviews, archival film, animation and dialogue drawn from interviews conducted with Shannon himself, The Bit Player tells the story of an overlooked genius who revolutionized the world, but never lost his childlike curiosity.

👓 Bob Gallager on Shannon’s tips for research | An Ergodic Walk

Annotated Bob Gallager on Shannon’s tips for research (An Ergodic Walk)

Gallager gave a nice concise summary of what he learned from Shannon about how to do good theory work:

  1. Simplify the problem
  2. Relate it to other problems
  3. Restate the problem in as many ways as possible
  4. Break the problem into pieces
  5. Avoid getting locked into thinking ruts
  6. Generalize

As he said, “it’s a process of doing research… each one [step] gives you a little insight.” It’s tempting, as a theorist, to claim that at the end of this process you’ve solved the “fundamental” problem, but Gallager admonished us to remember that the first step is to simplify, often dramatically. As Alfred North Whitehead said, we should “seek simplicity and distrust it.”

I know I’ve read this before, but it deserves a re-read/review every now and then.

Read Blue Brain solves a century-old neuroscience problem (ScienceDaily)
New research explains how the shapes of neurons can be classified using mathematical methods from the field of algebraic topology. Neuroscientists can now start building a formal catalogue for all the types of cells in the brain. Onto this catalogue of cells, they can systematically map the function and role in disease of each type of neuron in the brain.
Bookmarked From bit to it: How a complex metabolic network transforms information into living matter by Andreas Wagner (BMC Systems Biology)

Background

Organisms live and die by the amount of information they acquire about their environment. The systems analysis of complex metabolic networks allows us to ask how such information translates into fitness. A metabolic network transforms nutrients into biomass. The better it uses information on available nutrient availability, the faster it will allow a cell to divide.

Results

I here use metabolic flux balance analysis to show that the accuracy I (in bits) with which a yeast cell can sense a limiting nutrient's availability relates logarithmically to fitness as indicated by biomass yield and cell division rate. For microbes like yeast, natural selection can resolve fitness differences of genetic variants smaller than 10-6, meaning that cells would need to estimate nutrient concentrations to very high accuracy (greater than 22 bits) to ensure optimal growth. I argue that such accuracies are not achievable in practice. Natural selection may thus face fundamental limitations in maximizing the information processing capacity of cells.

Conclusion

The analysis of metabolic networks opens a door to understanding cellular biology from a quantitative, information-theoretic perspective.

https://doi.org/10.1186/1752-0509-1-33

Received: 01 March 2007 Accepted: 30 July 2007 Published: 30 July 2007

Hat tip to Paul Davies in The Demon in the Machine

Bookmarked Statistical Physics of Self-Replication by Jeremy L. England (J. Chem. Phys. 139, 121923 (2013); )
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.
https://doi.org/10.1063/1.4818538

Syndicated copy also available on arXiv: https://arxiv.org/abs/1209.1179

Hat tip to Paul Davies in The Demon in the Machine

🔖 The notion of information in biology, an appraisal | Jérôme Segal | Journal BIO Web of Conferences

Bookmarked The notion of information in biology, an appraisal by Jérôme SegalJérôme Segal (Journal BIO Web of Conferences Volume 4, Page 00017, 2015; ORIGINS – Studies in Biological and Cultural Evolution)

Developed during the first half of the 20th century, in three different fields, theoretical physics, statistics applied to agronomy and telecommunication engineering, the notion of information has become a scientific concept in the context of the Second War World. It is in this highly interdisciplinary environment that “information theory” emerged, combining the mathematical theory of communication and cybernetics. This theory has grown exponentially in many disciplines, including biology. The discovery of the genetic “code” has benefited from the development of a common language based on information theory and has fostered a almost imperialist development of molecular genetics, which culminated in the Human Genome Project. This project however could not fill all the raised expectations and epigenetics have shown the limits of this approach. Still, the theory of information continues to be applied in the current research, whether the application of the self-correcting coding theory to explain the conservation of genomes on a geological scale or aspects the theory of evolution.

[pdf]

https://doi.org/10.1051/bioconf/20150400017