Replied to a tweet by John StewartJohn Stewart (Twitter)
I bookmarked a great post by Jim Luke (@econproph) a few weeks ago on scale and scope. I suspect that tech’s effect on education is heavily (if not permanently) scale-limited, but scope may be a better avenue going forward.

I also suspect that Cesar Hidalgo’s text Why Information Grows: The Evolution of Order, from Atoms to Economies may provide a strong clue with some details. To some extent I think we’ve generally reached the Shannon limit for how much information we can pour into a single brain. We now need to rely on distributed and parallel networking among people to proceed forward.

📑 Solomon Golomb (1932–2016) | Stephen Wolfram Blog

Annotated Solomon Golomb (1932–2016) by Stephen Wolfram (blog.stephenwolfram.com)

As it happens, he’d already done some work on coding theory—in the area of biology. The digital nature of DNA had been discovered by Jim Watson and Francis Crick in 1953, but it wasn’t yet clear just how sequences of the four possible base pairs encoded the 20 amino acids. In 1956, Max Delbrück—Jim Watson’s former postdoc advisor at Caltech—asked around at JPL if anyone could figure it out. Sol and two colleagues analyzed an idea of Francis Crick’s and came up with “comma-free codes” in which overlapping triples of base pairs could encode amino acids. The analysis showed that exactly 20 amino acids could be encoded this way. It seemed like an amazing explanation of what was seen—but unfortunately it isn’t how biology actually works (biology uses a more straightforward encoding, where some of the 64 possible triples just don’t represent anything).  

I recall talking to Sol about this very thing when I sat in on a course he taught at USC on combinatorics. He gave me his paper on it and a few related issues as I was very interested at the time about the applications of information theory and biology.

I’m glad I managed to sit in on the class and still have the audio recordings and notes. While I can’t say that Newton taught me calculus, I can say I learned combinatorics from Golomb.

👓 Solomon Golomb (1932–2016) | Stephen Wolfram

Read Solomon Golomb (1932–2016) by Stephen WolframStephen Wolfram (blog.stephenwolfram.com)

The Most-Used Mathematical Algorithm Idea in History

An octillion. A billion billion billion. That’s a fairly conservative estimate of the number of times a cellphone or other device somewhere in the world has generated a bit using a maximum-length linear-feedback shift register sequence. It’s probably the single most-used mathematical algorithm idea in history. And the main originator of this idea was Solomon Golomb, who died on May 1—and whom I knew for 35 years.

Solomon Golomb’s classic book Shift Register Sequences, published in 1967—based on his work in the 1950s—went out of print long ago. But its content lives on in pretty much every modern communications system. Read the specifications for 3GLTEWi-FiBluetooth, or for that matter GPS, and you’ll find mentions of polynomials that determine the shift register sequences these systems use to encode the data they send. Solomon Golomb is the person who figured out how to construct all these polynomials.

A fantastic and pretty comprehensive obit for Sol. He did miss out on more of Sol’s youth as well as his cross-town chess rivalry with Basil Gordon when they both lived in Baltimore, but before they lived across town from each other again in Los Angeles.

Many of the fantastical seeming stories here, as well as Sol’s personality read very true to me with respect to the man I knew for almost two decades.

📑 Solomon Golomb (1932–2016) | Stephen Wolfram Blog

Annotated Solomon Golomb (1932–2016) by Stephen Wolfram (blog.stephenwolfram.com)
in June 1955 he wrote his final report, “Sequences with Randomness Properties”—which would basically become the foundational document of the theory of shift register sequences.  

❤️ lpachter tweeted I once asked Robert McEliece whether he would mentor me.

Liked a tweet by Lior PachterLior Pachter (Twitter)

👓 Robert J. McEliece, 1942–2019 | Caltech

Read Robert J. McEliece, 1942–2019 (caltech.edu)
Alumnus and engineering faculty member Robert J. McEliece has passed away.
May is apparently the month that many of the greats in information theory pass away. I was reminded of Sol Golomb’s passing in May 2016 the other day.

I didn’t know him well, but met Dr. McEliece a handful of times and at least a few of the books in my personal information theory library are hand-me-down copies from his personal library. He’ll definitely be missed.

Three open books piled on top of each other with McEliece's signature and dates in the top right hand of the first page and CalTech bookstore price stamps in them as well.

📺 The Bit Player (Trailer) | IEEE Information Theory Society

Watched The Bit Player (Trailer) from IEEE Information Theory Society

The Bit Player Trailer from IEEE Information Theory Society on Vimeo.

In a blockbuster paper in 1948, Claude Shannon introduced the notion of a "bit" and laid the foundation for the information age. His ideas ripple through nearly every aspect of modern life, influencing such diverse fields as communication, computing, cryptography, neuroscience, artificial intelligence, cosmology, linguistics, and genetics. But when interviewed in the 1980s, Shannon was more interested in showing off the gadgets he’d constructed — juggling robots, a Rubik’s Cube solving machine, a wearable computer to win at roulette, a unicycle without pedals, a flame-throwing trumpet — than rehashing the past. Mixing contemporary interviews, archival film, animation and dialogue drawn from interviews conducted with Shannon himself, The Bit Player tells the story of an overlooked genius who revolutionized the world, but never lost his childlike curiosity.

👓 Bob Gallager on Shannon’s tips for research | An Ergodic Walk

Annotated Bob Gallager on Shannon’s tips for research (An Ergodic Walk)

Gallager gave a nice concise summary of what he learned from Shannon about how to do good theory work:

  1. Simplify the problem
  2. Relate it to other problems
  3. Restate the problem in as many ways as possible
  4. Break the problem into pieces
  5. Avoid getting locked into thinking ruts
  6. Generalize

As he said, “it’s a process of doing research… each one [step] gives you a little insight.” It’s tempting, as a theorist, to claim that at the end of this process you’ve solved the “fundamental” problem, but Gallager admonished us to remember that the first step is to simplify, often dramatically. As Alfred North Whitehead said, we should “seek simplicity and distrust it.”

I know I’ve read this before, but it deserves a re-read/review every now and then.
Read Blue Brain solves a century-old neuroscience problem (ScienceDaily)
New research explains how the shapes of neurons can be classified using mathematical methods from the field of algebraic topology. Neuroscientists can now start building a formal catalogue for all the types of cells in the brain. Onto this catalogue of cells, they can systematically map the function and role in disease of each type of neuron in the brain.
Bookmarked From bit to it: How a complex metabolic network transforms information into living matter by Andreas Wagner (BMC Systems Biology)

Background

Organisms live and die by the amount of information they acquire about their environment. The systems analysis of complex metabolic networks allows us to ask how such information translates into fitness. A metabolic network transforms nutrients into biomass. The better it uses information on available nutrient availability, the faster it will allow a cell to divide.

Results

I here use metabolic flux balance analysis to show that the accuracy I (in bits) with which a yeast cell can sense a limiting nutrient's availability relates logarithmically to fitness as indicated by biomass yield and cell division rate. For microbes like yeast, natural selection can resolve fitness differences of genetic variants smaller than 10-6, meaning that cells would need to estimate nutrient concentrations to very high accuracy (greater than 22 bits) to ensure optimal growth. I argue that such accuracies are not achievable in practice. Natural selection may thus face fundamental limitations in maximizing the information processing capacity of cells.

Conclusion

The analysis of metabolic networks opens a door to understanding cellular biology from a quantitative, information-theoretic perspective.

https://doi.org/10.1186/1752-0509-1-33

Received: 01 March 2007 Accepted: 30 July 2007 Published: 30 July 2007

Hat tip to Paul Davies in The Demon in the Machine
Bookmarked Statistical Physics of Self-Replication by Jeremy L. England (J. Chem. Phys. 139, 121923 (2013); )
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids.
https://doi.org/10.1063/1.4818538
Syndicated copy also available on arXiv: https://arxiv.org/abs/1209.1179

Hat tip to Paul Davies in The Demon in the Machine

🔖 The notion of information in biology, an appraisal | Jérôme Segal | Journal BIO Web of Conferences

Bookmarked The notion of information in biology, an appraisal by Jérôme SegalJérôme Segal (Journal BIO Web of Conferences Volume 4, Page 00017, 2015; ORIGINS – Studies in Biological and Cultural Evolution)

Developed during the first half of the 20th century, in three different fields, theoretical physics, statistics applied to agronomy and telecommunication engineering, the notion of information has become a scientific concept in the context of the Second War World. It is in this highly interdisciplinary environment that “information theory” emerged, combining the mathematical theory of communication and cybernetics. This theory has grown exponentially in many disciplines, including biology. The discovery of the genetic “code” has benefited from the development of a common language based on information theory and has fostered a almost imperialist development of molecular genetics, which culminated in the Human Genome Project. This project however could not fill all the raised expectations and epigenetics have shown the limits of this approach. Still, the theory of information continues to be applied in the current research, whether the application of the self-correcting coding theory to explain the conservation of genomes on a geological scale or aspects the theory of evolution.

[pdf]

https://doi.org/10.1051/bioconf/20150400017

🔖 The Negentropy Principle of Information by Leon Brillouin | Journal of Applied Physics: Vol 24, No 9

Bookmarked The Negentropy Principle of Information by Leon Brillouin (Journal of Applied Physics 24, 1152 (1953))

The statistical definition of information is compared with Boltzmann's formula for entropy. The immediate result is that information I corresponds to a negative term in the total entropy S of a system.
S=S0−I
. A generalized second principle states that S must always increase. If an experiment yields an increase ΔI of the information concerning a physical system, it must be paid for by a larger increase ΔS0 in the entropy of the system and its surrounding laboratory. The efficiency ε of the experiment is defined as ε = ΔI/ΔS0≤1. Moreover, there is a lower limit k ln2 (k, Boltzmann's constant) for the ΔS0 required in an observation. Some specific examples are discussed: length or distance measurements, time measurements, observations under a microscope. In all cases it is found that higher accuracy always means lower efficiency. The information ΔI increases as the logarithm of the accuracy, while ΔS0 goes up faster than the accuracy itself. Exceptional circumstances arise when extremely small distances (of the order of nuclear dimensions) have to be measured, in which case the efficiency drops to exceedingly low values. This stupendous increase in the cost of observation is a new factor that should probably be included in the quantum theory.

https://doi.org/10.1063/1.1721463

First appearance of the word “negentropy” that I’ve seen in the literature.

👓 Celebrating the Work and Life of Claude Elwood Shannon | IEEE Foundation

Read Celebrating the Work and Life of Claude Elwood Shannon (ieeefoundation.org)

Claude Shannon

In 2014 IEEE Information Theory Society President, Michelle Effros, knew that something had to be done. The man who coined the very phrase, Information Theory, had largely been forgotten. Given his importance, and the growing impact that his work was having on society at large, she led the IEEE Information Theory Society on a quest to use the Centennial of Claude Shannon’s birth to right this injustice.

A series of activities were planned, including a dual IEEE Milestone dedicated at both Nokia Bell Labs and MIT. Such was his stature that both institutions were intent on honoring the work he accomplished on their respective sites. His work, after all, foresaw and paved the way for the Information Revolution that we are experiencing, making possible everything from cell phones to GPS to Bitcoin.

By the time of the Nokia Bell Labs event, the keystone project – a documentary on Shannon’s life was in the formative stages. IEEE Information Theory Society leadership had secured the services of Mark Levinson, of Particle Fever acclaim. The script was being written and preliminary plans were underway.

To make the film a reality, a coalition of individuals, foundations and corporations came together with the common objective to bring the story of Shannon to as wide an audience as possible. An effective partnership was forged with the IEEE Foundation which was undertaking its own unique project - its first ever major fundraising campaign. The combination proved to be a winning entry, and the Shannon Centennial quickly became exemplary of the impact that can occur when the power of volunteers is bolstered by effective staff support.

19 June was the World Premiere of the finished product. The Bit Player was screened to a full house on the big screen at the IEEE Information Theory Society’s meeting in Vail, CO, US. The film was met with enthusiastic acclaim. Following the screening attendees were treated to a Q&A with the film’s director and star.

Among the techniques used to tell Shannon’s story was the testimony of current luminaries in the fields he inspired. All spoke of his importance and the need for his impact to be recognized. As one contributor, Andrea Goldsmith, Stephen Harris Professor in the School of Engineering, Stanford University, put it, “Today everyone carries Shannon around in their pocket”.

Based on this article the Claude Shannon movie The Bit Player has already had its premiere. I updated the IMDb entry, but I still have to wonder if it is ever going to get any distribution so that the rest of us might ever see it?