The Most-Used Mathematical Algorithm Idea in History
An octillion. A billion billion billion. That’s a fairly conservative estimate of the number of times a cellphone or other device somewhere in the world has generated a bit using a maximum-length linear-feedback shift register sequence. It’s probably the single most-used mathematical algorithm idea in history. And the main originator of this idea was Solomon Golomb, who died on May 1—and whom I knew for 35 years.
Solomon Golomb’s classic book Shift Register Sequences, published in 1967—based on his work in the 1950s—went out of print long ago. But its content lives on in pretty much every modern communications system. Read the specifications for 3G, LTE, Wi-Fi, Bluetooth, or for that matter GPS, and you’ll find mentions of polynomials that determine the shift register sequences these systems use to encode the data they send. Solomon Golomb is the person who figured out how to construct all these polynomials.
A fantastic and pretty comprehensive obit for Sol. He did miss out on more of Sol’s youth as well as his cross-town chess rivalry with Basil Gordon when they both lived in Baltimore, but before they lived across town from each other again in Los Angeles.
Many of the fantastical seeming stories here, as well as Sol’s personality read very true to me with respect to the man I knew for almost two decades.
I once asked Robert McEliece whether he would mentor me. I told him I liked combinatorics and it seemed relevant for EE. He explained that "you find an interesting problem and then use the relevant tools, not the other way around." Formative experience. https://t.co/3IDTgdSJsF
Alumnus and engineering faculty member Robert J. McEliece has passed away.
May is apparently the month that many of the greats in information theory pass away. I was reminded of Sol Golomb’s passing in May 2016 the other day.
I didn’t know him well, but met Dr. McEliece a handful of times and at least a few of the books in my personal information theory library are hand-me-down copies from his personal library. He’ll definitely be missed.
In a blockbuster paper in 1948, Claude Shannon introduced the notion of a "bit" and laid the foundation for the information age. His ideas ripple through nearly every aspect of modern life, influencing such diverse fields as communication, computing, cryptography, neuroscience, artificial intelligence, cosmology, linguistics, and genetics. But when interviewed in the 1980s, Shannon was more interested in showing off the gadgets he’d constructed — juggling robots, a Rubik’s Cube solving machine, a wearable computer to win at roulette, a unicycle without pedals, a flame-throwing trumpet — than rehashing the past. Mixing contemporary interviews, archival film, animation and dialogue drawn from interviews conducted with Shannon himself, The Bit Player tells the story of an overlooked genius who revolutionized the world, but never lost his childlike curiosity.
Gallager gave a nice concise summary of what he learned from Shannon about how to do good theory work:
Simplify the problem
Relate it to other problems
Restate the problem in as many ways as possible
Break the problem into pieces
Avoid getting locked into thinking ruts
As he said, “it’s a process of doing research… each one [step] gives you a little insight.” It’s tempting, as a theorist, to claim that at the end of this process you’ve solved the “fundamental” problem, but Gallager admonished us to remember that the first step is to simplify, often dramatically. As Alfred North Whitehead said, we should “seek simplicity and distrust it.”
I know I’ve read this before, but it deserves a re-read/review every now and then.
New research explains how the shapes of neurons can be classified using mathematical methods from the field of algebraic topology. Neuroscientists can now start building a formal catalogue for all the types of cells in the brain. Onto this catalogue of cells, they can systematically map the function and role in disease of each type of neuron in the brain.
Organisms live and die by the amount of information they acquire about their environment. The systems analysis of complex metabolic networks allows us to ask how such information translates into fitness. A metabolic network transforms nutrients into biomass. The better it uses information on available nutrient availability, the faster it will allow a cell to divide.
I here use metabolic flux balance analysis to show that the accuracy I (in bits) with which a yeast cell can sense a limiting nutrient's availability relates logarithmically to fitness as indicated by biomass yield and cell division rate. For microbes like yeast, natural selection can resolve fitness differences of genetic variants smaller than 10-6, meaning that cells would need to estimate nutrient concentrations to very high accuracy (greater than 22 bits) to ensure optimal growth. I argue that such accuracies are not achievable in practice. Natural selection may thus face fundamental limitations in maximizing the information processing capacity of cells.
The analysis of metabolic networks opens a door to understanding cellular biology from a quantitative, information-theoretic perspective.
Self-replication is a capacity common to every species of living thing, and simple physical intuition dictates that such a process must invariably be fueled by the production of entropy. Here, we undertake to make this intuition rigorous and quantitative by deriving a lower bound for the amount of heat that is produced during a process of self-replication in a system coupled to a thermal bath. We find that the minimum value for the physically allowed rate of heat production is determined by the growth rate, internal entropy, and durability of the replicator, and we discuss the implications of this finding for bacterial cell division, as well as for the pre-biotic emergence of self-replicating nucleic acids. https://doi.org/10.1063/1.4818538
Developed during the first half of the 20th century, in three different fields, theoretical physics, statistics applied to agronomy and telecommunication engineering, the notion of information has become a scientific concept in the context of the Second War World. It is in this highly interdisciplinary environment that “information theory” emerged, combining the mathematical theory of communication and cybernetics. This theory has grown exponentially in many disciplines, including biology. The discovery of the genetic “code” has benefited from the development of a common language based on information theory and has fostered a almost imperialist development of molecular genetics, which culminated in the Human Genome Project. This project however could not fill all the raised expectations and epigenetics have shown the limits of this approach. Still, the theory of information continues to be applied in the current research, whether the application of the self-correcting coding theory to explain the conservation of genomes on a geological scale or aspects the theory of evolution.
The statistical definition of information is compared with Boltzmann's formula for entropy. The immediate result is that information I corresponds to a negative term in the total entropy S of a system.
. A generalized second principle states that S must always increase. If an experiment yields an increase ΔI of the information concerning a physical system, it must be paid for by a larger increase ΔS0 in the entropy of the system and its surrounding laboratory. The efficiency ε of the experiment is defined as ε = ΔI/ΔS0≤1. Moreover, there is a lower limit k ln2 (k, Boltzmann's constant) for the ΔS0 required in an observation. Some specific examples are discussed: length or distance measurements, time measurements, observations under a microscope. In all cases it is found that higher accuracy always means lower efficiency. The information ΔI increases as the logarithm of the accuracy, while ΔS0 goes up faster than the accuracy itself. Exceptional circumstances arise when extremely small distances (of the order of nuclear dimensions) have to be measured, in which case the efficiency drops to exceedingly low values. This stupendous increase in the cost of observation is a new factor that should probably be included in the quantum theory.
First appearance of the word “negentropy” that I’ve seen in the literature.
In 2014 IEEE Information Theory Society President, Michelle Effros, knew that something had to be done. The man who coined the very phrase, Information Theory, had largely been forgotten. Given his importance, and the growing impact that his work was having on society at large, she led the IEEE Information Theory Society on a quest to use the Centennial of Claude Shannon’s birth to right this injustice.
A series of activities were planned, including a dual IEEE Milestone dedicated at both Nokia Bell Labs and MIT. Such was his stature that both institutions were intent on honoring the work he accomplished on their respective sites. His work, after all, foresaw and paved the way for the Information Revolution that we are experiencing, making possible everything from cell phones to GPS to Bitcoin.
By the time of the Nokia Bell Labs event, the keystone project – a documentary on Shannon’s life was in the formative stages. IEEE Information Theory Society leadership had secured the services of Mark Levinson, of Particle Fever acclaim. The script was being written and preliminary plans were underway.
To make the film a reality, a coalition of individuals, foundations and corporations came together with the common objective to bring the story of Shannon to as wide an audience as possible. An effective partnership was forged with the IEEE Foundation which was undertaking its own unique project - its first ever major fundraising campaign. The combination proved to be a winning entry, and the Shannon Centennial quickly became exemplary of the impact that can occur when the power of volunteers is bolstered by effective staff support.
19 June was the World Premiere of the finished product. The Bit Player was screened to a full house on the big screen at the IEEE Information Theory Society’s meeting in Vail, CO, US. The film was met with enthusiastic acclaim. Following the screening attendees were treated to a Q&A with the film’s director and star.
Among the techniques used to tell Shannon’s story was the testimony of current luminaries in the fields he inspired. All spoke of his importance and the need for his impact to be recognized. As one contributor, Andrea Goldsmith, Stephen Harris Professor in the School of Engineering, Stanford University, put it, “Today everyone carries Shannon around in their pocket”.
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.
Quick note of a factual and temporal error: the article indicates:
After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.
In fact, it was Claude E. Shannon, one of Wiener’s colleagues, who wrote the influential A Mathematical Theory of Communication published in Bell System Technical Journal in 1948, almost 5 years after the 1943 part of the timeline the article is indicating. Not only did Wiener not write the paper, but it wouldn’t have existed yet to have been a factor in Pitts deciding to choose a school or adviser at the time. While Wiener may have been a tremendous polymath, I suspect that his mathematical area of expertise during those years would have been closer to analysis and not probability theory.
To put Pitts & McCulloch’s work into additional context, Claude Shannon’s stunning MIT master’s thesis A symbolic analysis of relay and switching circuits in 1940 applied Boolean algebra to electronic circuits for the first time and as a result largely allowed the digital age to blossom. It would be nice to know if Pitts & McCulloch were aware of it when they published their work three years later.
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker,…
Highlights, Quotes, Annotations, & Marginalia
McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m. ❧
Now that is a business card title!
March 03, 2019 at 06:01PM
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence. ❧
March 03, 2019 at 06:06PM
Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library. ❧
I don’t think I’ve ever heard this quirky story…
March 03, 2019 at 06:08PM
Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions. ❧
I’m curious what year this was, particularly in relation to Claude Shannon’s master’s thesis in which he applied Boolean algebra to electronics.
Based on their meeting date, it would have to be after 1940.And they published in 1943: https://link.springer.com/article/10.1007%2FBF02478259
March 03, 2019 at 06:14PM
McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up. ❧
A nice way to pass the time to be sure. Naturally mathematicians would have been turning “coffee into theorems” instead of whiskey.
March 03, 2019 at 06:15PM
“an idea wrenched out of time.” In other words, a memory. ❧
March 03, 2019 at 06:17PM
McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics. ❧
it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content. ❧
Oops, I think this article is confusing Wiener with Claude Shannon?
March 03, 2019 at 06:34PM
By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world. ❧
March 03, 2019 at 06:32PM
Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core. ❧
Wiener always did like cyberneticians for it’s parallelism with mathematicians….
March 03, 2019 at 06:38PM
In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts. ❧
First Draft of a Report on EDVAC by jon von Neumann
March 03, 2019 at 06:43PM
Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin. ❧
March 03, 2019 at 06:44PM
at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks. ❧
March 03, 2019 at 06:44PM
In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson. ❧
March 03, 2019 at 06:46PM
Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology. ❧
March 03, 2019 at 06:47PM
“The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959. ❧
March 03, 2019 at 06:50PM
There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.” ❧
March 03, 2019 at 06:54PM
Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind. ❧
March 03, 2019 at 06:55PM
by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation. ❧
I feel like this is something more akin to what may have been already known from Boolean algebra and Whitehead/Russell by this time. Certainly Shannon would have known of it?
Henry Quastler (November 11, 1908 – July 4, 1963) was an Austrian physician and radiologist who became a pioneer in the field of information theory applied to biology after emigrating to America. His work with Sidney Dancoff led to the publication of what is now commonly called Dancoff's Law.
Spent a moment to make a few additions to the page as well…