👓 Celebrating the Work and Life of Claude Elwood Shannon | IEEE Foundation

Read Celebrating the Work and Life of Claude Elwood Shannon (ieeefoundation.org)

Claude Shannon

In 2014 IEEE Information Theory Society President, Michelle Effros, knew that something had to be done. The man who coined the very phrase, Information Theory, had largely been forgotten. Given his importance, and the growing impact that his work was having on society at large, she led the IEEE Information Theory Society on a quest to use the Centennial of Claude Shannon’s birth to right this injustice.

A series of activities were planned, including a dual IEEE Milestone dedicated at both Nokia Bell Labs and MIT. Such was his stature that both institutions were intent on honoring the work he accomplished on their respective sites. His work, after all, foresaw and paved the way for the Information Revolution that we are experiencing, making possible everything from cell phones to GPS to Bitcoin.

By the time of the Nokia Bell Labs event, the keystone project – a documentary on Shannon’s life was in the formative stages. IEEE Information Theory Society leadership had secured the services of Mark Levinson, of Particle Fever acclaim. The script was being written and preliminary plans were underway.

To make the film a reality, a coalition of individuals, foundations and corporations came together with the common objective to bring the story of Shannon to as wide an audience as possible. An effective partnership was forged with the IEEE Foundation which was undertaking its own unique project - its first ever major fundraising campaign. The combination proved to be a winning entry, and the Shannon Centennial quickly became exemplary of the impact that can occur when the power of volunteers is bolstered by effective staff support.

19 June was the World Premiere of the finished product. The Bit Player was screened to a full house on the big screen at the IEEE Information Theory Society’s meeting in Vail, CO, US. The film was met with enthusiastic acclaim. Following the screening attendees were treated to a Q&A with the film’s director and star.

Among the techniques used to tell Shannon’s story was the testimony of current luminaries in the fields he inspired. All spoke of his importance and the need for his impact to be recognized. As one contributor, Andrea Goldsmith, Stephen Harris Professor in the School of Engineering, Stanford University, put it, “Today everyone carries Shannon around in their pocket”.

Based on this article the Claude Shannon movie The Bit Player has already had its premiere. I updated the IMDb entry, but I still have to wonder if it is ever going to get any distribution so that the rest of us might ever see it?

Reply to The Man Who Tried to Redeem the World with Logic | Nautilus

Replied to The Man Who Tried to Redeem the World with Logic by Amanda GefterAmanda Gefter (Nautilus)
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.

Quick note of a factual and temporal error: the article indicates:

After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.

In fact, it was Claude E. Shannon, one of Wiener’s colleagues, who wrote the influential A Mathematical Theory of Communication published in Bell System Technical Journal in 1948, almost 5 years after the 1943 part of the timeline the article is indicating. Not only did Wiener not write the paper, but it wouldn’t have existed yet to have been a factor in Pitts deciding to choose a school or adviser at the time. While Wiener may have been a tremendous polymath, I suspect that his mathematical area of expertise during those years would have been closer to analysis and not probability theory.

To put Pitts & McCulloch’s work into additional context, Claude Shannon’s stunning MIT master’s thesis A symbolic analysis of relay and switching circuits in 1940 applied Boolean algebra to electronic circuits for the first time and as a result largely allowed the digital age to blossom. It would be nice to know if Pitts & McCulloch were aware of it when they published their work three years later.

👓 The Man Who Tried to Redeem the World with Logic | Issue 21: Information – Nautilus

Read The Man Who Tried to Redeem the World with Logic (Nautilus)
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker,…

Highlights, Quotes, Annotations, & Marginalia

McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m.  

Now that is a business card title!

March 03, 2019 at 06:01PM

McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.  

tl;dr

March 03, 2019 at 06:06PM

Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.  

I don’t think I’ve ever heard this quirky story…

March 03, 2019 at 06:08PM

Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions.  

I’m curious what year this was, particularly in relation to Claude Shannon’s master’s thesis in which he applied Boolean algebra to electronics.
Based on their meeting date, it would have to be after 1940.And they published in 1943: https://link.springer.com/article/10.1007%2FBF02478259

March 03, 2019 at 06:14PM

McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up.  

A nice way to pass the time to be sure. Naturally mathematicians would have been turning “coffee into theorems” instead of whiskey.

March 03, 2019 at 06:15PM

“an idea wrenched out of time.” In other words, a memory.  

March 03, 2019 at 06:17PM

McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics.  

March 03, 2019 at 06:21PM

I really like this picture here. Perhaps for a business card?
colorful painting of man sitting with abstract structure around him
  
March 03, 2019 at 06:23PM

it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.  

Oops, I think this article is confusing Wiener with Claude Shannon?

March 03, 2019 at 06:34PM

By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world.  

March 03, 2019 at 06:32PM

Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core.  

Wiener always did like cyberneticians for it’s parallelism with mathematicians….

March 03, 2019 at 06:38PM

In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts.  

First Draft of a Report on EDVAC by jon von Neumann

March 03, 2019 at 06:43PM

Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin.  

March 03, 2019 at 06:44PM

at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks.  

March 03, 2019 at 06:44PM

In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson.  

March 03, 2019 at 06:46PM

Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology.  

March 03, 2019 at 06:47PM

“The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959.  

March 03, 2019 at 06:50PM

There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.”  

March 03, 2019 at 06:54PM

Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind.  

March 03, 2019 at 06:55PM

by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation.  

I feel like this is something more akin to what may have been already known from Boolean algebra and Whitehead/Russell by this time. Certainly Shannon would have known of it?

March 03, 2019 at 06:58PM

👓 Henry Quastler | Wikipedia

Read Henry Quastler (Wikipedia)
Henry Quastler (November 11, 1908 – July 4, 1963) was an Austrian physician and radiologist who became a pioneer in the field of information theory applied to biology after emigrating to America. His work with Sidney Dancoff led to the publication of what is now commonly called Dancoff's Law.

Spent a moment to make a few additions to the page as well…

👓 Dr. Hubert Yockey of Bel Air, Director of APG Reactor; Manhattan Project Nuclear Physicist, Dies at 99 | The Dagger

Read Dr. Hubert Yockey of Bel Air, Director of APG Reactor; Manhattan Project Nuclear Physicist, Dies at 99 (The Dagger - Local News with an Edge)
Hubert Palmer Yockey, 99, died peacefully under hospice care at his home in Bel Air, MD, on January 31, 2016, with his daughter, Cynthia Yockey, at his side. Born in Alexandria, Minnesota, he was t…

👓 Book review by Nicolas Rashevsky of Information theory in biology | The bulletin of mathematical biophysics

Read Book review of Information theory in biology by Nicolas Rashevsky (The bulletin of mathematical biophysics, June 1954, Volume 16, Issue 2, pp 183–185)

While sifting through some old bookmarks from CiteULike which is going to disappear from the web soon, I ran across one for this book review of Henry Quastler’s book Information Theory in Biology (1953).

The last page of the review had an interesting information theoretical take on not only book reviews, but the level of information they contain with respect for improved teaching and learning in an era prior to Mihaly Csikszentmihalyi’s ideas about “flow”.

As it isn’t the easiest thing to track down, I’ll quote the relevant paragraphs from page 185:

The purpose of a scientific book (we at least hope!) is to store and convey information in a given field. The purpose of a review is to convey  information about a book. It is therefore legitimate to attempt a mathematical theory of writing books and to find the optimal conditions which make a book good. At first it may seem that the optimal conditions consist of maximizing the amount of information per page, that is, in minimizing the redundancy. But a certain amount of redundancy may not only be desirable, but necessary. When presenting a new subject to young students who have never heard of it, a judicious amount of repetition is good pedagogy. Giving an exact abstract definition and then illustrating it by an example already constitutes a logical redundancy. But how useful it frequently is! The minimum of redundancy that is found in some well-known and excellent mathematical books (nomina sunt odiosa!) occasionally makes those books difficult to read even for mathematicians.
The optimum amount of redundancy is a function of the information and intelligence of the reader for whom the book is written. The analytical form of this function is to be determined by an appropriate mathematical theory of learning. Writing a book even in a field which belongs entirely to the domains of Her Majesty the Queen of Sciences is, alas, still more an art than a science. Is it not possible, however, that in the future it may become an exact science?
If a reviewer’s information and intelligence are exactly equal to the value for which the book has been optimized, then he will perceive as defects in the book only deviations from the optimal conditions. His criticism will be objective and unbiased. If, however, the reviewer’s information and intelligence deviate in any direction from the value for which the book is intended, then he will perceive shortcomings which are not due to the deviation of the book from the optimum, but to the reviewer’s personal characteristics. He may also perceive some advantages in the same way. If in the society of the future every individual will be tagged, through appropriate tests, as to his information and intelligence at a given time, expressed in appropriate units, then a reviewer will be able to calculate the correction for his personal bias. These are fantastic dreams of today, which may become reality in the future.

Some of this is very indicative of why one has to spend some significant time finding and recommending the right textbooks [1][2] for students and why things like personalized learning and improvements in pedagogy are so painfully difficult. Sadly on the pedagogy side we haven’t come as far as he may have hoped in nearly 70 ears, and, in fact, we may have regressed.

I’ve often seen web developers in the IndieWeb community mention the idea that “naming things is hard”, so I can’t help but noticing that this 1950’s reviewer uses the Latin catchphrase nomina sunt odiosa which translates as “names are odious”, which has a very similar, but far older sentiment about naming. It was apparently a problem for the ancients as well.

📑 Walter Pitts by Neil Smalheiser | Journal Perspectives in Biology and Medicine

Bookmarked Walter Pitts by Neil SmalheiserNeil Smalheiser (Journal Perspectives in Biology and Medicine. Volume 43. Issue 2. Page 217 - 226.)
Walter Pitts was pivotal in establishing the revolutionary notion of the brain as a computer, which was seminal in the development of computer design, cybernetics, artificial intelligence, and theoretical neuroscience. He was also a participant in a large number of key advances in 20th-century science.  

This looks like an interesting bio to read.

📑 A logical calculus of the ideas immanent in nervous activity by Warren S. McCulloch, Walter Pitts

Bookmarked A logical calculus of the ideas immanent in nervous activity by Warren S. McCulloch, Walter Pitts (The bulletin of mathematical biophysics December 1943, Volume 5, Issue 4, pp 115–133)
Because of the “all-or-none” character of nervous activity, neural events and the relations among them can be treated by means of propositional logic. It is found that the behavior of every net can be described in these terms, with the addition of more complicated logical means for nets containing circles; and that for any logical expression satisfying certain conditions, one can find a net behaving in the fashion it describes. It is shown that many particular choices among possible neurophysiological assumptions are equivalent, in the sense that for every net behaving under one assumption, there exists another net which behaves under the other and gives the same results, although perhaps not in the same time. Various applications of the calculus are discussed.

Found reference to this journal article in a review of Henry Quastler’s book Information Theory in Biology. It said:

A more serious thing, in the reviewer’s opinion, is the complete absence of contributions dealing with information theory and the central nervous system, which may be the field par excellence for the use of such a theory. Although no explicit reference to information theory is made in the well-known paper of W. McCulloch and W. Pitts (1943), the connection is quite obvious. This is made explicit in the systematic elaboration of the McCulloch-Pitts’ approach by J. von Neumann (1952). In his interesting book J. T. Culbertson (1950) discussed possible neural mechanisms for recognition of visual patterns, and particularly investigated the problems of how greatly a pattern may be deformed without ceasing to be recognizable. The connection between this problem and the problem of distortion in the theory of information is obvious. The work of Anatol Rapoport and his associates on random nets, and especially on their applications to rumor spread (see the series of papers which appeared in this Journal during the past four years), is also closely connected with problems of information theory.

Electronic copy available at: http://www.cse.chalmers.se/~coquand/AUTOMATA/mcp.pdf

🔖 Information Theory in Biology by Henry Quastler (editor) | University of Illinois Press (1953)

Bookmarked Information Theory in Biology (University of Illinois Press (1953))

I’d love to have a copy of this book that I don’t think I’d heard of before. I’ve got his later Symposium of Information Theory In Biology (1958) already. That volume gives credit to this prior book as inspiration for the symposium.

I suspect based on the Wikipedia article for Quastler that this may also be the same book as the slightly differently titled Essays on the Use of Information Theory in Biology. (Urbana: University of Illinois Press, 1953). There’s also a 1955 review of the text with this name available as well.

Google uses the first title with 273 pages and the Symposium text specifically cites Information Theory in Biology as the correct title several times.

The tough part seems to be that there are very few copies available online and the ones that are are certainly used, in poor condition, and priced at $100+. Ugh…

📖 Read pages 54-60 of 251 of The Demon in the Machine by Paul Davies

📖 Read pages 54-60 of 251 of The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life by Paul Davies

I’ve seen a few places in the text where he references “group(s) of Japanese scientists” in a collective way where as when the scientists are from the West he tends to name at least a principle investigator if not multiple members of a team. Is this implicit bias? I hope it’s not, but it feels very conspicuous and regular to me and I wish it weren’t there.

Photo of the book The Demon in the Machine by Paul Davies sitting on a wooden table. The cover is primarily the title in a large font superimposed on a wireframe of a bird in which the wireframe is meant to look like nodes in a newtowrk

📺 Kinesin protein walking on microtubule | YouTube

Watched Kinesin protein walking on microtubule from YouTube

Extracted from The Inner Life of a Cell by Cellular Visions and Harvard (http://www.studiodaily.com/2006/07/cellular-visions-the-inner-life-of-a-cell/)

hat tip to reference in note 21 on page 221 of The Demon in the Machine by Paul Davies.

I’m pretty certain that I’ve seen this or something very similar to it in another setting. (Television perhaps?)

👓 Differential privacy, an easy case | accuracyandprivacy.substack.com

Read Differential privacy, an easy case (accuracyandprivacy.substack.com)
By law, the Census Bureau is required to keep our responses to its questionnaires confidential. And so, over decades, it has applied several “disclosure avoidance” techniques when it publishes data — these have been meticulously catalogued by Laura McKenna

I could envision some interesting use cases for differential privacy like this within an IndieWeb framework for aggregated data potentially used for web discovery.

📖 Read pages 1-26 of 251 of The Demon in the Machine by Paul Davies

📖 Read pages 1-26 of 251 of The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life by Paul Davies

He seems to have a reasonable opening history here. He references several researchers I’m familiar with and trust at least.

Photo of the book The Demon in the Machine by Paul Davies sitting on a wooden table. The cover is primarily the title in a large font superimposed on a wireframe of a bird in which the wireframe is meant to look like nodes in a newtowrk

Acquired The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life by Paul Davies

Acquired The Demon in the Machine: How Hidden Webs of Information Are Finally Solving the Mystery of Life by Paul DaviesPaul Davies (Allen Lane)

How does life create order from chaos? And just what is life, anyway? Leading physicist Paul Davies argues that to find the answers, we must first answer a deeper question: 'What is information?' To understand the origins and nature of life, Davies proposes a radical vision of biology which sees the underpinnings of life as similar to circuits and electronics, arguing that life as we know it should really be considered a phenomenon of information storage. In an extraordinary deep dive into the real mechanics of what we take for granted, Davies reveals how biological processes, from photosynthesis to birds' navigation abilities, rely on quantum mechanics, and explores whether quantum physics could prove to be the secret key of all life on Earth. Lively and accessible, Demons in the Machine boils down intricate interdisciplinary developments to take readers on an eye-opening journey towards the ultimate goal of science: unifying all theories of the living and the non-living, so that humanity can at last understand its place in the universe.

book cover The Demon in the Machine

Ordered from Amazon on February 4th and had it shipped from the UK because I wasn’t sure when the book was going to finally be released in the US.