McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.
Quick note of a factual and temporal error: the article indicates:
After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.
In fact, it was Claude E. Shannon, one of Wiener’s colleagues, who wrote the influential A Mathematical Theory of Communication published in Bell System Technical Journal in 1948, almost 5 years after the 1943 part of the timeline the article is indicating. Not only did Wiener not write the paper, but it wouldn’t have existed yet to have been a factor in Pitts deciding to choose a school or adviser at the time. While Wiener may have been a tremendous polymath, I suspect that his mathematical area of expertise during those years would have been closer to analysis and not probability theory.
To put Pitts & McCulloch’s work into additional context, Claude Shannon’s stunning MIT master’s thesis A symbolic analysis of relay and switching circuits in 1940 applied Boolean algebra to electronic circuits for the first time and as a result largely allowed the digital age to blossom. It would be nice to know if Pitts & McCulloch were aware of it when they published their work three years later.
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker,…
Highlights, Quotes, Annotations, & Marginalia
McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m. ❧
Now that is a business card title!
March 03, 2019 at 06:01PM
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence. ❧
tl;dr
March 03, 2019 at 06:06PM
Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library. ❧
I don’t think I’ve ever heard this quirky story…
March 03, 2019 at 06:08PM
Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions. ❧
I’m curious what year this was, particularly in relation to Claude Shannon’s master’s thesis in which he applied Boolean algebra to electronics.
Based on their meeting date, it would have to be after 1940.And they published in 1943: https://link.springer.com/article/10.1007%2FBF02478259
March 03, 2019 at 06:14PM
McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up. ❧
A nice way to pass the time to be sure. Naturally mathematicians would have been turning “coffee into theorems” instead of whiskey.
March 03, 2019 at 06:15PM
“an idea wrenched out of time.” In other words, a memory. ❧
March 03, 2019 at 06:17PM
McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics. ❧
it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content. ❧
Oops, I think this article is confusing Wiener with Claude Shannon?
March 03, 2019 at 06:34PM
By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world. ❧
March 03, 2019 at 06:32PM
Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core. ❧
Wiener always did like cyberneticians for it’s parallelism with mathematicians….
March 03, 2019 at 06:38PM
In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts. ❧
First Draft of a Report on EDVAC by jon von Neumann
March 03, 2019 at 06:43PM
Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin. ❧
March 03, 2019 at 06:44PM
at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks. ❧
March 03, 2019 at 06:44PM
In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson. ❧
March 03, 2019 at 06:46PM
Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology. ❧
March 03, 2019 at 06:47PM
“The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959. ❧
March 03, 2019 at 06:50PM
There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.” ❧
March 03, 2019 at 06:54PM
Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind. ❧
March 03, 2019 at 06:55PM
by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation. ❧
I feel like this is something more akin to what may have been already known from Boolean algebra and Whitehead/Russell by this time. Certainly Shannon would have known of it?
Companies and apps constantly ask for ratings, but all that data may just be noise in the system.
A great framing of a lot of crazy digital exhaust that online services and apps are collecting that don’t do much. I’ve also thought for a while about the idea of signal to noise ratio of these types of data as well as their quantization levels which often don’t make much sense to me. I don’t think that there are any IndieWeb realizations of these sorts of (mostly business) systems in the wild yet, but this is an important area to begin to consider when they do.
A review of Summa Technologiae by Stanislaw Lem by David Auerbach from the Los Angeles Review of Books.
Summa Technologiae
AT LAST WE have it in English. Summa Technologiae, originally published in Polish in 1964, is the cornerstone of Stanislaw Lem’s oeuvre, his consummate work of speculative nonfiction. Trained in medicine and biology, Lem synthesizes the current science of the day in ways far ahead of most science fiction of the time.
I came across this book review quite serendipitously today via an Auerbach article in Slate, which I’ve bookmarked. I found a copy of the book and have added it to the top of my reading pile. As I’m currently reading an advance reader edition of Sean Carroll’sThe Big Picture, I can only imagine how well the two may go together despite being written nearly 60 years apart.