Replied to a tweet by codexeditor (Twitter)
@brunowinck @codexeditor @alanlaidlaw When thinking about this, recall that in the second paragraph of The Mathematical Theory of Communication (University of Illinois Press, 1949), Claude Shannon explicitly separates the semantic meaning from the engineering problem of communication. 
Highlight from the book with the underlined sentence: "These semantic aspects of communication are irrelevant to the engineering problem.
Replied to The Memex, the Manhatten Project, and the month of July 1945 by Matt WebbMatt Webb (Interconnected)
I wonder about these two legacies, the Memex and the Manhatten Project, and which has had the greater influence on the world.
Some revisionist history here glorifying Bush and the Memex without any mention of the long historical precedent of the commonplace book.

So for Bush’s greatest legacy, my answer would have to be his supervision and support of Claude Shannon’s work at MIT.

Five: The Memex

This is just what Vannevar Bush suggests in his famous article As We May Think in the July 1945 issue of The Atlantic. Here he posits the Memex, and opens up the idea of networked information.

#HeyPresstoConf20


The internet itself could be though of as a massive living and ever-growing commonplace book which can be digitally queried to provide the answers to nearly every conceivable question.

(Some may forget that Bush was the thesis advisor of Claude Shannon, the father of the modern digital age.)

 

Watched The Unitive Web from YouTube

This explains our proposal for a new generation of the Web, which we call The Unitive Web. Currently, there is a growing movement from the independence of the web, towards dominant companies. These companies offer organized information, but this comes at a price. We lose our independence more and more. The Unitive Web is a proposal to have both organized information and independence. It offers one generic approach closely compatible with the current web, which makes it possible to create a global open virtual space of information that is responsive and reliable. It offers open customization of user interaction, open bottom-up schema mapping, integration of (AI) algorithms, and facilitates in the protection of privacy. Kickstarter project will be launched if/when there is some interest first.

I only got about halfway through this watching at 1.5x. I’m purposely not embedding the video.

Some of the basic ideas about complexity theory are intriguing here, but it feels more like they’re trying to ground their ideas in solid science when they likely don’t have the proper grounding. I can’t help but thinking about Claude Shannon’s article The Bandwagon and seeing the same things that happened with information theory now taking place with complexity theory.

They’re also proposing a huge amount of infrastructure that is tenuous at best. I’m more than happy to await a minimal example before considering this further.

It almost kills me that they can’t be bothered to create a Kickstarter without “further interest.”

👓 What can Schrödinger’s cat say about 3D printers on Mars? | Aeon | Aeon Essays

Read What can Schrödinger’s cat say about 3D printers on Mars? by Michael Lachmann and Sara Walker (Aeon | Aeon Essays)
A cat is alive, a sofa is not: that much we know. But a sofa is also part of life. Information theory tells us why
A nice little essay in my area, but I’m not sure there’s anything new in it for me. It is nice that they’re trying to break some of the problem down into smaller components before building it back up into something else. Reframing things can always be helpful. Here, in particular, they’re reframing the definitions of life and alive.

📺 The Bit Player (Trailer) | IEEE Information Theory Society

Watched The Bit Player (Trailer) from IEEE Information Theory Society

The Bit Player Trailer from IEEE Information Theory Society on Vimeo.

In a blockbuster paper in 1948, Claude Shannon introduced the notion of a "bit" and laid the foundation for the information age. His ideas ripple through nearly every aspect of modern life, influencing such diverse fields as communication, computing, cryptography, neuroscience, artificial intelligence, cosmology, linguistics, and genetics. But when interviewed in the 1980s, Shannon was more interested in showing off the gadgets he’d constructed — juggling robots, a Rubik’s Cube solving machine, a wearable computer to win at roulette, a unicycle without pedals, a flame-throwing trumpet — than rehashing the past. Mixing contemporary interviews, archival film, animation and dialogue drawn from interviews conducted with Shannon himself, The Bit Player tells the story of an overlooked genius who revolutionized the world, but never lost his childlike curiosity.

👓 Bob Gallager on Shannon’s tips for research | An Ergodic Walk

Annotated Bob Gallager on Shannon’s tips for research (An Ergodic Walk)

Gallager gave a nice concise summary of what he learned from Shannon about how to do good theory work:

  1. Simplify the problem
  2. Relate it to other problems
  3. Restate the problem in as many ways as possible
  4. Break the problem into pieces
  5. Avoid getting locked into thinking ruts
  6. Generalize

As he said, “it’s a process of doing research… each one [step] gives you a little insight.” It’s tempting, as a theorist, to claim that at the end of this process you’ve solved the “fundamental” problem, but Gallager admonished us to remember that the first step is to simplify, often dramatically. As Alfred North Whitehead said, we should “seek simplicity and distrust it.”

I know I’ve read this before, but it deserves a re-read/review every now and then.

👓 Celebrating the Work and Life of Claude Elwood Shannon | IEEE Foundation

Read Celebrating the Work and Life of Claude Elwood Shannon (ieeefoundation.org)

Claude Shannon

In 2014 IEEE Information Theory Society President, Michelle Effros, knew that something had to be done. The man who coined the very phrase, Information Theory, had largely been forgotten. Given his importance, and the growing impact that his work was having on society at large, she led the IEEE Information Theory Society on a quest to use the Centennial of Claude Shannon’s birth to right this injustice.

A series of activities were planned, including a dual IEEE Milestone dedicated at both Nokia Bell Labs and MIT. Such was his stature that both institutions were intent on honoring the work he accomplished on their respective sites. His work, after all, foresaw and paved the way for the Information Revolution that we are experiencing, making possible everything from cell phones to GPS to Bitcoin.

By the time of the Nokia Bell Labs event, the keystone project – a documentary on Shannon’s life was in the formative stages. IEEE Information Theory Society leadership had secured the services of Mark Levinson, of Particle Fever acclaim. The script was being written and preliminary plans were underway.

To make the film a reality, a coalition of individuals, foundations and corporations came together with the common objective to bring the story of Shannon to as wide an audience as possible. An effective partnership was forged with the IEEE Foundation which was undertaking its own unique project - its first ever major fundraising campaign. The combination proved to be a winning entry, and the Shannon Centennial quickly became exemplary of the impact that can occur when the power of volunteers is bolstered by effective staff support.

19 June was the World Premiere of the finished product. The Bit Player was screened to a full house on the big screen at the IEEE Information Theory Society’s meeting in Vail, CO, US. The film was met with enthusiastic acclaim. Following the screening attendees were treated to a Q&A with the film’s director and star.

Among the techniques used to tell Shannon’s story was the testimony of current luminaries in the fields he inspired. All spoke of his importance and the need for his impact to be recognized. As one contributor, Andrea Goldsmith, Stephen Harris Professor in the School of Engineering, Stanford University, put it, “Today everyone carries Shannon around in their pocket”.

Based on this article the Claude Shannon movie The Bit Player has already had its premiere. I updated the IMDb entry, but I still have to wonder if it is ever going to get any distribution so that the rest of us might ever see it?

Reply to The Man Who Tried to Redeem the World with Logic | Nautilus

Replied to The Man Who Tried to Redeem the World with Logic by Amanda GefterAmanda Gefter (Nautilus)
McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.
Quick note of a factual and temporal error: the article indicates:

After all, it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.

In fact, it was Claude E. Shannon, one of Wiener’s colleagues, who wrote the influential A Mathematical Theory of Communication published in Bell System Technical Journal in 1948, almost 5 years after the 1943 part of the timeline the article is indicating. Not only did Wiener not write the paper, but it wouldn’t have existed yet to have been a factor in Pitts deciding to choose a school or adviser at the time. While Wiener may have been a tremendous polymath, I suspect that his mathematical area of expertise during those years would have been closer to analysis and not probability theory.

To put Pitts & McCulloch’s work into additional context, Claude Shannon’s stunning MIT master’s thesis A symbolic analysis of relay and switching circuits in 1940 applied Boolean algebra to electronic circuits for the first time and as a result largely allowed the digital age to blossom. It would be nice to know if Pitts & McCulloch were aware of it when they published their work three years later.

👓 The Man Who Tried to Redeem the World with Logic | Issue 21: Information – Nautilus

Read The Man Who Tried to Redeem the World with Logic (Nautilus)
Walter Pitts was used to being bullied. He’d been born into a tough family in Prohibition-era Detroit, where his father, a boiler-maker,…

Highlights, Quotes, Annotations, & Marginalia

McCulloch was a confident, gray-eyed, wild-bearded, chain-smoking philosopher-poet who lived on whiskey and ice cream and never went to bed before 4 a.m.  

Now that is a business card title!

March 03, 2019 at 06:01PM

McCulloch and Pitts were destined to live, work, and die together. Along the way, they would create the first mechanistic theory of the mind, the first computational approach to neuroscience, the logical design of modern computers, and the pillars of artificial intelligence.  

tl;dr

March 03, 2019 at 06:06PM

Gottfried Leibniz. The 17th-century philosopher had attempted to create an alphabet of human thought, each letter of which represented a concept and could be combined and manipulated according to a set of logical rules to compute all knowledge—a vision that promised to transform the imperfect outside world into the rational sanctuary of a library.  

I don’t think I’ve ever heard this quirky story…

March 03, 2019 at 06:08PM

Which got McCulloch thinking about neurons. He knew that each of the brain’s nerve cells only fires after a minimum threshold has been reached: Enough of its neighboring nerve cells must send signals across the neuron’s synapses before it will fire off its own electrical spike. It occurred to McCulloch that this set-up was binary—either the neuron fires or it doesn’t. A neuron’s signal, he realized, is a proposition, and neurons seemed to work like logic gates, taking in multiple inputs and producing a single output. By varying a neuron’s firing threshold, it could be made to perform “and,” “or,” and “not” functions.  

I’m curious what year this was, particularly in relation to Claude Shannon’s master’s thesis in which he applied Boolean algebra to electronics.
Based on their meeting date, it would have to be after 1940.And they published in 1943: https://link.springer.com/article/10.1007%2FBF02478259

March 03, 2019 at 06:14PM

McCulloch and Pitts alone would pour the whiskey, hunker down, and attempt to build a computational brain from the neuron up.  

A nice way to pass the time to be sure. Naturally mathematicians would have been turning “coffee into theorems” instead of whiskey.

March 03, 2019 at 06:15PM

“an idea wrenched out of time.” In other words, a memory.  

March 03, 2019 at 06:17PM

McCulloch and Pitts wrote up their findings in a now-seminal paper, “A Logical Calculus of Ideas Immanent in Nervous Activity,” published in the Bulletin of Mathematical Biophysics.  

March 03, 2019 at 06:21PM

I really like this picture here. Perhaps for a business card?
colorful painting of man sitting with abstract structure around him
  
March 03, 2019 at 06:23PM

it had been Wiener who discovered a precise mathematical definition of information: The higher the probability, the higher the entropy and the lower the information content.  

Oops, I think this article is confusing Wiener with Claude Shannon?

March 03, 2019 at 06:34PM

By the fall of 1943, Pitts had moved into a Cambridge apartment, was enrolled as a special student at MIT, and was studying under one of the most influential scientists in the world.  

March 03, 2019 at 06:32PM

Thus formed the beginnings of the group who would become known as the cyberneticians, with Wiener, Pitts, McCulloch, Lettvin, and von Neumann its core.  

Wiener always did like cyberneticians for it’s parallelism with mathematicians….

March 03, 2019 at 06:38PM

In the entire report, he cited only a single paper: “A Logical Calculus” by McCulloch and Pitts.  

First Draft of a Report on EDVAC by jon von Neumann

March 03, 2019 at 06:43PM

Oliver Selfridge, an MIT student who would become “the father of machine perception”; Hyman Minsky, the future economist; and Lettvin.  

March 03, 2019 at 06:44PM

at the Second Cybernetic Conference, Pitts announced that he was writing his doctoral dissertation on probabilistic three-dimensional neural networks.  

March 03, 2019 at 06:44PM

In June 1954, Fortune magazine ran an article featuring the 20 most talented scientists under 40; Pitts was featured, next to Claude Shannon and James Watson.  

March 03, 2019 at 06:46PM

Lettvin, along with the young neuroscientist Patrick Wall, joined McCulloch and Pitts at their new headquarters in Building 20 on Vassar Street. They posted a sign on the door: Experimental Epistemology.  

March 03, 2019 at 06:47PM

“The eye speaks to the brain in a language already highly organized and interpreted,” they reported in the now-seminal paper “What the Frog’s Eye Tells the Frog’s Brain,” published in 1959.  

March 03, 2019 at 06:50PM

There was a catch, though: This symbolic abstraction made the world transparent but the brain opaque. Once everything had been reduced to information governed by logic, the actual mechanics ceased to matter—the tradeoff for universal computation was ontology. Von Neumann was the first to see the problem. He expressed his concern to Wiener in a letter that anticipated the coming split between artificial intelligence on one side and neuroscience on the other. “After the great positive contribution of Turing-cum-Pitts-and-McCulloch is assimilated,” he wrote, “the situation is rather worse than better than before. Indeed these authors have demonstrated in absolute and hopeless generality that anything and everything … can be done by an appropriate mechanism, and specifically by a neural mechanism—and that even one, definite mechanism can be ‘universal.’ Inverting the argument: Nothing that we may know or learn about the functioning of the organism can give, without ‘microscopic,’ cytological work any clues regarding the further details of the neural mechanism.”  

March 03, 2019 at 06:54PM

Nature had chosen the messiness of life over the austerity of logic, a choice Pitts likely could not comprehend. He had no way of knowing that while his ideas about the biological brain were not panning out, they were setting in motion the age of digital computing, the neural network approach to machine learning, and the so-called connectionist philosophy of mind.  

March 03, 2019 at 06:55PM

by stringing them together exactly as Pitts and McCulloch had discovered, you could carry out any computation.  

I feel like this is something more akin to what may have been already known from Boolean algebra and Whitehead/Russell by this time. Certainly Shannon would have known of it?

March 03, 2019 at 06:58PM

🎧 Episode 025 System Theories, Racism & Human Relationships: Interview with TK Coleman | Human Current

Listened to Episode 025 System Theories, Racism & Human Relationships by Haley Campbell-GrossHaley Campbell-Gross from Human Current

In this episode, Haley interviews TK Coleman to discuss how humans allow their conflicting mental models to influence the way they handle controversial topics like racism. TK also shares how understanding context and patterns within human systems ultimately empowers us to actively contribute to human progress.

I generally prefer the harder sciences among Human Current’s episodes, but even episodes on the applications in other areas are really solid. I’m glad to hear about TK Coleman’s overarching philosophy and the idea of “human beings” versus “human doings.”

Also glad to have the recommendation of General Systems Theory: Beginning With Wholes by Barbara G. Hanson as a more accessible text in comparison to Ludwig von Bertalanffy’s text. The gang at Human Current should set up an Amazon Affiliate link so that when I buy books they recommend (which happens frequently), it helps to support and underwrite their work.

Highlights, Quotes, Annotations, & Marginalia

Reality is objective, but meaning is contextual.

—Barbara Hanson, General Systems Theory: Beginning with Wholes quoted within the episode

This quote is an interesting recap of a sentence in the first two paragraphs of Claude Shannon’s The Mathematical Theory of Communication.

An Information Theory Playlist on Spotify

In honor of tomorrow’s release of Jimmy Soni and Rob Goodman’s book A Mind at Play: How Claude Shannon Invented the Information Age, I’ve created an Information Theory playlist on Spotify.

Songs about communication, telephones, conversation, satellites, love, auto-tune and even one about a typewriter! They all relate at least tangentially to the topic at hand. To up the ante, everyone should realize that digital music would be impossible without Shannon’s seminal work.

Let me know in the comments or by replying to one of the syndicated copies listed below if there are any great tunes that the list is missing.

Enjoy the list and the book!

👓 Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age | IEEE Spectrum

Read Meet the Authors of a Mind at Play: How Claude Shannon Invented the Information Age (IEEE Spectrum)
Jimmy Soni and Rob Goodman wrote the first biography of the digital pioneer

📖 Read pages 16-55 of A Mind at Play by Jimmy Soni & Rob Goodman

📖 Read pages 16-55 of A Mind at Play: How Claude Shannon Invented the Information Age by Jimmy Soni & Rob Goodman

Knowing that I’ve read a lot about Shannon and even Vannevar Bush over the years, I’m pleasantly surprised to read some interesting tidbits about them that I’ve not previously come across.  I was a bit worried that this text wouldn’t provide me with much or anything new on the subjects at hand.

I’m really appreciating some of the prose and writing structure, particularly given that it’s a collaborative work between two authors. At times there are some really nonstandard sentence structures, but they’re wonderful in their rule breaking.

They’re doing an excellent job so far of explaining the more difficult pieces of science relating to information theory. In fact, some of the intro was as good as I think I’ve ever seen simple explanations of what is going on within the topic. I’m also pleased that they’ve made some interesting forays into topics like eugenics and the background role it played in the story for Shannon.

They had a chance to do a broader view of the history of computing, but opted against it, or at least must have made a conscious choice to leave out Babbage/Lovelace within the greater pantheon. I can see narratively why they may have done this knowing what is to come later in the text, but a few sentences as a nod would have been welcome.

The book does, however, get on my nerves with one of my personal pet peeves in popular science and biographical works like this: while there are reasonable notes at the end, absolutely no proper footnotes appear at the bottoms of pages or even indicators within the text other than pieces of text with quotation marks. I’m glad the notes even exist in the back, but it just drives me crazy that publishers blatantly hide them this way. The text could at least have had markers indicating where to find the notes. What are we? Animals?

Nota bene: I’m currently reading an advanced reader copy of this; the book won’t be out until mid-July 2017.